SQLObject-3.4.0/0000755000175000017500000000000013141371614012655 5ustar phdphd00000000000000SQLObject-3.4.0/setup.cfg0000644000175000017500000000023113141371614014472 0ustar phdphd00000000000000[egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 [flake8] exclude = .git,.tox,docs/europython/*.py ignore = E305 [bdist_wheel] universal = 1 SQLObject-3.4.0/scripts/0000755000175000017500000000000013141371614014344 5ustar phdphd00000000000000SQLObject-3.4.0/scripts/sqlobject-admin0000755000175000017500000000177112475125544017364 0ustar phdphd00000000000000#!/usr/bin/env python from __future__ import print_function import sys import os try: import pkg_resources pkg_resources.require('SQLObject>0.6.1') except (ImportError, pkg_resources.DistributionNotFound): # Oh well, we tried... pass try: import sqlobject.manager except ImportError: try: here = __file__ except NameError: here = sys.argv[0] updir = os.path.join( os.path.dirname(os.path.dirname(os.path.abspath(here))), 'sqlobject') if os.path.exists(updir): sys.path.insert(0, os.path.dirname(updir)) else: print('I cannot find the sqlobject module') print('If SQLObject is installed, you may need to set $PYTHONPATH') sys.exit(3) # Now we have to get rid of possibly stale modules from that import # up there for name, value in sys.modules.items(): if name.startswith('sqlobject'): del sys.modules[name] from sqlobject.manager import command command.the_runner.run(sys.argv) SQLObject-3.4.0/scripts/sqlobject-convertOldURI0000755000175000017500000000064112475125544020766 0ustar phdphd00000000000000#!/usr/bin/env python from __future__ import print_function import sys try: uri = sys.argv[1] except IndexError: sys.exit("Usage: %s old-style-URI" % sys.argv[0]) try: import pkg_resources pkg_resources.require('SQLObject>=1.0.0') except (ImportError, pkg_resources.DistributionNotFound): pass from sqlobject import connectionForURI conn = connectionForURI(uri, oldUri=True) print(conn.uri()) SQLObject-3.4.0/README.rst0000644000175000017500000000101613141371273014343 0ustar phdphd00000000000000SQLObject 3.4.0 =============== Thanks for looking at SQLObject. SQLObject is an object-relational mapper, i.e., a library that will wrap your database tables in Python classes, and your rows in Python instances. It currently supports MySQL through the `MySQLdb` package, PostgreSQL through the `psycopg` package, SQLite, Firebird, MaxDB (SAP DB), MS SQL, Sybase and Rdbhost. Python 2.7 or 3.4+ is required. For more information please see the documentation in ``_, or online at http://sqlobject.org/ SQLObject-3.4.0/ANNOUNCE.rst0000644000175000017500000000612013141371273014615 0ustar phdphd00000000000000Hello! I'm pleased to announce version 3.4.0, the first stable release of branch 3.4 of SQLObject. What's new in SQLObject ======================= Contributor for this release is Dr. Neil Muller. Features -------- * Python 2.6 is no longer supported. The minimal supported version is Python 2.7. Drivers (work in progress) -------------------------- * Encode binary values for py-postgresql driver. This fixes the last remaining problems with the driver. * Encode binary values for PyGreSQL driver using the same encoding as for py-postgresql driver. This fixes the last remaining problems with the driver. Our own encoding is needed because unescape_bytea(escape_bytea()) is not idempotent. See the comment for PQunescapeBytea at https://www.postgresql.org/docs/9.6/static/libpq-exec.html: This conversion is not exactly the inverse of PQescapeBytea, because the string is not expected to be "escaped" when received from PQgetvalue. In particular this means there is no need for string quoting considerations. * List all drivers in extras_require in setup.py. Minor features -------------- * Use base64.b64encode/b64decode instead of deprecated encodestring/decodestring. Tests ----- * Fix a bug with sqlite-memory: rollback transaction and close connection. The solution was found by Dr. Neil Muller. * Use remove-old-files.py from ppu to cleanup pip cache at Travis and AppVeyor. * Add test_csvimport.py more as an example how to use load_csv from sqlobject.util.csvimport. For a more complete list, please see the news: http://sqlobject.org/News.html What is SQLObject ================= SQLObject is an object-relational mapper. Your database tables are described as classes, and rows are instances of those classes. SQLObject is meant to be easy to use and quick to get started with. SQLObject supports a number of backends: MySQL, PostgreSQL, SQLite, Firebird, Sybase, MSSQL and MaxDB (also known as SAPDB). Python 2.7 or 3.4+ is required. Where is SQLObject ================== Site: http://sqlobject.org Development: http://sqlobject.org/devel/ Mailing list: https://lists.sourceforge.net/mailman/listinfo/sqlobject-discuss Download: https://pypi.python.org/pypi/SQLObject/3.4.0 News and changes: http://sqlobject.org/News.html StackOverflow: https://stackoverflow.com/questions/tagged/sqlobject Example ======= Create a simple class that wraps a table:: >>> from sqlobject import * >>> >>> sqlhub.processConnection = connectionForURI('sqlite:/:memory:') >>> >>> class Person(SQLObject): ... fname = StringCol() ... mi = StringCol(length=1, default=None) ... lname = StringCol() ... >>> Person.createTable() Use the object:: >>> p = Person(fname="John", lname="Doe") >>> p >>> p.fname 'John' >>> p.mi = 'Q' >>> p2 = Person.get(1) >>> p2 >>> p is p2 True Queries:: >>> p3 = Person.selectBy(lname="Doe")[0] >>> p3 >>> pc = Person.select(Person.q.lname=="Doe").count() >>> pc 1 SQLObject-3.4.0/tox.ini0000644000175000017500000004016713134764127014207 0ustar phdphd00000000000000[tox] minversion = 1.8 envlist = py27-{mysqldb,mysql-oursql},py{34,35,36}-{mysqlclient,pypostgresql},py{27,34,35,36}-{mysql-connector,pymysql,mysql-pyodbc,mysql-pypyodbc,postgres-psycopg,postgres-pygresql,postgres-pyodbc,postgres-pypyodbc,sqlite,sqlite-memory},py{27,34,35,36}-{firebird-fdb,firebirdsql},py{27,34}-flake8,py{27,34,35,36}-{mssql-pyodbc,mysql-connector,mysql-pyodbc,mysql-pypyodbc,postgres-psycopg,postgres-pyodbc,postgres-pypyodbc,sqlite,sqlite-memory}-w32 # Base test environment settings [testenv] # Ensure we cd into sqlobject before running the tests changedir = ./sqlobject/ basepython = py27: {env:TOXPYTHON:python2.7} py34: {env:TOXPYTHON:python3.4} py35: {env:TOXPYTHON:python3.5} py36: {env:TOXPYTHON:python3.6} commands = {envpython} --version {envpython} -c "import struct; print(struct.calcsize('P') * 8)" deps = -rdevscripts/requirements/requirements_tests.txt py27: egenix-mx-base mssql-pyodbc: pytest-timeout mysqldb: mysql-python mysqlclient: mysqlclient mysql-connector: mysql-connector <= 2.2.2 mysql-oursql: oursql pymysql: pymysql postgres-psycopg: psycopg2 postgres-pygresql: pygresql pypostgresql: py-postgresql pyodbc: pyodbc pypyodbc: pypyodbc firebird-fdb: fdb firebirdsql: firebirdsql passenv = CI TRAVIS TRAVIS_* APPVEYOR DISTUTILS_USE_SDK MSSdk INCLUDE LIB PGPASSWORD WINDIR # Don't fail or warn on uninstalled commands whitelist_externals = mysql createdb dropdb rm sudo isql-fb sqlcmd # MySQL test environments [mysqldb] commands = {[testenv]commands} -mysql -uroot -e 'drop database sqlobject_test;' mysql -uroot -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:@localhost/sqlobject_test?driver=mysqldb&debug=1 mysql -uroot -e 'drop database sqlobject_test;' [testenv:py27-mysqldb] commands = {[mysqldb]commands} [mysqlclient] commands = {[testenv]commands} -mysql -uroot -e 'drop database sqlobject_test;' mysql -uroot -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:@localhost/sqlobject_test?driver=mysqldb&charset=utf8&debug=1 mysql -uroot -e 'drop database sqlobject_test;' [testenv:py34-mysqlclient] commands = {[mysqlclient]commands} [testenv:py35-mysqlclient] commands = {[mysqlclient]commands} [testenv:py36-mysqlclient] commands = {[mysqlclient]commands} [mysql-connector] commands = {[testenv]commands} -mysql -uroot -e 'drop database sqlobject_test;' mysql -uroot -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:@localhost/sqlobject_test?driver=connector&charset=utf8&debug=1 mysql -uroot -e 'drop database sqlobject_test;' [testenv:py27-mysql-connector] commands = {[mysql-connector]commands} [testenv:py34-mysql-connector] commands = {[mysql-connector]commands} [testenv:py35-mysql-connector] commands = {[mysql-connector]commands} [testenv:py36-mysql-connector] commands = {[mysql-connector]commands} [oursql] commands = {[testenv]commands} -mysql -uroot -e 'drop database sqlobject_test;' mysql -uroot -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:@localhost/sqlobject_test?driver=oursql&charset=utf8&debug=1 mysql -uroot -e 'drop database sqlobject_test;' [testenv:py27-mysql-oursql] commands = {[oursql]commands} [pymysql] commands = {[testenv]commands} -mysql -uroot -e 'drop database sqlobject_test;' mysql -uroot -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:@localhost/sqlobject_test?driver=pymysql&charset=utf8&debug=1 mysql -uroot -e 'drop database sqlobject_test;' [testenv:py27-pymysql] commands = {[pymysql]commands} [testenv:py34-pymysql] commands = {[pymysql]commands} [testenv:py35-pymysql] commands = {[pymysql]commands} [testenv:py36-pymysql] commands = {[pymysql]commands} [mysql-pyodbc] commands = {[testenv]commands} {envpython} -c "import pyodbc; print(pyodbc.drivers())" -mysql -uroot -e 'drop database sqlobject_test;' mysql -uroot -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:@localhost/sqlobject_test?driver=pyodbc&odbcdrv=MySQL&charset=utf8&debug=1 mysql -uroot -e 'drop database sqlobject_test;' [testenv:py27-mysql-pyodbc] commands = {[mysql-pyodbc]commands} [testenv:py34-mysql-pyodbc] commands = {[mysql-pyodbc]commands} [testenv:py35-mysql-pyodbc] commands = {[mysql-pyodbc]commands} [testenv:py36-mysql-pyodbc] commands = {[mysql-pyodbc]commands} [mysql-pypyodbc] commands = {[testenv]commands} -mysql -uroot -e 'drop database sqlobject_test;' mysql -uroot -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:@localhost/sqlobject_test?driver=pypyodbc&odbcdrv=MySQL&charset=utf8&debug=1 mysql -uroot -e 'drop database sqlobject_test;' [testenv:py27-mysql-pypyodbc] commands = {[mysql-pypyodbc]commands} [testenv:py34-mysql-pypyodbc] commands = {[mysql-pypyodbc]commands} [testenv:py35-mysql-pypyodbc] commands = {[mysql-pypyodbc]commands} [testenv:py36-mysql-pypyodbc] commands = {[mysql-pypyodbc]commands} # PostgreSQL test environments [psycopg] commands = {[testenv]commands} -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D postgres://postgres:@localhost/sqlobject_test?driver=psycopg&charset=utf-8&debug=1 tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py27-postgres-psycopg] commands = {[psycopg]commands} [testenv:py34-postgres-psycopg] commands = {[psycopg]commands} [testenv:py35-postgres-psycopg] commands = {[psycopg]commands} [testenv:py36-postgres-psycopg] commands = {[psycopg]commands} [pygresql] commands = {[testenv]commands} -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D postgres://postgres:@localhost/sqlobject_test?driver=pygresql&charset=utf-8&debug=1 tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py27-postgres-pygresql] commands = {[pygresql]commands} [testenv:py34-postgres-pygresql] commands = {[pygresql]commands} [testenv:py35-postgres-pygresql] commands = {[pygresql]commands} [testenv:py36-postgres-pygresql] commands = {[pygresql]commands} [pypostgresql] commands = {[testenv]commands} -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D postgres://postgres:@localhost/sqlobject_test?driver=pypostgresql&charset=utf-8&debug=1 tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py34-pypostgresql] commands = {[pypostgresql]commands} [testenv:py35-pypostgresql] commands = {[pypostgresql]commands} [testenv:py36-pypostgresql] commands = {[pypostgresql]commands} [postgres-pyodbc] commands = {[testenv]commands} {envpython} -c "import pyodbc; print(pyodbc.drivers())" -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D postgres://postgres:@localhost/sqlobject_test?driver=pyodbc&odbcdrv=PostgreSQL%20ANSI&charset=utf-8&debug=1 tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py27-postgres-pyodbc] commands = {[postgres-pyodbc]commands} [testenv:py34-postgres-pyodbc] commands = {[postgres-pyodbc]commands} [testenv:py35-postgres-pyodbc] commands = {[postgres-pyodbc]commands} [testenv:py36-postgres-pyodbc] commands = {[postgres-pyodbc]commands} [postgres-pypyodbc] commands = {[testenv]commands} -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D postgres://postgres:@localhost/sqlobject_test?driver=pypyodbc&odbcdrv=PostgreSQL%20ANSI&charset=utf-8&debug=1 tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py27-postgres-pypyodbc] commands = {[postgres-pypyodbc]commands} [testenv:py34-postgres-pypyodbc] commands = {[postgres-pypyodbc]commands} [testenv:py35-postgres-pypyodbc] commands = {[postgres-pypyodbc]commands} [testenv:py36-postgres-pypyodbc] commands = {[postgres-pypyodbc]commands} # SQLite test environments [sqlite] commands = {[testenv]commands} -rm /tmp/sqlobject_test.sqdb pytest --cov=sqlobject -D sqlite:///tmp/sqlobject_test.sqdb?debug=1 rm /tmp/sqlobject_test.sqdb [testenv:py27-sqlite] commands = {[sqlite]commands} [testenv:py34-sqlite] commands = {[sqlite]commands} [testenv:py35-sqlite] commands = {[sqlite]commands} [testenv:py36-sqlite] commands = {[sqlite]commands} [sqlite-memory] commands = {[testenv]commands} pytest --cov=sqlobject -D sqlite:/:memory:?debug=1 [testenv:py27-sqlite-memory] commands = {[sqlite-memory]commands} [testenv:py34-sqlite-memory] commands = {[sqlite-memory]commands} [testenv:py35-sqlite-memory] commands = {[sqlite-memory]commands} [testenv:py36-sqlite-memory] commands = {[sqlite-memory]commands} # Firebird database test environments [fdb] commands = {[testenv]commands} sudo rm -f /tmp/test.fdb isql-fb -u test -p test -i /var/lib/firebird/create_test_db pytest --cov=sqlobject -D 'firebird://test:test@localhost/tmp/test.fdb?debug=1' sudo rm /tmp/test.fdb [testenv:py27-firebird-fdb] commands = {[fdb]commands} [testenv:py34-firebird-fdb] commands = {[fdb]commands} [testenv:py35-firebird-fdb] commands = {[fdb]commands} [testenv:py36-firebird-fdb] commands = {[fdb]commands} [firebirdsql] commands = {[testenv]commands} sudo rm -f /tmp/test.fdb isql-fb -u test -p test -i /var/lib/firebird/create_test_db pytest --cov=sqlobject -D 'firebird://test:test@localhost:3050/tmp/test.fdb?driver=firebirdsql&charset=utf8&debug=1' sudo rm /tmp/test.fdb [testenv:py27-firebirdsql] commands = {[firebirdsql]commands} [testenv:py34-firebirdsql] commands = {[firebirdsql]commands} [testenv:py35-firebirdsql] commands = {[firebirdsql]commands} [testenv:py36-firebirdsql] commands = {[firebirdsql]commands} # Special test environments [testenv:py27-flake8] changedir = ./ deps = flake8 commands = {[testenv]commands} flake8 . [testenv:py34-flake8] changedir = ./ deps = flake8 commands = {[testenv]commands} flake8 . # Windows testing [mssql-pyodbc-w32] commands = {envpython} -c "import pyodbc; print(pyodbc.drivers())" -sqlcmd -U sa -P "Password12!" -S .\SQL2014 -Q "DROP DATABASE sqlobject_test" sqlcmd -U sa -P "Password12!" -S .\SQL2014 -Q "CREATE DATABASE sqlobject_test" pytest --cov=sqlobject -D "mssql://sa:Password12!@localhost\SQL2014/sqlobject_test?driver=pyodbc&odbcdrv=SQL%20Server&debug=1" --timeout=30 sqlcmd -U sa -P "Password12!" -S .\SQL2014 -Q "DROP DATABASE sqlobject_test" [testenv:py27-mssql-pyodbc-w32] commands = {[mssql-pyodbc-w32]commands} [testenv:py34-mssql-pyodbc-w32] commands = {[mssql-pyodbc-w32]commands} [testenv:py35-mssql-pyodbc-w32] commands = {[mssql-pyodbc-w32]commands} [testenv:py36-mssql-pyodbc-w32] commands = {[mssql-pyodbc-w32]commands} [mysql-connector-w32] commands = {[testenv]commands} -mysql -u root "-pPassword12!" -e 'drop database sqlobject_test;' mysql -u root "-pPassword12!" -e 'create database sqlobject_test;' pytest --cov=sqlobject -D "mysql://root:Password12!@localhost/sqlobject_test?driver=connector&charset=utf8&debug=1" mysql -u root "-pPassword12!" -e 'drop database sqlobject_test;' [testenv:py27-mysql-connector-w32] commands = {[mysql-connector-w32]commands} [testenv:py34-mysql-connector-w32] commands = {[mysql-connector-w32]commands} [testenv:py35-mysql-connector-w32] commands = {[mysql-connector-w32]commands} [testenv:py36-mysql-connector-w32] commands = {[mysql-connector-w32]commands} [mysql-pyodbc-w32] commands = {[testenv]commands} {envpython} -c "import pyodbc; print(pyodbc.drivers())" -mysql -u root "-pPassword12!" -e 'drop database sqlobject_test;' mysql -u root "-pPassword12!" -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:Password12!@localhost/sqlobject_test?driver=pyodbc&odbcdrv=MySQL%20ODBC%205.3%20ANSI%20Driver&charset=utf8&debug=1 mysql -u root "-pPassword12!" -e 'drop database sqlobject_test;' [testenv:py27-mysql-pyodbc-w32] commands = {[mysql-pyodbc-w32]commands} [testenv:py34-mysql-pyodbc-w32] commands = {[mysql-pyodbc-w32]commands} [testenv:py35-mysql-pyodbc-w32] commands = {[mysql-pyodbc-w32]commands} [testenv:py36-mysql-pyodbc-w32] commands = {[mysql-pyodbc-w32]commands} [mysql-pypyodbc-w32] commands = {[testenv]commands} {envpython} -c "import pypyodbc; print(pypyodbc.drivers())" -mysql -u root "-pPassword12!" -e 'drop database sqlobject_test;' mysql -u root "-pPassword12!" -e 'create database sqlobject_test;' pytest --cov=sqlobject -D mysql://root:Password12!@localhost/sqlobject_test?driver=pypyodbc&odbcdrv=MySQL%20ODBC%205.3%20ANSI%20Driver&charset=utf8&debug=1 mysql -u root "-pPassword12!" -e 'drop database sqlobject_test;' [testenv:py27-mysql-pypyodbc-w32] commands = {[mysql-pypyodbc-w32]commands} [testenv:py34-mysql-pypyodbc-w32] commands = {[mysql-pypyodbc-w32]commands} [testenv:py35-mysql-pypyodbc-w32] commands = {[mysql-pypyodbc-w32]commands} [testenv:py36-mysql-pypyodbc-w32] commands = {[mysql-pypyodbc-w32]commands} [psycopg-w32] commands = {[testenv]commands} -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D "postgres://postgres:Password12!@localhost/sqlobject_test?driver=psycopg2&charset=utf-8&debug=1" tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py27-postgres-psycopg-w32] commands = {[psycopg-w32]commands} [testenv:py34-postgres-psycopg-w32] commands = {[psycopg-w32]commands} [testenv:py35-postgres-psycopg-w32] commands = {[psycopg-w32]commands} [testenv:py36-postgres-psycopg-w32] commands = {[psycopg-w32]commands} [postgres-pyodbc-w32] commands = {[testenv]commands} {envpython} -c "import pyodbc; print(pyodbc.drivers())" -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D "postgres://postgres:Password12!@localhost/sqlobject_test?driver=pyodbc&odbcdrv=PostgreSQL%20ANSI%28x64%29&charset=utf-8&debug=1" tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py27-postgres-pyodbc-w32] commands = {[postgres-pyodbc-w32]commands} [testenv:py34-postgres-pyodbc-w32] commands = {[postgres-pyodbc-w32]commands} [testenv:py35-postgres-pyodbc-w32] commands = {[postgres-pyodbc-w32]commands} [testenv:py36-postgres-pyodbc-w32] commands = {[postgres-pyodbc-w32]commands} [postgres-pypyodbc-w32] commands = {[testenv]commands} {envpython} -c "import pypyodbc; print(pypyodbc.drivers())" -dropdb -U postgres -w sqlobject_test createdb -U postgres -w sqlobject_test pytest --cov=sqlobject -D "postgres://postgres:Password12!@localhost/sqlobject_test?driver=odbc&odbcdrv=PostgreSQL%20ANSI%28x64%29&charset=utf-8&debug=1" tests include/tests inheritance/tests versioning/test dropdb -U postgres -w sqlobject_test [testenv:py27-postgres-pypyodbc-w32] commands = {[postgres-pypyodbc-w32]commands} [testenv:py34-postgres-pypyodbc-w32] commands = {[postgres-pypyodbc-w32]commands} [testenv:py35-postgres-pypyodbc-w32] commands = {[postgres-pypyodbc-w32]commands} [testenv:py36-postgres-pypyodbc-w32] commands = {[postgres-pypyodbc-w32]commands} [sqlite-w32] commands = {[testenv]commands} pytest --cov=sqlobject -D sqlite:/C:/projects/sqlobject/sqlobject_test.sqdb?debug=1 [testenv:py27-sqlite-w32] commands = {[sqlite-w32]commands} [testenv:py34-sqlite-w32] commands = {[sqlite-w32]commands} [testenv:py35-sqlite-w32] commands = {[sqlite-w32]commands} [testenv:py36-sqlite-w32] commands = {[sqlite-w32]commands} [sqlite-memory-w32] commands = {[testenv]commands} pytest --cov=sqlobject -D sqlite:/:memory:?debug=1 [testenv:py27-sqlite-memory-w32] commands = {[sqlite-memory-w32]commands} [testenv:py34-sqlite-memory-w32] commands = {[sqlite-memory-w32]commands} [testenv:py35-sqlite-memory-w32] commands = {[sqlite-memory-w32]commands} [testenv:py36-sqlite-memory-w32] commands = {[sqlite-memory-w32]commands} SQLObject-3.4.0/sqlobject/0000755000175000017500000000000013141371614014643 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/cache.py0000644000175000017500000002673512471371300016272 0ustar phdphd00000000000000""" This implements the instance caching in SQLObject. Caching is relatively aggressive. All objects are retained so long as they are in memory, by keeping weak references to objects. We also keep other objects in a cache that doesn't allow them to be garbage collected (unless caching is turned off). """ import threading from weakref import ref class CacheFactory(object): """ CacheFactory caches object creation. Each object should be referenced by a single hashable ID (note tuples of hashable values are also hashable). """ def __init__(self, cullFrequency=100, cullFraction=2, cache=True): """ Every cullFrequency times that an item is retrieved from this cache, the cull method is called. The cull method then expires an arbitrary fraction of the cached objects. The idea is at no time will the cache be entirely emptied, placing a potentially high load at that moment, but everything object will have its time to go eventually. The fraction is given as an integer, and one in that many objects are expired (i.e., the default is 1/2 of objects are expired). By setting cache to False, items won't be cached. However, in all cases a weak reference is kept to created objects, and if the object hasn't been garbage collected it will be returned. """ self.cullFrequency = cullFrequency self.cullCount = 0 self.cullOffset = 0 self.cullFraction = cullFraction self.doCache = cache if self.doCache: self.cache = {} self.expiredCache = {} self.lock = threading.Lock() def tryGet(self, id): """ This returns None, or the object in cache. """ value = self.expiredCache.get(id) if value: # it's actually a weakref: return value() if not self.doCache: return None return self.cache.get(id) def get(self, id): """ This method can cause deadlocks! tryGet is safer This returns the object found in cache, or None. If None, then the cache will remain locked! This is so that the calling function can create the object in a threadsafe manner before releasing the lock. You should use this like (note that ``cache`` is actually a CacheSet object in this example):: obj = cache.get(some_id, my_class) if obj is None: try: obj = create_object(some_id) cache.put(some_id, my_class, obj) finally: cache.finishPut(cls) This method checks both the main cache (which retains references) and the 'expired' cache, which retains only weak references. """ if self.doCache: if self.cullCount > self.cullFrequency: # Two threads could hit the cull in a row, but # that's not so bad. At least by setting cullCount # back to zero right away we avoid this. The cull # method has a lock, so it's threadsafe. self.cullCount = 0 self.cull() else: self.cullCount = self.cullCount + 1 try: return self.cache[id] except KeyError: pass self.lock.acquire() try: val = self.cache[id] except KeyError: pass else: self.lock.release() return val try: val = self.expiredCache[id]() except KeyError: return None else: del self.expiredCache[id] if val is None: return None self.cache[id] = val self.lock.release() return val else: try: val = self.expiredCache[id]() if val is not None: return val except KeyError: pass self.lock.acquire() try: val = self.expiredCache[id]() except KeyError: return None else: if val is None: del self.expiredCache[id] return None self.lock.release() return val def put(self, id, obj): """ Puts an object into the cache. Should only be called after .get(), so that duplicate objects don't end up in the cache. """ if self.doCache: self.cache[id] = obj else: self.expiredCache[id] = ref(obj) def finishPut(self): """ Releases the lock that is retained when .get() is called and returns None. """ self.lock.release() def created(self, id, obj): """ Inserts and object into the cache. Should be used when no one else knows about the object yet, so there cannot be any object already in the cache. After a database INSERT is an example of this situation. """ if self.doCache: if self.cullCount > self.cullFrequency: # Two threads could hit the cull in a row, but # that's not so bad. At least by setting cullCount # back to zero right away we avoid this. The cull # method has a lock, so it's threadsafe. self.cullCount = 0 self.cull() else: self.cullCount = self.cullCount + 1 self.cache[id] = obj else: self.expiredCache[id] = ref(obj) def cull(self): """Runs through the cache and expires objects E.g., if ``cullFraction`` is 3, then every third object is moved to the 'expired' (aka weakref) cache. """ self.lock.acquire() try: # remove dead references from the expired cache keys = list(self.expiredCache.keys()) for key in keys: if self.expiredCache[key]() is None: self.expiredCache.pop(key, None) keys = list(self.cache.keys()) for i in range(self.cullOffset, len(keys), self.cullFraction): id = keys[i] # create a weakref, then remove from the cache obj = ref(self.cache[id]) del self.cache[id] # the object may have been gc'd when removed from the cache # above, no need to place in expiredCache if obj() is not None: self.expiredCache[id] = obj # This offset tries to balance out which objects we # expire, so no object will just hang out in the cache # forever. self.cullOffset = (self.cullOffset + 1) % self.cullFraction finally: self.lock.release() def clear(self): """ Removes everything from the cache. Warning! This can cause duplicate objects in memory. """ if self.doCache: self.cache.clear() self.expiredCache.clear() def expire(self, id): """ Expires a single object. Typically called after a delete. Doesn't even keep a weakref. (@@: bad name?) """ if not self.doCache: return self.lock.acquire() try: if id in self.cache: del self.cache[id] if id in self.expiredCache: del self.expiredCache[id] finally: self.lock.release() def expireAll(self): """ Expires all objects, moving them all into the expired/weakref cache. """ if not self.doCache: return self.lock.acquire() try: for key, value in self.cache.items(): self.expiredCache[key] = ref(value) self.cache = {} finally: self.lock.release() def allIDs(self): """ Returns the IDs of all objects in the cache. """ if self.doCache: all = list(self.cache.keys()) else: all = [] for id, value in self.expiredCache.items(): if value(): all.append(id) return all def getAll(self): """ Return all the objects in the cache. """ if self.doCache: all = list(self.cache.values()) else: all = [] for value in self.expiredCache.values(): if value(): all.append(value()) return all class CacheSet(object): """ A CacheSet is used to collect and maintain a series of caches. In SQLObject, there is one CacheSet per connection, and one Cache in the CacheSet for each class, since IDs are not unique across classes. It contains methods similar to Cache, but that take a ``cls`` argument. """ def __init__(self, *args, **kw): self.caches = {} self.args = args self.kw = kw def get(self, id, cls): try: return self.caches[cls.__name__].get(id) except KeyError: self.caches[cls.__name__] = CacheFactory(*self.args, **self.kw) return self.caches[cls.__name__].get(id) def put(self, id, cls, obj): self.caches[cls.__name__].put(id, obj) def finishPut(self, cls): self.caches[cls.__name__].finishPut() def created(self, id, cls, obj): try: self.caches[cls.__name__].created(id, obj) except KeyError: self.caches[cls.__name__] = CacheFactory(*self.args, **self.kw) self.caches[cls.__name__].created(id, obj) def expire(self, id, cls): try: self.caches[cls.__name__].expire(id) except KeyError: pass def clear(self, cls=None): if cls is None: for cache in self.caches.values(): cache.clear() elif cls.__name__ in self.caches: self.caches[cls.__name__].clear() def tryGet(self, id, cls): return self.tryGetByName(id, cls.__name__) def tryGetByName(self, id, clsname): try: return self.caches[clsname].tryGet(id) except KeyError: return None def allIDs(self, cls): try: self.caches[cls.__name__].allIDs() except KeyError: return [] def allSubCaches(self): return self.caches.values() def allSubCachesByClassNames(self): return self.caches def weakrefAll(self, cls=None): """ Move all objects in the cls (or if not given, then in all classes) to the weakref dictionary, where they can be collected. """ if cls is None: for cache in self.caches.values(): cache.expireAll() elif cls.__name__ in self.caches: self.caches[cls.__name__].expireAll() def getAll(self, cls=None): """ Returns all instances in the cache for the given class or all classes. """ if cls is None: results = [] for cache in self.caches.values(): results.extend(cache.getAll()) return results elif cls.__name__ in self.caches: return self.caches[cls.__name__].getAll() else: return [] SQLObject-3.4.0/sqlobject/mysql/0000755000175000017500000000000013141371614016010 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/mysql/__init__.py0000644000175000017500000000027412464733165020136 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import mysqlconnection return mysqlconnection.MySQLConnection registerConnection(['mysql'], builder) SQLObject-3.4.0/sqlobject/mysql/mysqlconnection.py0000644000175000017500000004673513134764127021635 0ustar phdphd00000000000000import os from sqlobject import col, dberrors from sqlobject.compat import PY2 from sqlobject.dbconnection import DBAPI class ErrorMessage(str): def __new__(cls, e, append_msg=''): obj = str.__new__(cls, e.args[1] + append_msg) try: obj.code = int(e.args[0]) except ValueError: obj.code = e.args[0] obj.module = e.__module__ obj.exception = e.__class__.__name__ return obj mysql_Bin = None class MySQLConnection(DBAPI): supportTransactions = False dbName = 'mysql' schemes = [dbName] odbc_keywords = ('Server', 'Port', 'UID', 'Password', 'Database') def __init__(self, db, user, password='', host='localhost', port=0, **kw): drivers = kw.pop('driver', None) or 'mysqldb' for driver in drivers.split(','): driver = driver.strip() if not driver: continue try: if driver.lower() in ('mysqldb', 'pymysql'): if driver.lower() == 'pymysql': import pymysql pymysql.install_as_MySQLdb() import MySQLdb if driver.lower() == 'mysqldb': if MySQLdb.version_info[:3] < (1, 2, 2): raise ValueError( 'SQLObject requires MySQLdb 1.2.2 or later') import MySQLdb.constants.CR import MySQLdb.constants.ER self.module = MySQLdb if driver.lower() == 'mysqldb': self.CR_SERVER_GONE_ERROR = \ MySQLdb.constants.CR.SERVER_GONE_ERROR self.CR_SERVER_LOST = \ MySQLdb.constants.CR.SERVER_LOST else: self.CR_SERVER_GONE_ERROR = \ MySQLdb.constants.CR.CR_SERVER_GONE_ERROR self.CR_SERVER_LOST = \ MySQLdb.constants.CR.CR_SERVER_LOST self.ER_DUP_ENTRY = MySQLdb.constants.ER.DUP_ENTRY elif driver == 'connector': import mysql.connector self.module = mysql.connector self.CR_SERVER_GONE_ERROR = \ mysql.connector.errorcode.CR_SERVER_GONE_ERROR self.CR_SERVER_LOST = \ mysql.connector.errorcode.CR_SERVER_LOST self.ER_DUP_ENTRY = mysql.connector.errorcode.ER_DUP_ENTRY elif driver == 'oursql': import oursql self.module = oursql self.CR_SERVER_GONE_ERROR = \ oursql.errnos['CR_SERVER_GONE_ERROR'] self.CR_SERVER_LOST = oursql.errnos['CR_SERVER_LOST'] self.ER_DUP_ENTRY = oursql.errnos['ER_DUP_ENTRY'] elif driver == 'pyodbc': import pyodbc self.module = pyodbc elif driver == 'pypyodbc': import pypyodbc self.module = pypyodbc elif driver == 'odbc': try: import pyodbc except ImportError: import pypyodbc as pyodbc self.module = pyodbc else: raise ValueError( 'Unknown MySQL driver "%s", ' 'expected mysqldb, connector, ' 'oursql, pymysql, ' 'odbc, pyodbc or pypyodbc' % driver) except ImportError: pass else: break else: raise ImportError( 'Cannot find a MySQL driver, tried %s' % drivers) self.host = host self.port = port or 3306 self.db = db self.user = user self.password = password self.kw = {} for key in ("unix_socket", "init_command", "read_default_file", "read_default_group", "conv"): if key in kw: self.kw[key] = kw.pop(key) for key in ("connect_timeout", "compress", "named_pipe", "use_unicode", "client_flag", "local_infile"): if key in kw: self.kw[key] = int(kw.pop(key)) for key in ("ssl_key", "ssl_cert", "ssl_ca", "ssl_capath"): if key in kw: if "ssl" not in self.kw: self.kw["ssl"] = {} self.kw["ssl"][key[4:]] = kw.pop(key) if "charset" in kw: self.dbEncoding = self.kw["charset"] = kw.pop("charset") else: self.dbEncoding = None self.driver = driver if driver in ('odbc', 'pyodbc', 'pypyodbc'): self.make_odbc_conn_str(kw.pop('odbcdrv', 'MySQL ODBC 5.3 ANSI Driver'), db, host, port, user, password ) self.CR_SERVER_GONE_ERROR = 2006 self.CR_SERVER_LOST = 2013 self.ER_DUP_ENTRY = '23000' global mysql_Bin if not PY2 and mysql_Bin is None: mysql_Bin = self.module.Binary self.module.Binary = lambda x: mysql_Bin(x).decode( 'ascii', errors='surrogateescape') self._server_version = None self._can_use_microseconds = None DBAPI.__init__(self, **kw) @classmethod def _connectionFromParams(cls, user, password, host, port, path, args): return cls(db=path.strip('/'), user=user or '', password=password or '', host=host or 'localhost', port=port, **args) def makeConnection(self): dbEncoding = self.dbEncoding if dbEncoding: if self.driver.lower() in ('mysqldb', 'pymysql'): from MySQLdb.connections import Connection if not hasattr(Connection, 'set_character_set'): # monkeypatch pre MySQLdb 1.2.1 def character_set_name(self): return dbEncoding + '_' + dbEncoding Connection.character_set_name = character_set_name if self.driver == 'connector': self.kw['consume_results'] = True try: if self.driver in ('odbc', 'pyodbc', 'pypyodbc'): self.debugWriter.write( "ODBC connect string: " + self.odbc_conn_str) conn = self.module.connect(self.odbc_conn_str) else: conn = self.module.connect( host=self.host, port=self.port, db=self.db, user=self.user, passwd=self.password, **self.kw) if self.driver != 'oursql': # Attempt to reconnect. This setting is persistent. conn.ping(True) except self.module.OperationalError as e: conninfo = ("; used connection string: " "host=%(host)s, port=%(port)s, " "db=%(db)s, user=%(user)s" % self.__dict__) raise dberrors.OperationalError(ErrorMessage(e, conninfo)) self._setAutoCommit(conn, bool(self.autoCommit)) if dbEncoding: if self.driver in ('odbc', 'pyodbc'): conn.setdecoding(self.module.SQL_CHAR, encoding=dbEncoding) conn.setdecoding(self.module.SQL_WCHAR, encoding=dbEncoding) if PY2: conn.setencoding(str, encoding=dbEncoding) conn.setencoding(unicode, encoding=dbEncoding) # noqa else: conn.setencoding(encoding=dbEncoding) elif hasattr(conn, 'set_character_set'): conn.set_character_set(dbEncoding) elif self.driver == 'oursql': conn.charset = dbEncoding elif hasattr(conn, 'query'): # works along with monkeypatching code above conn.query("SET NAMES %s" % dbEncoding) return conn def _setAutoCommit(self, conn, auto): if hasattr(conn, 'autocommit'): try: conn.autocommit(auto) except TypeError: # mysql-connector has autocommit as a property conn.autocommit = auto def _executeRetry(self, conn, cursor, query): if self.debug: self.printDebug(conn, query, 'QueryR') dbEncoding = self.dbEncoding if dbEncoding and not isinstance(query, bytes) and ( self.driver == 'connector'): query = query.encode(dbEncoding, 'surrogateescape') # When a server connection is lost and a query is attempted, most of # the time the query will raise a SERVER_LOST exception, then at the # second attempt to execute it, the mysql lib will reconnect and # succeed. However is a few cases, the first attempt raises the # SERVER_GONE exception, the second attempt the SERVER_LOST exception # and only the third succeeds. Thus the 3 in the loop count. # If it doesn't reconnect even after 3 attempts, while the database is # up and running, it is because a 5.0.3 (or newer) server is used # which no longer permits autoreconnects by default. In that case a # reconnect flag must be set when making the connection to indicate # that autoreconnecting is desired. In MySQLdb 1.2.2 or newer this is # done by calling ping(True) on the connection. for count in range(3): try: return cursor.execute(query) except self.module.OperationalError as e: if e.args[0] in (self.CR_SERVER_GONE_ERROR, self.CR_SERVER_LOST): if count == 2: raise dberrors.OperationalError(ErrorMessage(e)) if self.debug: self.printDebug(conn, str(e), 'ERROR') else: raise dberrors.OperationalError(ErrorMessage(e)) except self.module.IntegrityError as e: msg = ErrorMessage(e) if e.args[0] == self.ER_DUP_ENTRY: raise dberrors.DuplicateEntryError(msg) else: raise dberrors.IntegrityError(msg) except self.module.InternalError as e: raise dberrors.InternalError(ErrorMessage(e)) except self.module.ProgrammingError as e: if e.args[0] is not None: raise dberrors.ProgrammingError(ErrorMessage(e)) except self.module.DataError as e: raise dberrors.DataError(ErrorMessage(e)) except self.module.NotSupportedError as e: raise dberrors.NotSupportedError(ErrorMessage(e)) except self.module.DatabaseError as e: raise dberrors.DatabaseError(ErrorMessage(e)) except self.module.InterfaceError as e: raise dberrors.InterfaceError(ErrorMessage(e)) except self.module.Warning as e: raise Warning(ErrorMessage(e)) except self.module.Error as e: raise dberrors.Error(ErrorMessage(e)) def _queryInsertID(self, conn, soInstance, id, names, values): table = soInstance.sqlmeta.table idName = soInstance.sqlmeta.idName c = conn.cursor() if id is not None: names = [idName] + names values = [id] + values q = self._insertSQL(table, names, values) if self.debug: self.printDebug(conn, q, 'QueryIns') self._executeRetry(conn, c, q) if id is None: try: id = c.lastrowid except AttributeError: try: id = c.insert_id except AttributeError: self._executeRetry(conn, c, "SELECT LAST_INSERT_ID();") id = c.fetchone()[0] else: id = c.insert_id() if self.debugOutput: self.printDebug(conn, id, 'QueryIns', 'result') return id @classmethod def _queryAddLimitOffset(cls, query, start, end): if not start: return "%s LIMIT %i" % (query, end) if not end: return "%s LIMIT %i, -1" % (query, start) return "%s LIMIT %i, %i" % (query, start, end - start) def createReferenceConstraint(self, soClass, col): return col.mysqlCreateReferenceConstraint() def createColumn(self, soClass, col): return col.mysqlCreateSQL(self) def createIndexSQL(self, soClass, index): return index.mysqlCreateIndexSQL(soClass) def createIDColumn(self, soClass): if soClass.sqlmeta.idType == str: return '%s TEXT PRIMARY KEY' % soClass.sqlmeta.idName return '%s INT PRIMARY KEY AUTO_INCREMENT' % soClass.sqlmeta.idName def joinSQLType(self, join): return 'INT NOT NULL' def tableExists(self, tableName): try: # Use DESCRIBE instead of SHOW TABLES because SHOW TABLES # assumes there is a default database selected # which is not always True (for an embedded application, e.g.) self.query('DESCRIBE %s' % (tableName)) return True except dberrors.ProgrammingError as e: if e.args[0].code in (1146, '42S02'): # ER_NO_SUCH_TABLE return False raise def addColumn(self, tableName, column): self.query('ALTER TABLE %s ADD COLUMN %s' % (tableName, column.mysqlCreateSQL(self))) def delColumn(self, sqlmeta, column): self.query('ALTER TABLE %s DROP COLUMN %s' % (sqlmeta.table, column.dbName)) def columnsFromSchema(self, tableName, soClass): colData = self.queryAll("SHOW COLUMNS FROM %s" % tableName) results = [] for field, t, nullAllowed, key, default, extra in colData: if field == soClass.sqlmeta.idName: continue colClass, kw = self.guessClass(t) if self.kw.get('use_unicode') and colClass is col.StringCol: colClass = col.UnicodeCol if self.dbEncoding: kw['dbEncoding'] = self.dbEncoding kw['name'] = soClass.sqlmeta.style.dbColumnToPythonAttr(field) kw['dbName'] = field # Since MySQL 5.0, 'NO' is returned in the NULL column # (SQLObject expected '') kw['notNone'] = (nullAllowed.upper() != 'YES' and True or False) if default and t.startswith('int'): kw['default'] = int(default) elif default and t.startswith('float'): kw['default'] = float(default) elif default == 'CURRENT_TIMESTAMP' and t == 'timestamp': kw['default'] = None elif default and colClass is col.BoolCol: kw['default'] = int(default) and True or False else: kw['default'] = default # @@ skip key... # @@ skip extra... results.append(colClass(**kw)) return results def guessClass(self, t): if t.startswith('int'): return col.IntCol, {} elif t.startswith('enum'): values = [] for i in t[5:-1].split(','): # take the enum() off and split values.append(i[1:-1]) # remove the surrounding \' return col.EnumCol, {'enumValues': values} elif t.startswith('double'): return col.FloatCol, {} elif t.startswith('varchar'): colType = col.StringCol if self.kw.get('use_unicode', False): colType = col.UnicodeCol if t.endswith('binary'): return colType, {'length': int(t[8:-8]), 'char_binary': True} else: return colType, {'length': int(t[8:-1])} elif t.startswith('char'): if t.endswith('binary'): return col.StringCol, {'length': int(t[5:-8]), 'varchar': False, 'char_binary': True} else: return col.StringCol, {'length': int(t[5:-1]), 'varchar': False} elif t.startswith('datetime'): return col.DateTimeCol, {} elif t.startswith('date'): return col.DateCol, {} elif t.startswith('time'): return col.TimeCol, {} elif t.startswith('timestamp'): return col.TimestampCol, {} elif t.startswith('bool'): return col.BoolCol, {} elif t.startswith('tinyblob'): return col.BLOBCol, {"length": 2 ** 8 - 1} elif t.startswith('tinytext'): return col.StringCol, {"length": 2 ** 8 - 1, "varchar": True} elif t.startswith('blob'): return col.BLOBCol, {"length": 2 ** 16 - 1} elif t.startswith('text'): return col.StringCol, {"length": 2 ** 16 - 1, "varchar": True} elif t.startswith('mediumblob'): return col.BLOBCol, {"length": 2 ** 24 - 1} elif t.startswith('mediumtext'): return col.StringCol, {"length": 2 ** 24 - 1, "varchar": True} elif t.startswith('longblob'): return col.BLOBCol, {"length": 2 ** 32} elif t.startswith('longtext'): return col.StringCol, {"length": 2 ** 32, "varchar": True} else: return col.Col, {} def listTables(self): return [v[0] for v in self.queryAll("SHOW TABLES")] def listDatabases(self): return [v[0] for v in self.queryAll("SHOW DATABASES")] def _createOrDropDatabase(self, op="CREATE"): self.query('%s DATABASE %s' % (op, self.db)) def createEmptyDatabase(self): self._createOrDropDatabase() def dropDatabase(self): self._createOrDropDatabase(op="DROP") def server_version(self): if self._server_version is not None: return self._server_version try: server_version = self.queryOne("SELECT VERSION()")[0] server_version = server_version.split('-', 1) db_tag = "MySQL" if len(server_version) == 2: if "MariaDB" in server_version[1]: db_tag = "MariaDB" server_version = server_version[0] server_version = tuple(int(v) for v in server_version.split('.')) server_version = (server_version, db_tag) except: server_version = None # unknown self._server_version = server_version return server_version def can_use_microseconds(self): if self._can_use_microseconds is not None: return self._can_use_microseconds if os.environ.get('APPVEYOR') or os.environ.get('TRAVIS'): self._can_use_microseconds = False return False server_version = self.server_version() if server_version is None: return None server_version, db_tag = server_version if db_tag == "MariaDB": can_use_microseconds = (server_version >= (5, 3, 0)) else: # MySQL can_use_microseconds = (server_version >= (5, 6, 4)) self._can_use_microseconds = can_use_microseconds return can_use_microseconds SQLObject-3.4.0/sqlobject/declarative.py0000644000175000017500000001650712471371300017506 0ustar phdphd00000000000000""" Declarative objects. Declarative objects have a simple protocol: you can use classes in lieu of instances and they are equivalent, and any keyword arguments you give to the constructor will override those instance variables. (So if a class is received, we'll simply instantiate an instance with no arguments). You can provide a variable __unpackargs__ (a list of strings), and if the constructor is called with non-keyword arguments they will be interpreted as the given keyword arguments. If __unpackargs__ is ('*', name), then all the arguments will be put in a variable by that name. You can define a __classinit__(cls, new_attrs) method, which will be called when the class is created (including subclasses). Note: you can't use super() in __classinit__ because the class isn't bound to a name. As an analog to __classinit__, Declarative adds __instanceinit__ which is called with the same argument (new_attrs). This is like __init__, but after __unpackargs__ and other factors have been taken into account. If __mutableattributes__ is defined as a sequence of strings, these attributes will not be shared between superclasses and their subclasses. E.g., if you have a class variable that contains a list and you append to that list, changes to subclasses will effect superclasses unless you add the attribute here. Also defines classinstancemethod, which acts as either a class method or an instance method depending on where it is called. """ import copy from . import events from sqlobject.compat import with_metaclass import itertools counter = itertools.count() __all__ = ('classinstancemethod', 'DeclarativeMeta', 'Declarative') class classinstancemethod(object): """ Acts like a class method when called from a class, like an instance method when called by an instance. The method should take two arguments, 'self' and 'cls'; one of these will be None depending on how the method was called. """ def __init__(self, func): self.func = func def __get__(self, obj, type=None): return _methodwrapper(self.func, obj=obj, type=type) class _methodwrapper(object): def __init__(self, func, obj, type): self.func = func self.obj = obj self.type = type def __call__(self, *args, **kw): assert 'self' not in kw and 'cls' not in kw, ( "You cannot use 'self' or 'cls' arguments to a " "classinstancemethod") return self.func(*((self.obj, self.type) + args), **kw) def __repr__(self): if self.obj is None: return ('' % (self.type.__name__, self.func.__name__)) else: return ('' % (self.type.__name__, self.func.__name__, self.obj)) class DeclarativeMeta(type): def __new__(meta, class_name, bases, new_attrs): post_funcs = [] early_funcs = [] events.send(events.ClassCreateSignal, bases[0], class_name, bases, new_attrs, post_funcs, early_funcs) cls = type.__new__(meta, class_name, bases, new_attrs) for func in early_funcs: func(cls) if '__classinit__' in new_attrs: if hasattr(cls.__classinit__, '__func__'): cls.__classinit__ = staticmethod(cls.__classinit__.__func__) else: cls.__classinit__ = staticmethod(cls.__classinit__) cls.__classinit__(cls, new_attrs) for func in post_funcs: func(cls) return cls class Declarative(with_metaclass(DeclarativeMeta, object)): __unpackargs__ = () __mutableattributes__ = () __restrict_attributes__ = None def __classinit__(cls, new_attrs): cls.declarative_count = next(counter) for name in cls.__mutableattributes__: if name not in new_attrs: setattr(cls, copy.copy(getattr(cls, name))) def __instanceinit__(self, new_attrs): if self.__restrict_attributes__ is not None: for name in new_attrs: if name not in self.__restrict_attributes__: raise TypeError( '%s() got an unexpected keyword argument %r' % (self.__class__.__name__, name)) for name, value in new_attrs.items(): setattr(self, name, value) if 'declarative_count' not in new_attrs: self.declarative_count = next(counter) def __init__(self, *args, **kw): if self.__unpackargs__ and self.__unpackargs__[0] == '*': assert len(self.__unpackargs__) == 2, \ "When using __unpackargs__ = ('*', varname), " \ "you must only provide a single variable name " \ "(you gave %r)" % self.__unpackargs__ name = self.__unpackargs__[1] if name in kw: raise TypeError( "keyword parameter '%s' was given by position and name" % name) kw[name] = args else: if len(args) > len(self.__unpackargs__): raise TypeError( '%s() takes at most %i arguments (%i given)' % (self.__class__.__name__, len(self.__unpackargs__), len(args))) for name, arg in zip(self.__unpackargs__, args): if name in kw: raise TypeError( "keyword parameter '%s' was given by position and name" % name) kw[name] = arg if '__alsocopy' in kw: for name, value in kw['__alsocopy'].items(): if name not in kw: if name in self.__mutableattributes__: value = copy.copy(value) kw[name] = value del kw['__alsocopy'] self.__instanceinit__(kw) def __call__(self, *args, **kw): kw['__alsocopy'] = self.__dict__ return self.__class__(*args, **kw) @classinstancemethod def singleton(self, cls): if self: return self name = '_%s__singleton' % cls.__name__ if not hasattr(cls, name): setattr(cls, name, cls(declarative_count=cls.declarative_count)) return getattr(cls, name) @classinstancemethod def __repr__(self, cls): if self: name = '%s object' % self.__class__.__name__ v = self.__dict__.copy() else: name = '%s class' % cls.__name__ v = cls.__dict__.copy() if 'declarative_count' in v: name = '%s %i' % (name, v['declarative_count']) del v['declarative_count'] # @@: simplifying repr: # v = {} names = v.keys() args = [] for n in self._repr_vars(names): args.append('%s=%r' % (n, v[n])) if not args: return '<%s>' % name else: return '<%s %s>' % (name, ' '.join(args)) @staticmethod def _repr_vars(dictNames): names = [n for n in dictNames if not n.startswith('_') and n != 'declarative_count'] names.sort() return names def setup_attributes(cls, new_attrs): for name, value in new_attrs.items(): if hasattr(value, '__addtoclass__'): value.__addtoclass__(cls, name) SQLObject-3.4.0/sqlobject/dbconnection.py0000644000175000017500000011411313103631236017660 0ustar phdphd00000000000000import atexit import inspect import sys import os import threading import types try: from urlparse import urlparse, parse_qsl from urllib import unquote, quote, urlencode except ImportError: from urllib.parse import urlparse, parse_qsl, unquote, quote, urlencode import warnings import weakref from .cache import CacheSet from . import classregistry from . import col from .converters import sqlrepr from . import sqlbuilder from .util.threadinglocal import local as threading_local from .compat import PY2, string_type, unicode_type warnings.filterwarnings("ignore", "DB-API extension cursor.lastrowid used") _connections = {} def _closeConnection(ref): conn = ref() if conn is not None: conn.close() class ConsoleWriter: def __init__(self, connection, loglevel): # loglevel: None or empty string for stdout; or 'stderr' self.loglevel = loglevel or "stdout" self.dbEncoding = getattr(connection, "dbEncoding", None) or "ascii" def write(self, text): logfile = getattr(sys, self.loglevel) if PY2 and isinstance(text, unicode_type): try: text = text.encode(self.dbEncoding) except UnicodeEncodeError: text = repr(text)[2:-1] # Remove u'...' from the repr logfile.write(text + '\n') class LogWriter: def __init__(self, connection, logger, loglevel): self.logger = logger self.loglevel = loglevel self.logmethod = getattr(logger, loglevel) def write(self, text): self.logmethod(text) def makeDebugWriter(connection, loggerName, loglevel): if not loggerName: return ConsoleWriter(connection, loglevel) import logging logger = logging.getLogger(loggerName) return LogWriter(connection, logger, loglevel) class Boolean(object): """A bool class that also understands some special string keywords Understands: yes/no, true/false, on/off, 1/0, case ignored. """ _keywords = {'1': True, 'yes': True, 'true': True, 'on': True, '0': False, 'no': False, 'false': False, 'off': False} def __new__(cls, value): try: return Boolean._keywords[value.lower()] except (AttributeError, KeyError): return bool(value) class DBConnection: def __init__(self, name=None, debug=False, debugOutput=False, cache=True, style=None, autoCommit=True, debugThreading=False, registry=None, logger=None, loglevel=None): self.name = name self.debug = Boolean(debug) self.debugOutput = Boolean(debugOutput) self.debugThreading = Boolean(debugThreading) self.debugWriter = makeDebugWriter(self, logger, loglevel) self.doCache = Boolean(cache) self.cache = CacheSet(cache=self.doCache) self.style = style self._connectionNumbers = {} self._connectionCount = 1 self.autoCommit = Boolean(autoCommit) self.registry = registry or None classregistry.registry(self.registry).addCallback(self.soClassAdded) registerConnectionInstance(self) atexit.register(_closeConnection, weakref.ref(self)) def oldUri(self): auth = getattr(self, 'user', '') or '' if auth: if self.password: auth = auth + ':' + self.password auth = auth + '@' else: assert not getattr(self, 'password', None), ( 'URIs cannot express passwords without usernames') uri = '%s://%s' % (self.dbName, auth) if self.host: uri += self.host if self.port: uri += ':%d' % self.port uri += '/' db = self.db if db.startswith('/'): db = db[1:] return uri + db def uri(self): auth = getattr(self, 'user', '') or '' if auth: auth = quote(auth) if self.password: auth = auth + ':' + quote(self.password) auth = auth + '@' else: assert not getattr(self, 'password', None), ( 'URIs cannot express passwords without usernames') uri = '%s://%s' % (self.dbName, auth) if self.host: uri += self.host if self.port: uri += ':%d' % self.port uri += '/' db = self.db if db.startswith('/'): db = db[1:] return uri + quote(db) @classmethod def connectionFromOldURI(cls, uri): return cls._connectionFromParams(*cls._parseOldURI(uri)) @classmethod def connectionFromURI(cls, uri): return cls._connectionFromParams(*cls._parseURI(uri)) @staticmethod def _parseOldURI(uri): schema, rest = uri.split(':', 1) assert rest.startswith('/'), \ "URIs must start with scheme:/ -- " \ "you did not include a / (in %r)" % rest if rest.startswith('/') and not rest.startswith('//'): host = None rest = rest[1:] elif rest.startswith('///'): host = None rest = rest[3:] else: rest = rest[2:] if rest.find('/') == -1: host = rest rest = '' else: host, rest = rest.split('/', 1) if host and host.find('@') != -1: user, host = host.rsplit('@', 1) if user.find(':') != -1: user, password = user.split(':', 1) else: password = None else: user = password = None if host and host.find(':') != -1: _host, port = host.split(':') try: port = int(port) except ValueError: raise ValueError("port must be integer, " "got '%s' instead" % port) if not (1 <= port <= 65535): raise ValueError("port must be integer in the range 1-65535, " "got '%d' instead" % port) host = _host else: port = None path = '/' + rest if os.name == 'nt': if (len(rest) > 1) and (rest[1] == '|'): path = "%s:%s" % (rest[0], rest[2:]) args = {} if path.find('?') != -1: path, arglist = path.split('?', 1) arglist = arglist.split('&') for single in arglist: argname, argvalue = single.split('=', 1) argvalue = unquote(argvalue) args[argname] = argvalue return user, password, host, port, path, args @staticmethod def _parseURI(uri): parsed = urlparse(uri) host, path = parsed.hostname, parsed.path user, password, port = None, None, None if parsed.username: user = unquote(parsed.username) if parsed.password: password = unquote(parsed.password) if parsed.port: port = int(parsed.port) path = unquote(path) if (os.name == 'nt') and (len(path) > 2): # Preserve backward compatibility with URIs like /C|/path; # replace '|' by ':' if path[2] == '|': path = "%s:%s" % (path[0:2], path[3:]) # Remove leading slash if (path[0] == '/') and (path[2] == ':'): path = path[1:] query = parsed.query # hash-tag / fragment is ignored args = {} if query: for name, value in parse_qsl(query): args[name] = value return user, password, host, port, path, args def soClassAdded(self, soClass): """ This is called for each new class; we use this opportunity to create an instance method that is bound to the class and this connection. """ name = soClass.__name__ assert not hasattr(self, name), ( "Connection %r already has an attribute with the name " "%r (and you just created the conflicting class %r)" % (self, name, soClass)) setattr(self, name, ConnWrapper(soClass, self)) def expireAll(self): """ Expire all instances of objects for this connection. """ cache_set = self.cache cache_set.weakrefAll() for item in cache_set.getAll(): item.expire() class ConnWrapper(object): """ This represents a SQLObject class that is bound to a specific connection (instances have a connection instance variable, but classes are global, so this is binds the connection variable lazily when a class method is accessed) """ # @@: methods that take connection arguments should be explicitly # marked up instead of the implicit use of a connection argument # and inspect.getargspec() def __init__(self, soClass, connection): self._soClass = soClass self._connection = connection def __call__(self, *args, **kw): kw['connection'] = self._connection return self._soClass(*args, **kw) def __getattr__(self, attr): meth = getattr(self._soClass, attr) if not isinstance(meth, types.MethodType): # We don't need to wrap non-methods return meth try: takes_conn = meth.takes_connection except AttributeError: args, varargs, varkw, defaults = inspect.getargspec(meth) assert not varkw and not varargs, ( "I cannot tell whether I must wrap this method, " "because it takes **kw: %r" % meth) takes_conn = 'connection' in args meth.__func__.takes_connection = takes_conn if not takes_conn: return meth return ConnMethodWrapper(meth, self._connection) class ConnMethodWrapper(object): def __init__(self, method, connection): self._method = method self._connection = connection def __getattr__(self, attr): return getattr(self._method, attr) def __call__(self, *args, **kw): kw['connection'] = self._connection return self._method(*args, **kw) def __repr__(self): return '' % ( self._method, self._connection) class DBAPI(DBConnection): """ Subclass must define a `makeConnection()` method, which returns a newly-created connection object. ``queryInsertID`` must also be defined. """ dbName = None def __init__(self, **kw): self._pool = [] self._poolLock = threading.Lock() DBConnection.__init__(self, **kw) self._binaryType = type(self.module.Binary(b'')) def _runWithConnection(self, meth, *args): conn = self.getConnection() try: val = meth(conn, *args) finally: self.releaseConnection(conn) return val def getConnection(self): self._poolLock.acquire() try: if not self._pool: conn = self.makeConnection() self._connectionNumbers[id(conn)] = self._connectionCount self._connectionCount += 1 else: conn = self._pool.pop() if self.debug: s = 'ACQUIRE' if self._pool is not None: s += ' pool=[%s]' % ', '.join( [str(self._connectionNumbers[id(v)]) for v in self._pool]) self.printDebug(conn, s, 'Pool') return conn finally: self._poolLock.release() def releaseConnection(self, conn, explicit=False): if self.debug: if explicit: s = 'RELEASE (explicit)' else: s = 'RELEASE (implicit, autocommit=%s)' % self.autoCommit if self._pool is None: s += ' no pooling' else: s += ' pool=[%s]' % ', '.join( [str(self._connectionNumbers[id(v)]) for v in self._pool]) self.printDebug(conn, s, 'Pool') if self.supportTransactions and not explicit: if self.autoCommit == 'exception': if self.debug: self.printDebug(conn, 'auto/exception', 'ROLLBACK') conn.rollback() raise Exception('Object used outside of a transaction; ' 'implicit COMMIT or ROLLBACK not allowed') elif self.autoCommit: if self.debug: self.printDebug(conn, 'auto', 'COMMIT') if not getattr(conn, 'autocommit', False): conn.commit() else: if self.debug: self.printDebug(conn, 'auto', 'ROLLBACK') conn.rollback() if self._pool is not None: if conn not in self._pool: # @@: We can get duplicate releasing of connections with # the __del__ in Iteration (unfortunately, not sure why # it happens) self._pool.insert(0, conn) else: conn.close() def printDebug(self, conn, s, name, type='query'): if name == 'Pool' and self.debug != 'Pool': return if type == 'query': sep = ': ' else: sep = '->' s = repr(s) n = self._connectionNumbers[id(conn)] spaces = ' ' * (8 - len(name)) if self.debugThreading: threadName = threading.currentThread().getName() threadName = (':' + threadName + ' ' * (8 - len(threadName))) else: threadName = '' msg = '%(n)2i%(threadName)s/%(name)s%(spaces)s%(sep)s %(s)s' % locals() self.debugWriter.write(msg) def _executeRetry(self, conn, cursor, query): if self.debug: self.printDebug(conn, query, 'QueryR') return cursor.execute(query) def _query(self, conn, s): if self.debug: self.printDebug(conn, s, 'Query') self._executeRetry(conn, conn.cursor(), s) def query(self, s): return self._runWithConnection(self._query, s) def _queryAll(self, conn, s): if self.debug: self.printDebug(conn, s, 'QueryAll') c = conn.cursor() self._executeRetry(conn, c, s) value = c.fetchall() if self.debugOutput: self.printDebug(conn, value, 'QueryAll', 'result') return value def queryAll(self, s): return self._runWithConnection(self._queryAll, s) def _queryAllDescription(self, conn, s): """ Like queryAll, but returns (description, rows), where the description is cursor.description (which gives row types) """ if self.debug: self.printDebug(conn, s, 'QueryAllDesc') c = conn.cursor() self._executeRetry(conn, c, s) value = c.fetchall() if self.debugOutput: self.printDebug(conn, value, 'QueryAll', 'result') return c.description, value def queryAllDescription(self, s): return self._runWithConnection(self._queryAllDescription, s) def _queryOne(self, conn, s): if self.debug: self.printDebug(conn, s, 'QueryOne') c = conn.cursor() self._executeRetry(conn, c, s) value = c.fetchone() if self.debugOutput: self.printDebug(conn, value, 'QueryOne', 'result') return value def queryOne(self, s): return self._runWithConnection(self._queryOne, s) def _insertSQL(self, table, names, values): return ("INSERT INTO %s (%s) VALUES (%s)" % (table, ', '.join(names), ', '.join([self.sqlrepr(v) for v in values]))) def transaction(self): return Transaction(self) def queryInsertID(self, soInstance, id, names, values): return self._runWithConnection(self._queryInsertID, soInstance, id, names, values) def iterSelect(self, select): return select.IterationClass(self, self.getConnection(), select, keepConnection=False) def accumulateSelect(self, select, *expressions): """ Apply an accumulate function(s) (SUM, COUNT, MIN, AVG, MAX, etc...) to the select object. """ q = select.queryForSelect().newItems(expressions).\ unlimited().orderBy(None) q = self.sqlrepr(q) val = self.queryOne(q) if len(expressions) == 1: val = val[0] return val def queryForSelect(self, select): return self.sqlrepr(select.queryForSelect()) def _SO_createJoinTable(self, join): self.query(self._SO_createJoinTableSQL(join)) def _SO_createJoinTableSQL(self, join): return ('CREATE TABLE %s (\n%s %s,\n%s %s\n)' % (join.intermediateTable, join.joinColumn, self.joinSQLType(join), join.otherColumn, self.joinSQLType(join))) def _SO_dropJoinTable(self, join): self.query("DROP TABLE %s" % join.intermediateTable) def _SO_createIndex(self, soClass, index): self.query(self.createIndexSQL(soClass, index)) def createIndexSQL(self, soClass, index): assert 0, 'Implement in subclasses' def createTable(self, soClass): createSql, constraints = self.createTableSQL(soClass) self.query(createSql) return constraints def createReferenceConstraints(self, soClass): refConstraints = [self.createReferenceConstraint(soClass, column) for column in soClass.sqlmeta.columnList if isinstance(column, col.SOForeignKey)] refConstraintDefs = [constraint for constraint in refConstraints if constraint] return refConstraintDefs def createSQL(self, soClass): tableCreateSQLs = getattr(soClass.sqlmeta, 'createSQL', None) if tableCreateSQLs: assert isinstance(tableCreateSQLs, (str, list, dict, tuple)), ( '%s.sqlmeta.createSQL must be a str, list, dict or tuple.' % (soClass.__name__)) if isinstance(tableCreateSQLs, dict): tableCreateSQLs = tableCreateSQLs.get( soClass._connection.dbName, []) if isinstance(tableCreateSQLs, str): tableCreateSQLs = [tableCreateSQLs] if isinstance(tableCreateSQLs, tuple): tableCreateSQLs = list(tableCreateSQLs) assert isinstance(tableCreateSQLs, list), ( 'Unable to create a list from %s.sqlmeta.createSQL' % (soClass.__name__)) return tableCreateSQLs or [] def createTableSQL(self, soClass): constraints = self.createReferenceConstraints(soClass) extraSQL = self.createSQL(soClass) createSql = ('CREATE TABLE %s (\n%s\n)' % (soClass.sqlmeta.table, self.createColumns(soClass))) return createSql, constraints + extraSQL def createColumns(self, soClass): columnDefs = [self.createIDColumn(soClass)] \ + [self.createColumn(soClass, col) for col in soClass.sqlmeta.columnList] return ",\n".join([" %s" % c for c in columnDefs]) def createReferenceConstraint(self, soClass, col): assert 0, "Implement in subclasses" def createColumn(self, soClass, col): assert 0, "Implement in subclasses" def dropTable(self, tableName, cascade=False): self.query("DROP TABLE %s" % tableName) def clearTable(self, tableName): # 3-03 @@: Should this have a WHERE 1 = 1 or similar # clause? In some configurations without the WHERE clause # the query won't go through, but maybe we shouldn't override # that. self.query("DELETE FROM %s" % tableName) def createBinary(self, value): """ Create a binary object wrapper for the given database. """ # Default is Binary() function from the connection driver. return self.module.Binary(value) # The _SO_* series of methods are sorts of "friend" methods # with SQLObject. They grab values from the SQLObject instances # or classes freely, but keep the SQLObject class from accessing # the database directly. This way no SQL is actually created # in the SQLObject class. def _SO_update(self, so, values): self.query("UPDATE %s SET %s WHERE %s = (%s)" % (so.sqlmeta.table, ", ".join(["%s = (%s)" % (dbName, self.sqlrepr(value)) for dbName, value in values]), so.sqlmeta.idName, self.sqlrepr(so.id))) def _SO_selectOne(self, so, columnNames): return self._SO_selectOneAlt(so, columnNames, so.q.id == so.id) def _SO_selectOneAlt(self, so, columnNames, condition): if columnNames: columns = [isinstance(x, string_type) and sqlbuilder.SQLConstant(x) or x for x in columnNames] else: columns = None return self.queryOne(self.sqlrepr(sqlbuilder.Select( columns, staticTables=[so.sqlmeta.table], clause=condition))) def _SO_delete(self, so): self.query("DELETE FROM %s WHERE %s = (%s)" % (so.sqlmeta.table, so.sqlmeta.idName, self.sqlrepr(so.id))) def _SO_selectJoin(self, soClass, column, value): return self.queryAll("SELECT %s FROM %s WHERE %s = (%s)" % (soClass.sqlmeta.idName, soClass.sqlmeta.table, column, self.sqlrepr(value))) def _SO_intermediateJoin(self, table, getColumn, joinColumn, value): return self.queryAll("SELECT %s FROM %s WHERE %s = (%s)" % (getColumn, table, joinColumn, self.sqlrepr(value))) def _SO_intermediateDelete(self, table, firstColumn, firstValue, secondColumn, secondValue): self.query("DELETE FROM %s WHERE %s = (%s) AND %s = (%s)" % (table, firstColumn, self.sqlrepr(firstValue), secondColumn, self.sqlrepr(secondValue))) def _SO_intermediateInsert(self, table, firstColumn, firstValue, secondColumn, secondValue): self.query("INSERT INTO %s (%s, %s) VALUES (%s, %s)" % (table, firstColumn, secondColumn, self.sqlrepr(firstValue), self.sqlrepr(secondValue))) def _SO_columnClause(self, soClass, kw): from . import main ops = {None: "IS"} data = [] if 'id' in kw: data.append((soClass.sqlmeta.idName, kw.pop('id'))) for soColumn in soClass.sqlmeta.columnList: key = soColumn.name if key in kw: val = kw.pop(key) if soColumn.from_python: val = soColumn.from_python( val, sqlbuilder.SQLObjectState(soClass, connection=self)) data.append((soColumn.dbName, val)) elif soColumn.foreignName in kw: obj = kw.pop(soColumn.foreignName) if isinstance(obj, main.SQLObject): data.append((soColumn.dbName, obj.id)) else: data.append((soColumn.dbName, obj)) if kw: # pick the first key from kw to use to raise the error, raise TypeError("got an unexpected keyword argument(s): " "%r" % kw.keys()) if not data: return None return ' AND '.join( ['%s %s %s' % (dbName, ops.get(value, "="), self.sqlrepr(value)) for dbName, value in data]) def sqlrepr(self, v): return sqlrepr(v, self.dbName) def __del__(self): self.close() def close(self): if not hasattr(self, '_pool'): # Probably there was an exception while creating this # instance, so it is incomplete. return if not self._pool: return self._poolLock.acquire() try: if not self._pool: # _pool could be filled in a different thread return conns = self._pool[:] self._pool[:] = [] for conn in conns: try: conn.close() except self.module.Error: pass del conn del conns finally: self._poolLock.release() def createEmptyDatabase(self): """ Create an empty database. """ raise NotImplementedError def make_odbc_conn_str(self, odb_source, db, host=None, port=None, user=None, password=None): odbc_conn_parts = ['Driver={%s}' % odb_source] for odbc_keyword, value in \ zip(self.odbc_keywords, (host, port, user, password, db)): if value is not None: odbc_conn_parts.append('%s=%s' % (odbc_keyword, value)) self.odbc_conn_str = ';'.join(odbc_conn_parts) class Iteration(object): def __init__(self, dbconn, rawconn, select, keepConnection=False): self.dbconn = dbconn self.rawconn = rawconn self.select = select self.keepConnection = keepConnection self.cursor = rawconn.cursor() self.query = self.dbconn.queryForSelect(select) if dbconn.debug: dbconn.printDebug(rawconn, self.query, 'Select') self.dbconn._executeRetry(self.rawconn, self.cursor, self.query) def __iter__(self): return self def __next__(self): return self.next() def next(self): result = self.cursor.fetchone() if result is None: self._cleanup() raise StopIteration if result[0] is None: return None if self.select.ops.get('lazyColumns', 0): obj = self.select.sourceClass.get(result[0], connection=self.dbconn) return obj else: obj = self.select.sourceClass.get(result[0], selectResults=result[1:], connection=self.dbconn) return obj def _cleanup(self): if getattr(self, 'query', None) is None: # already cleaned up return self.query = None if not self.keepConnection: self.dbconn.releaseConnection(self.rawconn) self.dbconn = self.rawconn = self.select = self.cursor = None def __del__(self): self._cleanup() class Transaction(object): def __init__(self, dbConnection): # this is to skip __del__ in case of an exception in this __init__ self._obsolete = True self._dbConnection = dbConnection self._connection = dbConnection.getConnection() self._dbConnection._setAutoCommit(self._connection, 0) self.cache = CacheSet(cache=dbConnection.doCache) self._deletedCache = {} self._obsolete = False def assertActive(self): assert not self._obsolete, \ "This transaction has already gone through ROLLBACK; " \ "begin another transaction" def query(self, s): self.assertActive() return self._dbConnection._query(self._connection, s) def queryAll(self, s): self.assertActive() return self._dbConnection._queryAll(self._connection, s) def queryOne(self, s): self.assertActive() return self._dbConnection._queryOne(self._connection, s) def queryInsertID(self, soInstance, id, names, values): self.assertActive() return self._dbConnection._queryInsertID( self._connection, soInstance, id, names, values) def iterSelect(self, select): self.assertActive() # We can't keep the cursor open with results in a transaction, # because we might want to use the connection while we're # still iterating through the results. # @@: But would it be okay for psycopg, with threadsafety # level 2? return iter(list(select.IterationClass(self, self._connection, select, keepConnection=True))) def _SO_delete(self, inst): cls = inst.__class__.__name__ if cls not in self._deletedCache: self._deletedCache[cls] = [] self._deletedCache[cls].append(inst.id) if PY2: meth = types.MethodType(self._dbConnection._SO_delete.__func__, self, self.__class__) else: meth = types.MethodType(self._dbConnection._SO_delete.__func__, self) return meth(inst) def commit(self, close=False): if self._obsolete: # @@: is it okay to get extraneous commits? return if self._dbConnection.debug: self._dbConnection.printDebug(self._connection, '', 'COMMIT') self._connection.commit() subCaches = [(sub[0], sub[1].allIDs()) for sub in self.cache.allSubCachesByClassNames().items()] subCaches.extend([(x[0], x[1]) for x in self._deletedCache.items()]) for cls, ids in subCaches: for id in list(ids): inst = self._dbConnection.cache.tryGetByName(id, cls) if inst is not None: inst.expire() if close: self._makeObsolete() def rollback(self): if self._obsolete: # @@: is it okay to get extraneous rollbacks? return if self._dbConnection.debug: self._dbConnection.printDebug(self._connection, '', 'ROLLBACK') subCaches = [(sub, sub.allIDs()) for sub in self.cache.allSubCaches()] self._connection.rollback() for subCache, ids in subCaches: for id in list(ids): inst = subCache.tryGet(id) if inst is not None: inst.expire() self._makeObsolete() def __getattr__(self, attr): """ If nothing else works, let the parent connection handle it. Except with this transaction as 'self'. Poor man's acquisition? Bad programming? Okay, maybe. """ self.assertActive() attr = getattr(self._dbConnection, attr) try: func = attr.__func__ except AttributeError: if isinstance(attr, ConnWrapper): return ConnWrapper(attr._soClass, self) else: return attr else: if PY2: meth = types.MethodType(func, self, self.__class__) else: meth = types.MethodType(func, self) return meth def _makeObsolete(self): self._obsolete = True if self._dbConnection.autoCommit: self._dbConnection._setAutoCommit(self._connection, 1) self._dbConnection.releaseConnection(self._connection, explicit=True) self._connection = None self._deletedCache = {} def begin(self): # @@: Should we do this, or should begin() be a no-op when we're # not already obsolete? assert self._obsolete, \ "You cannot begin a new transaction session " \ "without rolling back this one" self._obsolete = False self._connection = self._dbConnection.getConnection() self._dbConnection._setAutoCommit(self._connection, 0) def __del__(self): if self._obsolete: return self.rollback() def close(self): raise TypeError('You cannot just close transaction - ' 'you should either call rollback(), commit() ' 'or commit(close=True) ' 'to close the underlying connection.') class ConnectionHub(object): """ This object serves as a hub for connections, so that you can pass in a ConnectionHub to a SQLObject subclass as though it was a connection, but actually bind a real database connection later. You can also bind connections on a per-thread basis. You must hang onto the original ConnectionHub instance, as you cannot retrieve it again from the class or instance. To use the hub, do something like:: hub = ConnectionHub() class MyClass(SQLObject): _connection = hub hub.threadConnection = connectionFromURI('...') """ def __init__(self): self.threadingLocal = threading_local() def __get__(self, obj, type=None): # I'm a little surprised we have to do this, but apparently # the object's private dictionary of attributes doesn't # override this descriptor. if (obj is not None) and '_connection' in obj.__dict__: return obj.__dict__['_connection'] return self.getConnection() def __set__(self, obj, value): obj.__dict__['_connection'] = value def getConnection(self): try: return self.threadingLocal.connection except AttributeError: try: return self.processConnection except AttributeError: raise AttributeError( "No connection has been defined for this thread " "or process") def doInTransaction(self, func, *args, **kw): """ This routine can be used to run a function in a transaction, rolling the transaction back if any exception is raised from that function, and committing otherwise. Use like:: sqlhub.doInTransaction(process_request, os.environ) This will run ``process_request(os.environ)``. The return value will be preserved. """ # @@: In Python 2.5, something usable with with: should also # be added. try: old_conn = self.threadingLocal.connection old_conn_is_threading = True except AttributeError: old_conn = self.processConnection old_conn_is_threading = False conn = old_conn.transaction() if old_conn_is_threading: self.threadConnection = conn else: self.processConnection = conn try: try: value = func(*args, **kw) except: conn.rollback() raise else: conn.commit(close=True) return value finally: if old_conn_is_threading: self.threadConnection = old_conn else: self.processConnection = old_conn def _set_threadConnection(self, value): self.threadingLocal.connection = value def _get_threadConnection(self): return self.threadingLocal.connection def _del_threadConnection(self): del self.threadingLocal.connection threadConnection = property(_get_threadConnection, _set_threadConnection, _del_threadConnection) class ConnectionURIOpener(object): def __init__(self): self.schemeBuilders = {} self.instanceNames = {} self.cachedURIs = {} def registerConnection(self, schemes, builder): for uriScheme in schemes: assert uriScheme not in self.schemeBuilders \ or self.schemeBuilders[uriScheme] is builder, \ "A driver has already been registered " \ "for the URI scheme %s" % uriScheme self.schemeBuilders[uriScheme] = builder def registerConnectionInstance(self, inst): if inst.name: assert (inst.name not in self.instanceNames or self.instanceNames[inst.name] is cls # noqa ), ("A instance has already been registered " "with the name %s" % inst.name) assert inst.name.find(':') == -1, \ "You cannot include ':' " \ "in your class names (%r)" % cls.name # noqa self.instanceNames[inst.name] = inst def connectionForURI(self, uri, oldUri=False, **args): if args: if '?' not in uri: uri += '?' + urlencode(args) else: uri += '&' + urlencode(args) if uri in self.cachedURIs: return self.cachedURIs[uri] if uri.find(':') != -1: scheme, rest = uri.split(':', 1) connCls = self.dbConnectionForScheme(scheme) if oldUri: conn = connCls.connectionFromOldURI(uri) else: conn = connCls.connectionFromURI(uri) else: # We just have a name, not a URI assert uri in self.instanceNames, \ "No SQLObject driver exists under the name %s" % uri conn = self.instanceNames[uri] # @@: Do we care if we clobber another connection? self.cachedURIs[uri] = conn return conn def dbConnectionForScheme(self, scheme): assert scheme in self.schemeBuilders, ( "No SQLObject driver exists for %s (only %s)" % ( scheme, ', '.join(self.schemeBuilders.keys()))) return self.schemeBuilders[scheme]() TheURIOpener = ConnectionURIOpener() registerConnection = TheURIOpener.registerConnection registerConnectionInstance = TheURIOpener.registerConnectionInstance connectionForURI = TheURIOpener.connectionForURI dbConnectionForScheme = TheURIOpener.dbConnectionForScheme # Register DB URI schemas -- do import for side effects # noqa is a directive for flake8 to ignore seemingly unused imports from . import firebird # noqa from . import maxdb # noqa from . import mssql # noqa from . import mysql # noqa from . import postgres # noqa from . import rdbhost # noqa from . import sqlite # noqa from . import sybase # noqa SQLObject-3.4.0/sqlobject/postgres/0000755000175000017500000000000013141371614016511 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/postgres/__init__.py0000644000175000017500000000032512464733165020634 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import pgconnection return pgconnection.PostgresConnection registerConnection(['postgres', 'postgresql', 'psycopg'], builder) SQLObject-3.4.0/sqlobject/postgres/pgconnection.py0000644000175000017500000005246113140202164021551 0ustar phdphd00000000000000import re from sqlobject import col from sqlobject import dberrors from sqlobject import sqlbuilder from sqlobject.compat import PY2 from sqlobject.converters import registerConverter, sqlrepr from sqlobject.dbconnection import DBAPI class ErrorMessage(str): def __new__(cls, e, append_msg=''): obj = str.__new__(cls, e.args[0] + append_msg) if e.__module__ == 'psycopg2': obj.code = getattr(e, 'pgcode', None) obj.error = getattr(e, 'pgerror', None) else: obj.code = getattr(e, 'code', None) obj.error = getattr(e, 'error', None) obj.module = e.__module__ obj.exception = e.__class__.__name__ return obj class PostgresConnection(DBAPI): supportTransactions = True dbName = 'postgres' schemes = [dbName, 'postgresql'] odbc_keywords = ('Server', 'Port', 'UID', 'Password', 'Database') def __init__(self, dsn=None, host=None, port=None, db=None, user=None, password=None, **kw): drivers = kw.pop('driver', None) or 'psycopg' for driver in drivers.split(','): driver = driver.strip() if not driver: continue try: if driver == 'psycopg2': import psycopg2 as psycopg self.module = psycopg elif driver == 'psycopg1': import psycopg self.module = psycopg elif driver == 'psycopg': try: import psycopg2 as psycopg except ImportError: import psycopg self.module = psycopg elif driver == 'pygresql': import pgdb self.module = pgdb elif driver in ('py-postgresql', 'pypostgresql'): from postgresql.driver import dbapi20 self.module = dbapi20 elif driver == 'pyodbc': import pyodbc self.module = pyodbc elif driver == 'pypyodbc': import pypyodbc self.module = pypyodbc elif driver == 'odbc': try: import pyodbc except ImportError: import pypyodbc as pyodbc self.module = pyodbc else: raise ValueError( 'Unknown PostgreSQL driver "%s", ' 'expected psycopg, psycopg2, psycopg1, ' 'pygresql, pypostgresql, ' 'odbc, pyodbc or pypyodbc' % driver) except ImportError: pass else: break else: raise ImportError( 'Cannot find a PostgreSQL driver, tried %s' % drivers) if driver.startswith('psycopg'): # Register a converter for psycopg Binary type. registerConverter(type(self.module.Binary('')), PsycoBinaryConverter) elif driver in ('pygresql', 'py-postgresql', 'pypostgresql'): registerConverter(type(self.module.Binary(b'')), PostgresBinaryConverter) elif driver in ('odbc', 'pyodbc', 'pypyodbc'): registerConverter(bytearray, OdbcBinaryConverter) self.db = db self.user = user self.password = password self.host = host self.port = port if driver in ('odbc', 'pyodbc', 'pypyodbc'): self.make_odbc_conn_str(kw.pop('odbcdrv', 'PostgreSQL ANSI'), db, host, port, user, password ) sslmode = kw.pop("sslmode", None) if sslmode: self.odbc_conn_str += ';sslmode=require' else: self.dsn_dict = dsn_dict = {} if host: dsn_dict["host"] = host if port: if driver == 'pygresql': dsn_dict["host"] = "%s:%d" % (host, port) elif driver.startswith('psycopg') and \ psycopg.__version__.split('.')[0] == '1': dsn_dict["port"] = str(port) else: dsn_dict["port"] = port if db: dsn_dict["database"] = db if user: dsn_dict["user"] = user if password: dsn_dict["password"] = password sslmode = kw.pop("sslmode", None) if sslmode: dsn_dict["sslmode"] = sslmode self.use_dsn = dsn is not None if dsn is None: if driver == 'pygresql': dsn = '' if host: dsn += host dsn += ':' if db: dsn += db dsn += ':' if user: dsn += user dsn += ':' if password: dsn += password else: dsn = [] if db: dsn.append('dbname=%s' % db) if user: dsn.append('user=%s' % user) if password: dsn.append('password=%s' % password) if host: dsn.append('host=%s' % host) if port: dsn.append('port=%d' % port) if sslmode: dsn.append('sslmode=%s' % sslmode) dsn = ' '.join(dsn) if driver in ('py-postgresql', 'pypostgresql'): if host and host.startswith('/'): dsn_dict["host"] = dsn_dict["port"] = None dsn_dict["unix"] = host else: if "unix" in dsn_dict: del dsn_dict["unix"] self.dsn = dsn self.driver = driver self.unicodeCols = kw.pop('unicodeCols', False) self.schema = kw.pop('schema', None) self.dbEncoding = kw.pop("charset", None) DBAPI.__init__(self, **kw) @classmethod def _connectionFromParams(cls, user, password, host, port, path, args): path = path.strip('/') if (host is None) and path.count('/'): # Non-default unix socket path_parts = path.split('/') host = '/' + '/'.join(path_parts[:-1]) path = path_parts[-1] return cls(host=host, port=port, db=path, user=user, password=password, **args) def _setAutoCommit(self, conn, auto): # psycopg2 does not have an autocommit method. if hasattr(conn, 'autocommit'): try: conn.autocommit(auto) except TypeError: conn.autocommit = auto def makeConnection(self): try: if self.driver in ('odbc', 'pyodbc', 'pypyodbc'): self.debugWriter.write( "ODBC connect string: " + self.odbc_conn_str) conn = self.module.connect(self.odbc_conn_str) elif self.use_dsn: conn = self.module.connect(self.dsn) else: conn = self.module.connect(**self.dsn_dict) except self.module.OperationalError as e: raise dberrors.OperationalError( ErrorMessage(e, "used connection string %r" % self.dsn)) # For printDebug in _executeRetry self._connectionNumbers[id(conn)] = self._connectionCount if self.autoCommit: self._setAutoCommit(conn, 1) c = conn.cursor() if self.schema: self._executeRetry(conn, c, "SET search_path TO " + self.schema) dbEncoding = self.dbEncoding if dbEncoding: if self.driver in ('odbc', 'pyodbc'): conn.setdecoding(self.module.SQL_CHAR, encoding=dbEncoding) conn.setdecoding(self.module.SQL_WCHAR, encoding=dbEncoding) if PY2: conn.setencoding(str, encoding=dbEncoding) conn.setencoding(unicode, encoding=dbEncoding) # noqa else: conn.setencoding(encoding=dbEncoding) self._executeRetry(conn, c, "SET client_encoding TO '%s'" % dbEncoding) return conn def _executeRetry(self, conn, cursor, query): if self.debug: self.printDebug(conn, query, 'QueryR') try: return cursor.execute(query) except self.module.OperationalError as e: raise dberrors.OperationalError(ErrorMessage(e)) except self.module.IntegrityError as e: msg = ErrorMessage(e) if getattr(e, 'code', -1) == '23505' or \ getattr(e, 'pgcode', -1) == '23505' or \ getattr(e, 'sqlstate', -1) == '23505' or \ e.args[0] == '23505': raise dberrors.DuplicateEntryError(msg) else: raise dberrors.IntegrityError(msg) except self.module.InternalError as e: raise dberrors.InternalError(ErrorMessage(e)) except self.module.ProgrammingError as e: msg = ErrorMessage(e) if (len(e.args) >= 2) and e.args[1] == '23505': raise dberrors.DuplicateEntryError(msg) else: raise dberrors.ProgrammingError(msg) except self.module.DataError as e: raise dberrors.DataError(ErrorMessage(e)) except self.module.NotSupportedError as e: raise dberrors.NotSupportedError(ErrorMessage(e)) except self.module.DatabaseError as e: msg = ErrorMessage(e) if 'duplicate key value violates unique constraint' in msg: raise dberrors.DuplicateEntryError(msg) else: raise dberrors.DatabaseError(msg) except self.module.InterfaceError as e: raise dberrors.InterfaceError(ErrorMessage(e)) except self.module.Warning as e: raise Warning(ErrorMessage(e)) except self.module.Error as e: raise dberrors.Error(ErrorMessage(e)) def _queryInsertID(self, conn, soInstance, id, names, values): table = soInstance.sqlmeta.table idName = soInstance.sqlmeta.idName c = conn.cursor() if id is None and self.driver in ('py-postgresql', 'pypostgresql'): sequenceName = soInstance.sqlmeta.idSequence or \ '%s_%s_seq' % (table, idName) self._executeRetry(conn, c, "SELECT NEXTVAL('%s')" % sequenceName) id = c.fetchone()[0] if id is not None: names = [idName] + names values = [id] + values if names and values: q = self._insertSQL(table, names, values) else: q = "INSERT INTO %s DEFAULT VALUES" % table if id is None: q += " RETURNING " + idName if self.debug: self.printDebug(conn, q, 'QueryIns') self._executeRetry(conn, c, q) if id is None: id = c.fetchone()[0] if self.debugOutput: self.printDebug(conn, id, 'QueryIns', 'result') return id @classmethod def _queryAddLimitOffset(cls, query, start, end): if not start: return "%s LIMIT %i" % (query, end) if not end: return "%s OFFSET %i" % (query, start) return "%s LIMIT %i OFFSET %i" % (query, end - start, start) def createColumn(self, soClass, col): return col.postgresCreateSQL() def createReferenceConstraint(self, soClass, col): return col.postgresCreateReferenceConstraint() def createIndexSQL(self, soClass, index): return index.postgresCreateIndexSQL(soClass) def createIDColumn(self, soClass): key_type = {int: "SERIAL", str: "TEXT"}[soClass.sqlmeta.idType] return '%s %s PRIMARY KEY' % (soClass.sqlmeta.idName, key_type) def dropTable(self, tableName, cascade=False): self.query("DROP TABLE %s %s" % (tableName, cascade and 'CASCADE' or '')) def joinSQLType(self, join): return 'INT NOT NULL' def tableExists(self, tableName): result = self.queryOne( "SELECT COUNT(relname) FROM pg_class WHERE relname = %s" % self.sqlrepr(tableName)) return result[0] def addColumn(self, tableName, column): self.query('ALTER TABLE %s ADD COLUMN %s' % (tableName, column.postgresCreateSQL())) def delColumn(self, sqlmeta, column): self.query('ALTER TABLE %s DROP COLUMN %s' % (sqlmeta.table, column.dbName)) def columnsFromSchema(self, tableName, soClass): keyQuery = """ SELECT pg_catalog.pg_get_constraintdef(oid) as condef FROM pg_catalog.pg_constraint r WHERE r.conrelid = %s::regclass AND r.contype = 'f'""" colQuery = """ SELECT a.attname, pg_catalog.format_type(a.atttypid, a.atttypmod), a.attnotnull, (SELECT substring(d.adsrc for 128) FROM pg_catalog.pg_attrdef d WHERE d.adrelid=a.attrelid AND d.adnum = a.attnum) FROM pg_catalog.pg_attribute a WHERE a.attrelid =%s::regclass AND a.attnum > 0 AND NOT a.attisdropped ORDER BY a.attnum""" primaryKeyQuery = """ SELECT pg_index.indisprimary, pg_catalog.pg_get_indexdef(pg_index.indexrelid) FROM pg_catalog.pg_class c, pg_catalog.pg_class c2, pg_catalog.pg_index AS pg_index WHERE c.relname = %s AND c.oid = pg_index.indrelid AND pg_index.indexrelid = c2.oid AND pg_index.indisprimary """ otherKeyQuery = """ SELECT pg_index.indisprimary, pg_catalog.pg_get_indexdef(pg_index.indexrelid) FROM pg_catalog.pg_class c, pg_catalog.pg_class c2, pg_catalog.pg_index AS pg_index WHERE c.relname = %s AND c.oid = pg_index.indrelid AND pg_index.indexrelid = c2.oid AND NOT pg_index.indisprimary """ keyData = self.queryAll(keyQuery % self.sqlrepr(tableName)) keyRE = re.compile(r"\((.+)\) REFERENCES (.+)\(") keymap = {} for (condef,) in keyData: match = keyRE.search(condef) if match: field, reftable = match.groups() keymap[field] = reftable.capitalize() primaryData = self.queryAll(primaryKeyQuery % self.sqlrepr(tableName)) primaryRE = re.compile(r'CREATE .*? USING .* \((.+?)\)') primaryKey = None for isPrimary, indexDef in primaryData: match = primaryRE.search(indexDef) assert match, "Unparseable contraint definition: %r" % indexDef assert primaryKey is None, \ "Already found primary key (%r), " \ "then found: %r" % (primaryKey, indexDef) primaryKey = match.group(1) if primaryKey is None: # VIEWs don't have PRIMARY KEYs - accept help from user primaryKey = soClass.sqlmeta.idName assert primaryKey, "No primary key found in table %r" % tableName if primaryKey.startswith('"'): assert primaryKey.endswith('"') primaryKey = primaryKey[1:-1] otherData = self.queryAll(otherKeyQuery % self.sqlrepr(tableName)) otherRE = primaryRE otherKeys = [] for isPrimary, indexDef in otherData: match = otherRE.search(indexDef) assert match, "Unparseable constraint definition: %r" % indexDef otherKey = match.group(1) if otherKey.startswith('"'): assert otherKey.endswith('"') otherKey = otherKey[1:-1] otherKeys.append(otherKey) colData = self.queryAll(colQuery % self.sqlrepr(tableName)) results = [] if self.unicodeCols: client_encoding = self.queryOne("SHOW client_encoding")[0] for field, t, notnull, defaultstr in colData: if field == primaryKey: continue if field in keymap: colClass = col.ForeignKey kw = {'foreignKey': soClass.sqlmeta.style. dbTableToPythonClass(keymap[field])} name = soClass.sqlmeta.style.dbColumnToPythonAttr(field) if name.endswith('ID'): name = name[:-2] kw['name'] = name else: colClass, kw = self.guessClass(t) if self.unicodeCols and colClass is col.StringCol: colClass = col.UnicodeCol kw['dbEncoding'] = client_encoding kw['name'] = soClass.sqlmeta.style.dbColumnToPythonAttr(field) kw['dbName'] = field kw['notNone'] = notnull if defaultstr is not None: kw['default'] = self.defaultFromSchema(colClass, defaultstr) elif not notnull: kw['default'] = None if field in otherKeys: kw['alternateID'] = True results.append(colClass(**kw)) return results def guessClass(self, t): if t.count('point'): # poINT before INT return col.StringCol, {} elif t.count('int'): return col.IntCol, {} elif t.count('varying') or t.count('varchar'): if '(' in t: return col.StringCol, {'length': int(t[t.index('(') + 1:-1])} else: # varchar without length in Postgres means any length return col.StringCol, {} elif t.startswith('character('): return col.StringCol, {'length': int(t[t.index('(') + 1:-1]), 'varchar': False} elif t.count('float') or t.count('real') or t.count('double'): return col.FloatCol, {} elif t == 'text': return col.StringCol, {} elif t.startswith('timestamp'): return col.DateTimeCol, {} elif t.startswith('datetime'): return col.DateTimeCol, {} elif t.startswith('date'): return col.DateCol, {} elif t.startswith('bool'): return col.BoolCol, {} elif t.startswith('bytea'): return col.BLOBCol, {} else: return col.Col, {} def defaultFromSchema(self, colClass, defaultstr): """ If the default can be converted to a python constant, convert it. Otherwise return is as a sqlbuilder constant. """ if colClass == col.BoolCol: if defaultstr == 'false': return False elif defaultstr == 'true': return True return getattr(sqlbuilder.const, defaultstr) def _createOrDropDatabase(self, op="CREATE"): # We have to connect to *some* database, so we'll connect to # template1, which is a common open database. # @@: This doesn't use self.use_dsn or self.dsn_dict if self.driver == 'pygresql': dsn = '%s:template1:%s:%s' % ( self.host or '', self.user or '', self.password or '') else: dsn = 'dbname=template1' if self.user: dsn += ' user=%s' % self.user if self.password: dsn += ' password=%s' % self.password if self.host: dsn += ' host=%s' % self.host conn = self.module.connect(dsn) cur = conn.cursor() # We must close the transaction with a commit so that # the CREATE DATABASE can work (which can't be in a transaction): try: self._executeRetry(conn, cur, 'COMMIT') self._executeRetry(conn, cur, '%s DATABASE %s' % (op, self.db)) finally: cur.close() conn.close() def listTables(self): return [v[0] for v in self.queryAll( """SELECT c.relname FROM pg_catalog.pg_class c LEFT JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace WHERE c.relkind IN ('r','') AND n.nspname NOT IN ('pg_catalog', 'pg_toast') AND pg_catalog.pg_table_is_visible(c.oid)""")] def listDatabases(self): return [v[0] for v in self.queryAll("SELECT datname FROM pg_database")] def createEmptyDatabase(self): self._createOrDropDatabase() def dropDatabase(self): self._createOrDropDatabase(op="DROP") # Converter for Binary types def PsycoBinaryConverter(value, db): assert db == 'postgres' return str(value) if PY2: def escape_bytea(value): return ''.join( ['\\' + (x[1:].rjust(3, '0')) for x in (oct(ord(c)) for c in value)] ) else: def escape_bytea(value): return ''.join( ['\\' + (x[2:].rjust(3, '0')) for x in (oct(ord(c)) for c in value.decode('latin1'))] ) def PostgresBinaryConverter(value, db): assert db == 'postgres' return sqlrepr(escape_bytea(value), db) def OdbcBinaryConverter(value, db): assert db == 'postgres' value = bytes(value) if not PY2: value = value.decode('latin1') return value SQLObject-3.4.0/sqlobject/converters.py0000644000175000017500000001454013014624070017407 0ustar phdphd00000000000000from array import array import datetime from decimal import Decimal import time import sys from .compat import PY2, buffer_type if PY2: from types import ClassType, InstanceType, NoneType else: # Use suitable aliases for now ClassType = type NoneType = type(None) # This is may not be what we want in all cases, but will do for now InstanceType = object try: from mx.DateTime import DateTimeType, DateTimeDeltaType except ImportError: try: from DateTime import DateTimeType, DateTimeDeltaType except ImportError: DateTimeType = None DateTimeDeltaType = None try: import Sybase NumericType = Sybase.NumericType except ImportError: NumericType = None ######################################## # Quoting ######################################## sqlStringReplace = [ ("'", "''"), ('\\', '\\\\'), ('\000', '\\0'), ('\b', '\\b'), ('\n', '\\n'), ('\r', '\\r'), ('\t', '\\t'), ] class ConverterRegistry: def __init__(self): self.basic = {} self.klass = {} def registerConverter(self, typ, func): if type(typ) is ClassType: self.klass[typ] = func else: self.basic[typ] = func if PY2: def lookupConverter(self, value, default=None): if type(value) is InstanceType: # lookup on klasses dict return self.klass.get(value.__class__, default) return self.basic.get(type(value), default) else: def lookupConverter(self, value, default=None): # python 3 doesn't have classic classes, so everything's # in self.klass due to comparison order in registerConvertor return self.klass.get(value.__class__, default) converters = ConverterRegistry() registerConverter = converters.registerConverter lookupConverter = converters.lookupConverter def StringLikeConverter(value, db): if isinstance(value, array): try: value = value.tounicode() except ValueError: value = value.tostring() elif isinstance(value, buffer_type): value = str(value) if db in ('mysql', 'postgres', 'rdbhost'): for orig, repl in sqlStringReplace: value = value.replace(orig, repl) elif db in ('sqlite', 'firebird', 'sybase', 'maxdb', 'mssql'): value = value.replace("'", "''") else: assert 0, "Database %s unknown" % db if db in ('postgres', 'rdbhost') and ('\\' in value): return "E'%s'" % value return "'%s'" % value registerConverter(str, StringLikeConverter) if PY2: # noqa for flake8 & python3 registerConverter(unicode, StringLikeConverter) # noqa registerConverter(array, StringLikeConverter) if PY2: registerConverter(bytearray, StringLikeConverter) registerConverter(buffer_type, StringLikeConverter) else: registerConverter(memoryview, StringLikeConverter) def IntConverter(value, db): return repr(int(value)) registerConverter(int, IntConverter) def LongConverter(value, db): return str(value) if sys.version_info[0] < 3: # noqa for flake8 & python3 registerConverter(long, LongConverter) # noqa if NumericType: registerConverter(NumericType, IntConverter) def BoolConverter(value, db): if db in ('postgres', 'rdbhost'): if value: return "'t'" else: return "'f'" else: if value: return '1' else: return '0' registerConverter(bool, BoolConverter) def FloatConverter(value, db): return repr(value) registerConverter(float, FloatConverter) if DateTimeType: def mxDateTimeConverter(value, db): return "'%s'" % value.strftime("%Y-%m-%d %H:%M:%S") registerConverter(DateTimeType, mxDateTimeConverter) def mxTimeConverter(value, db): return "'%s'" % value.strftime("%H:%M:%S") registerConverter(DateTimeDeltaType, mxTimeConverter) def NoneConverter(value, db): return "NULL" registerConverter(NoneType, NoneConverter) def SequenceConverter(value, db): return "(%s)" % ", ".join([sqlrepr(v, db) for v in value]) registerConverter(tuple, SequenceConverter) registerConverter(list, SequenceConverter) registerConverter(dict, SequenceConverter) registerConverter(set, SequenceConverter) registerConverter(frozenset, SequenceConverter) if hasattr(time, 'struct_time'): def StructTimeConverter(value, db): return time.strftime("'%Y-%m-%d %H:%M:%S'", value) registerConverter(time.struct_time, StructTimeConverter) def DateTimeConverter(value, db): return "'%04d-%02d-%02d %02d:%02d:%02d'" % ( value.year, value.month, value.day, value.hour, value.minute, value.second) def DateTimeConverterMS(value, db): return "'%04d-%02d-%02d %02d:%02d:%02d.%06d'" % ( value.year, value.month, value.day, value.hour, value.minute, value.second, value.microsecond) registerConverter(datetime.datetime, DateTimeConverterMS) def DateConverter(value, db): return "'%04d-%02d-%02d'" % (value.year, value.month, value.day) registerConverter(datetime.date, DateConverter) def TimeConverter(value, db): return "'%02d:%02d:%02d'" % (value.hour, value.minute, value.second) def TimeConverterMS(value, db): return "'%02d:%02d:%02d.%06d'" % (value.hour, value.minute, value.second, value.microsecond) registerConverter(datetime.time, TimeConverterMS) def DecimalConverter(value, db): return value.to_eng_string() registerConverter(Decimal, DecimalConverter) def TimedeltaConverter(value, db): return """INTERVAL '%d days %d seconds'""" % \ (value.days, value.seconds) registerConverter(datetime.timedelta, TimedeltaConverter) def sqlrepr(obj, db=None): try: reprFunc = obj.__sqlrepr__ except AttributeError: converter = lookupConverter(obj) if converter is None: raise ValueError("Unknown SQL builtin type: %s for %s" % (type(obj), repr(obj))) return converter(obj, db) else: return reprFunc(db) def quote_str(s, db): if db in ('postgres', 'rdbhost') and ('\\' in s): return "E'%s'" % s return "'%s'" % s def unquote_str(s): if s[:2].upper().startswith("E'") and s.endswith("'"): return s[2:-1] elif s.startswith("'") and s.endswith("'"): return s[1:-1] else: return s SQLObject-3.4.0/sqlobject/manager/0000755000175000017500000000000013141371614016255 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/manager/__init__.py0000644000175000017500000000000210223355362020356 0ustar phdphd00000000000000# SQLObject-3.4.0/sqlobject/manager/command.py0000755000175000017500000014145412503207731020260 0ustar phdphd00000000000000#!/usr/bin/env python from __future__ import print_function import fnmatch import optparse import os import re import sys import textwrap import time import warnings try: from paste.deploy import appconfig except ImportError: appconfig = None import sqlobject from sqlobject import col from sqlobject.classregistry import findClass from sqlobject.declarative import DeclarativeMeta from sqlobject.util import moduleloader from sqlobject.compat import PY2, with_metaclass, string_type # It's not very unsafe to use tempnam like we are doing: warnings.filterwarnings( 'ignore', 'tempnam is a potential security risk.*', RuntimeWarning, '.*command', 28) if PY2: # noqa for flake8 and python 3 input = raw_input # noqa def nowarning_tempnam(*args, **kw): return os.tempnam(*args, **kw) class SQLObjectVersionTable(sqlobject.SQLObject): """ This table is used to store information about the database and its version (used with record and update commands). """ class sqlmeta: table = 'sqlobject_db_version' version = col.StringCol() updated = col.DateTimeCol(default=col.DateTimeCol.now) def db_differences(soClass, conn): """ Returns the differences between a class and the table in a connection. Returns [] if no differences are found. This function does the best it can; it can miss many differences. """ # @@: Repeats a lot from CommandStatus.command, but it's hard # to actually factor out the display logic. Or I'm too lazy # to do so. diffs = [] if not conn.tableExists(soClass.sqlmeta.table): if soClass.sqlmeta.columns: diffs.append('Does not exist in database') else: try: columns = conn.columnsFromSchema(soClass.sqlmeta.table, soClass) except AttributeError: # Database does not support reading columns pass else: existing = {} for _col in columns: _col = _col.withClass(soClass) existing[_col.dbName] = _col missing = {} for _col in soClass.sqlmeta.columnList: if _col.dbName in existing: del existing[_col.dbName] else: missing[_col.dbName] = _col for _col in existing.values(): diffs.append('Database has extra column: %s' % _col.dbName) for _col in missing.values(): diffs.append('Database missing column: %s' % _col.dbName) return diffs class CommandRunner(object): def __init__(self): self.commands = {} self.command_aliases = {} def run(self, argv): invoked_as = argv[0] args = argv[1:] for i in range(len(args)): if not args[i].startswith('-'): # this must be a command command = args[i].lower() del args[i] break else: # no command found self.invalid('No COMMAND given (try "%s help")' % os.path.basename(invoked_as)) real_command = self.command_aliases.get(command, command) if real_command not in self.commands.keys(): self.invalid('COMMAND %s unknown' % command) runner = self.commands[real_command]( invoked_as, command, args, self) runner.run() def register(self, command): name = command.name self.commands[name] = command for alias in command.aliases: self.command_aliases[alias] = name def invalid(self, msg, code=2): print(msg) sys.exit(code) the_runner = CommandRunner() register = the_runner.register def standard_parser(connection=True, simulate=True, interactive=False, find_modules=True): parser = optparse.OptionParser() parser.add_option('-v', '--verbose', help='Be verbose (multiple times for more verbosity)', action='count', dest='verbose', default=0) if simulate: parser.add_option('-n', '--simulate', help="Don't actually do anything (implies -v)", action='store_true', dest='simulate') if connection: parser.add_option('-c', '--connection', help="The database connection URI", metavar='URI', dest='connection_uri') parser.add_option('-f', '--config-file', help="The Paste config file " "that contains the database URI (in the database key)", metavar="FILE", dest="config_file") if find_modules: parser.add_option('-m', '--module', help="Module in which to find SQLObject classes", action='append', metavar='MODULE', dest='modules', default=[]) parser.add_option('-p', '--package', help="Package to search for SQLObject classes", action="append", metavar="PACKAGE", dest="packages", default=[]) parser.add_option('--class', help="Select only named classes (wildcards allowed)", action="append", metavar="NAME", dest="class_matchers", default=[]) if interactive: parser.add_option('-i', '--interactive', help="Ask before doing anything " "(use twice to be more careful)", action="count", dest="interactive", default=0) parser.add_option('--egg', help="Select modules from the given Egg, " "using sqlobject.txt", action="append", metavar="EGG_SPEC", dest="eggs", default=[]) return parser class Command(with_metaclass(DeclarativeMeta, object)): min_args = 0 min_args_error = 'You must provide at least %(min_args)s arguments' max_args = 0 max_args_error = 'You must provide no more than %(max_args)s arguments' aliases = () required_args = [] description = None help = '' def orderClassesByDependencyLevel(self, classes): """ Return classes ordered by their depth in the class dependency tree (this is *not* the inheritance tree), from the top level (independant) classes to the deepest level. The dependency tree is defined by the foreign key relations. """ # @@: written as a self-contained function for now, to prevent # having to modify any core SQLObject component and namespace # contamination. # yemartin - 2006-08-08 class SQLObjectCircularReferenceError(Exception): pass def findReverseDependencies(cls): """ Return a list of classes that cls depends on. Note that "depends on" here mean "has a foreign key pointing to". """ depended = [] for _col in cls.sqlmeta.columnList: if _col.foreignKey: other = findClass(_col.foreignKey, _col.soClass.sqlmeta.registry) if (other is not cls) and (other not in depended): depended.append(other) return depended # Cache to save already calculated dependency levels. dependency_levels = {} def calculateDependencyLevel(cls, dependency_stack=[]): """ Recursively calculate the dependency level of cls, while using the dependency_stack to detect any circular reference. """ # Return value from the cache if already calculated if cls in dependency_levels: return dependency_levels[cls] # Check for circular references if cls in dependency_stack: dependency_stack.append(cls) raise SQLObjectCircularReferenceError( "Found a circular reference: %s " % (' --> '.join([x.__name__ for x in dependency_stack]))) dependency_stack.append(cls) # Recursively inspect dependent classes. depended = findReverseDependencies(cls) if depended: level = max([calculateDependencyLevel(x, dependency_stack) for x in depended]) + 1 else: level = 0 dependency_levels[cls] = level return level # Now simply calculate and sort by dependency levels: try: sorter = [] for cls in classes: level = calculateDependencyLevel(cls) sorter.append((level, cls)) sorter.sort(key=lambda x: x[0]) ordered_classes = [cls for _, cls in sorter] except SQLObjectCircularReferenceError as msg: # Failsafe: return the classes as-is if a circular reference # prevented the dependency levels to be calculated. print("Warning: a circular reference was detected in the " "model. Unable to sort the classes by dependency: they " "will be treated in alphabetic order. This may or may " "not work depending on your database backend. " "The error was:\n%s" % msg) return classes return ordered_classes def __classinit__(cls, new_args): if cls.__bases__ == (object,): # This abstract base class return register(cls) def __init__(self, invoked_as, command_name, args, runner): self.invoked_as = invoked_as self.command_name = command_name self.raw_args = args self.runner = runner def run(self): self.parser.usage = "%%prog [options]\n%s" % self.summary if self.help: help = textwrap.fill( self.help, int(os.environ.get('COLUMNS', 80)) - 4) self.parser.usage += '\n' + help self.parser.prog = '%s %s' % ( os.path.basename(self.invoked_as), self.command_name) if self.description: self.parser.description = self.description self.options, self.args = self.parser.parse_args(self.raw_args) if (getattr(self.options, 'simulate', False) and not self.options.verbose): self.options.verbose = 1 if self.min_args is not None and len(self.args) < self.min_args: self.runner.invalid( self.min_args_error % {'min_args': self.min_args, 'actual_args': len(self.args)}) if self.max_args is not None and len(self.args) > self.max_args: self.runner.invalid( self.max_args_error % {'max_args': self.max_args, 'actual_args': len(self.args)}) for var_name, option_name in self.required_args: if not getattr(self.options, var_name, None): self.runner.invalid( 'You must provide the option %s' % option_name) conf = self.config() if conf and conf.get('sys_path'): update_sys_path(conf['sys_path'], self.options.verbose) if conf and conf.get('database'): conn = sqlobject.connectionForURI(conf['database']) sqlobject.sqlhub.processConnection = conn for egg_spec in getattr(self.options, 'eggs', []): self.load_options_from_egg(egg_spec) self.command() def classes(self, require_connection=True, require_some=False): all = [] for module_name in self.options.modules: all.extend(self.classes_from_module( moduleloader.load_module(module_name))) for package_name in self.options.packages: all.extend(self.classes_from_package(package_name)) for egg_spec in self.options.eggs: all.extend(self.classes_from_egg(egg_spec)) if self.options.class_matchers: filtered = [] for soClass in all: name = soClass.__name__ for matcher in self.options.class_matchers: if fnmatch.fnmatch(name, matcher): filtered.append(soClass) break all = filtered conn = self.connection() if conn: for soClass in all: soClass._connection = conn else: missing = [] for soClass in all: try: if not soClass._connection: missing.append(soClass) except AttributeError: missing.append(soClass) if missing and require_connection: self.runner.invalid( 'These classes do not have connections set:\n * %s\n' 'You must indicate --connection=URI' % '\n * '.join([soClass.__name__ for soClass in missing])) if require_some and not all: print('No classes found!') if self.options.modules: print('Looked in modules: %s' % ', '.join(self.options.modules)) else: print('No modules specified') if self.options.packages: print('Looked in packages: %s' % ', '.join(self.options.packages)) else: print('No packages specified') if self.options.class_matchers: print('Matching class pattern: %s' % self.options.class_matches) if self.options.eggs: print('Looked in eggs: %s' % ', '.join(self.options.eggs)) else: print('No eggs specified') sys.exit(1) return self.orderClassesByDependencyLevel(all) def classes_from_module(self, module): all = [] if hasattr(module, 'soClasses'): for name_or_class in module.soClasses: if isinstance(name_or_class, str): name_or_class = getattr(module, name_or_class) all.append(name_or_class) else: for name in dir(module): value = getattr(module, name) if (isinstance(value, type) and issubclass(value, sqlobject.SQLObject) and value.__module__ == module.__name__): all.append(value) return all def connection(self): config = self.config() if config is not None: assert config.get('database'), ( "No database variable found in config file %s" % self.options.config_file) return sqlobject.connectionForURI(config['database']) elif getattr(self.options, 'connection_uri', None): return sqlobject.connectionForURI(self.options.connection_uri) else: return None def config(self): if not getattr(self.options, 'config_file', None): return None config_file = self.options.config_file if appconfig: if (not config_file.startswith('egg:') and not config_file.startswith('config:')): config_file = 'config:' + config_file return appconfig(config_file, relative_to=os.getcwd()) else: return self.ini_config(config_file) def ini_config(self, conf_fn): conf_section = 'main' if '#' in conf_fn: conf_fn, conf_section = conf_fn.split('#', 1) try: from ConfigParser import ConfigParser except ImportError: from configparser import ConfigParser p = ConfigParser() # Case-sensitive: p.optionxform = str if not os.path.exists(conf_fn): # Stupid RawConfigParser doesn't give an error for # non-existant files: raise OSError( "Config file %s does not exist" % self.options.config_file) p.read([conf_fn]) p._defaults.setdefault( 'here', os.path.dirname(os.path.abspath(conf_fn))) possible_sections = [] for section in p.sections(): name = section.strip().lower() if (conf_section == name or (conf_section == name.split(':')[-1] and name.split(':')[0] in ('app', 'application'))): possible_sections.append(section) if not possible_sections: raise OSError( "Config file %s does not have a section [%s] or [*:%s]" % (conf_fn, conf_section, conf_section)) if len(possible_sections) > 1: raise OSError( "Config file %s has multiple sections matching %s: %s" % (conf_fn, conf_section, ', '.join(possible_sections))) config = {} for op in p.options(possible_sections[0]): config[op] = p.get(possible_sections[0], op) return config def classes_from_package(self, package_name): all = [] package = moduleloader.load_module(package_name) package_dir = os.path.dirname(package.__file__) def find_classes_in_file(arg, dir_name, filenames): if dir_name.startswith('.svn'): return filenames = filter( lambda fname: fname.endswith('.py') and fname != '__init__.py', filenames) for fname in filenames: module_name = os.path.join(dir_name, fname) module_name = module_name[module_name.find(package_name):] module_name = module_name.replace(os.path.sep, '.')[:-3] try: module = moduleloader.load_module(module_name) except ImportError as err: if self.options.verbose: print('Could not import module "%s". ' 'Error was : "%s"' % (module_name, err)) continue except Exception as exc: if self.options.verbose: print('Unknown exception while processing module ' '"%s" : "%s"' % (module_name, exc)) continue classes = self.classes_from_module(module) all.extend(classes) for dirpath, dirnames, filenames in os.walk(package_dir): find_classes_in_file(None, dirpath, dirnames + filenames) return all def classes_from_egg(self, egg_spec): modules = [] dist, conf = self.config_from_egg(egg_spec, warn_no_sqlobject=True) for mod in conf.get('db_module', '').split(','): mod = mod.strip() if not mod: continue if self.options.verbose: print('Looking in module %s' % mod) modules.extend(self.classes_from_module( moduleloader.load_module(mod))) return modules def load_options_from_egg(self, egg_spec): dist, conf = self.config_from_egg(egg_spec) if (hasattr(self.options, 'output_dir') and not self.options.output_dir and conf.get('history_dir')): dir = conf['history_dir'] dir = dir.replace('$base', dist.location) self.options.output_dir = dir def config_from_egg(self, egg_spec, warn_no_sqlobject=True): import pkg_resources dist = pkg_resources.get_distribution(egg_spec) if not dist.has_metadata('sqlobject.txt'): if warn_no_sqlobject: print('No sqlobject.txt in %s egg info' % egg_spec) return None, {} result = {} for line in dist.get_metadata_lines('sqlobject.txt'): line = line.strip() if not line or line.startswith('#'): continue name, value = line.split('=', 1) name = name.strip().lower() if name in result: print('Warning: %s appears more than once ' 'in sqlobject.txt' % name) result[name.strip().lower()] = value.strip() return dist, result def command(self): raise NotImplementedError def _get_prog_name(self): return os.path.basename(self.invoked_as) prog_name = property(_get_prog_name) def ask(self, prompt, safe=False, default=True): if self.options.interactive >= 2: default = safe if default: prompt += ' [Y/n]? ' else: prompt += ' [y/N]? ' while 1: response = input(prompt).strip() if not response.strip(): return default if response and response[0].lower() in ('y', 'n'): return response[0].lower() == 'y' print('Y or N please') def shorten_filename(self, fn): """ Shortens a filename to make it relative to the current directory (if it can). For display purposes. """ if fn.startswith(os.getcwd() + '/'): fn = fn[len(os.getcwd()) + 1:] return fn def open_editor(self, pretext, breaker=None, extension='.txt'): """ Open an editor with the given text. Return the new text, or None if no edits were made. If given, everything after `breaker` will be ignored. """ fn = nowarning_tempnam() + extension f = open(fn, 'w') f.write(pretext) f.close() print('$EDITOR %s' % fn) os.system('$EDITOR %s' % fn) f = open(fn, 'r') content = f.read() f.close() if breaker: content = content.split(breaker)[0] pretext = pretext.split(breaker)[0] if content == pretext or not content.strip(): return None return content class CommandSQL(Command): name = 'sql' summary = 'Show SQL CREATE statements' parser = standard_parser(simulate=False) def command(self): classes = self.classes() allConstraints = [] for cls in classes: if self.options.verbose >= 1: print('-- %s from %s' % ( cls.__name__, cls.__module__)) createSql, constraints = cls.createTableSQL() print(createSql.strip() + ';\n') allConstraints.append(constraints) for constraints in allConstraints: if constraints: for constraint in constraints: if constraint: print(constraint.strip() + ';\n') class CommandList(Command): name = 'list' summary = 'Show all SQLObject classes found' parser = standard_parser(simulate=False, connection=False) def command(self): if self.options.verbose >= 1: print('Classes found:') classes = self.classes(require_connection=False) for soClass in classes: print('%s.%s' % (soClass.__module__, soClass.__name__)) if self.options.verbose >= 1: print(' Table: %s' % soClass.sqlmeta.table) class CommandCreate(Command): name = 'create' summary = 'Create tables' parser = standard_parser(interactive=True) parser.add_option('--create-db', action='store_true', dest='create_db', help="Create the database") def command(self): v = self.options.verbose created = 0 existing = 0 dbs_created = [] constraints = {} for soClass in self.classes(require_some=True): if (self.options.create_db and soClass._connection not in dbs_created): if not self.options.simulate: try: soClass._connection.createEmptyDatabase() except soClass._connection.module.ProgrammingError as e: if str(e).find('already exists') != -1: print('Database already exists') else: raise else: print('(simulating; cannot create database)') dbs_created.append(soClass._connection) if soClass._connection not in constraints.keys(): constraints[soClass._connection] = [] exists = soClass._connection.tableExists(soClass.sqlmeta.table) if v >= 1: if exists: existing += 1 print('%s already exists.' % soClass.__name__) else: print('Creating %s' % soClass.__name__) if v >= 2: sql, extra = soClass.createTableSQL() print(sql) if (not self.options.simulate and not exists): if self.options.interactive: if self.ask('Create %s' % soClass.__name__): created += 1 tableConstraints = soClass.createTable( applyConstraints=False) if tableConstraints: constraints[soClass._connection].append( tableConstraints) else: print('Cancelled') else: created += 1 tableConstraints = soClass.createTable( applyConstraints=False) if tableConstraints: constraints[soClass._connection].append( tableConstraints) for connection in constraints.keys(): if v >= 2: print('Creating constraints') for constraintList in constraints[connection]: for constraint in constraintList: if constraint: connection.query(constraint) if v >= 1: print('%i tables created (%i already exist)' % ( created, existing)) class CommandDrop(Command): name = 'drop' summary = 'Drop tables' parser = standard_parser(interactive=True) def command(self): v = self.options.verbose dropped = 0 not_existing = 0 for soClass in reversed(self.classes()): exists = soClass._connection.tableExists(soClass.sqlmeta.table) if v >= 1: if exists: print('Dropping %s' % soClass.__name__) else: not_existing += 1 print('%s does not exist.' % soClass.__name__) if (not self.options.simulate and exists): if self.options.interactive: if self.ask('Drop %s' % soClass.__name__): dropped += 1 soClass.dropTable() else: print('Cancelled') else: dropped += 1 soClass.dropTable() if v >= 1: print('%i tables dropped (%i didn\'t exist)' % ( dropped, not_existing)) class CommandStatus(Command): name = 'status' summary = 'Show status of classes vs. database' help = ('This command checks the SQLObject definition and checks if ' 'the tables in the database match. It can always test for ' 'missing tables, and on some databases can test for the ' 'existance of other tables. Column types are not currently ' 'checked.') parser = standard_parser(simulate=False) def print_class(self, soClass): if self.printed: return self.printed = True print('Checking %s...' % soClass.__name__) def command(self): good = 0 bad = 0 missing_tables = 0 columnsFromSchema_warning = False for soClass in self.classes(require_some=True): conn = soClass._connection self.printed = False if self.options.verbose: self.print_class(soClass) if not conn.tableExists(soClass.sqlmeta.table): self.print_class(soClass) print(' Does not exist in database') missing_tables += 1 continue try: columns = conn.columnsFromSchema(soClass.sqlmeta.table, soClass) except AttributeError: if not columnsFromSchema_warning: print('Database does not support reading columns') columnsFromSchema_warning = True good += 1 continue except AssertionError as e: print('Cannot read db table %s: %s' % ( soClass.sqlmeta.table, e)) continue existing = {} for _col in columns: _col = _col.withClass(soClass) existing[_col.dbName] = _col missing = {} for _col in soClass.sqlmeta.columnList: if _col.dbName in existing: del existing[_col.dbName] else: missing[_col.dbName] = _col if existing: self.print_class(soClass) for _col in existing.values(): print(' Database has extra column: %s' % _col.dbName) if missing: self.print_class(soClass) for _col in missing.values(): print(' Database missing column: %s' % _col.dbName) if existing or missing: bad += 1 else: good += 1 if self.options.verbose: print('%i in sync; %i out of sync; %i not in database' % ( good, bad, missing_tables)) class CommandHelp(Command): name = 'help' summary = 'Show help' parser = optparse.OptionParser() max_args = 1 def command(self): if self.args: the_runner.run([self.invoked_as, self.args[0], '-h']) else: print('Available commands:') print(' (use "%s help COMMAND" or "%s COMMAND -h" ' % ( self.prog_name, self.prog_name)) print(' for more information)') items = sorted(the_runner.commands.items()) max_len = max([len(cn) for cn, c in items]) for command_name, command in items: print('%s:%s %s' % (command_name, ' ' * (max_len - len(command_name)), command.summary)) if command.aliases: print('%s (Aliases: %s)' % ( ' ' * max_len, ', '.join(command.aliases))) class CommandExecute(Command): name = 'execute' summary = 'Execute SQL statements' help = ('Runs SQL statements directly in the database, with no ' 'intervention. Useful when used with a configuration file. ' 'Each argument is executed as an individual statement.') parser = standard_parser(find_modules=False) parser.add_option('--stdin', help="Read SQL from stdin " "(normally takes SQL from the command line)", dest="use_stdin", action="store_true") max_args = None def command(self): args = self.args if self.options.use_stdin: if self.options.verbose: print("Reading additional SQL from stdin " "(Ctrl-D or Ctrl-Z to finish)...") args.append(sys.stdin.read()) self.conn = self.connection().getConnection() self.cursor = self.conn.cursor() for sql in args: self.execute_sql(sql) def execute_sql(self, sql): if self.options.verbose: print(sql) try: self.cursor.execute(sql) except Exception as e: if not self.options.verbose: print(sql) print("****Error:") print(' ', e) return desc = self.cursor.description rows = self.cursor.fetchall() if self.options.verbose: if not self.cursor.rowcount: print("No rows accessed") else: print("%i rows accessed" % self.cursor.rowcount) if desc: for (name, type_code, display_size, internal_size, precision, scale, null_ok) in desc: sys.stdout.write("%s\t" % name) sys.stdout.write("\n") for row in rows: for _col in row: sys.stdout.write("%r\t" % _col) sys.stdout.write("\n") print() class CommandRecord(Command): name = 'record' summary = 'Record historical information about the database status' help = ('Record state of table definitions. The state of each ' 'table is written out to a separate file in a directory, ' 'and that directory forms a "version". A table is also ' 'added to your database (%s) that reflects the version the ' 'database is currently at. Use the upgrade command to ' 'sync databases with code.' % SQLObjectVersionTable.sqlmeta.table) parser = standard_parser() parser.add_option('--output-dir', help="Base directory for recorded definitions", dest="output_dir", metavar="DIR", default=None) parser.add_option('--no-db-record', help="Don't record version to database", dest="db_record", action="store_false", default=True) parser.add_option('--force-create', help="Create a new version even if appears to be " "identical to the last version", action="store_true", dest="force_create") parser.add_option('--name', help="The name to append to the version. The " "version should sort after previous versions (so " "any versions from the same day should come " "alphabetically before this version).", dest="version_name", metavar="NAME") parser.add_option('--force-db-version', help="Update the database version, and include no " "database information. This is for databases that " "were developed without any interaction with " "this tool, to create a 'beginning' revision.", metavar="VERSION_NAME", dest="force_db_version") parser.add_option('--edit', help="Open an editor for the upgrader in the last " "version (using $EDITOR).", action="store_true", dest="open_editor") version_regex = re.compile(r'^\d\d\d\d-\d\d-\d\d') def command(self): if self.options.force_db_version: self.command_force_db_version() return v = self.options.verbose sim = self.options.simulate classes = self.classes() if not classes: print("No classes found!") return output_dir = self.find_output_dir() version = os.path.basename(output_dir) print("Creating version %s" % version) conns = [] files = {} for cls in self.classes(): dbName = cls._connection.dbName if cls._connection not in conns: conns.append(cls._connection) fn = os.path.join(cls.__name__ + '_' + dbName + '.sql') if sim: continue create, constraints = cls.createTableSQL() if constraints: constraints = '\n-- Constraints:\n%s\n' % ( '\n'.join(constraints)) else: constraints = '' files[fn] = ''.join([ '-- Exported definition from %s\n' % time.strftime('%Y-%m-%dT%H:%M:%S'), '-- Class %s.%s\n' % (cls.__module__, cls.__name__), '-- Database: %s\n' % dbName, create.strip(), '\n', constraints]) last_version_dir = self.find_last_version() if last_version_dir and not self.options.force_create: if v > 1: print("Checking %s to see if it is current" % last_version_dir) files_copy = files.copy() for fn in os.listdir(last_version_dir): if not fn.endswith('.sql'): continue if fn not in files_copy: if v > 1: print("Missing file %s" % fn) break f = open(os.path.join(last_version_dir, fn), 'r') content = f.read() f.close() if (self.strip_comments(files_copy[fn]) != self.strip_comments(content)): if v > 1: print("Content does not match: %s" % fn) break del files_copy[fn] else: # No differences so far if not files_copy: # Used up all files print("Current status matches version %s" % os.path.basename(last_version_dir)) return if v > 1: print("Extra files: %s" % ', '.join(files_copy.keys())) if v: print("Current state does not match %s" % os.path.basename(last_version_dir)) if v > 1 and not last_version_dir: print("No last version to check") if not sim: os.mkdir(output_dir) if v: print('Making directory %s' % self.shorten_filename(output_dir)) files = sorted(files.items()) for fn, content in files: if v: print(' Writing %s' % self.shorten_filename(fn)) if not sim: f = open(os.path.join(output_dir, fn), 'w') f.write(content) f.close() if self.options.db_record: all_diffs = [] for cls in self.classes(): for conn in conns: diffs = db_differences(cls, conn) for diff in diffs: if len(conns) > 1: diff = ' (%s).%s: %s' % ( conn.uri(), cls.sqlmeta.table, diff) else: diff = ' %s: %s' % (cls.sqlmeta.table, diff) all_diffs.append(diff) if all_diffs: print('Database does not match schema:') print('\n'.join(all_diffs)) for conn in conns: self.update_db(version, conn) else: all_diffs = [] if self.options.open_editor: if not last_version_dir: print("Cannot edit upgrader because there is no " "previous version") else: breaker = ('-' * 20 + ' lines below this will be ignored ' + '-' * 20) pre_text = breaker + '\n' + '\n'.join(all_diffs) text = self.open_editor('\n\n' + pre_text, breaker=breaker, extension='.sql') if text is not None: fn = os.path.join(last_version_dir, 'upgrade_%s_%s.sql' % (dbName, version)) f = open(fn, 'w') f.write(text) f.close() print('Wrote to %s' % fn) def update_db(self, version, conn): v = self.options.verbose if not conn.tableExists(SQLObjectVersionTable.sqlmeta.table): if v: print('Creating table %s' % SQLObjectVersionTable.sqlmeta.table) sql = SQLObjectVersionTable.createTableSQL(connection=conn) if v > 1: print(sql) if not self.options.simulate: SQLObjectVersionTable.createTable(connection=conn) if not self.options.simulate: SQLObjectVersionTable.clearTable(connection=conn) SQLObjectVersionTable( version=version, connection=conn) def strip_comments(self, sql): lines = [l for l in sql.splitlines() if not l.strip().startswith('--')] return '\n'.join(lines) def base_dir(self): base = self.options.output_dir if base is None: config = self.config() if config is not None: base = config.get('sqlobject_history_dir', '.') else: base = '.' if not os.path.exists(base): print('Creating history directory %s' % self.shorten_filename(base)) if not self.options.simulate: os.makedirs(base) return base def find_output_dir(self): today = time.strftime('%Y-%m-%d', time.localtime()) if self.options.version_name: dir = os.path.join(self.base_dir(), today + '-' + self.options.version_name) if os.path.exists(dir): print("Error, directory already exists: %s" % dir) sys.exit(1) return dir extra = '' while 1: dir = os.path.join(self.base_dir(), today + extra) if not os.path.exists(dir): return dir if not extra: extra = 'a' else: extra = chr(ord(extra) + 1) def find_last_version(self): names = [] for fn in os.listdir(self.base_dir()): if not self.version_regex.search(fn): continue names.append(fn) if not names: return None names.sort() return os.path.join(self.base_dir(), names[-1]) def command_force_db_version(self): v = self.options.verbose sim = self.options.simulate version = self.options.force_db_version if not self.version_regex.search(version): print("Versions must be in the format YYYY-MM-DD...") print("You version %s does not fit this" % version) return version_dir = os.path.join(self.base_dir(), version) if not os.path.exists(version_dir): if v: print('Creating %s' % self.shorten_filename(version_dir)) if not sim: os.mkdir(version_dir) elif v: print('Directory %s exists' % self.shorten_filename(version_dir)) if self.options.db_record: self.update_db(version, self.connection()) class CommandUpgrade(CommandRecord): name = 'upgrade' summary = 'Update the database to a new version (as created by record)' help = ('This command runs scripts (that you write by hand) to ' 'upgrade a database. The database\'s current version is in ' 'the sqlobject_version table (use record --force-db-version ' 'if a database does not have a sqlobject_version table), ' 'and upgrade scripts are in the version directory you are ' 'upgrading FROM, named upgrade_DBNAME_VERSION.sql, like ' '"upgrade_mysql_2004-12-01b.sql".') parser = standard_parser(find_modules=False) parser.add_option('--upgrade-to', help="Upgrade to the given version " "(default: newest version)", dest="upgrade_to", metavar="VERSION") parser.add_option('--output-dir', help="Base directory for recorded definitions", dest="output_dir", metavar="DIR", default=None) upgrade_regex = re.compile(r'^upgrade_([a-z]*)_([^.]*)\.sql$', re.I) def command(self): v = self.options.verbose sim = self.options.simulate if self.options.upgrade_to: version_to = self.options.upgrade_to else: fname = self.find_last_version() if fname is None: print("No version exists, use 'record' command to create one") return version_to = os.path.basename(fname) current = self.current_version() if v: print('Current version: %s' % current) version_list = self.make_plan(current, version_to) if not version_list: print('Database up to date') return if v: print('Plan:') for next_version, upgrader in version_list: print(' Use %s to upgrade to %s' % ( self.shorten_filename(upgrader), next_version)) conn = self.connection() for next_version, upgrader in version_list: f = open(upgrader) sql = f.read() f.close() if v: print("Running:") print(sql) print('-' * 60) if not sim: try: conn.query(sql) except: print("Error in script: %s" % upgrader) raise self.update_db(next_version, conn) print('Done.') def current_version(self): conn = self.connection() if not conn.tableExists(SQLObjectVersionTable.sqlmeta.table): print('No sqlobject_version table!') sys.exit(1) versions = list(SQLObjectVersionTable.select(connection=conn)) if not versions: print('No rows in sqlobject_version!') sys.exit(1) if len(versions) > 1: print('Ambiguous sqlobject_version_table') sys.exit(1) return versions[0].version def make_plan(self, current, dest): if current == dest: return [] dbname = self.connection().dbName next_version, upgrader = self.best_upgrade(current, dest, dbname) if not upgrader: print('No way to upgrade from %s to %s' % (current, dest)) print('(you need a %s/upgrade_%s_%s.sql script)' % (current, dbname, dest)) sys.exit(1) plan = [(next_version, upgrader)] if next_version == dest: return plan else: return plan + self.make_plan(next_version, dest) def best_upgrade(self, current, dest, target_dbname): current_dir = os.path.join(self.base_dir(), current) if self.options.verbose > 1: print('Looking in %s for upgraders' % self.shorten_filename(current_dir)) upgraders = [] for fn in os.listdir(current_dir): match = self.upgrade_regex.search(fn) if not match: if self.options.verbose > 1: print('Not an upgrade script: %s' % fn) continue dbname = match.group(1) version = match.group(2) if dbname != target_dbname: if self.options.verbose > 1: print('Not for this database: %s (want %s)' % ( dbname, target_dbname)) continue if version > dest: if self.options.verbose > 1: print('Version too new: %s (only want %s)' % ( version, dest)) upgraders.append((version, os.path.join(current_dir, fn))) if not upgraders: if self.options.verbose > 1: print('No upgraders found in %s' % current_dir) return None, None upgraders.sort() return upgraders[-1] def update_sys_path(paths, verbose): if isinstance(paths, string_type): paths = [paths] for path in paths: path = os.path.abspath(path) if path not in sys.path: if verbose > 1: print('Adding %s to path' % path) sys.path.insert(0, path) if __name__ == '__main__': the_runner.run(sys.argv) SQLObject-3.4.0/sqlobject/__init__.py0000644000175000017500000000075212464226042016761 0ustar phdphd00000000000000"""SQLObject""" # Do import for namespace # noqa is a directive for flake8 to ignore seemingly unused imports from .__version__ import version, version_info # noqa from .col import * # noqa from .index import * # noqa from .joins import * # noqa from .main import * # noqa from .sqlbuilder import AND, OR, NOT, IN, LIKE, RLIKE, DESC, CONTAINSSTRING, const, func # noqa from .styles import * # noqa from .dbconnection import connectionForURI # noqa from . import dberrors # noqa SQLObject-3.4.0/sqlobject/__version__.py0000644000175000017500000000022013141371273017471 0ustar phdphd00000000000000 version = '3.4.0' major = 3 minor = 4 micro = 0 release_level = 'final' serial = 0 version_info = (major, minor, micro, release_level, serial) SQLObject-3.4.0/sqlobject/inheritance/0000755000175000017500000000000013141371614017134 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/inheritance/__init__.py0000644000175000017500000005677712471401471021273 0ustar phdphd00000000000000from functools import reduce from sqlobject import dbconnection from sqlobject import classregistry from sqlobject import events from sqlobject import sqlbuilder from sqlobject.col import StringCol, ForeignKey from sqlobject.main import sqlmeta, SQLObject, SelectResults, \ makeProperties, unmakeProperties, getterName, setterName from sqlobject.compat import string_type from . import iteration def tablesUsedSet(obj, db): if hasattr(obj, "tablesUsedSet"): return obj.tablesUsedSet(db) elif isinstance(obj, (tuple, list, set, frozenset)): s = set() for component in obj: s.update(tablesUsedSet(component, db)) return s else: return set() class InheritableSelectResults(SelectResults): IterationClass = iteration.InheritableIteration def __init__(self, sourceClass, clause, clauseTables=None, inheritedTables=None, **ops): if clause is None or isinstance(clause, str) and clause == 'all': clause = sqlbuilder.SQLTrueClause dbName = (ops.get('connection', None) or sourceClass._connection).dbName tablesSet = tablesUsedSet(clause, dbName) tablesSet.add(str(sourceClass.sqlmeta.table)) orderBy = ops.get('orderBy') if inheritedTables: for tableName in inheritedTables: tablesSet.add(str(tableName)) if orderBy and not isinstance(orderBy, string_type): tablesSet.update(tablesUsedSet(orderBy, dbName)) # DSM: if this class has a parent, we need to link it # DSM: and be sure the parent is in the table list. # DSM: The following code is before clauseTables # DSM: because if the user uses clauseTables # DSM: (and normal string SELECT), he must know what he wants # DSM: and will do himself the relationship between classes. if not isinstance(clause, str): tableRegistry = {} allClasses = classregistry.registry( sourceClass.sqlmeta.registry).allClasses() for registryClass in allClasses: if str(registryClass.sqlmeta.table) in tablesSet: # DSM: By default, no parents are needed for the clauses tableRegistry[registryClass] = registryClass tableRegistryCopy = tableRegistry.copy() for childClass in tableRegistryCopy: if childClass not in tableRegistry: continue currentClass = childClass while currentClass: if currentClass in tableRegistryCopy: if currentClass in tableRegistry: # DSM: Remove this class as it is a parent one # DSM: of a needed children del tableRegistry[currentClass] # DSM: Must keep the last parent needed # DSM: (to limit the number of join needed) tableRegistry[childClass] = currentClass currentClass = currentClass.sqlmeta.parentClass # DSM: Table registry contains only the last children # DSM: or standalone classes parentClause = [] for (currentClass, minParentClass) in tableRegistry.items(): while (currentClass != minParentClass) \ and currentClass.sqlmeta.parentClass: parentClass = currentClass.sqlmeta.parentClass parentClause.append(currentClass.q.id == parentClass.q.id) currentClass = parentClass tablesSet.add(str(currentClass.sqlmeta.table)) clause = reduce(sqlbuilder.AND, parentClause, clause) super(InheritableSelectResults, self).__init__( sourceClass, clause, clauseTables, **ops) def accumulateMany(self, *attributes, **kw): if kw.get("skipInherited"): return super(InheritableSelectResults, self).\ accumulateMany(*attributes) tables = [] for func_name, attribute in attributes: if not isinstance(attribute, string_type): tables.append(attribute.tableName) clone = self.__class__(self.sourceClass, self.clause, self.clauseTables, inheritedTables=tables, **self.ops) return clone.accumulateMany(skipInherited=True, *attributes) class InheritableSQLMeta(sqlmeta): @classmethod def addColumn(sqlmeta, columnDef, changeSchema=False, connection=None, childUpdate=False): soClass = sqlmeta.soClass # DSM: Try to add parent properties to the current class # DSM: Only do this once if possible at object creation and once for # DSM: each new dynamic column to refresh the current class if sqlmeta.parentClass: for col in sqlmeta.parentClass.sqlmeta.columnList: cname = col.name if cname == 'childName': continue if cname.endswith("ID"): cname = cname[:-2] setattr(soClass, getterName(cname), eval( 'lambda self: self._parent.%s' % cname)) if not col.immutable: def make_setfunc(cname): def setfunc(self, val): if not self.sqlmeta._creating and \ not getattr(self.sqlmeta, "row_update_sig_suppress", False): self.sqlmeta.send(events.RowUpdateSignal, self, {cname: val}) setattr(self._parent, cname, val) return setfunc setfunc = make_setfunc(cname) setattr(soClass, setterName(cname), setfunc) if childUpdate: makeProperties(soClass) return if columnDef: super(InheritableSQLMeta, sqlmeta).addColumn(columnDef, changeSchema, connection) # DSM: Update each child class if needed and existing (only for new # DSM: dynamic column as no child classes exists at object creation) if columnDef and hasattr(soClass, "q"): q = getattr(soClass.q, columnDef.name, None) else: q = None for c in sqlmeta.childClasses.values(): c.sqlmeta.addColumn(columnDef, connection=connection, childUpdate=True) if q: setattr(c.q, columnDef.name, q) @classmethod def delColumn(sqlmeta, column, changeSchema=False, connection=None, childUpdate=False): if childUpdate: soClass = sqlmeta.soClass unmakeProperties(soClass) makeProperties(soClass) if isinstance(column, str): name = column else: name = column.name delattr(soClass, name) delattr(soClass.q, name) return super(InheritableSQLMeta, sqlmeta).delColumn(column, changeSchema, connection) # DSM: Update each child class if needed # DSM: and delete properties for this column for c in sqlmeta.childClasses.values(): c.sqlmeta.delColumn(column, changeSchema=changeSchema, connection=connection, childUpdate=True) @classmethod def addJoin(sqlmeta, joinDef, childUpdate=False): soClass = sqlmeta.soClass # DSM: Try to add parent properties to the current class # DSM: Only do this once if possible at object creation and once for # DSM: each new dynamic join to refresh the current class if sqlmeta.parentClass: for join in sqlmeta.parentClass.sqlmeta.joins: jname = join.joinMethodName jarn = join.addRemoveName setattr( soClass, getterName(jname), eval('lambda self: self._parent.%s' % jname)) if hasattr(join, 'remove'): setattr( soClass, 'remove' + jarn, eval('lambda self,o: self._parent.remove%s(o)' % jarn)) if hasattr(join, 'add'): setattr( soClass, 'add' + jarn, eval('lambda self,o: self._parent.add%s(o)' % jarn)) if childUpdate: makeProperties(soClass) return if joinDef: super(InheritableSQLMeta, sqlmeta).addJoin(joinDef) # DSM: Update each child class if needed and existing (only for new # DSM: dynamic join as no child classes exists at object creation) for c in sqlmeta.childClasses.values(): c.sqlmeta.addJoin(joinDef, childUpdate=True) @classmethod def delJoin(sqlmeta, joinDef, childUpdate=False): if childUpdate: soClass = sqlmeta.soClass unmakeProperties(soClass) makeProperties(soClass) return super(InheritableSQLMeta, sqlmeta).delJoin(joinDef) # DSM: Update each child class if needed # DSM: and delete properties for this join for c in sqlmeta.childClasses.values(): c.sqlmeta.delJoin(joinDef, childUpdate=True) @classmethod def getAllColumns(sqlmeta): columns = sqlmeta.columns.copy() sm = sqlmeta while sm.parentClass: columns.update(sm.parentClass.sqlmeta.columns) sm = sm.parentClass.sqlmeta return columns @classmethod def getColumns(sqlmeta): columns = sqlmeta.getAllColumns() if 'childName' in columns: del columns['childName'] return columns class InheritableSQLObject(SQLObject): sqlmeta = InheritableSQLMeta _inheritable = True SelectResultsClass = InheritableSelectResults def set(self, **kw): if self._parent: SQLObject.set(self, _suppress_set_sig=True, **kw) else: SQLObject.set(self, **kw) def __classinit__(cls, new_attrs): SQLObject.__classinit__(cls, new_attrs) # if we are a child class, add sqlbuilder fields from parents currentClass = cls.sqlmeta.parentClass while currentClass: for column in currentClass.sqlmeta.columnDefinitions.values(): if column.name == 'childName': continue if isinstance(column, ForeignKey): continue setattr(cls.q, column.name, getattr(currentClass.q, column.name)) currentClass = currentClass.sqlmeta.parentClass @classmethod def _SO_setupSqlmeta(cls, new_attrs, is_base): # Note: cannot use super(InheritableSQLObject, cls)._SO_setupSqlmeta - # InheritableSQLObject is not defined when it's __classinit__ # is run. Cannot use SQLObject._SO_setupSqlmeta, either: # the method would be bound to wrong class. if cls.__name__ == "InheritableSQLObject": call_super = super(cls, cls) else: # InheritableSQLObject must be in globals yet call_super = super(InheritableSQLObject, cls) call_super._SO_setupSqlmeta(new_attrs, is_base) sqlmeta = cls.sqlmeta sqlmeta.childClasses = {} # locate parent class and register this class in it's children sqlmeta.parentClass = None for superclass in cls.__bases__: if getattr(superclass, '_inheritable', False) \ and (superclass.__name__ != 'InheritableSQLObject'): if sqlmeta.parentClass: # already have a parent class; # cannot inherit from more than one raise NotImplementedError( "Multiple inheritance is not implemented") sqlmeta.parentClass = superclass superclass.sqlmeta.childClasses[cls.__name__] = cls if sqlmeta.parentClass: # remove inherited column definitions cls.sqlmeta.columns = {} cls.sqlmeta.columnList = [] cls.sqlmeta.columnDefinitions = {} # default inheritance child name if not sqlmeta.childName: sqlmeta.childName = cls.__name__ @classmethod def get(cls, id, connection=None, selectResults=None, childResults=None, childUpdate=False): val = super(InheritableSQLObject, cls).get(id, connection, selectResults) # DSM: If we are updating a child, we should never return a child... if childUpdate: return val # DSM: If this class has a child, return the child if 'childName' in cls.sqlmeta.columns: childName = val.childName if childName is not None: childClass = cls.sqlmeta.childClasses[childName] # If the class has no columns (which sometimes makes sense # and may be true for non-inheritable (leaf) classes only), # shunt the query to avoid almost meaningless SQL # like "SELECT NULL FROM child WHERE id=1". # This is based on assumption that child object exists # if parent object exists. (If it doesn't your database # is broken and that is a job for database maintenance.) if not (childResults or childClass.sqlmeta.columns): childResults = (None,) return childClass.get(id, connection=connection, selectResults=childResults) # DSM: Now, we know we are alone or the last child in a family... # DSM: It's time to find our parents inst = val while inst.sqlmeta.parentClass and not inst._parent: inst._parent = inst.sqlmeta.parentClass.get( id, connection=connection, childUpdate=True) inst = inst._parent # DSM: We can now return ourself return val @classmethod def _notifyFinishClassCreation(cls): sqlmeta = cls.sqlmeta # verify names of added columns if sqlmeta.parentClass: # FIXME: this does not check for grandparent column overrides parentCols = sqlmeta.parentClass.sqlmeta.columns.keys() for column in sqlmeta.columnList: if column.name == 'childName': raise AttributeError( "The column name 'childName' is reserved") if column.name in parentCols: raise AttributeError( "The column '%s' is already defined " "in an inheritable parent" % column.name) # if this class is inheritable, add column for children distinction if cls._inheritable and (cls.__name__ != 'InheritableSQLObject'): sqlmeta.addColumn( StringCol(name='childName', # limit string length to get VARCHAR and not CLOB length=255, default=None)) if not sqlmeta.columnList: # There are no columns - call addColumn to propagate columns # from parent classes to children sqlmeta.addColumn(None) if not sqlmeta.joins: # There are no joins - call addJoin to propagate joins # from parent classes to children sqlmeta.addJoin(None) def _create(self, id, **kw): # DSM: If we were called by a children class, # DSM: we must retreive the properties dictionary. # DSM: Note: we can't use the ** call paremeter directly # DSM: as we must be able to delete items from the dictionary # DSM: (and our children must know that the items were removed!) if 'kw' in kw: kw = kw['kw'] # DSM: If we are the children of an inheritable class, # DSM: we must first create our parent if self.sqlmeta.parentClass: parentClass = self.sqlmeta.parentClass new_kw = {} parent_kw = {} for (name, value) in kw.items(): if (name != 'childName') and hasattr(parentClass, name): parent_kw[name] = value else: new_kw[name] = value kw = new_kw # Need to check that we have enough data to sucesfully # create the current subclass otherwise we will leave # the database in an inconsistent state. for col in self.sqlmeta.columnList: if (col._default == sqlbuilder.NoDefault) and \ (col.name not in kw) and (col.foreignName not in kw): raise TypeError( "%s() did not get expected keyword argument " "%s" % (self.__class__.__name__, col.name)) parent_kw['childName'] = self.sqlmeta.childName self._parent = parentClass(kw=parent_kw, connection=self._connection) id = self._parent.id # TC: Create this record and catch all exceptions in order to destroy # TC: the parent if the child can not be created. try: super(InheritableSQLObject, self)._create(id, **kw) except: # If we are outside a transaction and this is a child, # destroy the parent connection = self._connection if (not isinstance(connection, dbconnection.Transaction) and connection.autoCommit) and self.sqlmeta.parentClass: self._parent.destroySelf() # TC: Do we need to do this?? self._parent = None # TC: Reraise the original exception raise @classmethod def _findAlternateID(cls, name, dbName, value, connection=None): result = list(cls.selectBy(connection, **{name: value})) if not result: return result, None obj = result[0] return [obj.id], obj @classmethod def select(cls, clause=None, *args, **kwargs): parentClass = cls.sqlmeta.parentClass childUpdate = kwargs.pop('childUpdate', None) # childUpdate may have one of three values: # True: # select was issued by parent class to create child objects. # Execute select without modifications. # None (default): # select is run by application. If this class is inheritance # child, delegate query to the parent class to utilize # InheritableIteration optimizations. Selected records # are restricted to this (child) class by adding childName # filter to the where clause. # False: # select is delegated from inheritance child which is parent # of another class. Delegate the query to parent if possible, # but don't add childName restriction: selected records # will be filtered by join to the table filtered by childName. if (not childUpdate) and parentClass: if childUpdate is None: # this is the first parent in deep hierarchy addClause = parentClass.q.childName == cls.sqlmeta.childName # if the clause was one of TRUE varians, replace it if (clause is None) or (clause is sqlbuilder.SQLTrueClause) \ or ( isinstance(clause, string_type) and (clause == 'all')): clause = addClause else: # patch WHERE condition: # change ID field of this class to ID of parent class # XXX the clause is patched in place; it would be better # to build a new one if we have to replace field clsID = cls.q.id parentID = parentClass.q.id def _get_patched(clause): if isinstance(clause, sqlbuilder.SQLOp): _patch_id_clause(clause) return None elif not isinstance(clause, sqlbuilder.Field): return None elif (clause.tableName == clsID.tableName) \ and (clause.fieldName == clsID.fieldName): return parentID else: return None def _patch_id_clause(clause): if not isinstance(clause, sqlbuilder.SQLOp): return expr = _get_patched(clause.expr1) if expr: clause.expr1 = expr expr = _get_patched(clause.expr2) if expr: clause.expr2 = expr _patch_id_clause(clause) # add childName filter clause = sqlbuilder.AND(clause, addClause) return parentClass.select(clause, childUpdate=False, *args, **kwargs) else: return super(InheritableSQLObject, cls).select( clause, *args, **kwargs) @classmethod def selectBy(cls, connection=None, **kw): clause = [] foreignColumns = {} currentClass = cls while currentClass: foreignColumns.update(dict( [(column.foreignName, name) for (name, column) in currentClass.sqlmeta.columns.items() if column.foreignKey ])) currentClass = currentClass.sqlmeta.parentClass for name, value in kw.items(): if name in foreignColumns: name = foreignColumns[name] # translate "key" to "keyID" if isinstance(value, SQLObject): value = value.id currentClass = cls while currentClass: try: clause.append(getattr(currentClass.q, name) == value) break except AttributeError: pass currentClass = currentClass.sqlmeta.parentClass else: raise AttributeError( "'%s' instance has no attribute '%s'" % ( cls.__name__, name)) if clause: clause = reduce(sqlbuilder.AND, clause) else: clause = None # select all conn = connection or cls._connection return cls.SelectResultsClass(cls, clause, connection=conn) def destroySelf(self): # DSM: If this object has parents, recursivly kill them if hasattr(self, '_parent') and self._parent: self._parent.destroySelf() super(InheritableSQLObject, self).destroySelf() def _reprItems(self): items = super(InheritableSQLObject, self)._reprItems() # add parent attributes (if any) if self.sqlmeta.parentClass: items.extend(self._parent._reprItems()) # filter out our special column return [item for item in items if item[0] != 'childName'] __all__ = ['InheritableSQLObject'] SQLObject-3.4.0/sqlobject/inheritance/tests/0000755000175000017500000000000013141371614020276 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/inheritance/tests/test_aggregates.py0000644000175000017500000000124412770603634024030 0ustar phdphd00000000000000from sqlobject import IntCol from sqlobject.inheritance import InheritableSQLObject from sqlobject.tests.dbtest import raises, setupClass class SOTestAggregate1(InheritableSQLObject): value1 = IntCol() class SOTestAggregate2(SOTestAggregate1): value2 = IntCol() def test_aggregates(): setupClass([SOTestAggregate1, SOTestAggregate2]) SOTestAggregate1(value1=1) SOTestAggregate2(value1=2, value2=12) assert SOTestAggregate1.select().max("value1") == 2 assert SOTestAggregate2.select().max("value1") == 2 raises(Exception, SOTestAggregate2.select().max, "value2") assert SOTestAggregate2.select().max(SOTestAggregate2.q.value2) == 12 SQLObject-3.4.0/sqlobject/inheritance/tests/__init__.py0000644000175000017500000000000210202735006022371 0ustar phdphd00000000000000# SQLObject-3.4.0/sqlobject/inheritance/tests/test_inheritance.py0000644000175000017500000001117013036106435024200 0ustar phdphd00000000000000from pytest import raises from sqlobject import IntCol, StringCol from sqlobject.inheritance import InheritableSQLObject from sqlobject.tests.dbtest import setupClass ######################################## # Inheritance ######################################## class InheritablePerson(InheritableSQLObject): firstName = StringCol() lastName = StringCol(alternateID=True, length=255) class Employee(InheritablePerson): _inheritable = False so_position = StringCol() def setup(): setupClass(InheritablePerson) setupClass(Employee) Employee(firstName='Project', lastName='Leader', so_position='Project leader') InheritablePerson(firstName='Oneof', lastName='Authors') def test_creation_fail(): setup() kwargs = {'firstName': 'John', 'lastname': 'Doe'} raises(TypeError, Employee, **kwargs) persons = InheritablePerson.select(InheritablePerson.q.firstName == 'John') assert persons.count() == 0 def test_inheritance(): setup() persons = InheritablePerson.select() # all for person in persons: assert isinstance(person, InheritablePerson) if isinstance(person, Employee): assert not hasattr(person, "childName") else: assert hasattr(person, "childName") assert not person.childName def test_inheritance_select(): setup() # comparison to None needed to build the right SQL expression persons = InheritablePerson.select( InheritablePerson.q.firstName != None) # noqa assert persons.count() == 2 persons = InheritablePerson.select(InheritablePerson.q.firstName == "phd") assert persons.count() == 0 # comparison to None needed to build the right SQL expression employees = Employee.select(Employee.q.firstName != None) # noqa assert employees.count() == 1 employees = Employee.select(Employee.q.firstName == "phd") assert employees.count() == 0 # comparison to None needed to build the right SQL expression employees = Employee.select(Employee.q.so_position != None) # noqa assert employees.count() == 1 persons = InheritablePerson.selectBy(firstName="Project") assert persons.count() == 1 assert isinstance(persons[0], Employee) persons = Employee.selectBy(firstName="Project") assert persons.count() == 1 try: person = InheritablePerson.byLastName("Oneof") except: pass else: raise RuntimeError("unknown person %s" % person) person = InheritablePerson.byLastName("Leader") assert person.firstName == "Project" person = Employee.byLastName("Leader") assert person.firstName == "Project" persons = list(InheritablePerson.select( orderBy=InheritablePerson.q.lastName)) assert len(persons) == 2 persons = list(InheritablePerson.select(orderBy=( InheritablePerson.q.lastName, InheritablePerson.q.firstName))) assert len(persons) == 2 persons = list(Employee.select(orderBy=Employee.q.lastName)) assert len(persons) == 1 persons = list(Employee.select(orderBy=(Employee.q.lastName, Employee.q.firstName))) assert len(persons) == 1 persons = list(Employee.select(orderBy=Employee.q.so_position)) assert len(persons) == 1 persons = list(Employee.select(orderBy=(Employee.q.so_position, Employee.q.lastName))) assert len(persons) == 1 def test_addDelColumn(): setup() assert hasattr(InheritablePerson, "firstName") assert hasattr(Employee, "firstName") assert hasattr(InheritablePerson.q, "firstName") assert hasattr(Employee.q, "firstName") Employee.sqlmeta.addColumn(IntCol('runtime', default=None)) assert not hasattr(InheritablePerson, 'runtime') assert hasattr(Employee, 'runtime') assert not hasattr(InheritablePerson.q, 'runtime') assert hasattr(Employee.q, 'runtime') InheritablePerson.sqlmeta.addColumn(IntCol('runtime2', default=None)) assert hasattr(InheritablePerson, 'runtime2') assert hasattr(Employee, 'runtime2') assert hasattr(InheritablePerson.q, 'runtime2') assert hasattr(Employee.q, 'runtime2') Employee.sqlmeta.delColumn('runtime') assert not hasattr(InheritablePerson, 'runtime') assert not hasattr(Employee, 'runtime') assert not hasattr(InheritablePerson.q, 'runtime') assert not hasattr(Employee.q, 'runtime') InheritablePerson.sqlmeta.delColumn('runtime2') assert not hasattr(InheritablePerson, 'runtime2') assert not hasattr(Employee, 'runtime2') assert not hasattr(InheritablePerson.q, 'runtime2') assert not hasattr(Employee.q, 'runtime2') SQLObject-3.4.0/sqlobject/inheritance/tests/test_deep_inheritance.py0000644000175000017500000000653013036106435025201 0ustar phdphd00000000000000from pytest import raises, skip from sqlobject import ForeignKey, MultipleJoin, StringCol from sqlobject.inheritance import InheritableSQLObject from sqlobject.tests.dbtest import getConnection, setupClass, supports ######################################## # Deep Inheritance ######################################## class DIPerson(InheritableSQLObject): firstName = StringCol(length=100) lastName = StringCol(alternateID=True, length=255) manager = ForeignKey("DIManager", default=None) class DIEmployee(DIPerson): so_position = StringCol(unique=True, length=100) class DIManager(DIEmployee): subdudes = MultipleJoin("DIPerson", joinColumn="manager_id") def test_creation_fail(): """ Try to create a Manager without specifying a position. This should fail without leaving any partial records in the database. """ setupClass([DIManager, DIEmployee, DIPerson]) kwargs = {'firstName': 'John', 'lastname': 'Doe'} raises(TypeError, DIManager, **kwargs) persons = DIEmployee.select(DIPerson.q.firstName == 'John') assert persons.count() == 0 def test_creation_fail2(): """ Try to create two Managers with the same position. This should fail without leaving any partial records in the database. """ setupClass([DIManager, DIEmployee, DIPerson]) kwargs = {'firstName': 'John', 'lastName': 'Doe', 'so_position': 'Project Manager'} DIManager(**kwargs) persons = DIEmployee.select(DIPerson.q.firstName == 'John') assert persons.count() == 1 kwargs = {'firstName': 'John', 'lastName': 'Doe II', 'so_position': 'Project Manager'} raises(Exception, DIManager, **kwargs) persons = DIPerson.select(DIPerson.q.firstName == 'John') assert persons.count() == 1 if not supports('transactions'): skip("Transactions aren't supported") transaction = DIPerson._connection.transaction() kwargs = {'firstName': 'John', 'lastName': 'Doe III', 'so_position': 'Project Manager'} raises(Exception, DIManager, connection=transaction, **kwargs) transaction.rollback() transaction.begin() persons = DIPerson.select(DIPerson.q.firstName == 'John', connection=transaction) assert persons.count() == 1 def test_deep_inheritance(): setupClass([DIManager, DIEmployee, DIPerson]) manager = DIManager(firstName='Project', lastName='Manager', so_position='Project Manager') manager_id = manager.id employee_id = DIEmployee(firstName='Project', lastName='Leader', so_position='Project leader', manager=manager).id DIPerson(firstName='Oneof', lastName='Authors', manager=manager) conn = getConnection() cache = conn.cache cache.clear() managers = list(DIManager.select()) assert len(managers) == 1 cache.clear() employees = list(DIEmployee.select()) assert len(employees) == 2 cache.clear() persons = list(DIPerson.select()) assert len(persons) == 3 cache.clear() person = DIPerson.get(employee_id) assert isinstance(person, DIEmployee) person = DIPerson.get(manager_id) assert isinstance(person, DIEmployee) assert isinstance(person, DIManager) cache.clear() person = DIEmployee.get(manager_id) assert isinstance(person, DIManager) conn.close() SQLObject-3.4.0/sqlobject/inheritance/tests/test_foreignKey.py0000644000175000017500000000530713036405001024005 0ustar phdphd00000000000000from sqlobject import ForeignKey, SQLObject, StringCol from sqlobject.tests.dbtest import setupClass from sqlobject.inheritance import InheritableSQLObject class Note(SQLObject): text = StringCol() class PersonWithNotes(InheritableSQLObject): firstName = StringCol() lastName = StringCol() note = ForeignKey("Note", default=None) class Paper(SQLObject): content = StringCol() class EmployeeWithNotes(PersonWithNotes): _inheritable = False paper = ForeignKey("Paper", default=None) def test_foreignKey(): setupClass([Note, PersonWithNotes, Paper, EmployeeWithNotes], force=True) note = Note(text="person") PersonWithNotes(firstName='Oneof', lastName='Authors', note=note) note = Note(text="employee") EmployeeWithNotes(firstName='Project', lastName='Leader', note=note) paper = Paper(content="secret") EmployeeWithNotes(firstName='Senior', lastName='Clerk', paper=paper) PersonWithNotes(firstName='Some', lastName='Person') person = PersonWithNotes.get(1) assert isinstance(person, PersonWithNotes) and \ not isinstance(person, EmployeeWithNotes) assert person.note.text == "person" employee = EmployeeWithNotes.get(2) assert isinstance(employee, EmployeeWithNotes) assert employee.note.text == "employee" save_employee = employee # comparison to None needed to build the right SQL expression persons = PersonWithNotes.select(PersonWithNotes.q.noteID != None) # noqa assert persons.count() == 2 persons = PersonWithNotes.selectBy(noteID=person.note.id) assert persons.count() == 1 # comparison to None needed to build the right SQL expression employee = EmployeeWithNotes.select( PersonWithNotes.q.noteID != None) # noqa assert employee.count() == 1 persons = PersonWithNotes.selectBy(noteID=person.note.id) assert persons.count() == 1 persons = PersonWithNotes.selectBy(note=person.note) assert persons.count() == 1 persons = PersonWithNotes.selectBy(note=None) assert persons.count() == 2 employee = EmployeeWithNotes.selectBy(paperID=None) assert employee.count() == 1 employee = EmployeeWithNotes.selectBy(paper=None) assert employee.count() == 1 employee = EmployeeWithNotes.selectBy(note=save_employee.note, paper=save_employee.paper) assert employee.count() == 1 employee = EmployeeWithNotes.selectBy() assert employee.count() == 2 class SOTestInhBase(InheritableSQLObject): pass class SOTestInhFKey(SOTestInhBase): base = ForeignKey("SOTestInhBase") def test_foreignKey2(): setupClass([SOTestInhBase, SOTestInhFKey]) test = SOTestInhBase() SOTestInhFKey(base=test) SQLObject-3.4.0/sqlobject/inheritance/tests/test_inheritance_tree.py0000644000175000017500000000210512747416060025223 0ustar phdphd00000000000000from sqlobject import StringCol from sqlobject.inheritance import InheritableSQLObject from sqlobject.tests.dbtest import setupClass ######################################## # Inheritance Tree ######################################## class Tree1(InheritableSQLObject): aprop = StringCol(length=10) class Tree2(Tree1): bprop = StringCol(length=10) class Tree3(Tree1): cprop = StringCol(length=10) class Tree4(Tree2): dprop = StringCol(length=10) class Tree5(Tree2): eprop = StringCol(length=10) def test_tree(): setupClass([Tree1, Tree2, Tree3, Tree4, Tree5]) Tree1(aprop='t1') # t1 t2 = Tree2(aprop='t2', bprop='t2') Tree3(aprop='t3', cprop='t3') # t3 t4 = Tree4(aprop='t4', bprop='t4', dprop='t4') t5 = Tree5(aprop='t5', bprop='t5', eprop='t5') # find just the t5 out of childs from Tree2 assert t5 == Tree1.select(Tree2.q.childName == 'Tree5')[0] # t2,t4,t5 are all subclasses of Tree1 with t1 childName of 'Tree2' assert list(Tree1.select( Tree1.q.childName == 'Tree2', orderBy="aprop")) == [t2, t4, t5] SQLObject-3.4.0/sqlobject/inheritance/tests/test_indexes.py0000644000175000017500000000301313036460624023346 0ustar phdphd00000000000000from sqlobject import DatabaseIndex, IntCol, StringCol from sqlobject.tests.dbtest import setupClass from sqlobject.inheritance import InheritableSQLObject class InhPersonIdxGet(InheritableSQLObject): first_name = StringCol(notNone=True, length=100) last_name = StringCol(notNone=True, length=100) age = IntCol() pk = DatabaseIndex(first_name, last_name, unique=True) class InhEmployeeIdxGet(InhPersonIdxGet): security_number = IntCol() experience = IntCol() sec_index = DatabaseIndex(security_number, unique=True) class InhSalesManIdxGet(InhEmployeeIdxGet): _inheritable = False skill = IntCol() def test_index_get_1(): setupClass([InhPersonIdxGet, InhEmployeeIdxGet, InhSalesManIdxGet]) InhSalesManIdxGet(first_name='Michael', last_name='Pallin', age=65, security_number=2304, experience=2, skill=10) InhEmployeeIdxGet(first_name='Eric', last_name='Idle', age=63, security_number=3402, experience=9) InhPersonIdxGet(first_name='Terry', last_name='Guilliam', age=64) InhPersonIdxGet.pk.get('Michael', 'Pallin') InhEmployeeIdxGet.pk.get('Michael', 'Pallin') InhSalesManIdxGet.pk.get('Michael', 'Pallin') InhPersonIdxGet.pk.get('Eric', 'Idle') InhEmployeeIdxGet.pk.get('Eric', 'Idle') InhPersonIdxGet.pk.get(first_name='Terry', last_name='Guilliam') InhEmployeeIdxGet.sec_index.get(2304) InhEmployeeIdxGet.sec_index.get(3402) InhSalesManIdxGet.sec_index.get(2304) InhSalesManIdxGet.sec_index.get(3402) SQLObject-3.4.0/sqlobject/inheritance/tests/test_destroy_cascade.py0000644000175000017500000000104712771114253025047 0ustar phdphd00000000000000from sqlobject import ForeignKey, IntCol, SQLObject from sqlobject.inheritance import InheritableSQLObject from sqlobject.tests.dbtest import setupClass class SOTestCascade1(InheritableSQLObject): dummy = IntCol() class SOTestCascade2(SOTestCascade1): c = ForeignKey('SOTestCascade3', cascade='null') class SOTestCascade3(SQLObject): dummy = IntCol() def test_destroySelf(): setupClass([SOTestCascade1, SOTestCascade3, SOTestCascade2]) c = SOTestCascade3(dummy=1) SOTestCascade2(cID=c.id, dummy=1) c.destroySelf() SQLObject-3.4.0/sqlobject/inheritance/tests/test_asdict.py0000644000175000017500000000322513036106435023160 0ustar phdphd00000000000000from sqlobject import StringCol from sqlobject.inheritance import InheritableSQLObject from sqlobject.tests.dbtest import setupClass ######################################## # sqlmeta.asDict ######################################## class InheritablePersonAD(InheritableSQLObject): firstName = StringCol() lastName = StringCol(alternateID=True, length=255) class ManagerAD(InheritablePersonAD): department = StringCol() class EmployeeAD(InheritablePersonAD): _inheritable = False so_position = StringCol() def test_getColumns(): setupClass([InheritablePersonAD, ManagerAD, EmployeeAD]) for klass, columns in ( (InheritablePersonAD, ['firstName', 'lastName']), (ManagerAD, ['department', 'firstName', 'lastName']), (EmployeeAD, ['firstName', 'lastName', 'so_position'])): _columns = sorted(klass.sqlmeta.getColumns().keys()) assert _columns == columns def test_asDict(): setupClass([InheritablePersonAD, ManagerAD, EmployeeAD], force=True) InheritablePersonAD(firstName='Oneof', lastName='Authors') ManagerAD(firstName='ManagerAD', lastName='The', department='Dep') EmployeeAD(firstName='Project', lastName='Leader', so_position='Project leader') assert InheritablePersonAD.get(1).sqlmeta.asDict() == \ dict(firstName='Oneof', lastName='Authors', id=1) assert InheritablePersonAD.get(2).sqlmeta.asDict() == \ dict(firstName='ManagerAD', lastName='The', department='Dep', id=2) assert InheritablePersonAD.get(3).sqlmeta.asDict() == \ dict(firstName='Project', lastName='Leader', so_position='Project leader', id=3) SQLObject-3.4.0/sqlobject/inheritance/iteration.py0000644000175000017500000001015312467733425021520 0ustar phdphd00000000000000from sqlobject import sqlbuilder from sqlobject.classregistry import findClass from sqlobject.dbconnection import Iteration class InheritableIteration(Iteration): # Default array size for cursor.fetchmany() defaultArraySize = 10000 def __init__(self, dbconn, rawconn, select, keepConnection=False): super(InheritableIteration, self).__init__(dbconn, rawconn, select, keepConnection) self.lazyColumns = select.ops.get('lazyColumns', False) self.cursor.arraysize = self.defaultArraySize self._results = [] # Find the index of the childName column childNameIdx = None columns = select.sourceClass.sqlmeta.columnList for i, column in enumerate(columns): if column.name == "childName": childNameIdx = i break self._childNameIdx = childNameIdx def next(self): if not self._results: self._results = list(self.cursor.fetchmany()) if not self.lazyColumns: self.fetchChildren() if not self._results: self._cleanup() raise StopIteration result = self._results[0] del self._results[0] if self.lazyColumns: obj = self.select.sourceClass.get(result[0], connection=self.dbconn) return obj else: id = result[0] if id in self._childrenResults: childResults = self._childrenResults[id] del self._childrenResults[id] else: childResults = None obj = self.select.sourceClass.get( id, selectResults=result[1:], childResults=childResults, connection=self.dbconn) return obj def fetchChildren(self): """Prefetch childrens' data Fetch childrens' data for every subclass in one big .select() to avoid .get() fetching it one by one. """ self._childrenResults = {} if self._childNameIdx is None: return childIdsNames = {} childNameIdx = self._childNameIdx for result in self._results: childName = result[childNameIdx + 1] if childName: ids = childIdsNames.get(childName) if ids is None: ids = childIdsNames[childName] = [] ids.append(result[0]) dbconn = self.dbconn rawconn = self.rawconn cursor = rawconn.cursor() registry = self.select.sourceClass.sqlmeta.registry for childName, ids in childIdsNames.items(): klass = findClass(childName, registry) if len(ids) == 1: select = klass.select(klass.q.id == ids[0], childUpdate=True, connection=dbconn) else: select = klass.select(sqlbuilder.IN(klass.q.id, ids), childUpdate=True, connection=dbconn) query = dbconn.queryForSelect(select) if dbconn.debug: dbconn.printDebug(rawconn, query, 'Select children of the class %s' % childName) self.dbconn._executeRetry(rawconn, cursor, query) for result in cursor.fetchall(): # Inheritance child classes may have no own columns # (that makes sense when child class has a join # that does not apply to parent class objects). # In such cases result[1:] gives an empty tuple # which is interpreted as "no results fetched" in .get(). # So .get() issues another query which is absolutely # meaningless (like "SELECT NULL FROM child WHERE id=1"). # In order to avoid this, we replace empty results # with non-empty tuple. Extra values in selectResults # are Ok - they will be ignored by ._SO_selectInit(). self._childrenResults[result[0]] = result[1:] or (None,) SQLObject-3.4.0/sqlobject/main.py0000644000175000017500000021044713103631236016146 0ustar phdphd00000000000000""" SQLObject --------- :author: Ian Bicking SQLObject is a object-relational mapper. See SQLObject.html or SQLObject.rst for more. With the help by Oleg Broytman and many other contributors. See Authors.rst. This program is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. """ import sys import threading import weakref import types import warnings from . import sqlbuilder from . import dbconnection from . import col from . import styles from . import joins from . import index from . import classregistry from . import declarative from . import events from .sresults import SelectResults from .util.threadinglocal import local from sqlobject.compat import PY2, with_metaclass, string_type, unicode_type if ((sys.version_info[0] == 2) and (sys.version_info[:2] < (2, 7))) or \ ((sys.version_info[0] == 3) and (sys.version_info[:2] < (3, 4))): raise ImportError("SQLObject requires Python 2.7 or 3.4+") if not PY2: # alias for python 3 compatability long = int """ This thread-local storage is needed for RowCreatedSignals. It gathers code-blocks to execute _after_ the whole hierachy of inherited SQLObjects is created. See SQLObject._create """ NoDefault = sqlbuilder.NoDefault class SQLObjectNotFound(LookupError): pass class SQLObjectIntegrityError(Exception): pass def makeProperties(obj): """ This function takes a dictionary of methods and finds methods named like: * _get_attr * _set_attr * _del_attr * _doc_attr Except for _doc_attr, these should be methods. It then creates properties from these methods, like property(_get_attr, _set_attr, _del_attr, _doc_attr). Missing methods are okay. """ if isinstance(obj, dict): def setFunc(var, value): obj[var] = value d = obj else: def setFunc(var, value): setattr(obj, var, value) d = obj.__dict__ props = {} for var, value in d.items(): if var.startswith('_set_'): props.setdefault(var[5:], {})['set'] = value elif var.startswith('_get_'): props.setdefault(var[5:], {})['get'] = value elif var.startswith('_del_'): props.setdefault(var[5:], {})['del'] = value elif var.startswith('_doc_'): props.setdefault(var[5:], {})['doc'] = value for var, setters in props.items(): if len(setters) == 1 and 'doc' in setters: continue if var in d: if isinstance(d[var], (types.MethodType, types.FunctionType)): warnings.warn( "I tried to set the property %r, but it was " "already set, as a method (%r). Methods have " "significantly different semantics than properties, " "and this may be a sign of a bug in your code." % (var, d[var])) continue setFunc(var, property(setters.get('get'), setters.get('set'), setters.get('del'), setters.get('doc'))) def unmakeProperties(obj): if isinstance(obj, dict): def delFunc(obj, var): del obj[var] d = obj else: delFunc = delattr d = obj.__dict__ for var, value in list(d.items()): if isinstance(value, property): for prop in [value.fget, value.fset, value.fdel]: if prop and prop.__name__ not in d: delFunc(obj, var) break def findDependencies(name, registry=None): depends = [] for klass in classregistry.registry(registry).allClasses(): if findDependantColumns(name, klass): depends.append(klass) else: for join in klass.sqlmeta.joins: if isinstance(join, joins.SORelatedJoin) and \ join.otherClassName == name: depends.append(klass) break return depends def findDependantColumns(name, klass): depends = [] for _col in klass.sqlmeta.columnList: if _col.foreignKey == name and _col.cascade is not None: depends.append(_col) return depends def _collectAttributes(cls, new_attrs, look_for_class): """Finds all attributes in `new_attrs` that are instances of `look_for_class`. The ``.name`` attribute is set for any matching objects. Returns them as a list. """ result = [] for attr, value in new_attrs.items(): if isinstance(value, look_for_class): value.name = attr delattr(cls, attr) result.append(value) return result class CreateNewSQLObject: """ Dummy singleton to use in place of an ID, to signal we want a new object. """ pass class sqlmeta(with_metaclass(declarative.DeclarativeMeta, object)): """ This object is the object we use to keep track of all sorts of information. Subclasses are made for each SQLObject subclass (dynamically if necessary), and instances are created to go alongside every SQLObject instance. """ table = None idName = None idSequence = None # This function is used to coerce IDs into the proper format, # so you should replace it with str, or another function, if you # aren't using integer IDs idType = int style = None lazyUpdate = False defaultOrder = None cacheValues = True registry = None fromDatabase = False # Default is false, but we set it to true for the *instance* # when necessary: (bad clever? maybe) expired = False # This is a mapping from column names to SOCol (or subclass) # instances: columns = {} columnList = [] # This is a mapping from column names to Col (or subclass) # instances; these objects don't have the logic that the SOCol # objects do, and are not attached to this class closely. columnDefinitions = {} # These are lists of the join and index objects: indexes = [] indexDefinitions = [] joins = [] joinDefinitions = [] # These attributes shouldn't be shared with superclasses: _unshared_attributes = ['table', 'columns', 'childName'] # These are internal bookkeeping attributes; the class-level # definition is a default for the instances, instances will # reset these values. # When an object is being created, it has an instance # variable _creating, which is true. This way all the # setters can be captured until the object is complete, # and then the row is inserted into the database. Once # that happens, _creating is deleted from the instance, # and only the class variable (which is always false) is # left. _creating = False _obsolete = False # Sometimes an intance is attached to a connection, not # globally available. In that case, self.sqlmeta._perConnection # will be true. It's false by default: _perConnection = False # Inheritance definitions: parentClass = None # A reference to the parent class childClasses = {} # References to child classes, keyed by childName childName = None # Class name for inheritance child object creation # Does the row require syncing? dirty = False # Default encoding for UnicodeCol's dbEncoding = None def __classinit__(cls, new_attrs): for attr in cls._unshared_attributes: if attr not in new_attrs: setattr(cls, attr, None) declarative.setup_attributes(cls, new_attrs) def __init__(self, instance): self.instance = weakref.proxy(instance) @classmethod def send(cls, signal, *args, **kw): events.send(signal, cls.soClass, *args, **kw) @classmethod def setClass(cls, soClass): cls.soClass = soClass if not cls.style: cls.style = styles.defaultStyle try: if cls.soClass._connection and cls.soClass._connection.style: cls.style = cls.soClass._connection.style except AttributeError: pass if cls.table is None: cls.table = cls.style.pythonClassToDBTable(cls.soClass.__name__) if cls.idName is None: cls.idName = cls.style.idForTable(cls.table) # plainSetters are columns that haven't been overridden by the # user, so we can contact the database directly to set them. # Note that these can't set these in the SQLObject class # itself, because they specific to this subclass of SQLObject, # and cannot be shared among classes. cls._plainSetters = {} cls._plainGetters = {} cls._plainForeignSetters = {} cls._plainForeignGetters = {} cls._plainJoinGetters = {} cls._plainJoinAdders = {} cls._plainJoinRemovers = {} # This is a dictionary of columnName: columnObject # None of these objects can be shared with superclasses cls.columns = {} cls.columnList = [] # These, however, can be shared: cls.columnDefinitions = cls.columnDefinitions.copy() cls.indexes = [] cls.indexDefinitions = cls.indexDefinitions[:] cls.joins = [] cls.joinDefinitions = cls.joinDefinitions[:] ############################################################ # Adding special values, like columns and indexes ############################################################ ######################################## # Column handling ######################################## @classmethod def addColumn(cls, columnDef, changeSchema=False, connection=None): post_funcs = [] cls.send(events.AddColumnSignal, cls.soClass, connection, columnDef.name, columnDef, changeSchema, post_funcs) sqlmeta = cls soClass = cls.soClass del cls column = columnDef.withClass(soClass) name = column.name assert name != 'id', ( "The 'id' column is implicit, and should not be defined as " "a column") assert name not in sqlmeta.columns, ( "The class %s.%s already has a column %r (%r), you cannot " "add the column %r" % (soClass.__module__, soClass.__name__, name, sqlmeta.columnDefinitions[name], columnDef)) # Collect columns from the parent classes to test # if the column is not in a parent class parent_columns = [] for base in soClass.__bases__: if hasattr(base, "sqlmeta"): parent_columns.extend(base.sqlmeta.columns.keys()) if hasattr(soClass, name): assert (name in parent_columns) or (name == "childName"), ( "The class %s.%s already has a variable or method %r, " "you cannot add the column %r" % ( soClass.__module__, soClass.__name__, name, name)) sqlmeta.columnDefinitions[name] = columnDef sqlmeta.columns[name] = column # A stable-ordered version of the list... sqlmeta.columnList.append(column) ################################################### # Create the getter function(s). We'll start by # creating functions like _SO_get_columnName, # then if there's no function named _get_columnName # we'll alias that to _SO_get_columnName. This # allows a sort of super call, even though there's # no superclass that defines the database access. if sqlmeta.cacheValues: # We create a method here, which is just a function # that takes "self" as the first argument. getter = eval( 'lambda self: self._SO_loadValue(%s)' % repr(instanceName(name))) else: # If we aren't caching values, we just call the # function _SO_getValue, which fetches from the # database. getter = eval('lambda self: self._SO_getValue(%s)' % repr(name)) setattr(soClass, rawGetterName(name), getter) # Here if the _get_columnName method isn't in the # definition, we add it with the default # _SO_get_columnName definition. if not hasattr(soClass, getterName(name)) or (name == 'childName'): setattr(soClass, getterName(name), getter) sqlmeta._plainGetters[name] = 1 ################################################# # Create the setter function(s) # Much like creating the getters, we will create # _SO_set_columnName methods, and then alias them # to _set_columnName if the user hasn't defined # those methods themself. # @@: This is lame; immutable right now makes it unsettable, # making the table read-only if not column.immutable: # We start by just using the _SO_setValue method setter = eval( 'lambda self, val: self._SO_setValue' '(%s, val, self.%s, self.%s)' % ( repr(name), '_SO_from_python_%s' % name, '_SO_to_python_%s' % name)) setattr(soClass, '_SO_from_python_%s' % name, column.from_python) setattr(soClass, '_SO_to_python_%s' % name, column.to_python) setattr(soClass, rawSetterName(name), setter) # Then do the aliasing if not hasattr(soClass, setterName(name)) or (name == 'childName'): setattr(soClass, setterName(name), setter) # We keep track of setters that haven't been # overridden, because we can combine these # set columns into one SQL UPDATE query. sqlmeta._plainSetters[name] = 1 ################################################## # Here we check if the column is a foreign key, in # which case we need to make another method that # fetches the key and constructs the sister # SQLObject instance. if column.foreignKey: # We go through the standard _SO_get_columnName deal # we're giving the object, not the ID of the # object this time: origName = column.origName if sqlmeta.cacheValues: # self._SO_class_className is a reference # to the class in question. getter = eval( 'lambda self: self._SO_foreignKey' '(self._SO_loadValue(%r), self._SO_class_%s, %s)' % (instanceName(name), column.foreignKey, column.refColumn and repr(column.refColumn))) else: # Same non-caching version as above. getter = eval( 'lambda self: self._SO_foreignKey' '(self._SO_getValue(%s), self._SO_class_%s, %s)' % (repr(name), column.foreignKey, column.refColumn and repr(column.refColumn))) setattr(soClass, rawGetterName(origName), getter) # And we set the _get_columnName version if not hasattr(soClass, getterName(origName)): setattr(soClass, getterName(origName), getter) sqlmeta._plainForeignGetters[origName] = 1 if not column.immutable: # The setter just gets the ID of the object, # and then sets the real column. setter = eval( 'lambda self, val: ' 'setattr(self, %s, self._SO_getID(val, %s))' % (repr(name), column.refColumn and repr(column.refColumn))) setattr(soClass, rawSetterName(origName), setter) if not hasattr(soClass, setterName(origName)): setattr(soClass, setterName(origName), setter) sqlmeta._plainForeignSetters[origName] = 1 classregistry.registry(sqlmeta.registry).addClassCallback( column.foreignKey, lambda foreign, me, attr: setattr(me, attr, foreign), soClass, '_SO_class_%s' % column.foreignKey) if column.alternateMethodName: func = eval( 'lambda cls, val, connection=None: ' 'cls._SO_fetchAlternateID(%s, %s, val, connection=connection)' % (repr(column.name), repr(column.dbName))) setattr(soClass, column.alternateMethodName, classmethod(func)) if changeSchema: conn = connection or soClass._connection conn.addColumn(sqlmeta.table, column) if soClass._SO_finishedClassCreation: makeProperties(soClass) for func in post_funcs: func(soClass, column) @classmethod def addColumnsFromDatabase(sqlmeta, connection=None): soClass = sqlmeta.soClass conn = connection or soClass._connection for columnDef in conn.columnsFromSchema(sqlmeta.table, soClass): if columnDef.name not in sqlmeta.columnDefinitions: if isinstance(columnDef.name, unicode_type) and PY2: columnDef.name = columnDef.name.encode('ascii') sqlmeta.addColumn(columnDef) @classmethod def delColumn(cls, column, changeSchema=False, connection=None): sqlmeta = cls soClass = sqlmeta.soClass if isinstance(column, str): if column in sqlmeta.columns: column = sqlmeta.columns[column] elif column + 'ID' in sqlmeta.columns: column = sqlmeta.columns[column + 'ID'] else: raise ValueError('Unknown column ' + column) if isinstance(column, col.Col): for c in sqlmeta.columns.values(): if column is c.columnDef: column = c break else: raise IndexError( "Column with definition %r not found" % column) post_funcs = [] cls.send(events.DeleteColumnSignal, cls.soClass, connection, column.name, column, post_funcs) name = column.name del sqlmeta.columns[name] del sqlmeta.columnDefinitions[name] sqlmeta.columnList.remove(column) delattr(soClass, rawGetterName(name)) if name in sqlmeta._plainGetters: delattr(soClass, getterName(name)) delattr(soClass, rawSetterName(name)) if name in sqlmeta._plainSetters: delattr(soClass, setterName(name)) if column.foreignKey: delattr(soClass, rawGetterName(soClass.sqlmeta.style. instanceIDAttrToAttr(name))) if name in sqlmeta._plainForeignGetters: delattr(soClass, getterName(name)) delattr(soClass, rawSetterName(soClass.sqlmeta.style. instanceIDAttrToAttr(name))) if name in sqlmeta._plainForeignSetters: delattr(soClass, setterName(name)) if column.alternateMethodName: delattr(soClass, column.alternateMethodName) if changeSchema: conn = connection or soClass._connection conn.delColumn(sqlmeta, column) if soClass._SO_finishedClassCreation: unmakeProperties(soClass) makeProperties(soClass) for func in post_funcs: func(soClass, column) ######################################## # Join handling ######################################## @classmethod def addJoin(cls, joinDef): sqlmeta = cls soClass = cls.soClass # The name of the method we'll create. If it's # automatically generated, it's generated by the # join class. join = joinDef.withClass(soClass) meth = join.joinMethodName sqlmeta.joins.append(join) index = len(sqlmeta.joins) - 1 if joinDef not in sqlmeta.joinDefinitions: sqlmeta.joinDefinitions.append(joinDef) # The function fetches the join by index, and # then lets the join object do the rest of the # work: func = eval( 'lambda self: self.sqlmeta.joins[%i].performJoin(self)' % index) # And we do the standard _SO_get_... _get_... deal setattr(soClass, rawGetterName(meth), func) if not hasattr(soClass, getterName(meth)): setattr(soClass, getterName(meth), func) sqlmeta._plainJoinGetters[meth] = 1 # Some joins allow you to remove objects from the # join. if hasattr(join, 'remove'): # Again, we let it do the remove, and we do the # standard naming trick. func = eval( 'lambda self, obj: self.sqlmeta.joins[%i].remove(self, obj)' % index) setattr(soClass, '_SO_remove' + join.addRemoveName, func) if not hasattr(soClass, 'remove' + join.addRemoveName): setattr(soClass, 'remove' + join.addRemoveName, func) sqlmeta._plainJoinRemovers[meth] = 1 # Some joins allow you to add objects. if hasattr(join, 'add'): # And again... func = eval( 'lambda self, obj: self.sqlmeta.joins[%i].add(self, obj)' % index) setattr(soClass, '_SO_add' + join.addRemoveName, func) if not hasattr(soClass, 'add' + join.addRemoveName): setattr(soClass, 'add' + join.addRemoveName, func) sqlmeta._plainJoinAdders[meth] = 1 if soClass._SO_finishedClassCreation: makeProperties(soClass) @classmethod def delJoin(sqlmeta, joinDef): soClass = sqlmeta.soClass for join in sqlmeta.joins: # previously deleted joins will be None, so it must # be skipped or it'll error out on the next line. if join is None: continue if joinDef is join.joinDef: break else: raise IndexError( "Join %r not found in class %r (from %r)" % (joinDef, soClass, sqlmeta.joins)) meth = join.joinMethodName sqlmeta.joinDefinitions.remove(joinDef) for i in range(len(sqlmeta.joins)): if sqlmeta.joins[i] is join: # Have to leave None, because we refer to joins # by index. sqlmeta.joins[i] = None delattr(soClass, rawGetterName(meth)) if meth in sqlmeta._plainJoinGetters: delattr(soClass, getterName(meth)) if hasattr(join, 'remove'): delattr(soClass, '_SO_remove' + join.addRemovePrefix) if meth in sqlmeta._plainJoinRemovers: delattr(soClass, 'remove' + join.addRemovePrefix) if hasattr(join, 'add'): delattr(soClass, '_SO_add' + join.addRemovePrefix) if meth in sqlmeta._plainJoinAdders: delattr(soClass, 'add' + join.addRemovePrefix) if soClass._SO_finishedClassCreation: unmakeProperties(soClass) makeProperties(soClass) ######################################## # Indexes ######################################## @classmethod def addIndex(cls, indexDef): cls.indexDefinitions.append(indexDef) index = indexDef.withClass(cls.soClass) cls.indexes.append(index) setattr(cls.soClass, index.name, index) ######################################## # Utility methods ######################################## @classmethod def getColumns(sqlmeta): return sqlmeta.columns.copy() def asDict(self): """ Return the object as a dictionary of columns to values. """ result = {} for key in self.getColumns(): result[key] = getattr(self.instance, key) result['id'] = self.instance.id return result @classmethod def expireAll(sqlmeta, connection=None): """ Expire all instances of this class. """ soClass = sqlmeta.soClass connection = connection or soClass._connection cache_set = connection.cache cache_set.weakrefAll(soClass) for item in cache_set.getAll(soClass): item.expire() sqlhub = dbconnection.ConnectionHub() # Turning it on gives earlier warning about things # that will be deprecated (having this off we won't flood people # with warnings right away). warnings_level = 1 exception_level = None # Current levels: # 1) Actively deprecated # 2) Deprecated after 1 # 3) Deprecated after 2 def deprecated(message, level=1, stacklevel=2): if exception_level is not None and exception_level <= level: raise NotImplementedError(message) if warnings_level is not None and warnings_level <= level: warnings.warn(message, DeprecationWarning, stacklevel=stacklevel) # if sys.version_info[:2] < (2, 7): # deprecated("Support for Python 2.6 has been declared obsolete " # "and will be removed in the next release of SQLObject") def setDeprecationLevel(warning=1, exception=None): """ Set the deprecation level for SQLObject. Low levels are more actively being deprecated. Any warning at a level at or below ``warning`` will give a warning. Any warning at a level at or below ``exception`` will give an exception. You can use a higher ``exception`` level for tests to help upgrade your code. ``None`` for either value means never warn or raise exceptions. The levels currently mean: 1) Deprecated in current version. Will be removed in next version. 2) Planned to deprecate in next version, remove later. 3) Planned to deprecate sometime, remove sometime much later. As the SQLObject versions progress, the deprecation level of specific features will go down, indicating the advancing nature of the feature's doom. We'll try to keep features at 1 for a major revision. As time continues there may be a level 0, which will give a useful error message (better than ``AttributeError``) but where the feature has been fully removed. """ global warnings_level, exception_level warnings_level = warning exception_level = exception class _sqlmeta_attr(object): def __init__(self, name, deprecation_level): self.name = name self.deprecation_level = deprecation_level def __get__(self, obj, type=None): if self.deprecation_level is not None: deprecated( 'Use of this attribute should be replaced with ' '.sqlmeta.%s' % self.name, level=self.deprecation_level) return getattr((type or obj).sqlmeta, self.name) _postponed_local = local() # SQLObject is the superclass for all SQLObject classes, of # course. All the deeper magic is done in MetaSQLObject, and # only lesser magic is done here. All the actual work is done # here, though -- just automatic method generation (like # methods and properties for each column) is done in # MetaSQLObject. class SQLObject(with_metaclass(declarative.DeclarativeMeta, object)): _connection = sqlhub sqlmeta = sqlmeta # DSM: The _inheritable attribute controls wheter the class can by # DSM: inherited 'logically' with a foreignKey and a back reference. _inheritable = False # Is this class inheritable? _parent = None # A reference to the parent instance childName = None # Children name (to be able to get a subclass) # The law of Demeter: the class should not call another classes by name SelectResultsClass = SelectResults def __classinit__(cls, new_attrs): # This is true if we're initializing the SQLObject class, # instead of a subclass: is_base = cls.__bases__ == (object,) cls._SO_setupSqlmeta(new_attrs, is_base) implicitColumns = _collectAttributes(cls, new_attrs, col.Col) implicitJoins = _collectAttributes(cls, new_attrs, joins.Join) implicitIndexes = _collectAttributes(cls, new_attrs, index.DatabaseIndex) if not is_base: cls._SO_cleanDeprecatedAttrs(new_attrs) if '_connection' in new_attrs: connection = new_attrs['_connection'] del cls._connection assert 'connection' not in new_attrs elif 'connection' in new_attrs: connection = new_attrs['connection'] del cls.connection else: connection = None cls._SO_finishedClassCreation = False ###################################################### # Set some attributes to their defaults, if necessary. # First we get the connection: if not connection and not getattr(cls, '_connection', None): mod = sys.modules[cls.__module__] # See if there's a __connection__ global in # the module, use it if there is. if hasattr(mod, '__connection__'): connection = mod.__connection__ # Do not check hasattr(cls, '_connection') here - it is possible # SQLObject parent class has a connection attribute that came # from sqlhub, e.g.; check __dict__ only. if connection and ('_connection' not in cls.__dict__): cls.setConnection(connection) sqlmeta = cls.sqlmeta # We have to check if there are columns in the inherited # _columns where the attribute has been set to None in this # class. If so, then we need to remove that column from # _columns. for key in sqlmeta.columnDefinitions.keys(): if (key in new_attrs and new_attrs[key] is None): del sqlmeta.columnDefinitions[key] for column in sqlmeta.columnDefinitions.values(): sqlmeta.addColumn(column) for column in implicitColumns: sqlmeta.addColumn(column) # Now the class is in an essentially OK-state, so we can # set up any magic attributes: declarative.setup_attributes(cls, new_attrs) if sqlmeta.fromDatabase: sqlmeta.addColumnsFromDatabase() for j in implicitJoins: sqlmeta.addJoin(j) for i in implicitIndexes: sqlmeta.addIndex(i) def order_getter(o): return o.creationOrder sqlmeta.columnList.sort(key=order_getter) sqlmeta.indexes.sort(key=order_getter) sqlmeta.indexDefinitions.sort(key=order_getter) # Joins cannot be sorted because addJoin created accessors # that remember indexes. # sqlmeta.joins.sort(key=order_getter) sqlmeta.joinDefinitions.sort(key=order_getter) # We don't setup the properties until we're finished with the # batch adding of all the columns... cls._notifyFinishClassCreation() cls._SO_finishedClassCreation = True makeProperties(cls) # We use the magic "q" attribute for accessing lazy # SQL where-clause generation. See the sql module for # more. if not is_base: cls.q = sqlbuilder.SQLObjectTable(cls) cls.j = sqlbuilder.SQLObjectTableWithJoins(cls) classregistry.registry(sqlmeta.registry).addClass(cls) @classmethod def _SO_setupSqlmeta(cls, new_attrs, is_base): """ This fixes up the sqlmeta attribute. It handles both the case where no sqlmeta was given (in which we need to create another subclass), or the sqlmeta given doesn't have the proper inheritance. Lastly it calls sqlmeta.setClass, which handles much of the setup. """ if ('sqlmeta' not in new_attrs and not is_base): # We have to create our own subclass, usually. # type(className, bases_tuple, attr_dict) creates a new subclass. cls.sqlmeta = type('sqlmeta', (cls.sqlmeta,), {}) if not issubclass(cls.sqlmeta, sqlmeta): # We allow no superclass and an object superclass, instead # of inheriting from sqlmeta; but in that case we replace # the class and just move over its attributes: assert cls.sqlmeta.__bases__ in ((), (object,)), ( "If you do not inherit your sqlmeta class from " "sqlobject.sqlmeta, it must not inherit from any other " "class (your sqlmeta inherits from: %s)" % cls.sqlmeta.__bases__) for base in cls.__bases__: superclass = getattr(base, 'sqlmeta', None) if superclass: break else: assert 0, ( "No sqlmeta class could be found in any superclass " "(while fixing up sqlmeta %r inheritance)" % cls.sqlmeta) values = dict(cls.sqlmeta.__dict__) for key in list(values.keys()): if key.startswith('__') and key.endswith('__'): # Magic values shouldn't be passed through: del values[key] cls.sqlmeta = type('sqlmeta', (superclass,), values) if not is_base: # Do not pollute the base sqlmeta class cls.sqlmeta.setClass(cls) @classmethod def _SO_cleanDeprecatedAttrs(cls, new_attrs): """ This removes attributes on SQLObject subclasses that have been deprecated; they are moved to the sqlmeta class, and a deprecation warning is given. """ for attr in (): if attr in new_attrs: deprecated("%r is deprecated and read-only; please do " "not use it in your classes until it is fully " "deprecated" % attr, level=1, stacklevel=5) @classmethod def get(cls, id, connection=None, selectResults=None): assert id is not None, \ 'None is not a possible id for %s' % cls.__name__ id = cls.sqlmeta.idType(id) if connection is None: cache = cls._connection.cache else: cache = connection.cache # This whole sequence comes from Cache.CacheFactory's # behavior, where a None returned means a cache miss. val = cache.get(id, cls) if val is None: try: val = cls(_SO_fetch_no_create=1) val._SO_validatorState = sqlbuilder.SQLObjectState(val) val._init(id, connection, selectResults) cache.put(id, cls, val) finally: cache.finishPut(cls) elif selectResults and not val.sqlmeta.dirty: val._SO_writeLock.acquire() try: val._SO_selectInit(selectResults) val.sqlmeta.expired = False finally: val._SO_writeLock.release() return val @classmethod def _notifyFinishClassCreation(cls): pass def _init(self, id, connection=None, selectResults=None): assert id is not None # This function gets called only when the object is # created, unlike __init__ which would be called # anytime the object was returned from cache. self.id = id self._SO_writeLock = threading.Lock() # If no connection was given, we'll inherit the class # instance variable which should have a _connection # attribute. if (connection is not None) and \ (getattr(self, '_connection', None) is not connection): self._connection = connection # Sometimes we need to know if this instance is # global or tied to a particular connection. # This flag tells us that: self.sqlmeta._perConnection = True if not selectResults: dbNames = [col.dbName for col in self.sqlmeta.columnList] selectResults = self._connection._SO_selectOne(self, dbNames) if not selectResults: raise SQLObjectNotFound( "The object %s by the ID %s does not exist" % ( self.__class__.__name__, self.id)) self._SO_selectInit(selectResults) self._SO_createValues = {} self.sqlmeta.dirty = False def _SO_loadValue(self, attrName): try: return getattr(self, attrName) except AttributeError: try: self._SO_writeLock.acquire() try: # Maybe, just in the moment since we got the lock, # some other thread did a _SO_loadValue and we # have the attribute! Let's try and find out! We # can keep trying this all day and still beat the # performance on the database call (okay, we can # keep trying this for a few msecs at least)... result = getattr(self, attrName) except AttributeError: pass else: return result self.sqlmeta.expired = False dbNames = [col.dbName for col in self.sqlmeta.columnList] selectResults = self._connection._SO_selectOne(self, dbNames) if not selectResults: raise SQLObjectNotFound( "The object %s by the ID %s has been deleted" % ( self.__class__.__name__, self.id)) self._SO_selectInit(selectResults) result = getattr(self, attrName) return result finally: self._SO_writeLock.release() def sync(self): if self.sqlmeta.lazyUpdate and self._SO_createValues: self.syncUpdate() self._SO_writeLock.acquire() try: dbNames = [col.dbName for col in self.sqlmeta.columnList] selectResults = self._connection._SO_selectOne(self, dbNames) if not selectResults: raise SQLObjectNotFound( "The object %s by the ID %s has been deleted" % ( self.__class__.__name__, self.id)) self._SO_selectInit(selectResults) self.sqlmeta.expired = False finally: self._SO_writeLock.release() def syncUpdate(self): if not self._SO_createValues: return self._SO_writeLock.acquire() try: if self.sqlmeta.columns: columns = self.sqlmeta.columns values = [(columns[v[0]].dbName, v[1]) for v in sorted( self._SO_createValues.items(), key=lambda c: columns[c[0]].creationOrder)] self._connection._SO_update(self, values) self.sqlmeta.dirty = False self._SO_createValues = {} finally: self._SO_writeLock.release() post_funcs = [] self.sqlmeta.send(events.RowUpdatedSignal, self, post_funcs) for func in post_funcs: func(self) def expire(self): if self.sqlmeta.expired: return self._SO_writeLock.acquire() try: if self.sqlmeta.expired: return for column in self.sqlmeta.columnList: delattr(self, instanceName(column.name)) self.sqlmeta.expired = True self._connection.cache.expire(self.id, self.__class__) self._SO_createValues = {} finally: self._SO_writeLock.release() def _SO_setValue(self, name, value, from_python, to_python): # This is the place where we actually update the # database. # If we are _creating, the object doesn't yet exist # in the database, and we can't insert it until all # the parts are set. So we just keep them in a # dictionary until later: d = {name: value} if not self.sqlmeta._creating and \ not getattr(self.sqlmeta, "row_update_sig_suppress", False): self.sqlmeta.send(events.RowUpdateSignal, self, d) if len(d) != 1 or name not in d: # Already called RowUpdateSignal, don't call it again # inside .set() self.sqlmeta.row_update_sig_suppress = True self.set(**d) del self.sqlmeta.row_update_sig_suppress value = d[name] if from_python: dbValue = from_python(value, self._SO_validatorState) else: dbValue = value if to_python: value = to_python(dbValue, self._SO_validatorState) if self.sqlmeta._creating or self.sqlmeta.lazyUpdate: self.sqlmeta.dirty = True self._SO_createValues[name] = dbValue setattr(self, instanceName(name), value) return self._connection._SO_update( self, [(self.sqlmeta.columns[name].dbName, dbValue)]) if self.sqlmeta.cacheValues: setattr(self, instanceName(name), value) post_funcs = [] self.sqlmeta.send(events.RowUpdatedSignal, self, post_funcs) for func in post_funcs: func(self) def set(self, _suppress_set_sig=False, **kw): if not self.sqlmeta._creating and \ not getattr(self.sqlmeta, "row_update_sig_suppress", False) \ and not _suppress_set_sig: self.sqlmeta.send(events.RowUpdateSignal, self, kw) # set() is used to update multiple values at once, # potentially with one SQL statement if possible. # Filter out items that don't map to column names. # Those will be set directly on the object using # setattr(obj, name, value). def is_column(_c): return _c in self.sqlmeta._plainSetters def f_is_column(item): return is_column(item[0]) def f_not_column(item): return not is_column(item[0]) items = kw.items() extra = dict(filter(f_not_column, items)) kw = dict(filter(f_is_column, items)) # _creating is special, see _SO_setValue if self.sqlmeta._creating or self.sqlmeta.lazyUpdate: for name, value in kw.items(): from_python = getattr(self, '_SO_from_python_%s' % name, None) if from_python: kw[name] = dbValue = from_python(value, self._SO_validatorState) else: dbValue = value to_python = getattr(self, '_SO_to_python_%s' % name, None) if to_python: value = to_python(dbValue, self._SO_validatorState) setattr(self, instanceName(name), value) self._SO_createValues.update(kw) for name, value in extra.items(): try: getattr(self.__class__, name) except AttributeError: if name not in self.sqlmeta.columns: raise TypeError( "%s.set() got an unexpected keyword argument " "%s" % (self.__class__.__name__, name)) try: setattr(self, name, value) except AttributeError as e: raise AttributeError('%s (with attribute %r)' % (e, name)) self.sqlmeta.dirty = True return self._SO_writeLock.acquire() try: # We have to go through and see if the setters are # "plain", that is, if the user has changed their # definition in any way (put in something that # normalizes the value or checks for consistency, # for instance). If so then we have to use plain # old setattr() to change the value, since we can't # read the user's mind. We'll combine everything # else into a single UPDATE, if necessary. toUpdate = {} for name, value in kw.items(): from_python = getattr(self, '_SO_from_python_%s' % name, None) if from_python: dbValue = from_python(value, self._SO_validatorState) else: dbValue = value to_python = getattr(self, '_SO_to_python_%s' % name, None) if to_python: value = to_python(dbValue, self._SO_validatorState) if self.sqlmeta.cacheValues: setattr(self, instanceName(name), value) toUpdate[name] = dbValue for name, value in extra.items(): try: getattr(self.__class__, name) except AttributeError: if name not in self.sqlmeta.columns: raise TypeError( "%s.set() got an unexpected keyword argument " "%s" % (self.__class__.__name__, name)) try: setattr(self, name, value) except AttributeError as e: raise AttributeError('%s (with attribute %r)' % (e, name)) if toUpdate: toUpdate = sorted( toUpdate.items(), key=lambda c: self.sqlmeta.columns[c[0]].creationOrder) args = [(self.sqlmeta.columns[name].dbName, value) for name, value in toUpdate] self._connection._SO_update(self, args) finally: self._SO_writeLock.release() post_funcs = [] self.sqlmeta.send(events.RowUpdatedSignal, self, post_funcs) for func in post_funcs: func(self) def _SO_selectInit(self, row): for _col, colValue in zip(self.sqlmeta.columnList, row): if _col.to_python: colValue = _col.to_python(colValue, self._SO_validatorState) setattr(self, instanceName(_col.name), colValue) def _SO_getValue(self, name): # Retrieves a single value from the database. Simple. assert not self.sqlmeta._obsolete, ( "%s with id %s has become obsolete" % (self.__class__.__name__, self.id)) # @@: do we really need this lock? # self._SO_writeLock.acquire() column = self.sqlmeta.columns[name] results = self._connection._SO_selectOne(self, [column.dbName]) # self._SO_writeLock.release() assert results is not None, "%s with id %s is not in the database" % ( self.__class__.__name__, self.id) value = results[0] if column.to_python: value = column.to_python(value, self._SO_validatorState) return value def _SO_foreignKey(self, value, joinClass, idName=None): if value is None: return None if self.sqlmeta._perConnection: connection = self._connection else: connection = None if idName is None: # Get by id return joinClass.get(value, connection=connection) return joinClass.select( getattr(joinClass.q, idName) == value, connection=connection).getOne() def __init__(self, **kw): # If we are the outmost constructor of a hiearchy of # InheritableSQLObjects (or simlpy _the_ constructor of a "normal" # SQLObject), we create a threadlocal list that collects the # RowCreatedSignals, and executes them if this very constructor is left try: _postponed_local.postponed_calls postponed_created = False except AttributeError: _postponed_local.postponed_calls = [] postponed_created = True try: # We shadow the sqlmeta class with an instance of sqlmeta # that points to us (our sqlmeta buddy object; where the # sqlmeta class is our class's buddy class) self.sqlmeta = self.__class__.sqlmeta(self) # The get() classmethod/constructor uses a magic keyword # argument when it wants an empty object, fetched from the # database. So we have nothing more to do in that case: if '_SO_fetch_no_create' in kw: return post_funcs = [] self.sqlmeta.send(events.RowCreateSignal, self, kw, post_funcs) # Pass the connection object along if we were given one. if 'connection' in kw: connection = kw.pop('connection') if getattr(self, '_connection', None) is not connection: self._connection = connection self.sqlmeta._perConnection = True self._SO_writeLock = threading.Lock() if 'id' in kw: id = self.sqlmeta.idType(kw['id']) del kw['id'] else: id = None self._create(id, **kw) for func in post_funcs: func(self) finally: # if we are the creator of the tl-storage, we # have to exectute and under all circumstances # remove the tl-storage if postponed_created: try: for func in _postponed_local.postponed_calls: func() finally: del _postponed_local.postponed_calls def _create(self, id, **kw): self.sqlmeta._creating = True self._SO_createValues = {} self._SO_validatorState = sqlbuilder.SQLObjectState(self) # First we do a little fix-up on the keywords we were # passed: for column in self.sqlmeta.columnList: # Then we check if the column wasn't passed in, and # if not we try to get the default. if column.name not in kw and column.foreignName not in kw: default = column.default # If we don't get it, it's an error: # If we specified an SQL DEFAULT, then we should use that if default is NoDefault: if column.defaultSQL is None: raise TypeError( "%s() did not get expected keyword argument " "'%s'" % (self.__class__.__name__, column.name)) else: # There is defaultSQL for the column - # do not put the column to kw # so that the backend creates the value. continue # Otherwise we put it in as though they did pass # that keyword: kw[column.name] = default self.set(**kw) # Then we finalize the process: self._SO_finishCreate(id) def _SO_finishCreate(self, id=None): # Here's where an INSERT is finalized. # These are all the column values that were supposed # to be set, but were delayed until now: setters = self._SO_createValues.items() setters = sorted( setters, key=lambda c: self.sqlmeta.columns[c[0]].creationOrder) # Here's their database names: names = [self.sqlmeta.columns[v[0]].dbName for v in setters] values = [v[1] for v in setters] # Get rid of _SO_create*, we aren't creating anymore. # Doesn't have to be threadsafe because we're still in # new(), which doesn't need to be threadsafe. self.sqlmeta.dirty = False if not self.sqlmeta.lazyUpdate: del self._SO_createValues else: self._SO_createValues = {} del self.sqlmeta._creating # Do the insert -- most of the SQL in this case is left # up to DBConnection, since getting a new ID is # non-standard. id = self._connection.queryInsertID(self, id, names, values) cache = self._connection.cache cache.created(id, self.__class__, self) self._init(id) post_funcs = [] kw = dict([('class', self.__class__), ('id', id)]) def _send_RowCreatedSignal(): self.sqlmeta.send(events.RowCreatedSignal, self, kw, post_funcs) for func in post_funcs: func(self) _postponed_local.postponed_calls.append(_send_RowCreatedSignal) def _SO_getID(self, obj, refColumn=None): return getID(obj, refColumn) @classmethod def _findAlternateID(cls, name, dbName, value, connection=None): if isinstance(name, str): name = (name,) value = (value,) if len(name) != len(value): raise ValueError( "'column' and 'value' tuples must be of the same size") new_value = [] for n, v in zip(name, value): from_python = getattr(cls, '_SO_from_python_' + n) if from_python: v = from_python( v, sqlbuilder.SQLObjectState(cls, connection=connection)) new_value.append(v) condition = sqlbuilder.AND( *[getattr(cls.q, _n) == _v for _n, _v in zip(name, new_value)]) return (connection or cls._connection)._SO_selectOneAlt( cls, [cls.sqlmeta.idName] + [column.dbName for column in cls.sqlmeta.columnList], condition), None @classmethod def _SO_fetchAlternateID(cls, name, dbName, value, connection=None, idxName=None): result, obj = cls._findAlternateID(name, dbName, value, connection) if not result: if idxName is None: raise SQLObjectNotFound( "The %s by alternateID %s = %s does not exist" % ( cls.__name__, name, repr(value))) else: names = [] for i in range(len(name)): names.append("%s = %s" % (name[i], repr(value[i]))) names = ', '.join(names) raise SQLObjectNotFound( "The %s by unique index %s(%s) does not exist" % ( cls.__name__, idxName, names)) if obj: return obj if connection: obj = cls.get(result[0], connection=connection, selectResults=result[1:]) else: obj = cls.get(result[0], selectResults=result[1:]) return obj @classmethod def _SO_depends(cls): return findDependencies(cls.__name__, cls.sqlmeta.registry) @classmethod def select(cls, clause=None, clauseTables=None, orderBy=NoDefault, limit=None, lazyColumns=False, reversed=False, distinct=False, connection=None, join=None, forUpdate=False): return cls.SelectResultsClass(cls, clause, clauseTables=clauseTables, orderBy=orderBy, limit=limit, lazyColumns=lazyColumns, reversed=reversed, distinct=distinct, connection=connection, join=join, forUpdate=forUpdate) @classmethod def selectBy(cls, connection=None, **kw): conn = connection or cls._connection return cls.SelectResultsClass(cls, conn._SO_columnClause(cls, kw), connection=conn) @classmethod def tableExists(cls, connection=None): conn = connection or cls._connection return conn.tableExists(cls.sqlmeta.table) @classmethod def dropTable(cls, ifExists=False, dropJoinTables=True, cascade=False, connection=None): conn = connection or cls._connection if ifExists and not cls.tableExists(connection=conn): return extra_sql = [] post_funcs = [] cls.sqlmeta.send(events.DropTableSignal, cls, connection, extra_sql, post_funcs) conn.dropTable(cls.sqlmeta.table, cascade) if dropJoinTables: cls.dropJoinTables(ifExists=ifExists, connection=conn) for sql in extra_sql: connection.query(sql) for func in post_funcs: func(cls, conn) @classmethod def createTable(cls, ifNotExists=False, createJoinTables=True, createIndexes=True, applyConstraints=True, connection=None): conn = connection or cls._connection if ifNotExists and cls.tableExists(connection=conn): return extra_sql = [] post_funcs = [] cls.sqlmeta.send(events.CreateTableSignal, cls, connection, extra_sql, post_funcs) constraints = conn.createTable(cls) if applyConstraints: for constraint in constraints: conn.query(constraint) else: extra_sql.extend(constraints) if createJoinTables: cls.createJoinTables(ifNotExists=ifNotExists, connection=conn) if createIndexes: cls.createIndexes(ifNotExists=ifNotExists, connection=conn) for func in post_funcs: func(cls, conn) return extra_sql @classmethod def createTableSQL(cls, createJoinTables=True, createIndexes=True, connection=None): conn = connection or cls._connection sql, constraints = conn.createTableSQL(cls) if createJoinTables: join_sql = cls.createJoinTablesSQL(connection=conn) if join_sql: sql += ';\n' + join_sql if createIndexes: index_sql = cls.createIndexesSQL(connection=conn) if index_sql: sql += ';\n' + index_sql return sql, constraints @classmethod def createJoinTables(cls, ifNotExists=False, connection=None): conn = connection or cls._connection for join in cls._getJoinsToCreate(): if (ifNotExists and conn.tableExists(join.intermediateTable)): continue conn._SO_createJoinTable(join) @classmethod def createJoinTablesSQL(cls, connection=None): conn = connection or cls._connection sql = [] for join in cls._getJoinsToCreate(): sql.append(conn._SO_createJoinTableSQL(join)) return ';\n'.join(sql) @classmethod def createIndexes(cls, ifNotExists=False, connection=None): conn = connection or cls._connection for _index in cls.sqlmeta.indexes: if not _index: continue conn._SO_createIndex(cls, _index) @classmethod def createIndexesSQL(cls, connection=None): conn = connection or cls._connection sql = [] for _index in cls.sqlmeta.indexes: if not _index: continue sql.append(conn.createIndexSQL(cls, _index)) return ';\n'.join(sql) @classmethod def _getJoinsToCreate(cls): joins = [] for join in cls.sqlmeta.joins: if not join: continue if not join.hasIntermediateTable() or \ not getattr(join, 'createRelatedTable', True): continue if join.soClass.__name__ > join.otherClass.__name__: continue joins.append(join) return joins @classmethod def dropJoinTables(cls, ifExists=False, connection=None): conn = connection or cls._connection for join in cls.sqlmeta.joins: if not join: continue if not join.hasIntermediateTable() or \ not getattr(join, 'createRelatedTable', True): continue if join.soClass.__name__ > join.otherClass.__name__: continue if ifExists and \ not conn.tableExists(join.intermediateTable): continue conn._SO_dropJoinTable(join) @classmethod def clearTable(cls, connection=None, clearJoinTables=True): # 3-03 @@: Maybe this should check the cache... but it's # kind of crude anyway, so... conn = connection or cls._connection conn.clearTable(cls.sqlmeta.table) if clearJoinTables: for join in cls._getJoinsToCreate(): conn.clearTable(join.intermediateTable) def destroySelf(self): post_funcs = [] self.sqlmeta.send(events.RowDestroySignal, self, post_funcs) # Kills this object. Kills it dead! klass = self.__class__ # Free related joins on the base class for join in klass.sqlmeta.joins: if isinstance(join, joins.SORelatedJoin): q = "DELETE FROM %s WHERE %s=%d" % (join.intermediateTable, join.joinColumn, self.id) self._connection.query(q) depends = [] depends = self._SO_depends() for k in depends: # Free related joins for join in k.sqlmeta.joins: if isinstance(join, joins.SORelatedJoin) and \ join.otherClassName == klass.__name__: q = "DELETE FROM %s WHERE %s=%d" % (join.intermediateTable, join.otherColumn, self.id) self._connection.query(q) cols = findDependantColumns(klass.__name__, k) # Don't confuse the rest of the process if len(cols) == 0: continue query = [] delete = setnull = restrict = False for _col in cols: if _col.cascade is False: # Found a restriction restrict = True query.append(getattr(k.q, _col.name) == self.id) if _col.cascade == 'null': setnull = _col.name elif _col.cascade: delete = True assert delete or setnull or restrict, ( "Class %s depends on %s accoriding to " "findDependantColumns, but this seems inaccurate" % (k, klass)) query = sqlbuilder.OR(*query) results = k.select(query, connection=self._connection) if restrict: if results.count(): # Restrictions only apply if there are # matching records on the related table raise SQLObjectIntegrityError( "Tried to delete %s::%s but " "table %s has a restriction against it" % (klass.__name__, self.id, k.__name__)) else: for row in results: if delete: row.destroySelf() else: row.set(**{setnull: None}) self.sqlmeta._obsolete = True self._connection._SO_delete(self) self._connection.cache.expire(self.id, self.__class__) for func in post_funcs: func(self) post_funcs = [] self.sqlmeta.send(events.RowDestroyedSignal, self, post_funcs) for func in post_funcs: func(self) @classmethod def delete(cls, id, connection=None): obj = cls.get(id, connection=connection) obj.destroySelf() @classmethod def deleteMany(cls, where=NoDefault, connection=None): conn = connection or cls._connection conn.query(conn.sqlrepr(sqlbuilder.Delete(cls.sqlmeta.table, where))) @classmethod def deleteBy(cls, connection=None, **kw): conn = connection or cls._connection conn.query(conn.sqlrepr(sqlbuilder.Delete( cls.sqlmeta.table, conn._SO_columnClause(cls, kw)))) def __repr__(self): if not hasattr(self, 'id'): # Object initialization not finished. No attributes can be read. return '<%s (not initialized)>' % self.__class__.__name__ return '<%s %r %s>' \ % (self.__class__.__name__, self.id, ' '.join( ['%s=%s' % (name, repr(value)) for name, value in self._reprItems()])) def __sqlrepr__(self, db): return str(self.id) @classmethod def sqlrepr(cls, value, connection=None): return (connection or cls._connection).sqlrepr(value) @classmethod def coerceID(cls, value): if isinstance(value, cls): return value.id else: return cls.sqlmeta.idType(value) def _reprItems(self): items = [] for _col in self.sqlmeta.columnList: value = getattr(self, _col.name) r = repr(value) if len(r) > 20: value = r[:17] + "..." + r[-1] items.append((_col.name, value)) return items @classmethod def setConnection(cls, value): if isinstance(value, string_type): value = dbconnection.connectionForURI(value) cls._connection = value def tablesUsedImmediate(self): return [self.__class__.q] # hash implementation def __hash__(self): # We hash on class name and id, since that should be # unique return hash((self.__class__.__name__, self.id)) # Comparison def __eq__(self, other): if self.__class__ is other.__class__: if self.id == other.id: return True return False def __ne__(self, other): return not self.__eq__(other) def __lt__(self, other): return NotImplemented def __le__(self, other): return NotImplemented def __gt__(self, other): return NotImplemented def __ge__(self, other): return NotImplemented # (De)serialization (pickle, etc.) def __getstate__(self): if self.sqlmeta._perConnection: from pickle import PicklingError raise PicklingError( 'Cannot pickle an SQLObject instance ' 'that has a per-instance connection') if self.sqlmeta.lazyUpdate and self._SO_createValues: self.syncUpdate() d = self.__dict__.copy() del d['sqlmeta'] del d['_SO_validatorState'] del d['_SO_writeLock'] del d['_SO_createValues'] return d def __setstate__(self, d): self.__init__(_SO_fetch_no_create=1) self._SO_validatorState = sqlbuilder.SQLObjectState(self) self._SO_writeLock = threading.Lock() self._SO_createValues = {} self.__dict__.update(d) cls = self.__class__ cache = self._connection.cache if cache.tryGet(self.id, cls) is not None: raise ValueError( "Cannot unpickle %s row with id=%s - " "a different instance with the id already exists " "in the cache" % (cls.__name__, self.id)) cache.created(self.id, cls, self) def setterName(name): return '_set_%s' % name def rawSetterName(name): return '_SO_set_%s' % name def getterName(name): return '_get_%s' % name def rawGetterName(name): return '_SO_get_%s' % name def instanceName(name): return '_SO_val_%s' % name ######################################## # Utility functions (for external consumption) ######################################## def getID(obj, refColumn=None): if isinstance(obj, SQLObject): return getattr(obj, refColumn or 'id') elif isinstance(obj, int): return obj elif isinstance(obj, long): return int(obj) elif isinstance(obj, str): try: return int(obj) except ValueError: return obj elif obj is None: return None def getObject(obj, klass): if isinstance(obj, int): return klass(obj) elif isinstance(obj, long): return klass(int(obj)) elif isinstance(obj, str): return klass(int(obj)) elif obj is None: return None else: return obj __all__ = [ 'NoDefault', 'SQLObject', 'SQLObjectIntegrityError', 'SQLObjectNotFound', 'getID', 'getObject', 'sqlhub', 'sqlmeta', ] SQLObject-3.4.0/sqlobject/.coveragerc0000644000175000017500000000030112753115111016752 0ustar phdphd00000000000000[run] omit = firebird/*.py maxdb/*.py mssql/*.py rdbhost/*.py sybase/*.py tests/test_boundattributes.py tests/test_paste.py util/threadinglocal.py wsgi_middleware.py SQLObject-3.4.0/sqlobject/boundattributes.py0000644000175000017500000001013712470110264020431 0ustar phdphd00000000000000""" Bound attributes are attributes that are bound to a specific class and a specific name. In SQLObject a typical example is a column object, which knows its name and class. A bound attribute should define a method ``__addtoclass__(added_class, name)`` (attributes without this method will simply be treated as normal). The return value is ignored; if the attribute wishes to change the value in the class, it must call ``setattr(added_class, name, new_value)``. BoundAttribute is a class that facilitates lazy attribute creation. """ from __future__ import absolute_import from . import declarative from . import events __all__ = ['BoundAttribute', 'BoundFactory'] class BoundAttribute(declarative.Declarative): """ This is a declarative class that passes all the values given to it to another object. So you can pass it arguments (via __init__/__call__) or give it the equivalent of keyword arguments through subclassing. Then a bound object will be added in its place. To hook this other object in, override ``make_object(added_class, name, **attrs)`` and maybe ``set_object(added_class, name, **attrs)`` (the default implementation of ``set_object`` just resets the attribute to whatever ``make_object`` returned). Also see ``BoundFactory``. """ _private_variables = ( '_private_variables', '_all_attributes', '__classinit__', '__addtoclass__', '_add_attrs', 'set_object', 'make_object', 'clone_in_subclass', ) _all_attrs = () clone_for_subclass = True def __classinit__(cls, new_attrs): declarative.Declarative.__classinit__(cls, new_attrs) cls._all_attrs = cls._add_attrs(cls, new_attrs) def __instanceinit__(self, new_attrs): declarative.Declarative.__instanceinit__(self, new_attrs) self.__dict__['_all_attrs'] = self._add_attrs(self, new_attrs) @staticmethod def _add_attrs(this_object, new_attrs): private = this_object._private_variables all_attrs = list(this_object._all_attrs) for key in new_attrs.keys(): if key.startswith('_') or key in private: continue if key not in all_attrs: all_attrs.append(key) return tuple(all_attrs) @declarative.classinstancemethod def __addtoclass__(self, cls, added_class, attr_name): me = self or cls attrs = {} for name in me._all_attrs: attrs[name] = getattr(me, name) attrs['added_class'] = added_class attrs['attr_name'] = attr_name obj = me.make_object(**attrs) if self.clone_for_subclass: def on_rebind(new_class_name, bases, new_attrs, post_funcs, early_funcs): def rebind(new_class): me.set_object( new_class, attr_name, me.make_object(**attrs)) post_funcs.append(rebind) events.listen(receiver=on_rebind, soClass=added_class, signal=events.ClassCreateSignal, weak=False) me.set_object(added_class, attr_name, obj) @classmethod def set_object(cls, added_class, attr_name, obj): setattr(added_class, attr_name, obj) @classmethod def make_object(cls, added_class, attr_name, *args, **attrs): raise NotImplementedError def __setattr__(self, name, value): self.__dict__['_all_attrs'] = self._add_attrs(self, {name: value}) self.__dict__[name] = value class BoundFactory(BoundAttribute): """ This will bind the attribute to whatever is given by ``factory_class``. This factory should be a callable with the signature ``factory_class(added_class, attr_name, *args, **kw)``. The factory will be reinvoked (and the attribute rebound) for every subclassing. """ factory_class = None _private_variables = ( BoundAttribute._private_variables + ('factory_class',)) def make_object(cls, added_class, attr_name, *args, **kw): return cls.factory_class(added_class, attr_name, *args, **kw) SQLObject-3.4.0/sqlobject/col.py0000644000175000017500000020141513140202164015765 0ustar phdphd00000000000000""" Col -- SQLObject columns Note that each column object is named BlahBlahCol, and these are used in class definitions. But there's also a corresponding SOBlahBlahCol object, which is used in SQLObject *classes*. An explanation: when a SQLObject subclass is created, the metaclass looks through your class definition for any subclasses of Col. It collects them together, and indexes them to do all the database stuff you like, like the magic attributes and whatnot. It then asks the Col object to create an SOCol object (usually a subclass, actually). The SOCol object contains all the interesting logic, as well as a record of the attribute name you used and the class it is bound to (set by the metaclass). So, in summary: Col objects are what you define, but SOCol objects are what gets used. """ from array import array from decimal import Decimal from itertools import count import json try: import cPickle as pickle except ImportError: import pickle import time from uuid import UUID import weakref from formencode import compound, validators from .classregistry import findClass # Sadly the name "constraints" conflicts with many of the function # arguments in this module, so we rename it: from . import constraints as constrs from . import converters from . import sqlbuilder from .styles import capword from .compat import PY2, string_type, unicode_type, buffer_type import datetime datetime_available = True try: from mx import DateTime except ImportError: try: # old version of mxDateTime, # or Zope's Version if we're running with Zope import DateTime except ImportError: mxdatetime_available = False else: mxdatetime_available = True else: mxdatetime_available = True DATETIME_IMPLEMENTATION = "datetime" MXDATETIME_IMPLEMENTATION = "mxDateTime" if mxdatetime_available: if hasattr(DateTime, "Time"): DateTimeType = type(DateTime.now()) TimeType = type(DateTime.Time()) else: # Zope DateTimeType = type(DateTime.DateTime()) TimeType = type(DateTime.DateTime.Time(DateTime.DateTime())) __all__ = ["datetime_available", "mxdatetime_available", "default_datetime_implementation", "DATETIME_IMPLEMENTATION"] if mxdatetime_available: __all__.append("MXDATETIME_IMPLEMENTATION") default_datetime_implementation = DATETIME_IMPLEMENTATION if not PY2: # alias for python 3 compatibility long = int # This is to satisfy flake8 under python 3 unicode = str NoDefault = sqlbuilder.NoDefault def use_microseconds(use=True): if use: SODateTimeCol.datetimeFormat = '%Y-%m-%d %H:%M:%S.%f' SOTimeCol.timeFormat = '%H:%M:%S.%f' dt_types = [(datetime.datetime, converters.DateTimeConverterMS), (datetime.time, converters.TimeConverterMS)] else: SODateTimeCol.datetimeFormat = '%Y-%m-%d %H:%M:%S' SOTimeCol.timeFormat = '%H:%M:%S' dt_types = [(datetime.datetime, converters.DateTimeConverter), (datetime.time, converters.TimeConverter)] for dt_type, converter in dt_types: converters.registerConverter(dt_type, converter) __all__.append("use_microseconds") creationOrder = count() ######################################## # Columns ######################################## # Col is essentially a column definition, it doesn't have much logic to it. class SOCol(object): def __init__(self, name, soClass, creationOrder, dbName=None, default=NoDefault, defaultSQL=None, foreignKey=None, alternateID=False, alternateMethodName=None, constraints=None, notNull=NoDefault, notNone=NoDefault, unique=NoDefault, sqlType=None, columnDef=None, validator=None, validator2=None, immutable=False, cascade=None, lazy=False, noCache=False, forceDBName=False, title=None, tags=[], origName=None, dbEncoding=None, extra_vars=None): super(SOCol, self).__init__() # This isn't strictly true, since we *could* use backquotes or # " or something (database-specific) around column names, but # why would anyone *want* to use a name like that? # @@: I suppose we could actually add backquotes to the # dbName if we needed to... if not forceDBName: assert sqlbuilder.sqlIdentifier(name), ( 'Name must be SQL-safe ' '(letters, numbers, underscores): %s (or use forceDBName=True)' % repr(name)) assert name != 'id', ( 'The column name "id" is reserved for SQLObject use ' '(and is implicitly created).') assert name, "You must provide a name for all columns" self.columnDef = columnDef self.creationOrder = creationOrder self.immutable = immutable # cascade can be one of: # None: no constraint is generated # True: a CASCADE constraint is generated # False: a RESTRICT constraint is generated # 'null': a SET NULL trigger is generated if isinstance(cascade, str): assert cascade == 'null', ( "The only string value allowed for cascade is 'null' " "(you gave: %r)" % cascade) self.cascade = cascade if not isinstance(constraints, (list, tuple)): constraints = [constraints] self.constraints = self.autoConstraints() + constraints self.notNone = False if notNull is not NoDefault: self.notNone = notNull assert notNone is NoDefault or (not notNone) == (not notNull), ( "The notNull and notNone arguments are aliases, " "and must not conflict. " "You gave notNull=%r, notNone=%r" % (notNull, notNone)) elif notNone is not NoDefault: self.notNone = notNone if self.notNone: self.constraints = [constrs.notNull] + self.constraints self.name = name self.soClass = soClass self._default = default self.defaultSQL = defaultSQL self.customSQLType = sqlType # deal with foreign keys self.foreignKey = foreignKey if self.foreignKey: if origName is not None: idname = soClass.sqlmeta.style.instanceAttrToIDAttr(origName) else: idname = soClass.sqlmeta.style.instanceAttrToIDAttr(name) if self.name != idname: self.foreignName = self.name self.name = idname else: self.foreignName = soClass.sqlmeta.style.\ instanceIDAttrToAttr(self.name) else: self.foreignName = None # if they don't give us a specific database name for # the column, we separate the mixedCase into mixed_case # and assume that. if dbName is None: self.dbName = soClass.sqlmeta.style.pythonAttrToDBColumn(self.name) else: self.dbName = dbName # alternateID means that this is a unique column that # can be used to identify rows self.alternateID = alternateID if unique is NoDefault: self.unique = alternateID else: self.unique = unique if self.unique and alternateMethodName is None: self.alternateMethodName = 'by' + capword(self.name) else: self.alternateMethodName = alternateMethodName _validators = self.createValidators() if validator: _validators.append(validator) if validator2: _validators.insert(0, validator2) _vlen = len(_validators) if _vlen: for _validator in _validators: _validator.soCol = weakref.proxy(self) if _vlen == 0: self.validator = None # Set sef.{from,to}_python elif _vlen == 1: self.validator = _validators[0] elif _vlen > 1: self.validator = compound.All.join( _validators[0], *_validators[1:]) self.noCache = noCache self.lazy = lazy # this is in case of ForeignKey, where we rename the column # and append an ID self.origName = origName or name self.title = title self.tags = tags self.dbEncoding = dbEncoding if extra_vars: for name, value in extra_vars.items(): setattr(self, name, value) def _set_validator(self, value): self._validator = value if self._validator: self.to_python = self._validator.to_python self.from_python = self._validator.from_python else: self.to_python = None self.from_python = None def _get_validator(self): return self._validator validator = property(_get_validator, _set_validator) def createValidators(self): """Create a list of validators for the column.""" return [] def autoConstraints(self): return [] def _get_default(self): # A default can be a callback or a plain value, # here we resolve the callback if self._default is NoDefault: return NoDefault elif hasattr(self._default, '__sqlrepr__'): return self._default elif callable(self._default): return self._default() else: return self._default default = property(_get_default, None, None) def _get_joinName(self): return self.soClass.sqlmeta.style.instanceIDAttrToAttr(self.name) joinName = property(_get_joinName, None, None) def __repr__(self): r = '<%s %s' % (self.__class__.__name__, self.name) if self.default is not NoDefault: r += ' default=%s' % repr(self.default) if self.foreignKey: r += ' connected to %s' % self.foreignKey if self.alternateID: r += ' alternate ID' if self.notNone: r += ' not null' return r + '>' def createSQL(self): return ' '.join([self._sqlType()] + self._extraSQL()) def _extraSQL(self): result = [] if self.notNone or self.alternateID: result.append('NOT NULL') if self.unique or self.alternateID: result.append('UNIQUE') if self.defaultSQL is not None: result.append("DEFAULT %s" % self.defaultSQL) return result def _sqlType(self): if self.customSQLType is None: raise ValueError("Col %s (%s) cannot be used for automatic " "schema creation (too abstract)" % (self.name, self.__class__)) else: return self.customSQLType def _mysqlType(self): return self._sqlType() def _postgresType(self): return self._sqlType() def _sqliteType(self): # SQLite is naturally typeless, so as a fallback it uses # no type. try: return self._sqlType() except ValueError: return '' def _sybaseType(self): return self._sqlType() def _mssqlType(self): return self._sqlType() def _firebirdType(self): return self._sqlType() def _maxdbType(self): return self._sqlType() def mysqlCreateSQL(self, connection=None): self.connection = connection return ' '.join([self.dbName, self._mysqlType()] + self._extraSQL()) def postgresCreateSQL(self): return ' '.join([self.dbName, self._postgresType()] + self._extraSQL()) def sqliteCreateSQL(self): return ' '.join([self.dbName, self._sqliteType()] + self._extraSQL()) def sybaseCreateSQL(self): return ' '.join([self.dbName, self._sybaseType()] + self._extraSQL()) def mssqlCreateSQL(self, connection=None): self.connection = connection return ' '.join([self.dbName, self._mssqlType()] + self._extraSQL()) def firebirdCreateSQL(self): # Ian Sparks pointed out that fb is picky about the order # of the NOT NULL clause in a create statement. So, we handle # them differently for Enum columns. if not isinstance(self, SOEnumCol): return ' '.join( [self.dbName, self._firebirdType()] + self._extraSQL()) else: return ' '.join( [self.dbName] + [self._firebirdType()[0]] + self._extraSQL() + [self._firebirdType()[1]]) def maxdbCreateSQL(self): return ' '.join([self.dbName, self._maxdbType()] + self._extraSQL()) def __get__(self, obj, type=None): if obj is None: # class attribute, return the descriptor itself return self if obj.sqlmeta._obsolete: raise RuntimeError('The object <%s %s> is obsolete' % ( obj.__class__.__name__, obj.id)) if obj.sqlmeta.cacheColumns: columns = obj.sqlmeta._columnCache if columns is None: obj.sqlmeta.loadValues() try: return columns[name] # noqa except KeyError: return obj.sqlmeta.loadColumn(self) else: return obj.sqlmeta.loadColumn(self) def __set__(self, obj, value): if self.immutable: raise AttributeError("The column %s.%s is immutable" % (obj.__class__.__name__, self.name)) obj.sqlmeta.setColumn(self, value) def __delete__(self, obj): raise AttributeError("I can't be deleted from %r" % obj) def getDbEncoding(self, state, default='utf-8'): if self.dbEncoding: return self.dbEncoding dbEncoding = state.soObject.sqlmeta.dbEncoding if dbEncoding: return dbEncoding try: connection = state.connection or state.soObject._connection except AttributeError: dbEncoding = None else: dbEncoding = getattr(connection, "dbEncoding", None) if not dbEncoding: dbEncoding = default return dbEncoding class Col(object): baseClass = SOCol def __init__(self, name=None, **kw): super(Col, self).__init__() self.__dict__['_name'] = name self.__dict__['_kw'] = kw self.__dict__['creationOrder'] = next(creationOrder) self.__dict__['_extra_vars'] = {} def _set_name(self, value): assert self._name is None or self._name == value, ( "You cannot change a name after it has already been set " "(from %s to %s)" % (self.name, value)) self.__dict__['_name'] = value def _get_name(self): return self._name name = property(_get_name, _set_name) def withClass(self, soClass): return self.baseClass(soClass=soClass, name=self._name, creationOrder=self.creationOrder, columnDef=self, extra_vars=self._extra_vars, **self._kw) def __setattr__(self, var, value): if var == 'name': super(Col, self).__setattr__(var, value) return self._extra_vars[var] = value def __repr__(self): return '<%s %s %s>' % ( self.__class__.__name__, hex(abs(id(self)))[2:], self._name or '(unnamed)') class SOValidator(validators.Validator): def getDbEncoding(self, state, default='utf-8'): try: return self.dbEncoding except AttributeError: return self.soCol.getDbEncoding(state, default=default) class SOStringLikeCol(SOCol): """A common ancestor for SOStringCol and SOUnicodeCol""" def __init__(self, **kw): self.length = kw.pop('length', None) self.varchar = kw.pop('varchar', 'auto') self.char_binary = kw.pop('char_binary', None) # A hack for MySQL if not self.length: assert self.varchar == 'auto' or not self.varchar, \ "Without a length strings are treated as TEXT, not varchar" self.varchar = False elif self.varchar == 'auto': self.varchar = True super(SOStringLikeCol, self).__init__(**kw) def autoConstraints(self): constraints = [constrs.isString] if self.length is not None: constraints += [constrs.MaxLength(self.length)] return constraints def _sqlType(self): if self.customSQLType is not None: return self.customSQLType if not self.length: return 'TEXT' elif self.varchar: return 'VARCHAR(%i)' % self.length else: return 'CHAR(%i)' % self.length def _check_case_sensitive(self, db): if self.char_binary: raise ValueError("%s does not support " "binary character columns" % db) def _mysqlType(self): type = self._sqlType() if self.char_binary: type += " BINARY" return type def _postgresType(self): self._check_case_sensitive("PostgreSQL") return super(SOStringLikeCol, self)._postgresType() def _sqliteType(self): self._check_case_sensitive("SQLite") return super(SOStringLikeCol, self)._sqliteType() def _sybaseType(self): self._check_case_sensitive("SYBASE") type = self._sqlType() return type def _mssqlType(self): if self.customSQLType is not None: return self.customSQLType if not self.length: if self.connection and self.connection.can_use_max_types(): type = 'VARCHAR(MAX)' else: type = 'VARCHAR(4000)' elif self.varchar: type = 'VARCHAR(%i)' % self.length else: type = 'CHAR(%i)' % self.length return type def _firebirdType(self): self._check_case_sensitive("FireBird") if not self.length: return 'BLOB SUB_TYPE TEXT' else: return self._sqlType() def _maxdbType(self): self._check_case_sensitive("SAP DB/MaxDB") if not self.length: return 'LONG ASCII' else: return self._sqlType() class StringValidator(SOValidator): def to_python(self, value, state): if value is None: return None try: connection = state.connection or state.soObject._connection binaryType = connection._binaryType dbName = connection.dbName except AttributeError: binaryType = type(None) # Just a simple workaround dbEncoding = self.getDbEncoding(state, default='ascii') if isinstance(value, unicode_type): if PY2: return value.encode(dbEncoding) return value if self.dataType and isinstance(value, self.dataType): return value if isinstance(value, (str, bytes, buffer_type, binaryType, sqlbuilder.SQLExpression)): return value if hasattr(value, '__unicode__'): return unicode(value).encode(dbEncoding) if dbName == 'mysql': if isinstance(value, bytearray): if PY2: return bytes(value) else: return value.decode(dbEncoding, errors='surrogateescape') if not PY2 and isinstance(value, bytes): return value.decode('ascii', errors='surrogateescape') raise validators.Invalid( "expected a str in the StringCol '%s', got %s %r instead" % ( self.name, type(value), value), value, state) from_python = to_python class SOStringCol(SOStringLikeCol): def createValidators(self, dataType=None): return [StringValidator(name=self.name, dataType=dataType)] + \ super(SOStringCol, self).createValidators() class StringCol(Col): baseClass = SOStringCol class NQuoted(sqlbuilder.SQLExpression): def __init__(self, value): self.value = value def __hash__(self): return hash(self.value) def __sqlrepr__(self, db): assert db == 'mssql' return "N" + sqlbuilder.sqlrepr(self.value, db) class UnicodeStringValidator(SOValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, (unicode_type, sqlbuilder.SQLExpression)): return value if isinstance(value, str): return value.decode(self.getDbEncoding(state)) if isinstance(value, array): # MySQL return value.tostring().decode(self.getDbEncoding(state)) if hasattr(value, '__unicode__'): return unicode(value) raise validators.Invalid( "expected a str or a unicode in the UnicodeCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None if isinstance(value, (str, sqlbuilder.SQLExpression)): return value if isinstance(value, unicode_type): try: connection = state.connection or state.soObject._connection except AttributeError: pass else: if connection.dbName == 'mssql': if PY2: value = value.encode(self.getDbEncoding(state)) return NQuoted(value) return value.encode(self.getDbEncoding(state)) if hasattr(value, '__unicode__'): return unicode(value).encode(self.getDbEncoding(state)) raise validators.Invalid( "expected a str or a unicode in the UnicodeCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) class SOUnicodeCol(SOStringLikeCol): def _mssqlType(self): if self.customSQLType is not None: return self.customSQLType return 'N' + super(SOUnicodeCol, self)._mssqlType() def createValidators(self): return [UnicodeStringValidator(name=self.name)] + \ super(SOUnicodeCol, self).createValidators() class UnicodeCol(Col): baseClass = SOUnicodeCol class IntValidator(SOValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, (int, long, sqlbuilder.SQLExpression)): return value for converter, attr_name in (int, '__int__'), (long, '__long__'): if hasattr(value, attr_name): try: return converter(value) except: break raise validators.Invalid( "expected an int in the IntCol '%s', got %s %r instead" % ( self.name, type(value), value), value, state) from_python = to_python class SOIntCol(SOCol): # 3-03 @@: support precision, maybe max and min directly def __init__(self, **kw): self.length = kw.pop('length', None) self.unsigned = bool(kw.pop('unsigned', None)) self.zerofill = bool(kw.pop('zerofill', None)) SOCol.__init__(self, **kw) def autoConstraints(self): return [constrs.isInt] def createValidators(self): return [IntValidator(name=self.name)] + \ super(SOIntCol, self).createValidators() def addSQLAttrs(self, str): _ret = str if str is None or len(str) < 1: return None if self.length and self.length >= 1: _ret = "%s(%d)" % (_ret, self.length) if self.unsigned: _ret = _ret + " UNSIGNED" if self.zerofill: _ret = _ret + " ZEROFILL" return _ret def _sqlType(self): return self.addSQLAttrs("INT") class IntCol(Col): baseClass = SOIntCol class SOTinyIntCol(SOIntCol): def _sqlType(self): return self.addSQLAttrs("TINYINT") class TinyIntCol(Col): baseClass = SOTinyIntCol class SOSmallIntCol(SOIntCol): def _sqlType(self): return self.addSQLAttrs("SMALLINT") class SmallIntCol(Col): baseClass = SOSmallIntCol class SOMediumIntCol(SOIntCol): def _sqlType(self): return self.addSQLAttrs("MEDIUMINT") class MediumIntCol(Col): baseClass = SOMediumIntCol class SOBigIntCol(SOIntCol): def _sqlType(self): return self.addSQLAttrs("BIGINT") class BigIntCol(Col): baseClass = SOBigIntCol class BoolValidator(SOValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, (bool, sqlbuilder.SQLExpression)): return value if PY2 and hasattr(value, '__nonzero__') \ or not PY2 and hasattr(value, '__bool__'): return bool(value) try: connection = state.connection or state.soObject._connection except AttributeError: pass else: if connection.dbName == 'postgres' and \ connection.driver in ('odbc', 'pyodbc', 'pypyodbc') and \ isinstance(value, string_type): return bool(int(value)) raise validators.Invalid( "expected a bool or an int in the BoolCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) from_python = to_python class SOBoolCol(SOCol): def autoConstraints(self): return [constrs.isBool] def createValidators(self): return [BoolValidator(name=self.name)] + \ super(SOBoolCol, self).createValidators() def _postgresType(self): return 'BOOL' def _mysqlType(self): return "BOOL" def _sybaseType(self): return "BIT" def _mssqlType(self): return "BIT" def _firebirdType(self): return 'INT' def _maxdbType(self): return "BOOLEAN" def _sqliteType(self): return "BOOLEAN" class BoolCol(Col): baseClass = SOBoolCol class FloatValidator(SOValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, (float, int, long, sqlbuilder.SQLExpression)): return value for converter, attr_name in ( (float, '__float__'), (int, '__int__'), (long, '__long__')): if hasattr(value, attr_name): try: return converter(value) except: break raise validators.Invalid( "expected a float in the FloatCol '%s', got %s %r instead" % ( self.name, type(value), value), value, state) from_python = to_python class SOFloatCol(SOCol): # 3-03 @@: support precision (e.g., DECIMAL) def autoConstraints(self): return [constrs.isFloat] def createValidators(self): return [FloatValidator(name=self.name)] + \ super(SOFloatCol, self).createValidators() def _sqlType(self): return 'FLOAT' def _mysqlType(self): return "DOUBLE PRECISION" class FloatCol(Col): baseClass = SOFloatCol class SOKeyCol(SOCol): key_type = {int: "INT", str: "TEXT"} # 3-03 @@: this should have a simplified constructor # Should provide foreign key information for other DBs. def __init__(self, **kw): self.refColumn = kw.pop('refColumn', None) super(SOKeyCol, self).__init__(**kw) def _idType(self): return self.soClass.sqlmeta.idType def _sqlType(self): return self.key_type[self._idType()] def _sybaseType(self): key_type = {int: "NUMERIC(18,0)", str: "TEXT"} return key_type[self._idType()] def _mssqlType(self): key_type = {int: "INT", str: "TEXT"} return key_type[self._idType()] def _firebirdType(self): key_type = {int: "INT", str: "VARCHAR(255)"} return key_type[self._idType()] class KeyCol(Col): baseClass = SOKeyCol class ForeignKeyValidator(SOValidator): def __init__(self, *args, **kw): super(ForeignKeyValidator, self).__init__(*args, **kw) self.fkIDType = None def from_python(self, value, state): if value is None: return None # Avoid importing the main module # to get the SQLObject class for isinstance if hasattr(value, 'sqlmeta'): return value if self.fkIDType is None: otherTable = findClass(self.soCol.foreignKey, self.soCol.soClass.sqlmeta.registry) self.fkIDType = otherTable.sqlmeta.idType try: value = self.fkIDType(value) return value except (ValueError, TypeError): pass raise validators.Invalid("expected a %r for the ForeignKey '%s', " "got %s %r instead" % (self.fkIDType, self.name, type(value), value), value, state) class SOForeignKey(SOKeyCol): def __init__(self, **kw): foreignKey = kw['foreignKey'] style = kw['soClass'].sqlmeta.style if kw.get('name'): kw['origName'] = kw['name'] kw['name'] = style.instanceAttrToIDAttr(kw['name']) else: kw['name'] = style.instanceAttrToIDAttr( style.pythonClassToAttr(foreignKey)) super(SOForeignKey, self).__init__(**kw) def createValidators(self): return [ForeignKeyValidator(name=self.name)] + \ super(SOForeignKey, self).createValidators() def _idType(self): other = findClass(self.foreignKey, self.soClass.sqlmeta.registry) return other.sqlmeta.idType def sqliteCreateSQL(self): sql = SOKeyCol.sqliteCreateSQL(self) other = findClass(self.foreignKey, self.soClass.sqlmeta.registry) tName = other.sqlmeta.table idName = self.refColumn or other.sqlmeta.idName if self.cascade is not None: if self.cascade == 'null': action = 'ON DELETE SET NULL' elif self.cascade: action = 'ON DELETE CASCADE' else: action = 'ON DELETE RESTRICT' else: action = '' constraint = ('CONSTRAINT %(colName)s_exists ' # 'FOREIGN KEY(%(colName)s) ' 'REFERENCES %(tName)s(%(idName)s) ' '%(action)s' % {'tName': tName, 'colName': self.dbName, 'idName': idName, 'action': action}) sql = ' '.join([sql, constraint]) return sql def postgresCreateSQL(self): sql = SOKeyCol.postgresCreateSQL(self) return sql def postgresCreateReferenceConstraint(self): sTName = self.soClass.sqlmeta.table other = findClass(self.foreignKey, self.soClass.sqlmeta.registry) tName = other.sqlmeta.table idName = self.refColumn or other.sqlmeta.idName if self.cascade is not None: if self.cascade == 'null': action = 'ON DELETE SET NULL' elif self.cascade: action = 'ON DELETE CASCADE' else: action = 'ON DELETE RESTRICT' else: action = '' constraint = ('ALTER TABLE %(sTName)s ' 'ADD CONSTRAINT %(colName)s_exists ' 'FOREIGN KEY (%(colName)s) ' 'REFERENCES %(tName)s (%(idName)s) ' '%(action)s' % {'tName': tName, 'colName': self.dbName, 'idName': idName, 'action': action, 'sTName': sTName}) return constraint def mysqlCreateReferenceConstraint(self): sTName = self.soClass.sqlmeta.table sTLocalName = sTName.split('.')[-1] other = findClass(self.foreignKey, self.soClass.sqlmeta.registry) tName = other.sqlmeta.table idName = self.refColumn or other.sqlmeta.idName if self.cascade is not None: if self.cascade == 'null': action = 'ON DELETE SET NULL' elif self.cascade: action = 'ON DELETE CASCADE' else: action = 'ON DELETE RESTRICT' else: action = '' constraint = ('ALTER TABLE %(sTName)s ' 'ADD CONSTRAINT %(sTLocalName)s_%(colName)s_exists ' 'FOREIGN KEY (%(colName)s) ' 'REFERENCES %(tName)s (%(idName)s) ' '%(action)s' % {'tName': tName, 'colName': self.dbName, 'idName': idName, 'action': action, 'sTName': sTName, 'sTLocalName': sTLocalName}) return constraint def mysqlCreateSQL(self, connection=None): return SOKeyCol.mysqlCreateSQL(self, connection) def sybaseCreateSQL(self): sql = SOKeyCol.sybaseCreateSQL(self) other = findClass(self.foreignKey, self.soClass.sqlmeta.registry) tName = other.sqlmeta.table idName = self.refColumn or other.sqlmeta.idName reference = ('REFERENCES %(tName)s(%(idName)s) ' % {'tName': tName, 'idName': idName}) sql = ' '.join([sql, reference]) return sql def sybaseCreateReferenceConstraint(self): # @@: Code from above should be moved here return None def mssqlCreateSQL(self, connection=None): sql = SOKeyCol.mssqlCreateSQL(self, connection) other = findClass(self.foreignKey, self.soClass.sqlmeta.registry) tName = other.sqlmeta.table idName = self.refColumn or other.sqlmeta.idName reference = ('REFERENCES %(tName)s(%(idName)s) ' % {'tName': tName, 'idName': idName}) sql = ' '.join([sql, reference]) return sql def mssqlCreateReferenceConstraint(self): # @@: Code from above should be moved here return None def maxdbCreateSQL(self): other = findClass(self.foreignKey, self.soClass.sqlmeta.registry) fidName = self.dbName # I assume that foreign key name is identical # to the id of the reference table sql = ' '.join([fidName, self._maxdbType()]) tName = other.sqlmeta.table idName = self.refColumn or other.sqlmeta.idName sql = sql + ',' + '\n' sql = sql + 'FOREIGN KEY (%s) REFERENCES %s(%s)' % (fidName, tName, idName) return sql def maxdbCreateReferenceConstraint(self): # @@: Code from above should be moved here return None class ForeignKey(KeyCol): baseClass = SOForeignKey def __init__(self, foreignKey=None, **kw): super(ForeignKey, self).__init__(foreignKey=foreignKey, **kw) class EnumValidator(SOValidator): def to_python(self, value, state): if value in self.enumValues: # Only encode on python 2 - on python 3, the database driver # will handle this if isinstance(value, unicode_type) and PY2: dbEncoding = self.getDbEncoding(state) value = value.encode(dbEncoding) return value elif not self.notNone and value is None: return None raise validators.Invalid( "expected a member of %r in the EnumCol '%s', got %r instead" % ( self.enumValues, self.name, value), value, state) from_python = to_python class SOEnumCol(SOCol): def __init__(self, **kw): self.enumValues = kw.pop('enumValues', None) assert self.enumValues is not None, \ 'You must provide an enumValues keyword argument' super(SOEnumCol, self).__init__(**kw) def autoConstraints(self): return [constrs.isString, constrs.InList(self.enumValues)] def createValidators(self): return [EnumValidator(name=self.name, enumValues=self.enumValues, notNone=self.notNone)] + \ super(SOEnumCol, self).createValidators() def _mysqlType(self): # We need to map None in the enum expression to an appropriate # condition on NULL if None in self.enumValues: return "ENUM(%s)" % ', '.join( [sqlbuilder.sqlrepr(v, 'mysql') for v in self.enumValues if v is not None]) else: return "ENUM(%s) NOT NULL" % ', '.join( [sqlbuilder.sqlrepr(v, 'mysql') for v in self.enumValues]) def _postgresType(self): length = max(map(self._getlength, self.enumValues)) enumValues = ', '.join( [sqlbuilder.sqlrepr(v, 'postgres') for v in self.enumValues]) checkConstraint = "CHECK (%s in (%s))" % (self.dbName, enumValues) return "VARCHAR(%i) %s" % (length, checkConstraint) _sqliteType = _postgresType def _sybaseType(self): return self._postgresType() def _mssqlType(self): return self._postgresType() def _firebirdType(self): length = max(map(self._getlength, self.enumValues)) enumValues = ', '.join( [sqlbuilder.sqlrepr(v, 'firebird') for v in self.enumValues]) checkConstraint = "CHECK (%s in (%s))" % (self.dbName, enumValues) # NB. Return a tuple, not a string here return "VARCHAR(%i)" % (length), checkConstraint def _maxdbType(self): raise TypeError("Enum type is not supported on MAX DB") def _getlength(self, obj): """ None counts as 0; everything else uses len() """ if obj is None: return 0 else: return len(obj) class EnumCol(Col): baseClass = SOEnumCol class SetValidator(SOValidator): """ Translates Python tuples into SQL comma-delimited SET strings. """ def to_python(self, value, state): if isinstance(value, str): return tuple(value.split(",")) raise validators.Invalid( "expected a string in the SetCol '%s', got %s %r instead" % ( self.name, type(value), value), value, state) def from_python(self, value, state): if isinstance(value, string_type): value = (value,) try: return ",".join(value) except: raise validators.Invalid( "expected a string or a sequence of strings " "in the SetCol '%s', got %s %r instead" % ( self.name, type(value), value), value, state) class SOSetCol(SOCol): def __init__(self, **kw): self.setValues = kw.pop('setValues', None) assert self.setValues is not None, \ 'You must provide a setValues keyword argument' super(SOSetCol, self).__init__(**kw) def autoConstraints(self): return [constrs.isString, constrs.InList(self.setValues)] def createValidators(self): return [SetValidator(name=self.name, setValues=self.setValues)] + \ super(SOSetCol, self).createValidators() def _mysqlType(self): return "SET(%s)" % ', '.join( [sqlbuilder.sqlrepr(v, 'mysql') for v in self.setValues]) class SetCol(Col): baseClass = SOSetCol class DateTimeValidator(validators.DateValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, (datetime.datetime, datetime.date, datetime.time, sqlbuilder.SQLExpression)): return value if mxdatetime_available: if isinstance(value, DateTimeType): # convert mxDateTime instance to datetime if (self.format.find("%H") >= 0) or \ (self.format.find("%T")) >= 0: return datetime.datetime(value.year, value.month, value.day, value.hour, value.minute, int(value.second)) else: return datetime.date(value.year, value.month, value.day) elif isinstance(value, TimeType): # convert mxTime instance to time if self.format.find("%d") >= 0: return datetime.timedelta(seconds=value.seconds) else: return datetime.time(value.hour, value.minute, int(value.second)) try: if self.format.find(".%f") >= 0: if '.' in value: _value = value.split('.') microseconds = _value[-1] _l = len(microseconds) if _l < 6: _value[-1] = microseconds + '0' * (6 - _l) elif _l > 6: _value[-1] = microseconds[:6] if _l != 6: value = '.'.join(_value) else: value += '.0' return datetime.datetime.strptime(value, self.format) except: raise validators.Invalid( "expected a date/time string of the '%s' format " "in the DateTimeCol '%s', got %s %r instead" % ( self.format, self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None if isinstance(value, (datetime.datetime, datetime.date, datetime.time, sqlbuilder.SQLExpression)): return value if hasattr(value, "strftime"): return value.strftime(self.format) raise validators.Invalid( "expected a datetime in the DateTimeCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) if mxdatetime_available: class MXDateTimeValidator(validators.DateValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, (DateTimeType, TimeType, sqlbuilder.SQLExpression)): return value if isinstance(value, datetime.datetime): return DateTime.DateTime(value.year, value.month, value.day, value.hour, value.minute, value.second) elif isinstance(value, datetime.date): return DateTime.Date(value.year, value.month, value.day) elif isinstance(value, datetime.time): return DateTime.Time(value.hour, value.minute, value.second) elif isinstance(value, datetime.timedelta): if value.days: raise validators.Invalid( "the value for the TimeCol '%s' must has days=0, " "it has days=%d" % (self.name, value.days), value, state) return DateTime.Time(seconds=value.seconds) try: if self.format.find(".%f") >= 0: if '.' in value: _value = value.split('.') microseconds = _value[-1] _l = len(microseconds) if _l < 6: _value[-1] = microseconds + '0' * (6 - _l) elif _l > 6: _value[-1] = microseconds[:6] if _l != 6: value = '.'.join(_value) else: value += '.0' value = datetime.datetime.strptime(value, self.format) return DateTime.DateTime(value.year, value.month, value.day, value.hour, value.minute, value.second) except: raise validators.Invalid( "expected a date/time string of the '%s' format " "in the DateTimeCol '%s', got %s %r instead" % ( self.format, self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None if isinstance(value, (DateTimeType, TimeType, sqlbuilder.SQLExpression)): return value if hasattr(value, "strftime"): return value.strftime(self.format) raise validators.Invalid( "expected a mxDateTime in the DateTimeCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) class SODateTimeCol(SOCol): datetimeFormat = '%Y-%m-%d %H:%M:%S.%f' def __init__(self, **kw): datetimeFormat = kw.pop('datetimeFormat', None) if datetimeFormat: self.datetimeFormat = datetimeFormat super(SODateTimeCol, self).__init__(**kw) def createValidators(self): _validators = super(SODateTimeCol, self).createValidators() if default_datetime_implementation == DATETIME_IMPLEMENTATION: validatorClass = DateTimeValidator elif default_datetime_implementation == MXDATETIME_IMPLEMENTATION: validatorClass = MXDateTimeValidator if default_datetime_implementation: _validators.insert(0, validatorClass(name=self.name, format=self.datetimeFormat)) return _validators def _mysqlType(self): if self.connection and self.connection.can_use_microseconds(): return 'DATETIME(6)' else: return 'DATETIME' def _postgresType(self): return 'TIMESTAMP' def _sybaseType(self): return 'DATETIME' def _mssqlType(self): if self.connection and self.connection.can_use_microseconds(): return 'DATETIME2(6)' else: return 'DATETIME' def _sqliteType(self): return 'TIMESTAMP' def _firebirdType(self): return 'TIMESTAMP' def _maxdbType(self): return 'TIMESTAMP' class DateTimeCol(Col): baseClass = SODateTimeCol @staticmethod def now(): if default_datetime_implementation == DATETIME_IMPLEMENTATION: return datetime.datetime.now() elif default_datetime_implementation == MXDATETIME_IMPLEMENTATION: return DateTime.now() else: assert 0, ("No datetime implementation available " "(DATETIME_IMPLEMENTATION=%r)" % DATETIME_IMPLEMENTATION) class DateValidator(DateTimeValidator): def to_python(self, value, state): if isinstance(value, datetime.datetime): value = value.date() if isinstance(value, (datetime.date, sqlbuilder.SQLExpression)): return value value = super(DateValidator, self).to_python(value, state) if isinstance(value, datetime.datetime): value = value.date() return value from_python = to_python class SODateCol(SOCol): dateFormat = '%Y-%m-%d' def __init__(self, **kw): dateFormat = kw.pop('dateFormat', None) if dateFormat: self.dateFormat = dateFormat super(SODateCol, self).__init__(**kw) def createValidators(self): """Create a validator for the column. Can be overriden in descendants. """ _validators = super(SODateCol, self).createValidators() if default_datetime_implementation == DATETIME_IMPLEMENTATION: validatorClass = DateValidator elif default_datetime_implementation == MXDATETIME_IMPLEMENTATION: validatorClass = MXDateTimeValidator if default_datetime_implementation: _validators.insert(0, validatorClass(name=self.name, format=self.dateFormat)) return _validators def _mysqlType(self): return 'DATE' def _postgresType(self): return 'DATE' def _sybaseType(self): return self._postgresType() def _mssqlType(self): """ SQL Server doesn't have a DATE data type, to emulate we use a vc(10) """ return 'VARCHAR(10)' def _firebirdType(self): return 'DATE' def _maxdbType(self): return 'DATE' def _sqliteType(self): return 'DATE' class DateCol(Col): baseClass = SODateCol class TimeValidator(DateTimeValidator): def to_python(self, value, state): if isinstance(value, (datetime.time, sqlbuilder.SQLExpression)): return value if isinstance(value, datetime.timedelta): if value.days: raise validators.Invalid( "the value for the TimeCol '%s' must has days=0, " "it has days=%d" % (self.name, value.days), value, state) return datetime.time(*time.gmtime(value.seconds)[3:6]) value = super(TimeValidator, self).to_python(value, state) if isinstance(value, datetime.datetime): value = value.time() return value from_python = to_python class SOTimeCol(SOCol): timeFormat = '%H:%M:%S.%f' def __init__(self, **kw): timeFormat = kw.pop('timeFormat', None) if timeFormat: self.timeFormat = timeFormat super(SOTimeCol, self).__init__(**kw) def createValidators(self): _validators = super(SOTimeCol, self).createValidators() if default_datetime_implementation == DATETIME_IMPLEMENTATION: validatorClass = TimeValidator elif default_datetime_implementation == MXDATETIME_IMPLEMENTATION: validatorClass = MXDateTimeValidator if default_datetime_implementation: _validators.insert(0, validatorClass(name=self.name, format=self.timeFormat)) return _validators def _mysqlType(self): if self.connection and self.connection.can_use_microseconds(): return 'TIME(6)' else: return 'TIME' def _postgresType(self): return 'TIME' def _sybaseType(self): return 'TIME' def _mssqlType(self): if self.connection and self.connection.can_use_microseconds(): return 'TIME(6)' else: return 'TIME' def _sqliteType(self): return 'TIME' def _firebirdType(self): return 'TIME' def _maxdbType(self): return 'TIME' class TimeCol(Col): baseClass = SOTimeCol class SOTimestampCol(SODateTimeCol): """ Necessary to support MySQL's use of TIMESTAMP versus DATETIME types """ def __init__(self, **kw): if 'default' not in kw: kw['default'] = None SOCol.__init__(self, **kw) def _mysqlType(self): if self.connection and self.connection.can_use_microseconds(): return 'TIMESTAMP(6)' else: return 'TIMESTAMP' class TimestampCol(Col): baseClass = SOTimestampCol class TimedeltaValidator(SOValidator): def to_python(self, value, state): return value from_python = to_python class SOTimedeltaCol(SOCol): def _postgresType(self): return 'INTERVAL' def createValidators(self): return [TimedeltaValidator(name=self.name)] + \ super(SOTimedeltaCol, self).createValidators() class TimedeltaCol(Col): baseClass = SOTimedeltaCol class DecimalValidator(SOValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, (int, long, Decimal, sqlbuilder.SQLExpression)): return value if isinstance(value, float): value = str(value) try: connection = state.connection or state.soObject._connection except AttributeError: pass else: if hasattr(connection, "decimalSeparator"): value = value.replace(connection.decimalSeparator, ".") try: return Decimal(value) except: raise validators.Invalid( "expected a Decimal in the DecimalCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None if isinstance(value, float): value = str(value) if isinstance(value, string_type): try: connection = state.connection or state.soObject._connection except AttributeError: pass else: if hasattr(connection, "decimalSeparator"): value = value.replace(connection.decimalSeparator, ".") try: return Decimal(value) except: raise validators.Invalid( "can not parse Decimal value '%s' " "in the DecimalCol from '%s'" % ( value, getattr(state, 'soObject', '(unknown)')), value, state) if isinstance(value, (int, long, Decimal, sqlbuilder.SQLExpression)): return value raise validators.Invalid( "expected a Decimal in the DecimalCol '%s', got %s %r instead" % ( self.name, type(value), value), value, state) class SODecimalCol(SOCol): def __init__(self, **kw): self.size = kw.pop('size', NoDefault) assert self.size is not NoDefault, \ "You must give a size argument" self.precision = kw.pop('precision', NoDefault) assert self.precision is not NoDefault, \ "You must give a precision argument" super(SODecimalCol, self).__init__(**kw) def _sqlType(self): return 'DECIMAL(%i, %i)' % (self.size, self.precision) def createValidators(self): return [DecimalValidator(name=self.name)] + \ super(SODecimalCol, self).createValidators() class DecimalCol(Col): baseClass = SODecimalCol class SOCurrencyCol(SODecimalCol): def __init__(self, **kw): pushKey(kw, 'size', 10) pushKey(kw, 'precision', 2) super(SOCurrencyCol, self).__init__(**kw) class CurrencyCol(DecimalCol): baseClass = SOCurrencyCol class DecimalStringValidator(DecimalValidator): def to_python(self, value, state): value = super(DecimalStringValidator, self).to_python(value, state) if self.precision and isinstance(value, Decimal): assert value < self.max, \ "Value must be less than %s" % int(self.max) value = value.quantize(self.precision) return value def from_python(self, value, state): value = super(DecimalStringValidator, self).from_python(value, state) if isinstance(value, Decimal): if self.precision: assert value < self.max, \ "Value must be less than %s" % int(self.max) value = value.quantize(self.precision) value = value.to_eng_string() elif isinstance(value, (int, long)): value = str(value) return value class SODecimalStringCol(SOStringCol): def __init__(self, **kw): self.size = kw.pop('size', NoDefault) assert (self.size is not NoDefault) and (self.size >= 0), \ "You must give a size argument as a positive integer" self.precision = kw.pop('precision', NoDefault) assert (self.precision is not NoDefault) and (self.precision >= 0), \ "You must give a precision argument as a positive integer" kw['length'] = int(self.size) + int(self.precision) self.quantize = kw.pop('quantize', False) assert isinstance(self.quantize, bool), \ "quantize argument must be Boolean True/False" super(SODecimalStringCol, self).__init__(**kw) def createValidators(self): if self.quantize: v = DecimalStringValidator( name=self.name, precision=Decimal(10) ** (-1 * int(self.precision)), max=Decimal(10) ** (int(self.size) - int(self.precision))) else: v = DecimalStringValidator(name=self.name, precision=0) return [v] + \ super(SODecimalStringCol, self).createValidators(dataType=Decimal) class DecimalStringCol(StringCol): baseClass = SODecimalStringCol class BinaryValidator(SOValidator): """ Validator for binary types. We're assuming that the per-database modules provide some form of wrapper type for binary conversion. """ _cachedValue = None def to_python(self, value, state): if value is None: return None try: connection = state.connection or state.soObject._connection except AttributeError: dbName = None binaryType = type(None) # Just a simple workaround else: dbName = connection.dbName binaryType = connection._binaryType if isinstance(value, str): if not PY2 and dbName == "mysql": value = value.encode('ascii', errors='surrogateescape') if dbName == "sqlite": if not PY2: value = bytes(value, 'ascii') value = connection.module.decode(value) return value if isinstance(value, bytes): return value if isinstance(value, (buffer_type, binaryType)): cachedValue = self._cachedValue if cachedValue and cachedValue[1] == value: return cachedValue[0] if isinstance(value, array): # MySQL return value.tostring() if not PY2 and isinstance(value, memoryview): return value.tobytes() return str(value) # buffer => string raise validators.Invalid( "expected a string in the BLOBCol '%s', got %s %r instead" % ( self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None connection = state.connection or state.soObject._connection binary = connection.createBinary(value) if not PY2 and isinstance(binary, memoryview): binary = str(binary.tobytes(), 'ascii') self._cachedValue = (value, binary) return binary class SOBLOBCol(SOStringCol): def __init__(self, **kw): # Change the default from 'auto' to False - # this is a (mostly) binary column if 'varchar' not in kw: kw['varchar'] = False super(SOBLOBCol, self).__init__(**kw) def createValidators(self): return [BinaryValidator(name=self.name)] + \ super(SOBLOBCol, self).createValidators() def _mysqlType(self): length = self.length varchar = self.varchar if length: if length >= 2 ** 24: return varchar and "LONGTEXT" or "LONGBLOB" if length >= 2 ** 16: return varchar and "MEDIUMTEXT" or "MEDIUMBLOB" if length >= 2 ** 8: return varchar and "TEXT" or "BLOB" return varchar and "TINYTEXT" or "TINYBLOB" def _postgresType(self): return 'BYTEA' def _mssqlType(self): if self.connection and self.connection.can_use_max_types(): return 'VARBINARY(MAX)' else: return "IMAGE" class BLOBCol(StringCol): baseClass = SOBLOBCol class PickleValidator(BinaryValidator): """ Validator for pickle types. A pickle type is simply a binary type with hidden pickling, so that we can simply store any kind of stuff in a particular column. The support for this relies directly on the support for binary for your database. """ def to_python(self, value, state): if value is None: return None if isinstance(value, unicode_type): dbEncoding = self.getDbEncoding(state, default='ascii') if PY2: value = value.encode(dbEncoding) else: value = value.encode(dbEncoding, errors='surrogateescape') if isinstance(value, bytes): return pickle.loads(value) raise validators.Invalid( "expected a pickle string in the PickleCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None return pickle.dumps(value, self.pickleProtocol) class SOPickleCol(SOBLOBCol): def __init__(self, **kw): self.pickleProtocol = kw.pop('pickleProtocol', pickle.HIGHEST_PROTOCOL) super(SOPickleCol, self).__init__(**kw) def createValidators(self): return [PickleValidator(name=self.name, pickleProtocol=self.pickleProtocol)] + \ super(SOPickleCol, self).createValidators() def _mysqlType(self): length = self.length if length: if length >= 2 ** 24: return "LONGBLOB" if length >= 2 ** 16: return "MEDIUMBLOB" return "BLOB" class PickleCol(BLOBCol): baseClass = SOPickleCol class UuidValidator(SOValidator): def to_python(self, value, state): if value is None: return None if PY2 and isinstance(value, unicode): value = value.encode('ascii') if isinstance(value, str): return UUID(value) if isinstance(value, UUID): return value raise validators.Invalid( "expected string in the UuidCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None if isinstance(value, UUID): return str(value) raise validators.Invalid( "expected uuid in the UuidCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) class SOUuidCol(SOCol): def createValidators(self): return [UuidValidator(name=self.name)] + \ super(SOUuidCol, self).createValidators() def _sqlType(self): return 'VARCHAR(36)' def _postgresType(self): return 'UUID' class UuidCol(Col): baseClass = SOUuidCol class JsonbValidator(SOValidator): def to_python(self, value, state): if isinstance(value, string_type): return json.loads(value) return value def from_python(self, value, state): if value is None: return json.dumps(None) if isinstance(value, (dict, list, unicode, int, long, float, bool)): return json.dumps(value) raise validators.Invalid( "expect one of the following types " "(dict, list, unicode, int, long, float, bool) for '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) class SOJsonbCol(SOCol): def createValidators(self): return [JsonbValidator(name=self.name)] + \ super(SOJsonbCol, self).createValidators() def _postgresType(self): return 'JSONB' class JsonbCol(Col): baseClass = SOJsonbCol class JSONValidator(StringValidator): def to_python(self, value, state): if value is None: return None if isinstance(value, string_type): return json.loads(value) raise validators.Invalid( "expected a JSON str in the JSONCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) def from_python(self, value, state): if value is None: return None if isinstance(value, (bool, int, float, long, dict, list, string_type)): return json.dumps(value) raise validators.Invalid( "expected an object suitable for JSON in the JSONCol '%s', " "got %s %r instead" % ( self.name, type(value), value), value, state) class SOJSONCol(SOStringCol): def createValidators(self): return [JSONValidator(name=self.name)] + \ super(SOJSONCol, self).createValidators() class JSONCol(StringCol): baseClass = SOJSONCol def pushKey(kw, name, value): if name not in kw: kw[name] = value all = [] # Use copy() to avoid 'dictionary changed' issues on python 3 for key, value in globals().copy().items(): if isinstance(value, type) and (issubclass(value, (Col, SOCol))): all.append(key) __all__.extend(all) del all SQLObject-3.4.0/sqlobject/mssql/0000755000175000017500000000000013141371614016002 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/mssql/__init__.py0000644000175000017500000000027412464733165020130 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import mssqlconnection return mssqlconnection.MSSQLConnection registerConnection(['mssql'], builder) SQLObject-3.4.0/sqlobject/mssql/mssqlconnection.py0000644000175000017500000003351613102647330021601 0ustar phdphd00000000000000import re from sqlobject import col from sqlobject.dbconnection import DBAPI from sqlobject.compat import PY2 class MSSQLConnection(DBAPI): supportTransactions = True dbName = 'mssql' schemes = [dbName] limit_re = re.compile('^\s*(select )(.*)', re.IGNORECASE) odbc_keywords = ('Server', 'Port', 'User Id', 'Password', 'Database') def __init__(self, db, user, password='', host='localhost', port=None, autoCommit=0, **kw): drivers = kw.pop('driver', None) or 'adodb,pymssql' for driver in drivers.split(','): driver = driver.strip() if not driver: continue try: if driver in ('adodb', 'adodbapi'): import adodbapi as sqlmodule elif driver == 'pymssql': import pymssql as sqlmodule elif driver == 'pyodbc': import pyodbc self.module = pyodbc elif driver == 'pypyodbc': import pypyodbc self.module = pypyodbc elif driver == 'odbc': try: import pyodbc except ImportError: import pypyodbc as pyodbc self.module = pyodbc else: raise ValueError( 'Unknown MSSQL driver "%s", ' 'expected adodb, pymssql, ' 'odbc, pyodbc or pypyodbc' % driver) except ImportError: pass else: break else: raise ImportError( 'Cannot find an MSSQL driver, tried %s' % drivers) if driver in ('odbc', 'pyodbc', 'pypyodbc'): self.make_odbc_conn_str(kw.pop('odbcdrv', 'SQL Server'), db, host, port, user, password ) elif driver in ('adodb', 'adodbapi'): self.module = sqlmodule self.dbconnection = sqlmodule.connect # ADO uses unicode only (AFAIK) self.usingUnicodeStrings = True # Need to use SQLNCLI provider for SQL Server Express Edition if kw.get("ncli"): conn_str = "Provider=SQLNCLI;" else: conn_str = "Provider=SQLOLEDB;" conn_str += "Data Source=%s;Initial Catalog=%s;" # MSDE does not allow SQL server login if kw.get("sspi"): conn_str += \ "Integrated Security=SSPI;Persist Security Info=False" self.make_conn_str = lambda keys: conn_str % ( keys.host, keys.db) else: conn_str += "User Id=%s;Password=%s" self.make_conn_str = lambda keys: conn_str % ( keys.host, keys.db, keys.user, keys.password) kw.pop("ncli", None) kw.pop("sspi", None) elif driver == 'pymssql': self.module = sqlmodule self.dbconnection = sqlmodule.connect sqlmodule.Binary = lambda st: str(st) # don't know whether pymssql uses unicode self.usingUnicodeStrings = False timeout = kw.pop('timeout', None) if timeout: timeout = int(timeout) self.timeout = timeout def _make_conn_str(keys): keys_dict = {} for attr, value in ( ('database', keys.db), ('user', keys.user), ('password', keys.password), ('host', keys.host), ('port', keys.port), ('timeout', keys.timeout), ): if value: keys_dict[attr] = value return keys_dict self.make_conn_str = _make_conn_str self.driver = driver self.autoCommit = int(autoCommit) self.user = user self.password = password self.host = host self.port = port self.db = db self._server_version = None self._can_use_max_types = None self._can_use_microseconds = None DBAPI.__init__(self, **kw) @classmethod def _connectionFromParams(cls, user, password, host, port, path, args): path = path.strip('/') return cls(user=user, password=password, host=host or 'localhost', port=port, db=path, **args) def insert_id(self, conn): """ insert_id method. """ c = conn.cursor() # converting the identity to an int is ugly, but it gets returned # as a decimal otherwise :S c.execute('SELECT CONVERT(INT, @@IDENTITY)') return c.fetchone()[0] def makeConnection(self): if self.driver in ('odbc', 'pyodbc', 'pypyodbc'): self.debugWriter.write( "ODBC connect string: " + self.odbc_conn_str) conn = self.module.connect(self.odbc_conn_str) else: conn_descr = self.make_conn_str(self) if isinstance(conn_descr, dict): conn = self.dbconnection(**conn_descr) else: conn = self.dbconnection(conn_descr) cur = conn.cursor() cur.execute('SET ANSI_NULLS ON') cur.execute("SELECT CAST('12345.21' AS DECIMAL(10, 2))") self.decimalSeparator = str(cur.fetchone()[0])[-3] cur.close() return conn HAS_IDENTITY = """ select 1 from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME = '%s' and COLUMNPROPERTY(object_id(TABLE_NAME), COLUMN_NAME, 'IsIdentity') = 1 """ def _hasIdentity(self, conn, table): query = self.HAS_IDENTITY % table c = conn.cursor() c.execute(query) r = c.fetchone() return r is not None def _queryInsertID(self, conn, soInstance, id, names, values): """ Insert the Initial with names and values, using id. """ table = soInstance.sqlmeta.table idName = soInstance.sqlmeta.idName c = conn.cursor() has_identity = self._hasIdentity(conn, table) if id is not None: names = [idName] + names values = [id] + values elif has_identity and idName in names: try: i = names.index(idName) if i: del names[i] del values[i] except ValueError: pass if has_identity: if id is not None: c.execute('SET IDENTITY_INSERT %s ON' % table) else: c.execute('SET IDENTITY_INSERT %s OFF' % table) if names and values: q = self._insertSQL(table, names, values) else: q = "INSERT INTO %s DEFAULT VALUES" % table if self.debug: self.printDebug(conn, q, 'QueryIns') c.execute(q) if has_identity: c.execute('SET IDENTITY_INSERT %s OFF' % table) if id is None: id = self.insert_id(conn) if self.debugOutput: self.printDebug(conn, id, 'QueryIns', 'result') return id @classmethod def _queryAddLimitOffset(cls, query, start, end): if end and not start: limit_str = "SELECT TOP %i" % end match = cls.limit_re.match(query) if match and len(match.groups()) == 2: return ' '.join([limit_str, match.group(2)]) else: return query def createReferenceConstraint(self, soClass, col): return col.mssqlCreateReferenceConstraint() def createColumn(self, soClass, col): return col.mssqlCreateSQL(self) def createIDColumn(self, soClass): key_type = {int: "INT", str: "TEXT"}[soClass.sqlmeta.idType] return '%s %s IDENTITY UNIQUE' % (soClass.sqlmeta.idName, key_type) def createIndexSQL(self, soClass, index): return index.mssqlCreateIndexSQL(soClass) def joinSQLType(self, join): return 'INT NOT NULL' SHOW_TABLES = "SELECT name FROM sysobjects WHERE type='U'" def tableExists(self, tableName): for (table,) in self.queryAll(self.SHOW_TABLES): if table.lower() == tableName.lower(): return True return False def addColumn(self, tableName, column): self.query('ALTER TABLE %s ADD %s' % (tableName, column.mssqlCreateSQL(self))) def delColumn(self, sqlmeta, column): self.query('ALTER TABLE %s DROP COLUMN %s' % (sqlmeta.table, column.dbName)) # Precision and scale is gotten from column table # so that we can create decimal columns if needed. SHOW_COLUMNS = """ select name, length, ( select name from systypes where cast(xusertype as int)= cast(sc.xtype as int) ) datatype, prec, scale, isnullable, cdefault, m.text default_text, isnull(len(autoval),0) is_identity from syscolumns sc LEFT OUTER JOIN syscomments m on sc.cdefault = m.id AND m.colid = 1 where sc.id in (select id from sysobjects where name = '%s') order by colorder""" def columnsFromSchema(self, tableName, soClass): colData = self.queryAll(self.SHOW_COLUMNS % tableName) results = [] for (field, size, t, precision, scale, nullAllowed, default, defaultText, is_identity) in colData: if field == soClass.sqlmeta.idName: continue # precision is needed for decimal columns colClass, kw = self.guessClass(t, size, precision, scale) kw['name'] = str(soClass.sqlmeta.style.dbColumnToPythonAttr(field)) kw['dbName'] = str(field) kw['notNone'] = not nullAllowed if (defaultText): # Strip ( and ) defaultText = defaultText[1:-1] if defaultText[0] == "'": defaultText = defaultText[1:-1] else: if t in ("int", "float", "numeric") and \ defaultText[0] == "(": defaultText = defaultText[1:-1] if t == "int": defaultText = int(defaultText) if t == "float": defaultText = float(defaultText) if t == "numeric": defaultText = float(defaultText) # TODO need to access the "column" to_python method here -- # but the object doesn't exists yet. # @@ skip key... kw['default'] = defaultText results.append(colClass(**kw)) return results def _setAutoCommit(self, conn, auto): # raise Exception(repr(auto)) return # conn.auto_commit = auto option = "ON" if auto == 0: option = "OFF" c = conn.cursor() c.execute("SET AUTOCOMMIT " + option) # precision and scale is needed for decimal columns def guessClass(self, t, size, precision, scale): """ Here we take raw values coming out of syscolumns and map to SQLObject class types. """ if t.startswith('int'): return col.IntCol, {} elif t.startswith('varchar'): if self.usingUnicodeStrings: return col.UnicodeCol, {'length': size} return col.StringCol, {'length': size} elif t.startswith('char'): if self.usingUnicodeStrings: return col.UnicodeCol, {'length': size, 'varchar': False} return col.StringCol, {'length': size, 'varchar': False} elif t.startswith('datetime'): return col.DateTimeCol, {} elif t.startswith('decimal'): # be careful for awkward naming return col.DecimalCol, {'size': precision, 'precision': scale} else: return col.Col, {} def server_version(self): """Get server version: 8 - 2000 9 - 2005 10 - 2008 11 - 2012 12 - 2014 13 - 2016 """ if self._server_version is not None: return self._server_version try: server_version = self.queryOne( "SELECT SERVERPROPERTY('productversion')")[0] if not PY2 and isinstance(server_version, bytes): server_version = server_version.decode('ascii') server_version = server_version.split('.')[0] server_version = int(server_version) except: server_version = None # unknown self._server_version = server_version return server_version def can_use_max_types(self): if self._can_use_max_types is not None: return self._can_use_max_types server_version = self.server_version() self._can_use_max_types = can_use_max_types = \ (server_version is not None) and (server_version >= 9) return can_use_max_types def can_use_microseconds(self): if self._can_use_microseconds is not None: return self._can_use_microseconds server_version = self.server_version() self._can_use_microseconds = can_use_microseconds = \ (server_version is not None) and (server_version >= 10) return can_use_microseconds SQLObject-3.4.0/sqlobject/tests/0000755000175000017500000000000013141371614016005 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/tests/test_exceptions.py0000644000175000017500000000204412775273363021615 0ustar phdphd00000000000000import pytest from sqlobject import SQLObject, StringCol from sqlobject.dberrors import DuplicateEntryError, ProgrammingError from sqlobject.tests.dbtest import getConnection, raises, setupClass, supports ######################################## # Table aliases and self-joins ######################################## class SOTestException(SQLObject): name = StringCol(unique=True, length=100) class SOTestExceptionWithNonexistingTable(SQLObject): pass def test_exceptions(): if not supports("exceptions"): pytest.skip("exceptions aren't supported") setupClass(SOTestException) SOTestException(name="test") raises(DuplicateEntryError, SOTestException, name="test") connection = getConnection() if connection.module.__name__ != 'psycopg2': return SOTestExceptionWithNonexistingTable.setConnection(connection) try: list(SOTestExceptionWithNonexistingTable.select()) except ProgrammingError as e: assert e.args[0].code == '42P01' else: assert False, "DID NOT RAISE" SQLObject-3.4.0/sqlobject/tests/test_empty.py0000644000175000017500000000100512775273363020566 0ustar phdphd00000000000000import pytest from sqlobject import SQLObject from sqlobject.tests.dbtest import setupClass, supports class EmptyClass(SQLObject): pass def test_empty(): if not supports('emptyTable'): pytest.skip("emptyTable isn't supported") setupClass(EmptyClass) e1 = EmptyClass() e2 = EmptyClass() assert e1 != e2 assert e1.id != e2.id assert e1 in list(EmptyClass.select()) assert e2 in list(EmptyClass.select()) e1.destroySelf() assert list(EmptyClass.select()) == [e2] SQLObject-3.4.0/sqlobject/tests/test_transactions.py0000644000175000017500000000550613140202164022124 0ustar phdphd00000000000000import pytest from sqlobject import SQLObject, SQLObjectNotFound, StringCol from sqlobject.tests.dbtest import raises, setupClass, supports ######################################## # Transaction test ######################################## try: support_transactions = supports('transactions') except NameError: # The module was imported during documentation building pass else: if not support_transactions: pytestmark = pytest.mark.skip('') class SOTestSOTrans(SQLObject): class sqlmeta: defaultOrder = 'name' name = StringCol(length=10, alternateID=True, dbName='name_col') def test_transaction(): setupClass(SOTestSOTrans) SOTestSOTrans(name='bob') SOTestSOTrans(name='tim') trans = SOTestSOTrans._connection.transaction() try: SOTestSOTrans._connection.autoCommit = 'exception' SOTestSOTrans(name='joe', connection=trans) trans.rollback() trans.begin() assert ([n.name for n in SOTestSOTrans.select(connection=trans)] == ['bob', 'tim']) b = SOTestSOTrans.byName('bob', connection=trans) b.name = 'robert' trans.commit() assert b.name == 'robert' b.name = 'bob' trans.rollback() trans.begin() assert b.name == 'robert' finally: SOTestSOTrans._connection.autoCommit = True def test_transaction_commit_sync(): setupClass(SOTestSOTrans) connection = SOTestSOTrans._connection trans = connection.transaction() try: SOTestSOTrans(name='bob') bOut = SOTestSOTrans.byName('bob') bIn = SOTestSOTrans.byName('bob', connection=trans) bIn.name = 'robert' assert bOut.name == 'bob' trans.commit() assert bOut.name == 'robert' finally: trans.rollback() connection.autoCommit = True connection.close() def test_transaction_delete(close=False): setupClass(SOTestSOTrans) connection = SOTestSOTrans._connection if (connection.dbName == 'sqlite') and connection._memory: pytest.skip("The test doesn't work with sqlite memory connection") trans = connection.transaction() try: SOTestSOTrans(name='bob') bIn = SOTestSOTrans.byName('bob', connection=trans) bIn.destroySelf() bOut = SOTestSOTrans.select(SOTestSOTrans.q.name == 'bob') assert bOut.count() == 1 bOutInst = bOut[0] bOutID = bOutInst.id # noqa: bOutID is used in the string code below trans.commit(close=close) assert bOut.count() == 0 raises(SQLObjectNotFound, "SOTestSOTrans.get(bOutID)") raises(SQLObjectNotFound, "bOutInst.name") finally: trans.rollback() connection.autoCommit = True connection.close() def test_transaction_delete_with_close(): test_transaction_delete(close=True) SQLObject-3.4.0/sqlobject/tests/test_schema.py0000644000175000017500000000143612775273363020700 0ustar phdphd00000000000000import pytest from sqlobject import SQLObject, UnicodeCol from sqlobject.tests.dbtest import getConnection, setupClass, supports ######################################## # Schema per connection ######################################## class SOTestSchema(SQLObject): foo = UnicodeCol(length=200) def test_connection_schema(): if not supports('schema'): pytest.skip("schemas aren't supported") conn = getConnection() conn.schema = None conn.query('CREATE SCHEMA test') conn.schema = 'test' conn.query('SET search_path TO test') setupClass(SOTestSchema) assert SOTestSchema._connection is conn SOTestSchema(foo='bar') assert conn.queryAll("SELECT * FROM test.so_test_schema") conn.schema = None conn.query('SET search_path TO public') SQLObject-3.4.0/sqlobject/tests/test_basic.py0000644000175000017500000002321613036106435020503 0ustar phdphd00000000000000import codecs import pytest from sqlobject import BoolCol, ForeignKey, IntCol, KeyCol, SQLObject, \ StringCol, connectionForURI, sqlhub from sqlobject.tests.dbtest import inserts, raises, setupClass, supports class SOTestSO1(SQLObject): name = StringCol(length=50, dbName='name_col') name.title = 'Your Name' name.foobar = 1 passwd = StringCol(length=10) class sqlmeta: cacheValues = False def _set_passwd(self, passwd): self._SO_set_passwd(codecs.encode(passwd, 'rot13')) def setupGetters(cls): setupClass(cls) inserts(cls, [('bob', 'god'), ('sally', 'sordid'), ('dave', 'dremel'), ('fred', 'forgo')], 'name passwd') def test_case1(): setupGetters(SOTestSO1) bob = SOTestSO1.selectBy(name='bob')[0] assert bob.name == 'bob' assert bob.passwd == codecs.encode('god', 'rot13') bobs = SOTestSO1.selectBy(name='bob')[:10] assert len(list(bobs)) == 1 def test_newline(): setupGetters(SOTestSO1) bob = SOTestSO1.selectBy(name='bob')[0] testString = 'hey\nyou\\can\'t you see me?\t' bob.name = testString bob.expire() assert bob.name == testString def test_count(): setupGetters(SOTestSO1) assert SOTestSO1.selectBy(name=None).count() == 0 assert SOTestSO1.selectBy(name='bob').count() == 1 assert SOTestSO1.select(SOTestSO1.q.name == 'bob').count() == 1 assert SOTestSO1.select().count() == len(list(SOTestSO1.select())) def test_getset(): setupGetters(SOTestSO1) bob = SOTestSO1.selectBy(name='bob')[0] assert bob.name == 'bob' bob.name = 'joe' assert bob.name == 'joe' bob.set(name='joebob', passwd='testtest') assert bob.name == 'joebob' def test_extra_vars(): setupGetters(SOTestSO1) col = SOTestSO1.sqlmeta.columns['name'] assert col.title == 'Your Name' assert col.foobar == 1 assert getattr(SOTestSO1.sqlmeta.columns['passwd'], 'title', None) is None class SOTestSO2(SQLObject): name = StringCol(length=50, dbName='name_col') passwd = StringCol(length=10) def _set_passwd(self, passwd): self._SO_set_passwd(codecs.encode(passwd, 'rot13')) def test_case2(): setupGetters(SOTestSO2) bob = SOTestSO2.selectBy(name='bob')[0] assert bob.name == 'bob' assert bob.passwd == codecs.encode('god', 'rot13') class Student(SQLObject): is_smart = BoolCol() def test_boolCol(): setupClass(Student) student = Student(is_smart=False) assert not student.is_smart student2 = Student(is_smart=1) assert student2.is_smart class SOTestSO3(SQLObject): name = StringCol(length=10, dbName='name_col') other = ForeignKey('SOTestSO4', default=None) other2 = KeyCol(foreignKey='SOTestSO4', default=None) class SOTestSO4(SQLObject): me = StringCol(length=10) def test_foreignKey(): setupClass([SOTestSO4, SOTestSO3]) test3_order = [col.name for col in SOTestSO3.sqlmeta.columnList] assert test3_order == ['name', 'otherID', 'other2ID'] tc3 = SOTestSO3(name='a') assert tc3.other is None assert tc3.other2 is None assert tc3.otherID is None assert tc3.other2ID is None tc4a = SOTestSO4(me='1') tc3.other = tc4a assert tc3.other == tc4a assert tc3.otherID == tc4a.id tc4b = SOTestSO4(me='2') tc3.other = tc4b.id assert tc3.other == tc4b assert tc3.otherID == tc4b.id tc4c = SOTestSO4(me='3') tc3.other2 = tc4c assert tc3.other2 == tc4c assert tc3.other2ID == tc4c.id tc4d = SOTestSO4(me='4') tc3.other2 = tc4d.id assert tc3.other2 == tc4d assert tc3.other2ID == tc4d.id tcc = SOTestSO3(name='b', other=tc4a) assert tcc.other == tc4a tcc2 = SOTestSO3(name='c', other=tc4a.id) assert tcc2.other == tc4a def test_selectBy(): setupClass([SOTestSO4, SOTestSO3]) tc4 = SOTestSO4(me='another') tc3 = SOTestSO3(name='sel', other=tc4) SOTestSO3(name='not joined') assert tc3.other == tc4 assert list(SOTestSO3.selectBy(other=tc4)) == [tc3] assert list(SOTestSO3.selectBy(otherID=tc4.id)) == [tc3] assert SOTestSO3.selectBy(otherID=tc4.id)[0] == tc3 assert list(SOTestSO3.selectBy(otherID=tc4.id)[:10]) == [tc3] assert list(SOTestSO3.selectBy(other=tc4)[:10]) == [tc3] class SOTestSO5(SQLObject): name = StringCol(length=10, dbName='name_col') other = ForeignKey('SOTestSO6', default=None, cascade=True) another = ForeignKey('SOTestSO7', default=None, cascade=True) class SOTestSO6(SQLObject): name = StringCol(length=10, dbName='name_col') other = ForeignKey('SOTestSO7', default=None, cascade=True) class SOTestSO7(SQLObject): name = StringCol(length=10, dbName='name_col') def test_foreignKeyDestroySelfCascade(): setupClass([SOTestSO7, SOTestSO6, SOTestSO5]) tc5 = SOTestSO5(name='a') tc6a = SOTestSO6(name='1') tc5.other = tc6a tc7a = SOTestSO7(name='2') tc6a.other = tc7a tc5.another = tc7a assert tc5.other == tc6a assert tc5.otherID == tc6a.id assert tc6a.other == tc7a assert tc6a.otherID == tc7a.id assert tc5.other.other == tc7a assert tc5.other.otherID == tc7a.id assert tc5.another == tc7a assert tc5.anotherID == tc7a.id assert tc5.other.other == tc5.another assert SOTestSO5.select().count() == 1 assert SOTestSO6.select().count() == 1 assert SOTestSO7.select().count() == 1 tc6b = SOTestSO6(name='3') tc6c = SOTestSO6(name='4') tc7b = SOTestSO7(name='5') tc6b.other = tc7b tc6c.other = tc7b assert SOTestSO5.select().count() == 1 assert SOTestSO6.select().count() == 3 assert SOTestSO7.select().count() == 2 tc6b.destroySelf() assert SOTestSO5.select().count() == 1 assert SOTestSO6.select().count() == 2 assert SOTestSO7.select().count() == 2 tc7b.destroySelf() assert SOTestSO5.select().count() == 1 assert SOTestSO6.select().count() == 1 assert SOTestSO7.select().count() == 1 tc7a.destroySelf() assert SOTestSO5.select().count() == 0 assert SOTestSO6.select().count() == 0 assert SOTestSO7.select().count() == 0 def testForeignKeyDropTableCascade(): if not supports('dropTableCascade'): pytest.skip("dropTableCascade isn't supported") setupClass(SOTestSO7) setupClass(SOTestSO6) setupClass(SOTestSO5) tc5a = SOTestSO5(name='a') tc6a = SOTestSO6(name='1') tc5a.other = tc6a tc7a = SOTestSO7(name='2') tc6a.other = tc7a tc5a.another = tc7a tc5b = SOTestSO5(name='b') tc5c = SOTestSO5(name='c') tc6b = SOTestSO6(name='3') tc5c.other = tc6b assert SOTestSO5.select().count() == 3 assert SOTestSO6.select().count() == 2 assert SOTestSO7.select().count() == 1 SOTestSO7.dropTable(cascade=True) assert SOTestSO5.select().count() == 3 assert SOTestSO6.select().count() == 2 tc6a.destroySelf() assert SOTestSO5.select().count() == 2 assert SOTestSO6.select().count() == 1 tc6b.destroySelf() assert SOTestSO5.select().count() == 1 assert SOTestSO6.select().count() == 0 assert next(iter(SOTestSO5.select())) == tc5b tc6c = SOTestSO6(name='3') tc5b.other = tc6c assert SOTestSO5.select().count() == 1 assert SOTestSO6.select().count() == 1 tc6c.destroySelf() assert SOTestSO5.select().count() == 0 assert SOTestSO6.select().count() == 0 class SOTestSO8(SQLObject): name = StringCol(length=10, dbName='name_col') other = ForeignKey('SOTestSO9', default=None, cascade=False) class SOTestSO9(SQLObject): name = StringCol(length=10, dbName='name_col') def testForeignKeyDestroySelfRestrict(): setupClass([SOTestSO9, SOTestSO8]) tc8a = SOTestSO8(name='a') tc9a = SOTestSO9(name='1') tc8a.other = tc9a tc8b = SOTestSO8(name='b') tc9b = SOTestSO9(name='2') assert tc8a.other == tc9a assert tc8a.otherID == tc9a.id assert SOTestSO8.select().count() == 2 assert SOTestSO9.select().count() == 2 raises(Exception, tc9a.destroySelf) tc9b.destroySelf() assert SOTestSO8.select().count() == 2 assert SOTestSO9.select().count() == 1 tc8a.destroySelf() tc8b.destroySelf() tc9a.destroySelf() assert SOTestSO8.select().count() == 0 assert SOTestSO9.select().count() == 0 class SOTestSO10(SQLObject): name = StringCol() class SOTestSO11(SQLObject): name = StringCol() other = ForeignKey('SOTestSO10', default=None, cascade='null') def testForeignKeySetNull(): setupClass([SOTestSO10, SOTestSO11]) obj1 = SOTestSO10(name='foo') obj2 = SOTestSO10(name='bar') dep1 = SOTestSO11(name='xxx', other=obj1) dep2 = SOTestSO11(name='yyy', other=obj1) dep3 = SOTestSO11(name='zzz', other=obj2) for name in 'xxx', 'yyy', 'zzz': assert len(list(SOTestSO11.selectBy(name=name))) == 1 obj1.destroySelf() for name in 'xxx', 'yyy', 'zzz': assert len(list(SOTestSO11.selectBy(name=name))) == 1 assert dep1.other is None assert dep2.other is None assert dep3.other is obj2 def testAsDict(): setupGetters(SOTestSO1) bob = SOTestSO1.selectBy(name='bob')[0] assert bob.sqlmeta.asDict() == { 'passwd': 'tbq', 'name': 'bob', 'id': bob.id} def test_nonexisting_attr(): setupClass(Student) raises(AttributeError, getattr, Student.q, 'nonexisting') class SOTestSO12(SQLObject): name = StringCol() so_value = IntCol(defaultSQL='1') def test_defaultSQL(): setupClass(SOTestSO12) test = SOTestSO12(name="test") assert test.so_value == 1 def test_connection_override(): sqlhub.processConnection = connectionForURI('sqlite:///db1') class SOTestSO13(SQLObject): _connection = connectionForURI('sqlite:///db2') assert SOTestSO13._connection.uri() == 'sqlite:///db2' del sqlhub.processConnection SQLObject-3.4.0/sqlobject/tests/test_reparent_sqlmeta.py0000644000175000017500000000161112747416060022771 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol, sqlmeta from sqlobject.tests.dbtest import setupClass real_sqlmeta = sqlmeta class Reparented1(SQLObject): class sqlmeta: table = 'reparented1' dummy = StringCol() class Reparented2(SQLObject): class sqlmeta(object): @classmethod def setClass(cls, soClass): # Well, it's pretty hard to call the superclass method # when it's a classmethod and it's not actually your # *current* superclass. Sigh real_sqlmeta.setClass.__func__(cls, soClass) cls.worked = True dummy = StringCol() def test_reparented(): setupClass([Reparented1, Reparented2]) assert Reparented1.sqlmeta.table == 'reparented1' assert issubclass(Reparented1.sqlmeta, real_sqlmeta) assert issubclass(Reparented2.sqlmeta, real_sqlmeta) assert Reparented2.sqlmeta.worked SQLObject-3.4.0/sqlobject/tests/test_aggregates.py0000644000175000017500000000433313036106435021532 0ustar phdphd00000000000000from sqlobject import FloatCol, IntCol, SQLObject from sqlobject.tests.dbtest import setupClass # Test MIN, AVG, MAX, COUNT, SUM class IntAccumulator(SQLObject): so_value = IntCol() class FloatAccumulator(SQLObject): so_value = FloatCol() def test_integer(): setupClass(IntAccumulator) IntAccumulator(so_value=1) IntAccumulator(so_value=2) IntAccumulator(so_value=3) assert IntAccumulator.select().min(IntAccumulator.q.so_value) == 1 assert IntAccumulator.select().avg(IntAccumulator.q.so_value) == 2 assert IntAccumulator.select().max(IntAccumulator.q.so_value) == 3 assert IntAccumulator.select().sum(IntAccumulator.q.so_value) == 6 assert IntAccumulator.select(IntAccumulator.q.so_value > 1).\ max(IntAccumulator.q.so_value) == 3 assert IntAccumulator.select(IntAccumulator.q.so_value > 1).\ sum(IntAccumulator.q.so_value) == 5 def floatcmp(f1, f2): if abs(f1 - f2) < 0.1: return 0 if f1 < f2: return 1 return -1 def test_float(): setupClass(FloatAccumulator) FloatAccumulator(so_value=1.2) FloatAccumulator(so_value=2.4) FloatAccumulator(so_value=3.8) assert floatcmp( FloatAccumulator.select().min(FloatAccumulator.q.so_value), 1.2) == 0 assert floatcmp( FloatAccumulator.select().avg(FloatAccumulator.q.so_value), 2.5) == 0 assert floatcmp( FloatAccumulator.select().max(FloatAccumulator.q.so_value), 3.8) == 0 assert floatcmp( FloatAccumulator.select().sum(FloatAccumulator.q.so_value), 7.4) == 0 def test_many(): setupClass(IntAccumulator) IntAccumulator(so_value=1) IntAccumulator(so_value=1) IntAccumulator(so_value=2) IntAccumulator(so_value=2) IntAccumulator(so_value=3) IntAccumulator(so_value=3) attribute = IntAccumulator.q.so_value assert list(IntAccumulator.select().accumulateMany( ("MIN", attribute), ("AVG", attribute), ("MAX", attribute), ("COUNT", attribute), ("SUM", attribute) )) == [1, 2, 3, 6, 12] assert list(IntAccumulator.select(distinct=True).accumulateMany( ("MIN", attribute), ("AVG", attribute), ("MAX", attribute), ("COUNT", attribute), ("SUM", attribute) )) == [1, 2, 3, 3, 6] SQLObject-3.4.0/sqlobject/tests/test_identity.py0000644000175000017500000000131012770603634021251 0ustar phdphd00000000000000from sqlobject import IntCol, SQLObject from sqlobject.tests.dbtest import getConnection, setupClass ######################################## # Identity (MS SQL) ######################################## class SOTestIdentity(SQLObject): n = IntCol() def test_identity(): # (re)create table SOTestIdentity.dropTable(connection=getConnection(), ifExists=True) setupClass(SOTestIdentity) # insert without giving identity SOTestIdentity(n=100) # i1 # verify result i1get = SOTestIdentity.get(1) assert(i1get.n == 100) # insert while giving identity SOTestIdentity(id=2, n=200) # i2 # verify result i2get = SOTestIdentity.get(2) assert(i2get.n == 200) SQLObject-3.4.0/sqlobject/tests/__init__.py0000644000175000017500000000000210177633357020121 0ustar phdphd00000000000000# SQLObject-3.4.0/sqlobject/tests/test_jsoncol.py0000644000175000017500000000157512747417740021111 0ustar phdphd00000000000000from sqlobject import SQLObject, JSONCol from sqlobject.tests.dbtest import setupClass class JSONTest(SQLObject): json = JSONCol(default=None) _json_test_data = ( None, True, 1, 2.0, {u"test": [None, True, 1, 2.0, {u"unicode'with'apostrophes": u"unicode\"with\"quotes"}, [], u"unicode"]}, [None, True, 1, 2.0, [], {u"unicode'with'apostrophes": u"unicode\"with\"quotes"}, u"unicode", u"unicode'with'apostrophes", u"unicode\"with\"quotes", ], u"unicode", u"unicode'with'apostrophes", u"unicode\"with\"quotes", ) def test_JSONCol(): setupClass(JSONTest) for _id, test_data in enumerate(_json_test_data): json = JSONTest(id=_id + 1, json=test_data) JSONTest._connection.cache.clear() for _id, test_data in enumerate(_json_test_data): json = JSONTest.get(_id + 1) assert json.json == test_data SQLObject-3.4.0/sqlobject/tests/test_cache.py0000644000175000017500000000273612747416060020477 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.cache import CacheSet from .dbtest import setupClass class Something(object): pass def test_purge1(): x = CacheSet() y = Something() obj = x.get(1, y.__class__) assert obj is None x.put(1, y.__class__, y) x.finishPut(y.__class__) j = x.get(1, y.__class__) assert j == y x.expire(1, y.__class__) j = x.get(1, y.__class__) assert j is None x.finishPut(y.__class__) j = x.get(1, y.__class__) assert j is None x.finishPut(y.__class__) class CacheTest(SQLObject): name = StringCol(alternateID=True, length=100) def test_cache(): setupClass(CacheTest) s = CacheTest(name='foo') obj_id = id(s) s_id = s.id assert CacheTest.get(s_id) is s assert not s.sqlmeta.expired CacheTest.sqlmeta.expireAll() assert s.sqlmeta.expired CacheTest.sqlmeta.expireAll() s1 = CacheTest.get(s_id) # We should have a new object: assert id(s1) != obj_id obj_id2 = id(s1) CacheTest._connection.expireAll() s2 = CacheTest.get(s_id) assert id(s2) != obj_id and id(s2) != obj_id2 def test_cache_cull(): setupClass(CacheTest) s = CacheTest(name='test_cache_create') for count in range(s._connection.cache.caches['CacheTest'].cullFrequency): CacheTest(name='test_cache_create %d' % count) assert len(s._connection.cache.caches['CacheTest'].cache) < \ s._connection.cache.caches['CacheTest'].cullFrequency SQLObject-3.4.0/sqlobject/tests/test_validation.py0000644000175000017500000000727613102716754021571 0ustar phdphd00000000000000from sqlobject import BoolCol, FloatCol, IntCol, PickleCol, SQLObject, \ StringCol, UnicodeCol from sqlobject.col import validators from sqlobject.compat import PY2 from sqlobject.tests.dbtest import raises, setupClass if not PY2: # alias for python 3 compatability long = int ######################################## # Validation/conversion ######################################## class SOTestValidator(validators.Validator): def to_python(self, value, state): if value: self.save_value.append(value) return 1 return value def from_python(self, value, state): if value: self.save_value.append(value) return 2 return value validator1 = SOTestValidator(save_value=[]) validator2 = SOTestValidator(save_value=[]) class SOValidation(SQLObject): name = StringCol(validator=validators.PlainText(), default='x', dbName='name_col') name2 = StringCol(validator2=validators.ConfirmType(type=str), default='y') name3 = IntCol(validator=validators.Wrapper(fromPython=int), default=100) name4 = FloatCol(default=2.718) name5 = PickleCol(default=None) name6 = BoolCol(default=None) name7 = UnicodeCol(default=None) name8 = IntCol(default=None) name9 = IntCol(validator=validator1, validator2=validator2, default=0) class SOValidationTest(object): def __init__(self, value): self.value = value if PY2: class SOValidationTestUnicode(SOValidationTest): def __unicode__(self): return self.value class SOValidationTestInt(SOValidationTest): def __int__(self): return self.value class SOValidationTestBool(SOValidationTest): def __nonzero__(self): return self.value __bool__ = __nonzero__ class SOValidationTestFloat(SOValidationTest): def __float__(self): return self.value class TestValidation: def setup_method(self, meth): setupClass(SOValidation) def test_validate(self): t = SOValidation(name='hey') raises(validators.Invalid, setattr, t, 'name', '!!!') t.name = 'you' assert t.name == 'you' def test_confirmType(self): t = SOValidation(name2='hey') raises(validators.Invalid, setattr, t, 'name2', 1) raises(validators.Invalid, setattr, t, 'name3', '1') raises(validators.Invalid, setattr, t, 'name4', '1') if t._connection.dbName != 'postgres' or \ t._connection.driver not in ('odbc', 'pyodbc', 'pypyodbc'): raises(validators.Invalid, setattr, t, 'name6', '1') raises(validators.Invalid, setattr, t, 'name7', 1) t.name2 = 'you' assert t.name2 == 'you' for name, cls, value in ( ('name4', SOValidationTestFloat, 1.1), ('name6', SOValidationTestBool, True), ('name8', SOValidationTestInt, 1)): setattr(t, name, cls(value)) assert getattr(t, name) == value if PY2: for name, cls, value in ( ('name7', SOValidationTestUnicode, u'test'),): setattr(t, name, cls(value)) assert getattr(t, name) == value def test_wrapType(self): t = SOValidation(name3=1) raises(validators.Invalid, setattr, t, 'name3', 'x') t.name3 = long(1) assert t.name3 == 1 t.name3 = 0 assert t.name3 == 0 def test_emptyValue(self): t = SOValidation(name5={}) assert t.name5 == {} def test_validator2(self): SOValidation(name9=1) SOValidation(name9=2) assert validator1.save_value == [2, 2, 2, 2, 2, 2] assert validator2.save_value == [1, 1, 1, 2, 1, 1] SQLObject-3.4.0/sqlobject/tests/test_md5.py0000644000175000017500000000065612465376600020122 0ustar phdphd00000000000000from hashlib import md5 ######################################## # hashlib.md5 ######################################## def test_md5(): assert md5(b'').hexdigest() == 'd41d8cd98f00b204e9800998ecf8427e' assert md5(b'\n').hexdigest() == '68b329da9893e34099c7d8ad5cb9c940' assert md5(b'123').hexdigest() == '202cb962ac59075b964b07152d234b70' assert md5(b'123\n').hexdigest() == 'ba1f2511fc30423bdbb183fe33f3dd0f' SQLObject-3.4.0/sqlobject/tests/test_comparison.py0000644000175000017500000000100512770603634021573 0ustar phdphd00000000000000from sqlobject import SQLObject from sqlobject.tests.dbtest import setupClass class SOTestComparison(SQLObject): pass def test_eq(): setupClass(SOTestComparison, force=True) t1 = SOTestComparison() t2 = SOTestComparison() SOTestComparison._connection.cache.clear() t3 = SOTestComparison.get(1) t4 = SOTestComparison.get(2) assert t1.id == t3.id assert t2.id == t4.id assert t1 is not t3 assert t2 is not t4 assert t1 == t3 assert t2 == t4 assert t1 != t2 SQLObject-3.4.0/sqlobject/tests/test_cyclic_reference.py0000644000175000017500000000371113036460624022707 0ustar phdphd00000000000000import pytest from sqlobject import BLOBCol, DateTimeCol, ForeignKey, IntCol, SQLObject, \ StringCol, sqlmeta from sqlobject.tests.dbtest import getConnection, supports class SOTestCyclicRefA(SQLObject): class sqlmeta(sqlmeta): idName = 'test_id_here' table = 'test_cyclic_ref_a_table' name = StringCol() number = IntCol() so_time = DateTimeCol() short = StringCol(length=10) blobcol = BLOBCol() fkeyb = ForeignKey('SOTestCyclicRefB') class SOTestCyclicRefB(SQLObject): class sqlmeta(sqlmeta): idName = 'test_id_here' table = 'test_cyclic_ref_b_table' name = StringCol() number = IntCol() so_time = DateTimeCol() short = StringCol(length=10) blobcol = BLOBCol() fkeya = ForeignKey('SOTestCyclicRefA') def test_cyclic_reference(): if not supports('dropTableCascade'): pytest.skip("dropTableCascade isn't supported") conn = getConnection() SOTestCyclicRefA.setConnection(conn) SOTestCyclicRefB.setConnection(conn) SOTestCyclicRefA.dropTable(ifExists=True, cascade=True) assert not conn.tableExists(SOTestCyclicRefA.sqlmeta.table) SOTestCyclicRefB.dropTable(ifExists=True, cascade=True) assert not conn.tableExists(SOTestCyclicRefB.sqlmeta.table) constraints = SOTestCyclicRefA.createTable(ifNotExists=True, applyConstraints=False) assert conn.tableExists(SOTestCyclicRefA.sqlmeta.table) constraints += SOTestCyclicRefB.createTable(ifNotExists=True, applyConstraints=False) assert conn.tableExists(SOTestCyclicRefB.sqlmeta.table) for constraint in constraints: conn.query(constraint) SOTestCyclicRefA.dropTable(ifExists=True, cascade=True) assert not conn.tableExists(SOTestCyclicRefA.sqlmeta.table) SOTestCyclicRefB.dropTable(ifExists=True, cascade=True) assert not conn.tableExists(SOTestCyclicRefB.sqlmeta.table) SQLObject-3.4.0/sqlobject/tests/test_views.py0000644000175000017500000001262712747416060020571 0ustar phdphd00000000000000from sqlobject import ForeignKey, IntCol, SQLMultipleJoin, SQLObject, \ StringCol, func from sqlobject.views import ViewSQLObject from sqlobject.tests.dbtest import inserts, setupClass class PhoneNumber(SQLObject): number = StringCol() calls = SQLMultipleJoin('PhoneCall') incoming = SQLMultipleJoin('PhoneCall', joinColumn='toID') class PhoneCall(SQLObject): phoneNumber = ForeignKey('PhoneNumber') to = ForeignKey('PhoneNumber') minutes = IntCol() class ViewPhoneCall(ViewSQLObject): class sqlmeta: idName = PhoneCall.q.id clause = PhoneCall.q.phoneNumberID == PhoneNumber.q.id minutes = IntCol(dbName=PhoneCall.q.minutes) number = StringCol(dbName=PhoneNumber.q.number) phoneNumber = ForeignKey('PhoneNumber', dbName=PhoneNumber.q.id) call = ForeignKey('PhoneCall', dbName=PhoneCall.q.id) class ViewPhone(ViewSQLObject): class sqlmeta: idName = PhoneNumber.q.id clause = PhoneCall.q.phoneNumberID == PhoneNumber.q.id minutes = IntCol(dbName=func.SUM(PhoneCall.q.minutes)) numberOfCalls = IntCol(dbName=func.COUNT(PhoneCall.q.phoneNumberID)) number = StringCol(dbName=PhoneNumber.q.number) phoneNumber = ForeignKey('PhoneNumber', dbName=PhoneNumber.q.id) calls = SQLMultipleJoin('PhoneCall', joinColumn='phoneNumberID') vCalls = SQLMultipleJoin('ViewPhoneCall', joinColumn='phoneNumberID', orderBy='id') class ViewPhoneMore(ViewSQLObject): ''' View on top of view ''' class sqlmeta: idName = ViewPhone.q.id clause = ViewPhone.q.id == PhoneCall.q.toID number = StringCol(dbName=ViewPhone.q.number) timesCalled = IntCol(dbName=func.COUNT(PhoneCall.q.toID)) timesCalledLong = IntCol(dbName=func.COUNT(PhoneCall.q.toID)) timesCalledLong.aggregateClause = PhoneCall.q.minutes > 10 minutesCalled = IntCol(dbName=func.SUM(PhoneCall.q.minutes)) class ViewPhoneMore2(ViewPhoneMore): class sqlmeta: table = 'vpm' class ViewPhoneInnerAggregate(ViewPhone): twiceMinutes = IntCol(dbName=func.SUM(PhoneCall.q.minutes) * 2) def setup_module(mod): global calls, phones, sqlrepr setupClass([PhoneNumber, PhoneCall]) ViewPhoneCall._connection = PhoneNumber._connection ViewPhone._connection = PhoneNumber._connection ViewPhoneMore._connection = PhoneNumber._connection phones = inserts(PhoneNumber, [('1234567890',), ('1111111111',)], 'number') calls = inserts(PhoneCall, [(phones[0], phones[1], 5), (phones[0], phones[1], 20), (phones[1], phones[0], 10), (phones[1], phones[0], 25)], 'phoneNumber to minutes') sqlrepr = PhoneNumber._connection.sqlrepr def testSimpleVPC(): assert hasattr(ViewPhoneCall, 'minutes') assert hasattr(ViewPhoneCall, 'number') assert hasattr(ViewPhoneCall, 'phoneNumberID') def testColumnSQLVPC(): assert str(sqlrepr(ViewPhoneCall.q.id)) == 'view_phone_call.id' assert str(sqlrepr(ViewPhoneCall.q.minutes)) == 'view_phone_call.minutes' q = sqlrepr(ViewPhoneCall.q) assert q.count('phone_call.minutes AS minutes') assert q.count('phone_number.number AS number') def testAliasOverride(): assert str(sqlrepr(ViewPhoneMore2.q.id)) == 'vpm.id' def checkAttr(cls, id, attr, value): assert getattr(cls.get(id), attr) == value def testGetVPC(): checkAttr(ViewPhoneCall, calls[0].id, 'number', calls[0].phoneNumber.number) checkAttr(ViewPhoneCall, calls[0].id, 'minutes', calls[0].minutes) checkAttr(ViewPhoneCall, calls[0].id, 'phoneNumber', calls[0].phoneNumber) checkAttr(ViewPhoneCall, calls[2].id, 'number', calls[2].phoneNumber.number) checkAttr(ViewPhoneCall, calls[2].id, 'minutes', calls[2].minutes) checkAttr(ViewPhoneCall, calls[2].id, 'phoneNumber', calls[2].phoneNumber) def testGetVP(): checkAttr(ViewPhone, phones[0].id, 'number', phones[0].number) checkAttr(ViewPhone, phones[0].id, 'minutes', phones[0].calls.sum(PhoneCall.q.minutes)) checkAttr(ViewPhone, phones[0].id, 'phoneNumber', phones[0]) def testGetVPM(): checkAttr(ViewPhoneMore, phones[0].id, 'number', phones[0].number) checkAttr(ViewPhoneMore, phones[0].id, 'minutesCalled', phones[0].incoming.sum(PhoneCall.q.minutes)) checkAttr(ViewPhoneMore, phones[0].id, 'timesCalled', phones[0].incoming.count()) checkAttr(ViewPhoneMore, phones[0].id, 'timesCalledLong', phones[0].incoming.filter(PhoneCall.q.minutes > 10).count()) def testJoinView(): p = ViewPhone.get(phones[0].id) assert p.calls.count() == 2 assert p.vCalls.count() == 2 assert p.vCalls[0] == ViewPhoneCall.get(calls[0].id) def testInnerAggregate(): checkAttr(ViewPhoneInnerAggregate, phones[0].id, 'twiceMinutes', phones[0].calls.sum(PhoneCall.q.minutes) * 2) def testSelect(): s = ViewPhone.select() assert s.count() == len(phones) s = ViewPhoneCall.select() assert s.count() == len(calls) def testSelect2(): s = ViewPhone.select(ViewPhone.q.number == phones[0].number) assert s.getOne().phoneNumber == phones[0] def testDistinctCount(): # This test is for SelectResults non-* based count when distinct # We're really just checking this doesn't raise anything # due to lack of sqlrepr'ing. assert ViewPhone.select(distinct=True).count() == 2 SQLObject-3.4.0/sqlobject/tests/test_class_hash.py0000644000175000017500000000102412747416060021531 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import setupClass ######################################## # Test hashing a column instance ######################################## class ClassHashTest(SQLObject): name = StringCol(length=50, alternateID=True, dbName='name_col') def test_class_hash(): setupClass(ClassHashTest) ClassHashTest(name='bob') b = ClassHashTest.byName('bob') hashed = hash(b) b.expire() b = ClassHashTest.byName('bob') assert hash(b) == hashed SQLObject-3.4.0/sqlobject/tests/test_columns_order.py0000644000175000017500000000067412747416060022306 0ustar phdphd00000000000000from sqlobject import IntCol, SQLObject, StringCol ######################################## # Columns order ######################################## class SOColumnsOrder(SQLObject): name = StringCol() surname = StringCol() parname = StringCol() age = IntCol() def test_columns_order(): column_names = [c.name for c in SOColumnsOrder.sqlmeta.columnList] assert column_names == ['name', 'surname', 'parname', 'age'] SQLObject-3.4.0/sqlobject/tests/test_enum.py0000644000175000017500000000277712747416060020405 0ustar phdphd00000000000000from sqlobject import EnumCol, SQLObject, UnicodeCol from sqlobject.col import validators from sqlobject.tests.dbtest import raises, setupClass ######################################## # Enum test ######################################## class Enum1(SQLObject): l = EnumCol(enumValues=['a', 'bcd', 'e']) def testBad(): setupClass(Enum1) for l in ['a', 'bcd', 'a', 'e']: Enum1(l=l) raises( (Enum1._connection.module.IntegrityError, Enum1._connection.module.ProgrammingError, validators.Invalid), Enum1, l='b') class EnumWithNone(SQLObject): l = EnumCol(enumValues=['a', 'bcd', 'e', None]) def testNone(): setupClass(EnumWithNone) for l in [None, 'a', 'bcd', 'a', 'e', None]: e = EnumWithNone(l=l) assert e.l == l class EnumWithDefaultNone(SQLObject): l = EnumCol(enumValues=['a', 'bcd', 'e', None], default=None) def testDefaultNone(): setupClass(EnumWithDefaultNone) e = EnumWithDefaultNone() assert e.l is None class EnumWithDefaultOther(SQLObject): l = EnumCol(enumValues=['a', 'bcd', 'e', None], default='a') def testDefaultOther(): setupClass(EnumWithDefaultOther) e = EnumWithDefaultOther() assert e.l == 'a' class EnumUnicode(SQLObject): n = UnicodeCol() l = EnumCol(enumValues=['a', 'b']) def testUnicode(): setupClass(EnumUnicode) EnumUnicode(n=u'a', l='a') EnumUnicode(n=u'b', l=u'b') EnumUnicode(n=u'\u201c', l='a') EnumUnicode(n=u'\u201c', l=u'b') SQLObject-3.4.0/sqlobject/tests/test_ForeignKey.py0000644000175000017500000000770312770603634021476 0ustar phdphd00000000000000from formencode import validators from sqlobject import ForeignKey, IntCol, SQLObject, StringCol from sqlobject.tests.dbtest import getConnection, InstalledTestDatabase, \ raises, setupClass, setupCyclicClasses class SOTestComposerKey(SQLObject): name = StringCol() id2 = IntCol(default=None, unique=True) class SOTestWorkKey(SQLObject): class sqlmeta: idName = "work_id" composer = ForeignKey('SOTestComposerKey', cascade=True) title = StringCol() class SOTestWorkKey2(SQLObject): title = StringCol() class SOTestOtherColumn(SQLObject): key1 = ForeignKey('SOTestComposerKey', default=None) key2 = ForeignKey('SOTestComposerKey', refColumn='id2', default=None) def test1(): setupClass([SOTestComposerKey, SOTestWorkKey]) c = SOTestComposerKey(name='Mahler, Gustav') w1 = SOTestWorkKey(composer=c, title='Symphony No. 9') w2 = SOTestWorkKey(composer=None, title=None) # Select by usual way s = SOTestWorkKey.selectBy(composerID=c.id, title='Symphony No. 9') assert s.count() == 1 assert s[0] == w1 # selectBy object.id s = SOTestWorkKey.selectBy(composer=c.id, title='Symphony No. 9') assert s.count() == 1 assert s[0] == w1 # selectBy object s = SOTestWorkKey.selectBy(composer=c, title='Symphony No. 9') assert s.count() == 1 assert s[0] == w1 # selectBy id s = SOTestWorkKey.selectBy(id=w1.id) assert s.count() == 1 assert s[0] == w1 # is None handled correctly? s = SOTestWorkKey.selectBy(composer=None, title=None) assert s.count() == 1 assert s[0] == w2 s = SOTestWorkKey.selectBy() assert s.count() == 2 # select with objects s = SOTestWorkKey.select(SOTestWorkKey.q.composerID == c.id) assert s.count() == 1 assert s[0] == w1 s = SOTestWorkKey.select(SOTestWorkKey.q.composer == c.id) assert s.count() == 1 assert s[0] == w1 s = SOTestWorkKey.select(SOTestWorkKey.q.composerID == c) assert s.count() == 1 assert s[0] == w1 s = SOTestWorkKey.select(SOTestWorkKey.q.composer == c) assert s.count() == 1 assert s[0] == w1 s = SOTestWorkKey.select( (SOTestWorkKey.q.composer == c) & (SOTestWorkKey.q.title == 'Symphony No. 9')) assert s.count() == 1 assert s[0] == w1 def test2(): SOTestWorkKey._connection = getConnection() InstalledTestDatabase.drop(SOTestWorkKey) setupClass([SOTestComposerKey, SOTestWorkKey2], force=True) SOTestWorkKey2.sqlmeta.addColumn(ForeignKey('SOTestComposerKey'), changeSchema=True) def test_otherColumn(): setupClass([SOTestComposerKey, SOTestOtherColumn]) test_composer1 = SOTestComposerKey(name='Test1') test_composer2 = SOTestComposerKey(name='Test2', id2=2) test_fkey = SOTestOtherColumn(key1=test_composer1) test_other = SOTestOtherColumn(key2=test_composer2.id2) getConnection().cache.clear() assert test_fkey.key1 == test_composer1 assert test_other.key2 == test_composer2 class SOTestFKValidationA(SQLObject): name = StringCol() bfk = ForeignKey("SOTestFKValidationB") cfk = ForeignKey("SOTestFKValidationC", default=None) class SOTestFKValidationB(SQLObject): name = StringCol() afk = ForeignKey("SOTestFKValidationA") class SOTestFKValidationC(SQLObject): class sqlmeta: idType = str name = StringCol() def test_foreignkey_validation(): setupCyclicClasses(SOTestFKValidationA, SOTestFKValidationB, SOTestFKValidationC) a = SOTestFKValidationA(name="testa", bfk=None) b = SOTestFKValidationB(name="testb", afk=a) c = SOTestFKValidationC(id='testc', name="testc") a.bfk = b a.cfk = c assert a.bfk == b assert a.cfk == c assert b.afk == a raises(validators.Invalid, SOTestFKValidationA, name="testa", bfk='testb', cfk='testc') a = SOTestFKValidationA(name="testa", bfk=1, cfk='testc') assert a.bfkID == 1 assert a.cfkID == 'testc' SQLObject-3.4.0/sqlobject/tests/test_sqlbuilder_dbspecific.py0000644000175000017500000000353112770605203023742 0ustar phdphd00000000000000from __future__ import print_function from sqlobject import BoolCol, SQLObject from sqlobject.sqlbuilder import AND, Alias, EXISTS, JOIN, LEFTJOINOn, \ Select, sqlrepr from sqlobject.tests.dbtest import setupClass ''' Going to test that complex sqlbuilder constructions are never prematurely stringified. A straight-forward approach is to use Bools, since postgresql wants special formatting in queries. The test is whether a call to sqlrepr(x, 'postgres') includes the appropriate bool formatting throughout. ''' class SBButton(SQLObject): activated = BoolCol() def makeClause(): # It's not a comparison, it's an SQLExpression return SBButton.q.activated == True # noqa def makeSelect(): return Select(SBButton.q.id, clause=makeClause()) def checkCount(q, c, msg=''): print("STRING:", str(q)) print("POSTGR:", sqlrepr(q, 'postgres')) assert sqlrepr(q, 'postgres').count("'t'") == c and \ sqlrepr(q, 'postgres') != str(q), msg def testSimple(): setupClass(SBButton) checkCount(makeClause(), 1) checkCount(makeSelect(), 1) def testMiscOps(): setupClass(SBButton) checkCount(AND(makeClause(), makeClause()), 2) checkCount(AND(makeClause(), EXISTS(makeSelect())), 2) def testAliased(): setupClass(SBButton) b = Alias(makeSelect(), 'b') checkCount(b, 1) checkCount(Select(b.q.id), 1) # Table1 & Table2 are treated individually in joins checkCount(JOIN(None, b), 1) checkCount(JOIN(b, SBButton), 1) checkCount(JOIN(SBButton, b), 1) checkCount(LEFTJOINOn(None, b, SBButton.q.id == b.q.id), 1) checkCount(LEFTJOINOn(b, SBButton, SBButton.q.id == b.q.id), 1) checkCount(LEFTJOINOn(SBButton, b, SBButton.q.id == b.q.id), 1) def testTablesUsedSResults(): setupClass(SBButton) checkCount(SBButton.select(makeClause()).queryForSelect(), 1) SQLObject-3.4.0/sqlobject/tests/test_string_id.py0000644000175000017500000000312712747416060021411 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol, sqlmeta from sqlobject.tests.dbtest import setupClass ######################################## # String ID test ######################################## class SOStringID(SQLObject): class sqlmeta(sqlmeta): table = 'so_string_id' idType = str val = StringCol(alternateID=True) mysqlCreate = """ CREATE TABLE IF NOT EXISTS so_string_id ( id VARCHAR(50) PRIMARY KEY, val TEXT ) """ postgresCreate = """ CREATE TABLE so_string_id ( id VARCHAR(50) PRIMARY KEY, val TEXT ) """ sybaseCreate = """ CREATE TABLE so_string_id ( id VARCHAR(50) UNIQUE, val VARCHAR(50) NULL ) """ firebirdCreate = """ CREATE TABLE so_string_id ( id VARCHAR(50) NOT NULL PRIMARY KEY, val BLOB SUB_TYPE TEXT ) """ mssqlCreate = """ CREATE TABLE so_string_id ( id VARCHAR(50) PRIMARY KEY, val varchar(4000) ) """ sqliteCreate = postgresCreate mysqlDrop = """ DROP TABLE IF EXISTS so_string_id """ postgresDrop = """ DROP TABLE so_string_id """ sqliteDrop = postgresDrop firebirdDrop = postgresDrop mssqlDrop = postgresDrop def test_stringID(): setupClass(SOStringID) t1 = SOStringID(id='hey', val='whatever') t2 = SOStringID.byVal('whatever') assert t1 == t2 assert t1.val == t2.val assert t1.val == 'whatever' t1 = SOStringID(id='you', val='nowhere') t2 = SOStringID.get('you') assert t1 == t2 assert t1.val == t2.val assert t1.val == 'nowhere' SQLObject-3.4.0/sqlobject/tests/test_sqlbuilder.py0000644000175000017500000001032613037500320021556 0ustar phdphd00000000000000from sqlobject import IntCol, SQLObject, StringCol from sqlobject.compat import PY2 from sqlobject.sqlbuilder import AND, CONCAT, Delete, Insert, SQLOp, Select, \ Union, Update, const, func, sqlrepr from sqlobject.tests.dbtest import getConnection, raises, setupClass class SOTestSQLBuilder(SQLObject): name = StringCol() so_value = IntCol() def test_Select(): setupClass(SOTestSQLBuilder) select1 = Select([const.id, func.MAX(const.salary)], staticTables=['employees']) assert sqlrepr(select1) == 'SELECT id, MAX(salary) FROM employees' select2 = Select([SOTestSQLBuilder.q.name, SOTestSQLBuilder.q.so_value]) assert sqlrepr(select2) == \ 'SELECT so_test_sql_builder.name, so_test_sql_builder.so_value ' \ 'FROM so_test_sql_builder' union = Union(select1, select2) assert sqlrepr(union) == \ 'SELECT id, MAX(salary) FROM employees ' \ 'UNION SELECT so_test_sql_builder.name, ' \ 'so_test_sql_builder.so_value ' \ 'FROM so_test_sql_builder' union = Union(SOTestSQLBuilder.select().queryForSelect()) assert sqlrepr(union) == \ 'SELECT so_test_sql_builder.id, so_test_sql_builder.name, ' \ 'so_test_sql_builder.so_value FROM so_test_sql_builder WHERE 1 = 1' def test_empty_AND(): assert AND() is None assert AND(True) is True # sqlrepr() is needed because AND() returns an SQLExpression that overrides # comparison. The following # AND('x', 'y') == "foo bar" # is True! (-: Eeek! assert sqlrepr(AND(1, 2)) == sqlrepr(SQLOp("AND", 1, 2)) == "((1) AND (2))" assert sqlrepr(AND(1, 2, '3'), "sqlite") == \ sqlrepr(SQLOp("AND", 1, SQLOp("AND", 2, '3')), "sqlite") == \ "((1) AND ((2) AND ('3')))" def test_modulo(): setupClass(SOTestSQLBuilder) assert sqlrepr(SOTestSQLBuilder.q.so_value % 2 == 0, 'mysql') == \ "((MOD(so_test_sql_builder.so_value, 2)) = (0))" assert sqlrepr(SOTestSQLBuilder.q.so_value % 2 == 0, 'sqlite') == \ "(((so_test_sql_builder.so_value) % (2)) = (0))" def test_str_or_sqlrepr(): select = Select(['id', 'name'], staticTables=['employees'], where='value>0', orderBy='id') assert sqlrepr(select, 'sqlite') == \ 'SELECT id, name FROM employees WHERE value>0 ORDER BY id' select = Select(['id', 'name'], staticTables=['employees'], where='value>0', orderBy='id', lazyColumns=True) assert sqlrepr(select, 'sqlite') == \ 'SELECT id FROM employees WHERE value>0 ORDER BY id' insert = Insert('employees', values={'id': 1, 'name': 'test'}) assert sqlrepr(insert, 'sqlite') == \ "INSERT INTO employees (id, name) VALUES (1, 'test')" update = Update('employees', {'name': 'test'}, where='id=1') assert sqlrepr(update, 'sqlite') == \ "UPDATE employees SET name='test' WHERE id=1" update = Update('employees', {'name': 'test', 'age': 42}, where='id=1') assert sqlrepr(update, 'sqlite') == \ "UPDATE employees SET age=42, name='test' WHERE id=1" delete = Delete('employees', where='id=1') assert sqlrepr(delete, 'sqlite') == \ "DELETE FROM employees WHERE id=1" raises(TypeError, Delete, 'employees') delete = Delete('employees', where=None) assert sqlrepr(delete, 'sqlite') == \ "DELETE FROM employees" def test_CONCAT(): setupClass(SOTestSQLBuilder) SOTestSQLBuilder(name='test', so_value=42) assert sqlrepr(CONCAT('a', 'b'), 'mysql') == "CONCAT('a', 'b')" assert sqlrepr(CONCAT('a', 'b'), 'mssql') == "'a' + 'b'" assert sqlrepr(CONCAT('a', 'b'), 'sqlite') == "'a' || 'b'" assert sqlrepr(CONCAT('prefix', SOTestSQLBuilder.q.name), 'mysql') == \ "CONCAT('prefix', so_test_sql_builder.name)" assert sqlrepr(CONCAT('prefix', SOTestSQLBuilder.q.name), 'sqlite') == \ "'prefix' || so_test_sql_builder.name" select = Select([CONCAT(SOTestSQLBuilder.q.name, '-suffix')], staticTables=['so_test_sql_builder']) connection = getConnection() rows = connection.queryAll(connection.sqlrepr(select)) result = rows[0][0] if not PY2 and not isinstance(result, str): result = result.decode('ascii') assert result == "test-suffix" SQLObject-3.4.0/sqlobject/tests/test_slice.py0000644000175000017500000000303412775273363020533 0ustar phdphd00000000000000import pytest from sqlobject import IntCol, SQLObject from sqlobject.tests.dbtest import setupClass, supports ######################################## # Slicing tests ######################################## def listrange(*args): """Always return a list, for py3k compatibility""" return list(range(*args)) class Counter(SQLObject): number = IntCol(notNull=True) class TestSlice: def setup_method(self, meth): setupClass(Counter) for i in range(100): Counter(number=i) def counterEqual(self, counters, value): if not supports('limitSelect'): pytest.skip("limitSelect isn't supported") assert [c.number for c in counters] == value def test_slice(self): self.counterEqual( Counter.select(None, orderBy='number'), listrange(100)) self.counterEqual( Counter.select(None, orderBy='number')[10:20], listrange(10, 20)) self.counterEqual( Counter.select(None, orderBy='number')[20:30][:5], listrange(20, 25)) self.counterEqual( Counter.select(None, orderBy='number')[20:30][1:5], listrange(21, 25)) self.counterEqual( Counter.select(None, orderBy='number')[:-10], listrange(0, 90)) self.counterEqual( Counter.select(None, orderBy='number', reversed=True), listrange(99, -1, -1)) self.counterEqual( Counter.select(None, orderBy='-number'), listrange(99, -1, -1)) SQLObject-3.4.0/sqlobject/tests/test_parse_uri.py0000644000175000017500000001170213036245075021414 0ustar phdphd00000000000000import os from sqlobject.dbconnection import DBConnection from sqlobject.sqlite.sqliteconnection import SQLiteConnection ######################################## # Test _parseURI ######################################## def test_parse(): _parseURI = DBConnection._parseURI user, password, host, port, path, args = _parseURI("mysql://host/database") assert user is None assert password is None assert host == "host" assert port is None assert path == "/database" assert args == {} user, password, host, port, path, args = _parseURI( "mysql://user:pass%20word@host/database?unix_socket=/var/mysql/socket") assert user == "user" assert password == "pass word" assert host == "host" assert port is None assert path == "/database" assert args == {"unix_socket": "/var/mysql/socket"} user, password, host, port, path, args = \ _parseURI("postgres://user@host/database") assert user == "user" assert password is None assert host == "host" assert port is None assert path == "/database" assert args == {} user, password, host, port, path, args = \ _parseURI("postgres://host:5432/database") assert user is None assert password is None assert host == "host" assert port == 5432 assert path == "/database" assert args == {} user, password, host, port, path, args = \ _parseURI("postgres:///full/path/to/socket/database") assert user is None assert password is None assert host is None assert port is None assert path == "/full/path/to/socket/database" assert args == {} user, password, host, port, path, args = \ _parseURI("postgres://us%3Aer:p%40ssword@host/database") assert user == "us:er" assert password == "p@ssword" assert host == "host" assert port is None assert path == "/database" assert args == {} user, password, host, port, path, args = \ _parseURI("sqlite:///full/path/to/database") assert user is None assert password is None assert host is None assert port is None assert path == "/full/path/to/database" assert args == {} user, password, host, port, path, args = _parseURI("sqlite:/:memory:") assert user is None assert password is None assert host is None assert port is None assert path == "/:memory:" assert args == {} if os.name == 'nt': user, password, host, port, path, args = \ _parseURI("sqlite:/C|/full/path/to/database") assert user is None assert password is None assert host is None assert port is None assert path == "C:/full/path/to/database" assert args == {} user, password, host, port, path, args = \ _parseURI("sqlite:///C:/full/path/to/database") assert user is None assert password is None assert host is None assert port is None assert path == "C:/full/path/to/database" assert args == {} def test_uri(): connection = DBConnection() connection.close = lambda: None connection.dbName, connection.host, connection.port, \ connection.user, connection.password, connection.db = \ 'mysql', 'host', None, None, None, 'database' assert connection.uri() == "mysql://host/database" connection.dbName, connection.host, connection.port, \ connection.user, connection.password, connection.db = \ 'mysql', 'host', None, 'user', 'pass word', 'database' assert connection.uri() == "mysql://user:pass%20word@host/database" connection.dbName, connection.host, connection.port, \ connection.user, connection.password, connection.db = \ 'postgres', 'host', None, 'user', None, 'database' assert connection.uri() == "postgres://user@host/database" connection.dbName, connection.host, connection.port, \ connection.user, connection.password, connection.db = \ 'postgres', 'host', 5432, None, None, 'database' assert connection.uri() == "postgres://host:5432/database" connection.dbName, connection.host, connection.port, \ connection.user, connection.password, connection.db = \ 'postgres', None, None, None, None, '/full/path/to/socket/database' assert connection.uri() == "postgres:///full/path/to/socket/database" connection.dbName, connection.host, connection.port, \ connection.user, connection.password, connection.db = \ 'postgres', 'host', None, 'us:er', 'p@ssword', 'database' assert connection.uri() == "postgres://us%3Aer:p%40ssword@host/database" connection = SQLiteConnection(None) connection.filename = '/full/path/to/database' assert connection.uri() == "sqlite:///full/path/to/database" connection.filename = ':memory:' assert connection.uri() == "sqlite:/:memory:" if os.name == 'nt': connection.filename = 'C:/full/path/to/database' assert connection.uri() == "sqlite:///C%3A/full/path/to/database" SQLObject-3.4.0/sqlobject/tests/test_default_style.py0000644000175000017500000000371512770605203022271 0ustar phdphd00000000000000""" Test the default styles, to guarantee consistency. """ from sqlobject import ForeignKey, MixedCaseStyle, MixedCaseUnderscoreStyle, \ SQLObject, StringCol, Style # Hash of styles versus the database names resulting from 'columns' below. columns = ["ABCUpper", "abc_lower", "ABCamelCaseColumn"] styles = { Style: columns, MixedCaseUnderscoreStyle: ["abc_upper", "abc_lower", "ab_camel_case_column"], MixedCaseStyle: ["ABCUpper", "Abc_lower", "ABCamelCaseColumn"], } # Hash of styles versus the database names # resulting from a foreign key named 'FKey'. fkey = ForeignKey("DefaultStyleTest", name="FKey") fkeys = { Style: "FKeyID", MixedCaseUnderscoreStyle: "f_key_id", MixedCaseStyle: "FKeyID", } def make_columns(): global columns columns = [] for col_name in columns: columns.append(StringCol(name=col_name, length=10)) def do_col_test(DefaultStyleTest, style, dbnames): DefaultStyleTest.sqlmeta.style = style() for col, old_dbname in zip(columns, dbnames): DefaultStyleTest.sqlmeta.addColumn(col) try: new_dbname = DefaultStyleTest.sqlmeta.columns[col.name].dbName assert new_dbname == old_dbname finally: if col.name in DefaultStyleTest.sqlmeta.columns: DefaultStyleTest.sqlmeta.delColumn(col) def do_fkey_test(DefaultStyleTest, style, dbname): DefaultStyleTest.sqlmeta.style = style() DefaultStyleTest.sqlmeta.addColumn(fkey) try: assert list(DefaultStyleTest.sqlmeta.columns.keys())[0] == "FKeyID" assert list(DefaultStyleTest.sqlmeta.columns.values())[0].dbName == \ dbname finally: DefaultStyleTest.sqlmeta.delColumn(fkey) class DefaultStyleTest(SQLObject): pass def test_default_styles(): make_columns() for style in styles: do_col_test(DefaultStyleTest, style, styles[style]) do_fkey_test(DefaultStyleTest, style, fkeys[style]) SQLObject-3.4.0/sqlobject/tests/test_inheritance.py0000644000175000017500000000120012747416060021706 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import setupClass ######################################## # Inheritance ######################################## class Super(SQLObject): name = StringCol(length=10) class Sub(Super): name2 = StringCol(length=10) def test_super(): setupClass(Super) setupClass(Sub) s1 = Super(name='one') Super(name='two') # s2 s3 = Super.get(s1.id) assert s1 == s3 def test_sub(): setupClass(Super) setupClass(Sub) s1 = Sub(name='one', name2='1') Sub(name='two', name2='2') # s2 s3 = Sub.get(s1.id) assert s1 == s3 SQLObject-3.4.0/sqlobject/tests/test_pickle.py0000644000175000017500000000235112775273363020704 0ustar phdphd00000000000000import pickle import pytest from sqlobject import IntCol, SQLObject, StringCol from sqlobject.tests.dbtest import getConnection, raises, setupClass ######################################## # Pickle instances ######################################## class SOTestPickle(SQLObject): question = StringCol() answer = IntCol() test_question = 'The Ulimate Question of Life, the Universe and Everything' test_answer = 42 def test_pickleCol(): setupClass(SOTestPickle) connection = SOTestPickle._connection test = SOTestPickle(question=test_question, answer=test_answer) pickle_data = pickle.dumps(test, pickle.HIGHEST_PROTOCOL) connection.cache.clear() test = pickle.loads(pickle_data) test2 = connection.cache.tryGet(test.id, SOTestPickle) assert test2 is test assert test.question == test_question assert test.answer == test_answer if (connection.dbName == 'sqlite') and connection._memory: pytest.skip("The following test requires a different connection") test = SOTestPickle.get( test.id, # make a different DB URI and open another connection connection=getConnection(registry='')) raises(pickle.PicklingError, pickle.dumps, test, pickle.HIGHEST_PROTOCOL) SQLObject-3.4.0/sqlobject/tests/test_perConnection.py0000644000175000017500000000120112770603634022225 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import getConnection ######################################## # Per-instance connection ######################################## class SOTestPerConnection(SQLObject): test = StringCol() def test_perConnection(): connection = getConnection() SOTestPerConnection.dropTable(connection=connection, ifExists=True) SOTestPerConnection.createTable(connection=connection) SOTestPerConnection(test='test', connection=connection) assert len(list(SOTestPerConnection.select( SOTestPerConnection.q.test == 'test', connection=connection))) == 1 SQLObject-3.4.0/sqlobject/tests/test_complex_sorting.py0000644000175000017500000000524512747416060022646 0ustar phdphd00000000000000from sqlobject import IntCol, SQLObject, StringCol, sqlmeta from sqlobject.tests.dbtest import inserts, setupClass # Test more complex orderBy clauses class ComplexNames(SQLObject): class sqlmeta(sqlmeta): table = 'names_table' defaultOrder = ['lastName', 'firstName', 'phone', 'age'] firstName = StringCol(length=30) lastName = StringCol(length=30) phone = StringCol(length=11) age = IntCol() def setupComplexNames(): setupClass(ComplexNames) inserts(ComplexNames, [('aj', 'baker', '555-444-333', 34), ('joe', 'robbins', '444-555-333', 34), ('tim', 'jackson', '555-444-222', 32), ('joe', 'baker', '222-111-000', 24), ('zoe', 'robbins', '444-555-333', 46)], schema='firstName lastName phone age') def nameList(names): result = [] for name in names: result.append('%s %s' % (name.firstName, name.lastName)) return result def firstList(names): return [n.firstName for n in names] def test_defaultComplexOrder(): setupComplexNames() assert nameList(ComplexNames.select()) == \ ['aj baker', 'joe baker', 'tim jackson', 'joe robbins', 'zoe robbins'] def test_complexOrders(): setupComplexNames() assert nameList(ComplexNames.select().orderBy(['age', 'phone', 'firstName', 'lastName'])) == \ ['joe baker', 'tim jackson', 'joe robbins', 'aj baker', 'zoe robbins'] assert nameList(ComplexNames.select().orderBy(['-age', 'phone', 'firstName', 'lastName'])) == \ ['zoe robbins', 'joe robbins', 'aj baker', 'tim jackson', 'joe baker'] assert nameList(ComplexNames.select().orderBy(['age', '-phone', 'firstName', 'lastName'])) == \ ['joe baker', 'tim jackson', 'aj baker', 'joe robbins', 'zoe robbins'] assert nameList(ComplexNames.select().orderBy(['-firstName', 'phone', 'lastName', 'age'])) == \ ['zoe robbins', 'tim jackson', 'joe baker', 'joe robbins', 'aj baker'] assert nameList(ComplexNames.select().orderBy(['-firstName', '-phone', 'lastName', 'age'])) == \ ['zoe robbins', 'tim jackson', 'joe robbins', 'joe baker', 'aj baker'] SQLObject-3.4.0/sqlobject/tests/test_auto.py0000644000175000017500000001537713037500320020373 0ustar phdphd00000000000000from datetime import datetime from pytest import raises from sqlobject import KeyCol, MultipleJoin, SQLObject, StringCol, \ classregistry, sqlmeta from sqlobject.col import use_microseconds from sqlobject.tests.dbtest import getConnection, setupClass ######################################## # Dynamic column tests ######################################## now = datetime.now class Person(SQLObject): class sqlmeta: defaultOrder = 'name' name = StringCol(length=100, dbName='name_col') class Phone(SQLObject): class sqlmeta: defaultOrder = 'phone' phone = StringCol(length=12) class TestPeople: def setup_method(self, meth): setupClass(Person) setupClass(Phone) for n in ['jane', 'tim', 'bob', 'jake']: Person(name=n) for p in ['555-555-5555', '555-394-2930', '444-382-4854']: Phone(phone=p) def test_defaultOrder(self): assert list(Person.select('all')) == list( Person.select('all', orderBy=Person.sqlmeta.defaultOrder)) def test_dynamicColumn(self): nickname = StringCol('nickname', length=10) Person.sqlmeta.addColumn(nickname, changeSchema=True) Person(name='robert', nickname='bob') assert ([p.name for p in Person.select('all')] == ['bob', 'jake', 'jane', 'robert', 'tim']) Person.sqlmeta.delColumn(nickname, changeSchema=True) def test_dynamicJoin(self): col = KeyCol('person', foreignKey='Person') Phone.sqlmeta.addColumn(col, changeSchema=True) join = MultipleJoin('Phone') Person.sqlmeta.addJoin(join) for phone in Phone.select('all'): if phone.phone.startswith('555'): phone.person = Person.selectBy(name='tim')[0] else: phone.person = Person.selectBy(name='bob')[0] l = [p.phone for p in Person.selectBy(name='tim')[0].phones] l.sort() assert l == ['555-394-2930', '555-555-5555'] Phone.sqlmeta.delColumn(col, changeSchema=True) Person.sqlmeta.delJoin(join) def _test_collidingName(self): class CollidingName(SQLObject): expire = StringCol() def test_collidingName(self): raises(AssertionError, Person.sqlmeta.addColumn, StringCol(name="name")) raises(AssertionError, Person.sqlmeta.addColumn, StringCol(name="_init")) raises(AssertionError, Person.sqlmeta.addColumn, StringCol(name="expire")) raises(AssertionError, Person.sqlmeta.addColumn, StringCol(name="set")) raises(AssertionError, self._test_collidingName) ######################################## # Auto class generation ######################################## class TestAuto: mysqlCreate = """ CREATE TABLE IF NOT EXISTS auto_test ( auto_id INT AUTO_INCREMENT PRIMARY KEY, first_name VARCHAR(100), last_name VARCHAR(200) NOT NULL, age INT DEFAULT NULL, created DATETIME NOT NULL, happy char(1) DEFAULT 'Y' NOT NULL, long_field TEXT, wannahavefun TINYINT DEFAULT 0 NOT NULL ) """ postgresCreate = """ CREATE TABLE auto_test ( auto_id SERIAL PRIMARY KEY, first_name VARCHAR(100), last_name VARCHAR(200) NOT NULL, age INT DEFAULT 0, created TIMESTAMP NOT NULL, happy char(1) DEFAULT 'Y' NOT NULL, long_field TEXT, wannahavefun BOOL DEFAULT FALSE NOT NULL ) """ rdbhostCreate = """ CREATE TABLE auto_test ( auto_id SERIAL PRIMARY KEY, first_name VARCHAR(100), last_name VARCHAR(200) NOT NULL, age INT DEFAULT 0, created VARCHAR(40) NOT NULL, happy char(1) DEFAULT 'Y' NOT NULL, long_field TEXT, wannahavefun BOOL DEFAULT FALSE NOT NULL ) """ sqliteCreate = """ CREATE TABLE auto_test ( auto_id INTEGER PRIMARY KEY AUTOINCREMENT , first_name VARCHAR(100), last_name VARCHAR(200) NOT NULL, age INT DEFAULT NULL, created DATETIME NOT NULL, happy char(1) DEFAULT 'Y' NOT NULL, long_field TEXT, wannahavefun INT DEFAULT 0 NOT NULL ) """ sybaseCreate = """ CREATE TABLE auto_test ( auto_id integer, first_name VARCHAR(100), last_name VARCHAR(200) NOT NULL, age INT DEFAULT 0, created DATETIME NOT NULL, happy char(1) DEFAULT 'Y' NOT NULL, long_field TEXT, wannahavefun BIT default(0) NOT NULL ) """ mssqlCreate = """ CREATE TABLE auto_test ( auto_id int identity(1,1), first_name VARCHAR(100), last_name VARCHAR(200) NOT NULL, age INT DEFAULT 0, created DATETIME NOT NULL, happy char(1) DEFAULT 'Y' NOT NULL, long_field TEXT, wannahavefun BIT default(0) NOT NULL ) """ mysqlDrop = """ DROP TABLE IF EXISTS auto_test """ postgresDrop = """ DROP TABLE auto_test """ sqliteDrop = sybaseDrop = mssqlDrop = rdbhostDrop = postgresDrop def setup_method(self, meth): conn = getConnection() dbName = conn.dbName creator = getattr(self, dbName + 'Create', None) if creator: conn.query(creator) def teardown_method(self, meth): conn = getConnection() dbName = conn.dbName dropper = getattr(self, dbName + 'Drop', None) if dropper: try: conn.query(dropper) except: # Perhaps we don't have DROP permission pass def test_classCreate(self): class AutoTest(SQLObject): _connection = getConnection() class sqlmeta(sqlmeta): idName = 'auto_id' fromDatabase = True if AutoTest._connection.dbName == 'mssql': use_microseconds(False) john = AutoTest(firstName='john', lastName='doe', age=10, created=now(), wannahavefun=False, longField='x' * 1000) jane = AutoTest(firstName='jane', lastName='doe', happy='N', created=now(), wannahavefun=True, longField='x' * 1000) assert not john.wannahavefun assert jane.wannahavefun assert john.longField == 'x' * 1000 assert jane.longField == 'x' * 1000 del classregistry.registry( AutoTest.sqlmeta.registry).classes['AutoTest'] columns = AutoTest.sqlmeta.columns assert columns["lastName"].dbName == "last_name" assert columns["wannahavefun"].dbName == "wannahavefun" if AutoTest._connection.dbName == 'mssql': use_microseconds(True) SQLObject-3.4.0/sqlobject/tests/test_sqlobject_admin.py0000644000175000017500000000206012770603634022561 0ustar phdphd00000000000000# These tests are not enabled yet, but here are some working examples # of using createSQL so far. from sqlobject import SQLObject, StringCol class SOTest1(SQLObject): class sqlmeta: createSQL = "CREATE SEQUENCE db_test1_seq;" test1 = StringCol() class SOTest2(SQLObject): class sqlmeta: createSQL = ["CREATE SEQUENCE db_test2_seq;", "ALTER TABLE test2 ADD CHECK(test2 != '');"] test2 = StringCol() class SOTest3(SQLObject): class sqlmeta: createSQL = {'postgres': 'CREATE SEQUENCE db_test3_seq;', 'mysql': 'CREATE SEQUENCE db_test3_seq;'} test3 = StringCol() class SOTest4(SQLObject): class sqlmeta: createSQL = {'postgres': ['CREATE SEQUENCE db_test4_seq;', "ALTER TABLE test4 ADD CHECK(test4 != '');"], 'mysql': 'CREATE SEQUENCE db_test4_seq;'} test4 = StringCol() class SOTest5(SQLObject): class sqlmeta: createSQL = {'mysql': 'CREATE SEQUENCE db_test5_seq;'} test5 = StringCol() SQLObject-3.4.0/sqlobject/tests/test_sqlbuilder_importproxy.py0000644000175000017500000000256112747416060024272 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.sqlbuilder import Alias, ImportProxy, tablesUsedSet from sqlobject.views import ViewSQLObject def testSimple(): nyi = ImportProxy('NotYetImported') x = nyi.q.name class NotYetImported(SQLObject): name = StringCol(dbName='a_name') y = nyi.q.name assert str(x) == 'not_yet_imported.a_name' assert str(y) == 'not_yet_imported.a_name' def testAddition(): nyi = ImportProxy('NotYetImported2') x = nyi.q.name + nyi.q.name class NotYetImported2(SQLObject): name = StringCol(dbName='a_name') assert str(x) == \ '((not_yet_imported2.a_name) + (not_yet_imported2.a_name))' def testOnView(): nyi = ImportProxy('NotYetImportedV') x = nyi.q.name class NotYetImported3(SQLObject): name = StringCol(dbName='a_name') class NotYetImportedV(ViewSQLObject): class sqlmeta: idName = NotYetImported3.q.id name = StringCol(dbName=NotYetImported3.q.name) assert str(x) == 'not_yet_imported_v.name' def testAlias(): nyi = ImportProxy('NotYetImported4') y = Alias(nyi, 'y') x = y.q.name class NotYetImported4(SQLObject): name = StringCol(dbName='a_name') assert str(y) == 'not_yet_imported4 y' assert tablesUsedSet(x, None) == set(['not_yet_imported4 y']) assert str(x) == 'y.a_name' SQLObject-3.4.0/sqlobject/tests/test_csvexport.py0000644000175000017500000000535612747416060021472 0ustar phdphd00000000000000from __future__ import print_function try: from StringIO import StringIO except ImportError: from io import StringIO from sqlobject import IntCol, SQLObject, StringCol from sqlobject.util.csvexport import export_csv, export_csv_zip from .dbtest import setupClass def assert_export(result, *args, **kw): f = StringIO() kw['writer'] = f export_csv(*args, **kw) s = f.getvalue().replace('\r\n', '\n') if result.strip() != s.strip(): print('**Expected:') print(result) print('**Got:') print(s) assert result.strip() == s.strip() class SimpleCSV(SQLObject): name = StringCol() address = StringCol() address.csvTitle = 'Street Address' hidden = StringCol() hidden.noCSV = True class ComplexCSV(SQLObject): fname = StringCol() lname = StringCol() age = IntCol() extraCSVColumns = [('name', 'Full Name'), 'initials'] # initials should end up at the end then: csvColumnOrder = ['name', 'fname', 'lname', 'age'] def _get_name(self): return self.fname + ' ' + self.lname def _get_initials(self): return self.fname[0] + self.lname[0] def test_simple(): setupClass(SimpleCSV) SimpleCSV(name='Bob', address='432W', hidden='boo') SimpleCSV(name='Joe', address='123W', hidden='arg') assert_export("""\ name,Street Address Bob,432W Joe,123W """, SimpleCSV, orderBy='name') assert_export("""\ name,Street Address Joe,123W Bob,432W """, SimpleCSV, orderBy='address') assert_export("""\ name,Street Address Joe,123W """, SimpleCSV.selectBy(name='Joe')) def test_complex(): setupClass(ComplexCSV) ComplexCSV(fname='John', lname='Doe', age=40) ComplexCSV(fname='Bob', lname='Dylan', age=60) ComplexCSV(fname='Harriet', lname='Tubman', age=160) assert_export("""\ Full Name,fname,lname,age,initials John Doe,John,Doe,40,JD Bob Dylan,Bob,Dylan,60,BD Harriet Tubman,Harriet,Tubman,160,HT """, ComplexCSV, orderBy='lname') assert_export("""\ Full Name,fname,lname,age,initials Bob Dylan,Bob,Dylan,60,BD John Doe,John,Doe,40,JD """, ComplexCSV.select(ComplexCSV.q.lname.startswith('D'), orderBy='fname')) def test_zip(): # Just exercise tests, doesn't actually test results setupClass(SimpleCSV) SimpleCSV(name='Bob', address='432W', hidden='boo') SimpleCSV(name='Joe', address='123W', hidden='arg') setupClass(ComplexCSV) ComplexCSV(fname='John', lname='Doe', age=40) ComplexCSV(fname='Bob', lname='Dylan', age=60) ComplexCSV(fname='Harriet', lname='Tubman', age=160) s = export_csv_zip([SimpleCSV, ComplexCSV]) assert isinstance(s, bytes) and s s = export_csv_zip([SimpleCSV.selectBy(name='Bob'), (ComplexCSV, list(ComplexCSV.selectBy(fname='John')))]) SQLObject-3.4.0/sqlobject/tests/test_create_drop.py0000644000175000017500000000203313036106435021703 0ustar phdphd00000000000000from sqlobject import BLOBCol, DateTimeCol, IntCol, SQLObject, StringCol, \ sqlmeta from sqlobject.tests.dbtest import getConnection class SOTestCreateDrop(SQLObject): class sqlmeta(sqlmeta): idName = 'test_id_here' table = 'test_create_drop_table' name = StringCol() number = IntCol() so_time = DateTimeCol() short = StringCol(length=10) blobcol = BLOBCol() def test_create_drop(): conn = getConnection() SOTestCreateDrop.setConnection(conn) SOTestCreateDrop.dropTable(ifExists=True) assert not conn.tableExists(SOTestCreateDrop.sqlmeta.table) SOTestCreateDrop.createTable(ifNotExists=True) assert conn.tableExists(SOTestCreateDrop.sqlmeta.table) SOTestCreateDrop.createTable(ifNotExists=True) assert conn.tableExists(SOTestCreateDrop.sqlmeta.table) SOTestCreateDrop.dropTable(ifExists=True) assert not conn.tableExists(SOTestCreateDrop.sqlmeta.table) SOTestCreateDrop.dropTable(ifExists=True) assert not conn.tableExists(SOTestCreateDrop.sqlmeta.table) SQLObject-3.4.0/sqlobject/tests/test_SQLRelatedJoin.py0000644000175000017500000000354512775273363022223 0ustar phdphd00000000000000import pytest from sqlobject import RelatedJoin, SQLObject, SQLRelatedJoin, StringCol from sqlobject.tests.dbtest import setupClass, supports class Fighter(SQLObject): class sqlmeta: idName = 'fighter_id' # test on a non-standard way name = StringCol() tourtments = RelatedJoin('Tourtment') class Tourtment(SQLObject): class sqlmeta: table = 'competition' # test on a non-standard way name = StringCol() fightersAsList = RelatedJoin('Fighter') fightersAsSResult = SQLRelatedJoin('Fighter') def createAllTables(): setupClass(Fighter) setupClass(Tourtment) def test_1(): createAllTables() # create some tourtments t1 = Tourtment(name='Tourtment #1') t2 = Tourtment(name='Tourtment #2') t3 = Tourtment(name='Tourtment #3') # create some fighters gokou = Fighter(name='gokou') vegeta = Fighter(name='vegeta') gohan = Fighter(name='gohan') trunks = Fighter(name='trunks') # relating them t1.addFighter(gokou) t1.addFighter(vegeta) t1.addFighter(gohan) t2.addFighter(gokou) t2.addFighter(vegeta) t2.addFighter(trunks) t3.addFighter(gohan) t3.addFighter(trunks) # do some selects for i, j in zip(t1.fightersAsList, t1.fightersAsSResult): assert i is j assert len(t2.fightersAsList) == t2.fightersAsSResult.count() def test_related_join_transaction(): if not supports('transactions'): pytest.skip("Transactions aren't supported") createAllTables() trans = Tourtment._connection.transaction() try: t1 = Tourtment(name='Tourtment #1', connection=trans) t1.addFighter(Fighter(name='Jim', connection=trans)) assert t1.fightersAsSResult.count() == 1 assert t1.fightersAsSResult[0]._connection == trans finally: trans.commit(True) Tourtment._connection.autoCommit = True SQLObject-3.4.0/sqlobject/tests/test_unicode.py0000644000175000017500000000475713036106435021061 0ustar phdphd00000000000000from sqlobject import IntCol, SQLObject, UnicodeCol from sqlobject.compat import PY2 from sqlobject.tests.dbtest import setupClass ######################################## # Unicode columns ######################################## class SOTestUnicode(SQLObject): so_count = IntCol(alternateID=True) col = UnicodeCol(alternateID=True, length=100) data = [u'\u00f0', u'test', 'ascii test'] items = [] def setup(): global items items = [] setupClass(SOTestUnicode) for i, s in enumerate(data): items.append(SOTestUnicode(so_count=i, col=s)) def test_create(): setup() for s, item in zip(data, items): assert item.col == s conn = SOTestUnicode._connection if PY2: rows = conn.queryAll(""" SELECT so_count, col FROM so_test_unicode ORDER BY so_count """) for so_count, col in rows: if not isinstance(col, bytes): col = col.encode('utf-8') assert data[so_count].encode('utf-8') == col else: rows = conn.queryAll(""" SELECT so_count, col FROM so_test_unicode ORDER BY so_count """) # On python 3, everthings already decoded to unicode for so_count, col in rows: assert data[so_count] == col def test_select(): setup() for i, value in enumerate(data): rows = list(SOTestUnicode.select(SOTestUnicode.q.col == value)) assert len(rows) == 1 if PY2: rows = list(SOTestUnicode.select(SOTestUnicode.q.col == value)) assert len(rows) == 1 rows = list(SOTestUnicode.selectBy(col=value)) assert len(rows) == 1 if PY2: rows = list(SOTestUnicode.selectBy(col=value)) assert len(rows) == 1 row = SOTestUnicode.byCol(value) assert row.so_count == i # starts/endswith/contains rows = list(SOTestUnicode.select(SOTestUnicode.q.col.startswith("test"))) assert len(rows) == 1 rows = list(SOTestUnicode.select(SOTestUnicode.q.col.endswith("test"))) assert len(rows) == 2 rows = list(SOTestUnicode.select(SOTestUnicode.q.col.contains("test"))) assert len(rows) == 2 rows = list(SOTestUnicode.select( SOTestUnicode.q.col.startswith(u"\u00f0"))) assert len(rows) == 1 rows = list(SOTestUnicode.select(SOTestUnicode.q.col.endswith(u"\u00f0"))) assert len(rows) == 1 rows = list(SOTestUnicode.select(SOTestUnicode.q.col.contains(u"\u00f0"))) assert len(rows) == 1 SQLObject-3.4.0/sqlobject/tests/test_csvimport.py0000644000175000017500000000076613125024462021453 0ustar phdphd00000000000000import csv from datetime import datetime from sqlobject.util.csvimport import load_csv csv_data = """\ name:str,age:datetime,value:int Test,2000-01-01 21:44:33,42""" def test_load_csv(): loaded = load_csv(csv.reader(csv_data.split('\n')), default_class='SQLObject') assert loaded == { 'SQLObject': [ { 'age': datetime(2000, 1, 1, 21, 44, 33), 'name': 'Test', 'value': 42 } ] } SQLObject-3.4.0/sqlobject/tests/test_converters.py0000644000175000017500000001747313036460624021627 0ustar phdphd00000000000000from datetime import timedelta from sqlobject.col import SODateTimeCol, SOTimeCol from sqlobject.converters import registerConverter, sqlrepr, \ quote_str, unquote_str from sqlobject.sqlbuilder import SQLExpression, SQLObjectField, \ Select, Insert, Update, Delete, Replace, \ SQLTrueClauseClass, SQLConstant, SQLPrefix, SQLCall, SQLOp, \ _LikeQuoted class SOTestClass: def __repr__(self): return '' def SOTestClassConverter(value, db): return repr(value) registerConverter(SOTestClass, SOTestClassConverter) class NewSOTestClass: __metaclass__ = type def __repr__(self): return '' def NewSOTestClassConverter(value, db): return repr(value) registerConverter(NewSOTestClass, NewSOTestClassConverter) def _sqlrepr(self, db): return '<%s>' % self.__class__.__name__ SQLExpression.__sqlrepr__ = _sqlrepr ############################################################ # Tests ############################################################ def test_simple_string(): assert sqlrepr('A String', 'firebird') == "'A String'" def test_string_newline(): assert sqlrepr('A String\nAnother', 'postgres') == "E'A String\\nAnother'" assert sqlrepr('A String\nAnother', 'sqlite') == "'A String\nAnother'" def test_string_tab(): assert sqlrepr('A String\tAnother', 'postgres') == "E'A String\\tAnother'" def test_string_r(): assert sqlrepr('A String\rAnother', 'postgres') == "E'A String\\rAnother'" def test_string_b(): assert sqlrepr('A String\bAnother', 'postgres') == "E'A String\\bAnother'" def test_string_000(): assert sqlrepr('A String\000Another', 'postgres') == \ "E'A String\\0Another'" def test_string_(): assert sqlrepr('A String\tAnother', 'postgres') == "E'A String\\tAnother'" assert sqlrepr('A String\'Another', 'firebird') == "'A String''Another'" def test_simple_unicode(): assert sqlrepr(u'A String', 'postgres') == "'A String'" def test_integer(): assert sqlrepr(10) == "10" def test_float(): assert sqlrepr(10.01) == "10.01" def test_none(): assert sqlrepr(None) == "NULL" def test_list(): assert sqlrepr(['one', 'two', 'three'], 'postgres') == \ "('one', 'two', 'three')" def test_tuple(): assert sqlrepr(('one', 'two', 'three'), 'postgres') == \ "('one', 'two', 'three')" def test_bool(): assert sqlrepr(True, 'postgres') == "'t'" assert sqlrepr(False, 'postgres') == "'f'" assert sqlrepr(True, 'mysql') == "1" assert sqlrepr(False, 'mysql') == "0" def test_datetime(): from datetime import datetime, date, time if SODateTimeCol.datetimeFormat.find('.%f') > 0: assert sqlrepr(datetime(2005, 7, 14, 13, 31, 2)) == \ "'2005-07-14 13:31:02.000000'" else: assert sqlrepr(datetime(2005, 7, 14, 13, 31, 2)) == \ "'2005-07-14 13:31:02'" assert sqlrepr(date(2005, 7, 14)) == "'2005-07-14'" if SOTimeCol.timeFormat.find('.%f') > 0: assert sqlrepr(time(13, 31, 2)) == "'13:31:02.000000'" else: assert sqlrepr(time(13, 31, 2)) == "'13:31:02'" # now dates before 1900 if SODateTimeCol.datetimeFormat.find('.%f') > 0: assert sqlrepr(datetime(1428, 7, 14, 13, 31, 2)) == \ "'1428-07-14 13:31:02.000000'" else: assert sqlrepr(datetime(1428, 7, 14, 13, 31, 2)) == \ "'1428-07-14 13:31:02'" assert sqlrepr(date(1428, 7, 14)) == "'1428-07-14'" def test_instance(): instance = SOTestClass() assert sqlrepr(instance) == repr(instance) def test_newstyle(): instance = NewSOTestClass() assert sqlrepr(instance) == repr(instance) def test_sqlexpr(): instance = SQLExpression() assert sqlrepr(instance) == repr(instance) def test_sqlobjectfield(): instance = SQLObjectField('test', 'test', 'test', None, None) assert sqlrepr(instance) == repr(instance) def test_select(): instance = Select('test') assert sqlrepr(instance, 'mysql') == "SELECT test" def test_insert(): # Single column, no keyword arguments. instance = Insert('test', [('test',)]) assert sqlrepr(instance, 'mysql') == "INSERT INTO test VALUES ('test')" # Multiple columns, no keyword arguments. instance2 = Insert('test', [('1st', '2nd', '3th', '4th')]) assert sqlrepr(instance2, 'postgres') == \ "INSERT INTO test VALUES ('1st', '2nd', '3th', '4th')" # Multiple rows, multiple columns, "valueList" keyword argument. instance3 = Insert('test', valueList=[('a1', 'b1'), ('a2', 'b2'), ('a3', 'b3')]) assert sqlrepr(instance3, 'sqlite') == \ "INSERT INTO test VALUES ('a1', 'b1'), ('a2', 'b2'), ('a3', 'b3')" # Multiple columns, "values" keyword argument. instance4 = Insert('test', values=('v1', 'v2', 'v3')) assert sqlrepr(instance4, 'mysql') == \ "INSERT INTO test VALUES ('v1', 'v2', 'v3')" # Single column, "valueList" keyword argument. instance5 = Insert('test', valueList=[('v1',)]) assert sqlrepr(instance5, 'mysql') == "INSERT INTO test VALUES ('v1')" # Multiple rows, Multiple columns, template. instance6 = Insert('test', valueList=[('a1', 'b1'), ('a2', 'b2')], template=['col1', 'col2']) assert sqlrepr(instance6, 'mysql') == \ "INSERT INTO test (col1, col2) VALUES ('a1', 'b1'), ('a2', 'b2')" # Multiple columns, implicit template (dictionary value). instance7 = Insert('test', valueList=[{'col1': 'a1', 'col2': 'b1'}]) assert sqlrepr(instance7, 'mysql') == \ "INSERT INTO test (col1, col2) VALUES ('a1', 'b1')" # Multiple rows, Multiple columns, implicit template. instance8 = Insert('test', valueList=[{'col1': 'a1', 'col2': 'b1'}, {'col1': 'a2', 'col2': 'b2'}]) assert sqlrepr(instance8, 'mysql') == \ "INSERT INTO test (col1, col2) VALUES ('a1', 'b1'), ('a2', 'b2')" def test_update(): instance = Update('test', {'test': 'test'}) assert sqlrepr(instance, 'mysql') == "UPDATE test SET test='test'" def test_delete(): instance = Delete('test', None) assert sqlrepr(instance, 'mysql') == "DELETE FROM test" def test_replace(): instance = Replace('test', {'test': 'test'}) assert sqlrepr(instance, 'mysql') == "REPLACE test SET test='test'" def test_trueclause(): instance = SQLTrueClauseClass() assert sqlrepr(instance) == repr(instance) def test_op(): instance = SQLOp('and', 'this', 'that') assert sqlrepr(instance, 'mysql') == "(('this') AND ('that'))" def test_call(): instance = SQLCall('test', ('test',)) assert sqlrepr(instance, 'mysql') == "'test'('test')" def test_constant(): instance = SQLConstant('test') assert sqlrepr(instance) == repr(instance) def test_prefix(): instance = SQLPrefix('test', 'test') assert sqlrepr(instance, 'mysql') == "test 'test'" def test_dict(): assert sqlrepr({"key": "value"}, "sqlite") == "('key')" def test_sets(): try: set except NameError: pass else: assert sqlrepr(set([1])) == "(1)" def test_timedelta(): assert sqlrepr(timedelta(seconds=30 * 60)) == \ "INTERVAL '0 days 1800 seconds'" def test_quote_unquote_str(): assert quote_str('test%', 'postgres') == "'test%'" assert quote_str('test%', 'sqlite') == "'test%'" assert quote_str('test\%', 'postgres') == "E'test\\%'" assert quote_str('test\\%', 'sqlite') == "'test\%'" assert unquote_str("'test%'") == 'test%' assert unquote_str("'test\\%'") == 'test\\%' assert unquote_str("E'test\\%'") == 'test\\%' def test_like_quoted(): assert sqlrepr(_LikeQuoted('test'), 'postgres') == "'test'" assert sqlrepr(_LikeQuoted('test'), 'sqlite') == "'test'" assert sqlrepr(_LikeQuoted('test%'), 'postgres') == r"E'test\\%'" assert sqlrepr(_LikeQuoted('test%'), 'sqlite') == r"'test\%'" SQLObject-3.4.0/sqlobject/tests/test_new_joins.py0000644000175000017500000001120013036405001021372 0ustar phdphd00000000000000from __future__ import print_function from sqlobject import ForeignKey, ManyToMany, OneToMany, SQLObject, StringCol from sqlobject.tests.dbtest import setupClass ######################################## # Joins ######################################## class PersonJNew(SQLObject): name = StringCol(length=40, alternateID=True) addressJs = ManyToMany('AddressJNew') class AddressJNew(SQLObject): zip = StringCol(length=5, alternateID=True) personJs = ManyToMany('PersonJNew') class ImplicitJoiningSONew(SQLObject): foo = ManyToMany('Bar') class ExplicitJoiningSONew(SQLObject): foo = OneToMany('Bar') class TestJoin: def setup_method(self, meth): setupClass(PersonJNew) setupClass(AddressJNew) for n in ['bob', 'tim', 'jane', 'joe', 'fred', 'barb']: PersonJNew(name=n) for z in ['11111', '22222', '33333', '44444']: AddressJNew(zip=z) def test_join(self): b = PersonJNew.byName('bob') assert list(b.addressJs) == [] z = AddressJNew.byZip('11111') b.addressJs.add(z) self.assertZipsEqual(b.addressJs, ['11111']) print(str(z.personJs), repr(z.personJs)) self.assertNamesEqual(z.personJs, ['bob']) z2 = AddressJNew.byZip('22222') b.addressJs.add(z2) print(str(b.addressJs)) self.assertZipsEqual(b.addressJs, ['11111', '22222']) self.assertNamesEqual(z2.personJs, ['bob']) b.addressJs.remove(z) self.assertZipsEqual(b.addressJs, ['22222']) self.assertNamesEqual(z.personJs, []) def assertZipsEqual(self, zips, dest): assert [a.zip for a in zips] == dest def assertNamesEqual(self, people, dest): assert [p.name for p in people] == dest def test_joinAttributeWithUnderscores(self): # Make sure that the implicit setting of joinMethodName works assert hasattr(ImplicitJoiningSONew, 'foo') assert not hasattr(ImplicitJoiningSONew, 'bars') # And make sure explicit setting also works assert hasattr(ExplicitJoiningSONew, 'foo') assert not hasattr(ExplicitJoiningSONew, 'bars') class PersonJNew2(SQLObject): name = StringCol('name', length=40, alternateID=True) addressJ2s = OneToMany('AddressJNew2') class AddressJNew2(SQLObject): class sqlmeta: defaultOrder = ['-zip', 'plus4'] zip = StringCol(length=5) plus4 = StringCol(length=4, default=None) personJNew2 = ForeignKey('PersonJNew2') class TestJoin2: def setup_method(self, meth): setupClass([PersonJNew2, AddressJNew2]) p1 = PersonJNew2(name='bob') p2 = PersonJNew2(name='sally') for z in ['11111', '22222', '33333']: AddressJNew2(zip=z, personJNew2=p1) AddressJNew2(zip='00000', personJNew2=p2) def test_basic(self): bob = PersonJNew2.byName('bob') sally = PersonJNew2.byName('sally') print(bob.addressJ2s) print(bob) assert len(list(bob.addressJ2s)) == 3 assert len(list(sally.addressJ2s)) == 1 bob.addressJ2s[0].destroySelf() assert len(list(bob.addressJ2s)) == 2 z = bob.addressJ2s[0] z.zip = 'xxxxx' id = z.id del z z = AddressJNew2.get(id) assert z.zip == 'xxxxx' def test_defaultOrder(self): p1 = PersonJNew2.byName('bob') assert ([i.zip for i in p1.addressJ2s] == ['33333', '22222', '11111']) _personJ3_getters = [] _personJ3_setters = [] class PersonJNew3(SQLObject): name = StringCol('name', length=40, alternateID=True) addressJNew3s = OneToMany('AddressJNew3') class AddressJNew3(SQLObject): zip = StringCol(length=5) personJNew3 = ForeignKey('PersonJNew3') def _get_personJNew3(self): value = self._SO_get_personJNew3() _personJ3_getters.append((self, value)) return value def _set_personJNew3(self, value): self._SO_set_personJNew3(value) _personJ3_setters.append((self, value)) class TestJoin3: def setup_method(self, meth): setupClass([PersonJNew3, AddressJNew3]) p1 = PersonJNew3(name='bob') p2 = PersonJNew3(name='sally') for z in ['11111', '22222', '33333']: AddressJNew3(zip=z, personJNew3=p1) AddressJNew3(zip='00000', personJNew3=p2) def test_accessors(self): assert len(list(_personJ3_getters)) == 0 assert len(list(_personJ3_setters)) == 4 bob = PersonJNew3.byName('bob') for addressJ3 in bob.addressJNew3s: addressJ3.personJNew3 assert len(list(_personJ3_getters)) == 3 assert len(list(_personJ3_setters)) == 4 SQLObject-3.4.0/sqlobject/tests/test_setters.py0000644000175000017500000000123613036405001021100 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import setupClass class SOTestSetters(SQLObject): firstName = StringCol(length=50, dbName='fname_col', default=None) lastName = StringCol(length=50, dbName='lname_col', default=None) def _set_name(self, v): firstName, lastName = v.split() self.firstName = firstName self.lastName = lastName def _get_name(self): return "%s %s" % (self.firstName, self.lastName) def test_create(): setupClass(SOTestSetters) t = SOTestSetters(name='John Doe') assert t.firstName == 'John' assert t.lastName == 'Doe' assert t.name == 'John Doe' SQLObject-3.4.0/sqlobject/tests/test_lazy.py0000644000175000017500000001265012747416060020407 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import setupClass ######################################## # Lazy updates ######################################## class Lazy(SQLObject): class sqlmeta: lazyUpdate = True name = StringCol() other = StringCol(default='nothing') third = StringCol(default='third') class TestLazyTest: def setup_method(self, meth): # All this stuff is so that we can track when the connection # does an actual update; we put in a new _SO_update method # that calls the original and sets an instance variable that # we can later check. setupClass(Lazy) self.conn = Lazy._connection self.conn.didUpdate = False self._oldUpdate = self.conn._SO_update newUpdate = ( lambda so, values, s=self, c=self.conn, o=self._oldUpdate: self._alternateUpdate(so, values, c, o)) self.conn._SO_update = newUpdate def teardown_method(self, meth): self.conn._SO_update = self._oldUpdate del self._oldUpdate def _alternateUpdate(self, so, values, conn, oldUpdate): conn.didUpdate = True return oldUpdate(so, values) def test_lazy(self): assert not self.conn.didUpdate obj = Lazy(name='tim') # We just did an insert, but not an update: assert not self.conn.didUpdate obj.set(name='joe') assert obj.sqlmeta.dirty assert obj.name == 'joe' assert not self.conn.didUpdate obj.syncUpdate() assert obj.name == 'joe' assert self.conn.didUpdate assert not obj.sqlmeta.dirty assert obj.name == 'joe' self.conn.didUpdate = False obj = Lazy(name='frank') obj.name = 'joe' assert not self.conn.didUpdate assert obj.sqlmeta.dirty assert obj.name == 'joe' obj.name = 'joe2' assert not self.conn.didUpdate assert obj.sqlmeta.dirty assert obj.name == 'joe2' obj.syncUpdate() assert obj.name == 'joe2' assert not obj.sqlmeta.dirty assert self.conn.didUpdate self.conn.didUpdate = False obj = Lazy(name='loaded') assert not obj.sqlmeta.dirty assert not self.conn.didUpdate assert obj.name == 'loaded' obj.name = 'unloaded' assert obj.sqlmeta.dirty assert obj.name == 'unloaded' assert not self.conn.didUpdate obj.sync() assert not obj.sqlmeta.dirty assert obj.name == 'unloaded' assert self.conn.didUpdate self.conn.didUpdate = False obj.name = 'whatever' assert obj.sqlmeta.dirty assert obj.name == 'whatever' assert not self.conn.didUpdate obj._SO_loadValue('name') assert obj.sqlmeta.dirty assert obj.name == 'whatever' assert not self.conn.didUpdate obj._SO_loadValue('other') assert obj.name == 'whatever' assert not self.conn.didUpdate obj.syncUpdate() assert self.conn.didUpdate self.conn.didUpdate = False # Now, check that get() doesn't screw # cached objects' validator state. obj_id = obj.id old_state = obj._SO_validatorState obj = Lazy.get(obj_id) assert not obj.sqlmeta.dirty assert not self.conn.didUpdate assert obj._SO_validatorState is old_state assert obj.name == 'whatever' obj.name = 'unloaded' assert obj.name == 'unloaded' assert obj.sqlmeta.dirty assert not self.conn.didUpdate # Fetch the object again with get() and # make sure sqlmeta.dirty is still set, as the # object should come from the cache. obj = Lazy.get(obj_id) assert obj.sqlmeta.dirty assert not self.conn.didUpdate assert obj.name == 'unloaded' obj.syncUpdate() assert self.conn.didUpdate assert not obj.sqlmeta.dirty self.conn.didUpdate = False # Then clear the cache, and try a get() # again, to make sure stuf like _SO_createdValues # is properly initialized. self.conn.cache.clear() obj = Lazy.get(obj_id) assert not obj.sqlmeta.dirty assert not self.conn.didUpdate assert obj.name == 'unloaded' obj.name = 'spongebob' assert obj.name == 'spongebob' assert obj.sqlmeta.dirty assert not self.conn.didUpdate obj.syncUpdate() assert self.conn.didUpdate assert not obj.sqlmeta.dirty self.conn.didUpdate = False obj = Lazy(name='last') assert not obj.sqlmeta.dirty obj.syncUpdate() assert not self.conn.didUpdate assert not obj.sqlmeta.dirty # Check that setting multiple values # actually works. This was broken # and just worked because we were testing # only one value at a time, so 'name' # had the right value after the for loop *wink* # Also, check that passing a name that is not # a valid column doesn't break, but instead # just does a plain setattr. obj.set(name='first', other='who', third='yes') assert obj.name == 'first' assert obj.other == 'who' assert obj.third == 'yes' assert obj.sqlmeta.dirty assert not self.conn.didUpdate obj.syncUpdate() assert self.conn.didUpdate assert not obj.sqlmeta.dirty SQLObject-3.4.0/sqlobject/tests/test_select_through.py0000644000175000017500000000340212775273363022452 0ustar phdphd00000000000000import pytest from sqlobject import ForeignKey, SQLMultipleJoin, SQLObject, SQLRelatedJoin, \ StringCol from sqlobject.tests.dbtest import inserts, setupClass # Tests retrieving objects through a join/fk on a selectResults class SRThrough1(SQLObject): three = ForeignKey('SRThrough3') twos = SQLMultipleJoin('SRThrough2', joinColumn='oneID') class SRThrough2(SQLObject): one = ForeignKey('SRThrough1') threes = SQLRelatedJoin('SRThrough3', addRemoveName='Three') class SRThrough3(SQLObject): name = StringCol() ones = SQLMultipleJoin('SRThrough1', joinColumn='threeID') twos = SQLRelatedJoin('SRThrough2') def setup_module(mod): global ones, twos, threes setupClass([SRThrough3, SRThrough1, SRThrough2]) threes = inserts(SRThrough3, [('a',), ('b',), ('c',)], 'name') ones = inserts(SRThrough1, [(threes[0].id,), (threes[0].id,), (threes[2].id,)], 'threeID') twos = inserts(SRThrough2, [(ones[0].id,), (ones[1].id,), (ones[2].id,)], 'oneID') twos[0].addThree(threes[0]) twos[0].addThree(threes[1]) def testBadRef(): pytest.raises(AttributeError, 'threes[0].throughTo.four') def testThroughFK(): assert list(threes[0].ones.throughTo.three) == [threes[0]] def testThroughMultipleJoin(): assert list(threes[0].ones.throughTo.twos) == [twos[0], twos[1]] def testThroughRelatedJoin(): assert list(threes[0].twos.throughTo.threes) == [threes[0], threes[1]] assert list( SRThrough3.select(SRThrough3.q.id == threes[0].id).throughTo.twos) == \ list(threes[0].twos) def testThroughFKAndJoin(): assert list(threes[0].ones.throughTo.three.throughTo.twos) == [twos[0]] SQLObject-3.4.0/sqlobject/tests/test_NoneValuedResultItem.py0000644000175000017500000000164212770603634023506 0ustar phdphd00000000000000# Test that selectResults handle NULL values from, for example, outer joins. from sqlobject import ForeignKey, SQLObject, StringCol, sqlbuilder from sqlobject.tests.dbtest import setupClass class SOTestComposer(SQLObject): name = StringCol() class SOTestWork(SQLObject): class sqlmeta: idName = "work_id" composer = ForeignKey('SOTestComposer') title = StringCol() def test1(): setupClass([SOTestComposer, SOTestWork]) c = SOTestComposer(name='Mahler, Gustav') w = SOTestWork(composer=c, title='Symphony No. 9') SOTestComposer(name='Bruckner, Anton') # but don't add any works for Bruckner # do a left join, a common use case that often involves NULL results s = SOTestWork.select( join=sqlbuilder.LEFTJOINOn( SOTestComposer, SOTestWork, SOTestComposer.q.id == SOTestWork.q.composerID)) assert tuple(s) == (w, None) SQLObject-3.4.0/sqlobject/tests/test_sorting.py0000644000175000017500000000506412747416060021116 0ustar phdphd00000000000000from sqlobject import DESC, SQLObject, StringCol, sqlmeta from sqlobject.tests.dbtest import inserts, setupClass class Names(SQLObject): class sqlmeta(sqlmeta): table = 'names_table' defaultOrder = ['lastName', 'firstName'] firstName = StringCol(length=30) lastName = StringCol(length=30) def setupNames(): setupClass(Names) inserts(Names, [('aj', 'baker'), ('joe', 'robbins'), ('tim', 'jackson'), ('joe', 'baker'), ('zoe', 'robbins')], schema='firstName lastName') def nameList(names): result = [] for name in names: result.append('%s %s' % (name.firstName, name.lastName)) return result def firstList(names): return [n.firstName for n in names] def test_defaultOrder(): setupNames() assert nameList(Names.select()) == \ ['aj baker', 'joe baker', 'tim jackson', 'joe robbins', 'zoe robbins'] def test_otherOrder(): setupNames() assert nameList(Names.select().orderBy(['firstName', 'lastName'])) == \ ['aj baker', 'joe baker', 'joe robbins', 'tim jackson', 'zoe robbins'] def test_untranslatedColumnOrder(): setupNames() assert nameList(Names.select().orderBy(['first_name', 'last_name'])) == \ ['aj baker', 'joe baker', 'joe robbins', 'tim jackson', 'zoe robbins'] def test_singleUntranslatedColumnOrder(): setupNames() assert firstList(Names.select().orderBy('firstName')) == \ ['aj', 'joe', 'joe', 'tim', 'zoe'] assert firstList(Names.select().orderBy('first_name')) == \ ['aj', 'joe', 'joe', 'tim', 'zoe'] assert firstList(Names.select().orderBy('-firstName')) == \ ['zoe', 'tim', 'joe', 'joe', 'aj'] assert firstList(Names.select().orderBy(u'-first_name')) == \ ['zoe', 'tim', 'joe', 'joe', 'aj'] assert firstList(Names.select().orderBy(Names.q.firstName)) == \ ['aj', 'joe', 'joe', 'tim', 'zoe'] assert firstList(Names.select().orderBy('firstName').reversed()) == \ ['zoe', 'tim', 'joe', 'joe', 'aj'] assert firstList(Names.select().orderBy('-firstName').reversed()) == \ ['aj', 'joe', 'joe', 'tim', 'zoe'] assert firstList(Names.select().orderBy(DESC(Names.q.firstName))) == \ ['zoe', 'tim', 'joe', 'joe', 'aj'] assert firstList(Names.select().orderBy(Names.q.firstName).reversed()) == \ ['zoe', 'tim', 'joe', 'joe', 'aj'] assert firstList( Names.select().orderBy(DESC(Names.q.firstName)).reversed()) == \ ['aj', 'joe', 'joe', 'tim', 'zoe'] SQLObject-3.4.0/sqlobject/tests/test_declarative.py0000644000175000017500000000363612747416060021717 0ustar phdphd00000000000000from sqlobject.declarative import Declarative class A1(Declarative): a = 1 b = [] class A2(A1): a = 5 A3 = A2(b=5) def test_a_classes(): assert A1.a == 1 assert A1.singleton().a == 1 assert A1.b is A2.b assert A3.b == 5 assert A1.declarative_count == A1.singleton().declarative_count assert A1.declarative_count < A2.declarative_count assert A2.singleton() is not A1.singleton() assert A3.singleton().b == A3.b class B1(Declarative): attrs = [] def __classinit__(cls, new_attrs): Declarative.__classinit__(cls, new_attrs) cls.attrs = cls.add_attrs(cls.attrs, new_attrs) def __instanceinit__(self, new_attrs): Declarative.__instanceinit__(self, new_attrs) self.attrs = self.add_attrs(self.attrs, new_attrs) @staticmethod def add_attrs(old_attrs, new_attrs): old_attrs = old_attrs[:] for name in new_attrs.keys(): if (name in old_attrs or name.startswith('_') or name in ('add_attrs', 'declarative_count', 'attrs')): continue old_attrs.append(name) old_attrs.sort() return old_attrs c = 1 class B2(B1): g = 3 def __classinit__(cls, new_attrs): new_attrs['test'] = 'whatever' B1.__classinit__(cls, new_attrs) B3 = B2(c=5, d=3) B4 = B3(d=5) B5 = B1(a=1) def test_b_classes(): assert B1.attrs == ['c'] assert B1.c == 1 assert B2.attrs == ['c', 'g', 'test'] assert B3.d == 3 assert B4.d == 5 assert B5.a == 1 assert B5.attrs == ['a', 'c'] assert B3.attrs == ['c', 'd', 'g', 'test'] assert B4.attrs == ['c', 'd', 'g', 'test'] order = [B1, B1.singleton(), B2, B2.singleton(), B3, B3.singleton(), B4, B4.singleton(), B5, B5.singleton()] last = 0 for obj in order: assert obj.declarative_count >= last last = obj.declarative_count SQLObject-3.4.0/sqlobject/tests/test_indexes.py0000644000175000017500000000602012775273363021071 0ustar phdphd00000000000000import pytest from sqlobject import DatabaseIndex, ForeignKey, IntCol, MultipleJoin, \ SQLObject, StringCol from sqlobject.dberrors import DatabaseError, IntegrityError, \ OperationalError, ProgrammingError from sqlobject.tests.dbtest import raises, setupClass, supports ######################################## # Indexes ######################################## class SOIndex1(SQLObject): name = StringCol(length=100) number = IntCol() nameIndex = DatabaseIndex('name', unique=True) nameIndex2 = DatabaseIndex(name, number) nameIndex3 = DatabaseIndex({'column': name, 'length': 3}) class SOIndex2(SQLObject): name = StringCol(length=100) nameIndex = DatabaseIndex({'expression': 'lower(name)'}) def test_indexes_1(): setupClass(SOIndex1) n = 0 for name in 'blah blech boring yep yort snort'.split(): n += 1 SOIndex1(name=name, number=n) mod = SOIndex1._connection.module raises( (mod.ProgrammingError, mod.IntegrityError, mod.OperationalError, mod.DatabaseError, ProgrammingError, IntegrityError, OperationalError, DatabaseError), SOIndex1, name='blah', number=0) def test_indexes_2(): if not supports('expressionIndex'): pytest.skip("expressionIndex isn't supported") setupClass(SOIndex2) SOIndex2(name='') class PersonIndexGet(SQLObject): firstName = StringCol(length=100) lastName = StringCol(length=100) age = IntCol(alternateID=True) nameIndex = DatabaseIndex(firstName, lastName, unique=True) def test_index_get_1(): setupClass(PersonIndexGet, force=True) PersonIndexGet(firstName='Eric', lastName='Idle', age=62) PersonIndexGet(firstName='Terry', lastName='Gilliam', age=65) PersonIndexGet(firstName='John', lastName='Cleese', age=66) PersonIndexGet.get(1) PersonIndexGet.nameIndex.get('Terry', 'Gilliam') PersonIndexGet.nameIndex.get(firstName='John', lastName='Cleese') raises(Exception, PersonIndexGet.nameIndex.get, firstName='Graham', lastName='Chapman') raises(Exception, PersonIndexGet.nameIndex.get, 'Terry', lastName='Gilliam') raises(Exception, PersonIndexGet.nameIndex.get, 'Terry', 'Gilliam', 65) raises(Exception, PersonIndexGet.nameIndex.get, 'Terry') class PersonIndexGet2(SQLObject): name = StringCol(alternateID=True, length=100) age = IntCol() addresses = MultipleJoin('AddressIndexGet2') class AddressIndexGet2(SQLObject): person = ForeignKey('PersonIndexGet2', notNone=True) type = StringCol(notNone=True, length=100) street = StringCol(notNone=True) pk = DatabaseIndex(person, type, unique=True) def test_index_get_2(): setupClass([PersonIndexGet2, AddressIndexGet2]) p = PersonIndexGet2(name='Terry Guilliam', age=64) AddressIndexGet2(person=p, type='home', street='Terry Street 234') AddressIndexGet2(person=p, type='work', street='Guilliam Street 234') AddressIndexGet2.pk.get(p, 'work') AddressIndexGet2.pk.get(person=p, type='work') SQLObject-3.4.0/sqlobject/tests/test_jsonbcol.py0000644000175000017500000000267412775273363021256 0ustar phdphd00000000000000import pytest from sqlobject import SQLObject, JsonbCol from sqlobject.tests.dbtest import getConnection, setupClass ######################################## # Jsonb columns ######################################## class JsonbContainer(SQLObject): jsondata = JsonbCol(default=None) jsondata_none = JsonbCol(default=None) jsondata_list = JsonbCol(default=None) dictdata = dict( ship=u'USS Voyager', number='NCC-74656', crew=['Cpt. Janeway', 'Doctor', 'Lt. Torries', 'Seven of Nine'], races=('Borg', 'Kason',), distance2earth=70000, coming_home=True) nonedata = None listofdictsdata = [dict(Erath=7390000000), dict(Vulcan=6000000000), dict(Kronos='unknown')] def test_jsonbCol(): connection = getConnection() if connection.dbName != "postgres": pytest.skip("These tests require PostgreSQL") setupClass([JsonbContainer], force=True) my_jsonb = JsonbContainer( jsondata=dictdata, jsondata_none=nonedata, jsondata_list=listofdictsdata) iid = my_jsonb.id JsonbContainer._connection.cache.clear() my_jsonb_2 = JsonbContainer.get(iid) assert my_jsonb_2.jsondata['coming_home'] == dictdata['coming_home'] assert my_jsonb_2.jsondata['crew'] == dictdata['crew'] assert my_jsonb_2.jsondata['races'] == ['Borg', 'Kason'] assert my_jsonb_2.jsondata_none is None assert my_jsonb_2.jsondata_list == listofdictsdata SQLObject-3.4.0/sqlobject/tests/dbtest.py0000644000175000017500000002610513134764127017657 0ustar phdphd00000000000000""" The framework for making database tests. """ from __future__ import print_function import logging import os import sys from pytest import raises, skip import sqlobject from sqlobject.col import use_microseconds import sqlobject.conftest as conftest if sys.platform[:3] == "win": def getcwd(): return os.getcwd().replace('\\', '/') else: getcwd = os.getcwd """ supportsMatrix defines what database backends support what features. Each feature has a name, if you see a key like '+featureName' then only the databases listed support the feature. Conversely, '-featureName' means all databases *except* the ones listed support the feature. The databases are given by their SQLObject string name, separated by spaces. The function supports(featureName) returns True or False based on this, and you can use it like:: def test_featureX(): if not supports('featureX'): pytest.skip("Doesn't support featureX") """ supportsMatrix = { '-blobData': 'mssql rdbhost', '-decimalColumn': 'mssql', '-dropTableCascade': 'sybase mssql mysql', '-emptyTable': 'mssql', '+exceptions': 'mysql postgres sqlite', '-expressionIndex': 'mysql sqlite firebird mssql', '-limitSelect': 'mssql', '+memorydb': 'sqlite', '+rlike': 'mysql postgres sqlite', '+schema': 'postgres', '-transactions': 'mysql rdbhost', } def setupClass(soClasses, force=False): """ Makes sure the classes have a corresponding and correct table. This won't recreate the table if it already exists. It will check that the table is properly defined (in case you change your table definition). You can provide a single class or a list of classes; if a list then classes will be created in the order you provide, and destroyed in the opposite order. So if class A depends on class B, then do setupClass([B, A]) and B won't be destroyed or cleared until after A is destroyed or cleared. If force is true, then the database will be recreated no matter what. """ global hub if not isinstance(soClasses, (list, tuple)): soClasses = [soClasses] connection = getConnection() for soClass in soClasses: # This would be an alternate way to register connections: # try: # hub # except NameError: # hub = sqlobject.dbconnection.ConnectionHub() # soClass._connection = hub # hub.threadConnection = connection # hub.processConnection = connection soClass._connection = connection installOrClear(soClasses, force=force) return soClasses def speedupSQLiteConnection(connection): connection.query("PRAGMA synchronous=OFF") connection.query("PRAGMA count_changes=OFF") connection.query("PRAGMA journal_mode=MEMORY") connection.query("PRAGMA temp_store=MEMORY") installedDBFilename = os.path.join(getcwd(), 'dbs_data.tmp') installedDBTracker = sqlobject.connectionForURI( 'sqlite:///' + installedDBFilename) speedupSQLiteConnection(installedDBTracker) def getConnection(**kw): name = getConnectionURI() conn = sqlobject.connectionForURI(name, **kw) if conftest.option.show_sql: conn.debug = True if conftest.option.show_sql_output: conn.debugOutput = True if (conn.dbName == 'sqlite') and not conn._memory: speedupSQLiteConnection(conn) return conn def getConnectionURI(): name = conftest.option.Database if name in conftest.connectionShortcuts: name = conftest.connectionShortcuts[name] return name try: connection = getConnection() except Exception as e: # At least this module should be importable... # The module was imported during documentation building if 'sphinx' not in sys.modules: print("Could not open database: %s" % e, file=sys.stderr) else: if (connection.dbName == 'firebird') \ or ( (connection.dbName == 'mysql') and ( (os.environ.get('APPVEYOR')) or (os.environ.get('TRAVIS')) ) ): use_microseconds(False) class InstalledTestDatabase(sqlobject.SQLObject): """ This table is set up in SQLite (always, regardless of --Database) and tracks what tables have been set up in the 'real' database. This way we don't keep recreating the tables over and over when there are multiple tests that use a table. """ _connection = installedDBTracker table_name = sqlobject.StringCol(notNull=True) createSQL = sqlobject.StringCol(notNull=True) connectionURI = sqlobject.StringCol(notNull=True) @classmethod def installOrClear(cls, soClasses, force=False): cls.setup() reversed = list(soClasses)[:] reversed.reverse() # If anything needs to be dropped, they all must be dropped # But if we're forcing it, then we'll always drop if force: any_drops = True else: any_drops = False for soClass in reversed: table = soClass.sqlmeta.table if not soClass._connection.tableExists(table): continue items = list(cls.selectBy( table_name=table, connectionURI=soClass._connection.uri())) if items: instance = items[0] sql = instance.createSQL else: sql = None newSQL, constraints = soClass.createTableSQL() if sql != newSQL: if sql is not None: instance.destroySelf() any_drops = True break for soClass in reversed: if soClass._connection.tableExists(soClass.sqlmeta.table): if any_drops: cls.drop(soClass) else: cls.clear(soClass) for soClass in soClasses: table = soClass.sqlmeta.table if not soClass._connection.tableExists(table): cls.install(soClass) @classmethod def install(cls, soClass): """ Creates the given table in its database. """ sql = getattr(soClass, soClass._connection.dbName + 'Create', None) all_extra = [] if sql: soClass._connection.query(sql) else: sql, extra_sql = soClass.createTableSQL() soClass.createTable(applyConstraints=False) all_extra.extend(extra_sql) cls(table_name=soClass.sqlmeta.table, createSQL=sql, connectionURI=soClass._connection.uri()) for extra_sql in all_extra: soClass._connection.query(extra_sql) @classmethod def drop(cls, soClass): """ Drops a the given table from its database """ sql = getattr(soClass, soClass._connection.dbName + 'Drop', None) if sql: soClass._connection.query(sql) else: soClass.dropTable() @classmethod def clear(cls, soClass): """ Removes all the rows from a table. """ soClass.clearTable() @classmethod def setup(cls): """ This sets up *this* table. """ if not cls._connection.tableExists(cls.sqlmeta.table): cls.createTable() installOrClear = InstalledTestDatabase.installOrClear class Dummy(object): """ Used for creating fake objects; a really poor 'mock object'. """ def __init__(self, **kw): for name, value in kw.items(): setattr(self, name, value) def inserts(cls, data, schema=None): """ Creates a bunch of rows. You can use it like:: inserts(Person, [{'fname': 'blah', 'lname': 'doe'}, ...]) Or:: inserts(Person, [('blah', 'doe')], schema= ['fname', 'lname']) If you give a single string for the `schema` then it'll split that string to get the list of column names. """ if schema: if isinstance(schema, str): schema = schema.split() keywordData = [] for item in data: itemDict = {} for name, value in zip(schema, item): itemDict[name] = value keywordData.append(itemDict) data = keywordData results = [] for args in data: results.append(cls(**args)) return results def supports(feature): dbName = connection.dbName support = supportsMatrix.get('+' + feature, None) notSupport = supportsMatrix.get('-' + feature, None) if support is not None and dbName in support.split(): return True elif support: return False if notSupport is not None and dbName in notSupport.split(): return False elif notSupport: return True assert notSupport is not None or support is not None, ( "The supportMatrix does not list this feature: %r" % feature) # To avoid name clashes: _inserts = inserts def setSQLiteConnectionFactory(TableClass, factory): from sqlobject.sqlite.sqliteconnection import SQLiteConnection conn = TableClass._connection TableClass._connection = SQLiteConnection( filename=conn.filename, name=conn.name, debug=conn.debug, debugOutput=conn.debugOutput, cache=conn.cache, style=conn.style, autoCommit=conn.autoCommit, debugThreading=conn.debugThreading, registry=conn.registry, factory=factory ) speedupSQLiteConnection(TableClass._connection) installOrClear([TableClass]) def deprecated_module(): sqlobject.main.warnings_level = None sqlobject.main.exception_level = None def setup_module(mod): # modules with '_old' test backward compatible methods, so they # don't get warnings or errors. mod_name = str(mod.__name__) if mod_name.endswith('/py'): mod_name = mod_name[:-3] if mod_name.endswith('_old'): sqlobject.main.warnings_level = None sqlobject.main.exception_level = None else: sqlobject.main.warnings_level = None sqlobject.main.exception_level = 0 def teardown_module(mod=None): sqlobject.main.warnings_level = None sqlobject.main.exception_level = 0 def setupLogging(): fmt = '[%(asctime)s] %(name)s %(levelname)s: %(message)s' formatter = logging.Formatter(fmt) hdlr = logging.StreamHandler(sys.stderr) hdlr.setFormatter(formatter) hdlr.setLevel(logging.NOTSET) logger = logging.getLogger() logger.addHandler(hdlr) def setupCyclicClasses(*classes): if not supports('dropTableCascade'): skip("dropTableCascade isn't supported") conn = getConnection() for soClass in classes: soClass.setConnection(conn) soClass.dropTable(ifExists=True, cascade=True) constraints = [] for soClass in classes: constraints += soClass.createTable(ifNotExists=True, applyConstraints=False) for constraint in constraints: conn.query(constraint) __all__ = ['Dummy', 'deprecated_module', 'getConnection', 'getConnectionURI', 'inserts', 'raises', 'setupClass', 'setupCyclicClasses', 'setupLogging', 'setup_module', 'supports', 'teardown_module', ] SQLObject-3.4.0/sqlobject/tests/test_style.py0000644000175000017500000000175012747416060020567 0ustar phdphd00000000000000from sqlobject import ForeignKey, SQLObject, StringCol, sqlmeta, styles from sqlobject.tests.dbtest import setupClass class AnotherStyle(styles.MixedCaseUnderscoreStyle): def pythonAttrToDBColumn(self, attr): if attr.lower().endswith('id'): return 'id' + styles.MixedCaseUnderscoreStyle.\ pythonAttrToDBColumn(self, attr[:-2]) else: return styles.MixedCaseUnderscoreStyle.\ pythonAttrToDBColumn(self, attr) class SOStyleTest1(SQLObject): a = StringCol() st2 = ForeignKey('SOStyleTest2') class sqlmeta(sqlmeta): style = AnotherStyle() class SOStyleTest2(SQLObject): b = StringCol() class sqlmeta(sqlmeta): style = AnotherStyle() def test_style(): setupClass([SOStyleTest2, SOStyleTest1]) st1 = SOStyleTest1(a='something', st2=None) st2 = SOStyleTest2(b='whatever') st1.st2 = st2 assert st1.sqlmeta.columns['st2ID'].dbName == 'idst2' assert st1.st2 == st2 SQLObject-3.4.0/sqlobject/tests/test_picklecol.py0000644000175000017500000000215613036106435021367 0ustar phdphd00000000000000import pytest from sqlobject import PickleCol, SQLObject from sqlobject.tests.dbtest import setupClass, supports ######################################## # Pickle columns ######################################## class PickleData: pi = 3.14156 def __init__(self): self.question = \ 'The Ulimate Question of Life, the Universe and Everything' self.answer = 42 class PickleContainer(SQLObject): pickledata = PickleCol(default=None, length=256) def test_pickleCol(): if not supports('blobData'): pytest.skip("blobData isn't supported") setupClass([PickleContainer], force=True) mypickledata = PickleData() ctnr = PickleContainer(pickledata=mypickledata) iid = ctnr.id PickleContainer._connection.cache.clear() ctnr2 = PickleContainer.get(iid) s2 = ctnr2.pickledata assert isinstance(s2, PickleData) assert isinstance(s2.pi, float) assert isinstance(s2.question, str) assert isinstance(s2.answer, int) assert s2.pi == mypickledata.pi assert s2.question == mypickledata.question assert s2.answer == mypickledata.answer SQLObject-3.4.0/sqlobject/tests/test_joins.py0000644000175000017500000001125612747416060020553 0ustar phdphd00000000000000from sqlobject import ForeignKey, MultipleJoin, RelatedJoin, SQLObject, \ StringCol from sqlobject.tests.dbtest import setupClass ######################################## # Joins ######################################## class PersonJoiner(SQLObject): name = StringCol(length=40, alternateID=True) addressJoiners = RelatedJoin('AddressJoiner') class AddressJoiner(SQLObject): zip = StringCol(length=5, alternateID=True) personJoiners = RelatedJoin('PersonJoiner') class ImplicitJoiningSO(SQLObject): foo = RelatedJoin('Bar') class ExplicitJoiningSO(SQLObject): foo = MultipleJoin('Bar', joinMethodName='foo') class TestJoin: def setup_method(self, meth): setupClass(PersonJoiner) setupClass(AddressJoiner) for n in ['bob', 'tim', 'jane', 'joe', 'fred', 'barb']: PersonJoiner(name=n) for z in ['11111', '22222', '33333', '44444']: AddressJoiner(zip=z) def test_join(self): b = PersonJoiner.byName('bob') assert b.addressJoiners == [] z = AddressJoiner.byZip('11111') b.addAddressJoiner(z) self.assertZipsEqual(b.addressJoiners, ['11111']) self.assertNamesEqual(z.personJoiners, ['bob']) z2 = AddressJoiner.byZip('22222') b.addAddressJoiner(z2) self.assertZipsEqual(b.addressJoiners, ['11111', '22222']) self.assertNamesEqual(z2.personJoiners, ['bob']) b.removeAddressJoiner(z) self.assertZipsEqual(b.addressJoiners, ['22222']) self.assertNamesEqual(z.personJoiners, []) def assertZipsEqual(self, zips, dest): assert [a.zip for a in zips] == dest def assertNamesEqual(self, people, dest): assert [p.name for p in people] == dest def test_joinAttributeWithUnderscores(self): # Make sure that the implicit setting of joinMethodName works assert hasattr(ImplicitJoiningSO, 'foo') assert not hasattr(ImplicitJoiningSO, 'bars') # And make sure explicit setting also works assert hasattr(ExplicitJoiningSO, 'foo') assert not hasattr(ExplicitJoiningSO, 'bars') class PersonJoiner2(SQLObject): name = StringCol('name', length=40, alternateID=True) addressJoiner2s = MultipleJoin('AddressJoiner2') class AddressJoiner2(SQLObject): class sqlmeta: defaultOrder = ['-zip', 'plus4'] zip = StringCol(length=5) plus4 = StringCol(length=4, default=None) personJoiner2 = ForeignKey('PersonJoiner2') class TestJoin2: def setup_method(self, meth): setupClass([PersonJoiner2, AddressJoiner2]) p1 = PersonJoiner2(name='bob') p2 = PersonJoiner2(name='sally') for z in ['11111', '22222', '33333']: AddressJoiner2(zip=z, personJoiner2=p1) AddressJoiner2(zip='00000', personJoiner2=p2) def test_basic(self): bob = PersonJoiner2.byName('bob') sally = PersonJoiner2.byName('sally') assert len(bob.addressJoiner2s) == 3 assert len(sally.addressJoiner2s) == 1 bob.addressJoiner2s[0].destroySelf() assert len(bob.addressJoiner2s) == 2 z = bob.addressJoiner2s[0] z.zip = 'xxxxx' id = z.id del z z = AddressJoiner2.get(id) assert z.zip == 'xxxxx' def test_defaultOrder(self): p1 = PersonJoiner2.byName('bob') assert ([i.zip for i in p1.addressJoiner2s] == ['33333', '22222', '11111']) _personJoiner3_getters = [] _personJoiner3_setters = [] class PersonJoiner3(SQLObject): name = StringCol('name', length=40, alternateID=True) addressJoiner3s = MultipleJoin('AddressJoiner3') class AddressJoiner3(SQLObject): zip = StringCol(length=5) personJoiner3 = ForeignKey('PersonJoiner3') def _get_personJoiner3(self): value = self._SO_get_personJoiner3() _personJoiner3_getters.append((self, value)) return value def _set_personJoiner3(self, value): self._SO_set_personJoiner3(value) _personJoiner3_setters.append((self, value)) class TestJoin3: def setup_method(self, meth): setupClass([PersonJoiner3, AddressJoiner3]) p1 = PersonJoiner3(name='bob') p2 = PersonJoiner3(name='sally') for z in ['11111', '22222', '33333']: AddressJoiner3(zip=z, personJoiner3=p1) AddressJoiner3(zip='00000', personJoiner3=p2) def test_accessors(self): assert len(_personJoiner3_getters) == 0 assert len(_personJoiner3_setters) == 4 bob = PersonJoiner3.byName('bob') for addressJoiner3 in bob.addressJoiner3s: addressJoiner3.personJoiner3 assert len(_personJoiner3_getters) == 3 assert len(_personJoiner3_setters) == 4 SQLObject-3.4.0/sqlobject/tests/test_mysql.py0000644000175000017500000000112613040447174020566 0ustar phdphd00000000000000import pytest from sqlobject import SQLObject from sqlobject.tests.dbtest import getConnection, setupClass try: connection = getConnection() except (AttributeError, NameError): # The module was imported during documentation building pass else: if connection.dbName != "mysql": pytestmark = pytest.mark.skip('') class SOTestSOListMySQL(SQLObject): pass def test_list_databases(): assert connection.db in connection.listDatabases() def test_list_tables(): setupClass(SOTestSOListMySQL) assert SOTestSOListMySQL.sqlmeta.table in connection.listTables() SQLObject-3.4.0/sqlobject/tests/test_subqueries.py0000644000175000017500000001071112770603634021614 0ustar phdphd00000000000000from sqlobject import ForeignKey, SQLObject, StringCol from sqlobject.sqlbuilder import EXISTS, IN, LEFTOUTERJOINOn, NOTEXISTS, \ Outer, Select from sqlobject.tests.dbtest import setupClass ######################################## # Subqueries (subselects) ######################################## class SOTestIn1(SQLObject): col1 = StringCol() class SOTestIn2(SQLObject): col2 = StringCol() class SOTestOuter(SQLObject): fk = ForeignKey('SOTestIn1') def setup(): setupClass(SOTestIn1) setupClass(SOTestIn2) def insert(): setup() SOTestIn1(col1=None) SOTestIn1(col1='') SOTestIn1(col1="test") SOTestIn2(col2=None) SOTestIn2(col2='') SOTestIn2(col2="test") def test_1syntax_in(): setup() select = SOTestIn1.select(IN(SOTestIn1.q.col1, Select(SOTestIn2.q.col2))) assert str(select) == \ "SELECT so_test_in1.id, so_test_in1.col1 " \ "FROM so_test_in1 WHERE so_test_in1.col1 IN " \ "(SELECT so_test_in2.col2 FROM so_test_in2)" select = SOTestIn1.select(IN(SOTestIn1.q.col1, SOTestIn2.select())) assert str(select) == \ "SELECT so_test_in1.id, so_test_in1.col1 " \ "FROM so_test_in1 WHERE so_test_in1.col1 IN " \ "(SELECT so_test_in2.id FROM so_test_in2 WHERE 1 = 1)" def test_2perform_in(): insert() select = SOTestIn1.select(IN(SOTestIn1.q.col1, Select(SOTestIn2.q.col2))) assert select.count() == 2 def test_3syntax_exists(): setup() select = SOTestIn1.select(NOTEXISTS( Select(SOTestIn2.q.col2, where=(Outer(SOTestIn1).q.col1 == SOTestIn2.q.col2)))) assert str(select) == \ "SELECT so_test_in1.id, so_test_in1.col1 " \ "FROM so_test_in1 WHERE NOT EXISTS " \ "(SELECT so_test_in2.col2 FROM so_test_in2 " \ "WHERE ((so_test_in1.col1) = (so_test_in2.col2)))" setupClass(SOTestOuter) select = SOTestOuter.select(NOTEXISTS( Select(SOTestIn1.q.col1, where=(Outer(SOTestOuter).q.fk == SOTestIn1.q.id)))) assert str(select) == \ "SELECT so_test_outer.id, so_test_outer.fk_id " \ "FROM so_test_outer WHERE NOT EXISTS " \ "(SELECT so_test_in1.col1 FROM so_test_in1 " \ "WHERE ((so_test_outer.fk_id) = (so_test_in1.id)))" def test_4perform_exists(): insert() select = SOTestIn1.select(EXISTS( Select(SOTestIn2.q.col2, where=(Outer(SOTestIn1).q.col1 == SOTestIn2.q.col2)))) assert len(list(select)) == 2 setupClass(SOTestOuter) select = SOTestOuter.select(NOTEXISTS( Select(SOTestIn1.q.col1, where=(Outer(SOTestOuter).q.fkID == SOTestIn1.q.id)))) assert len(list(select)) == 0 def test_4syntax_direct(): setup() select = SOTestIn1.select(SOTestIn1.q.col1 == Select(SOTestIn2.q.col2, where=(SOTestIn2.q.col2 == "test"))) assert str(select) == \ "SELECT so_test_in1.id, so_test_in1.col1 " \ "FROM so_test_in1 WHERE ((so_test_in1.col1) = " \ "(SELECT so_test_in2.col2 FROM so_test_in2 " \ "WHERE ((so_test_in2.col2) = ('test'))))" def test_4perform_direct(): insert() select = SOTestIn1.select(SOTestIn1.q.col1 == Select(SOTestIn2.q.col2, where=(SOTestIn2.q.col2 == "test"))) assert select.count() == 1 def test_5perform_direct(): insert() select = SOTestIn1.select(SOTestIn1.q.col1 == Select(SOTestIn2.q.col2, where=(SOTestIn2.q.col2 == "test"))) assert select.count() == 1 def test_6syntax_join(): insert() j = LEFTOUTERJOINOn(SOTestIn2, SOTestIn1, SOTestIn1.q.col1 == SOTestIn2.q.col2) select = SOTestIn1.select(SOTestIn1.q.col1 == Select(SOTestIn2.q.col2, where=(SOTestIn2.q.col2 == "test"), join=j)) assert str(select) == \ "SELECT so_test_in1.id, so_test_in1.col1 " \ "FROM so_test_in1 WHERE ((so_test_in1.col1) = " \ "(SELECT so_test_in2.col2 FROM so_test_in2 " \ "LEFT OUTER JOIN so_test_in1 ON " \ "((so_test_in1.col1) = (so_test_in2.col2)) " \ "WHERE ((so_test_in2.col2) = ('test'))))" def test_6perform_join(): insert() j = LEFTOUTERJOINOn(SOTestIn2, SOTestIn1, SOTestIn1.q.col1 == SOTestIn2.q.col2) select = SOTestIn1.select(SOTestIn1.q.col1 == Select(SOTestIn2.q.col2, where=(SOTestIn2.q.col2 == "test"), join=j)) assert select.count() == 1 SQLObject-3.4.0/sqlobject/tests/test_uuidcol.py0000644000175000017500000000113012747416060021063 0ustar phdphd00000000000000from uuid import UUID from sqlobject import SQLObject, UuidCol from sqlobject.tests.dbtest import setupClass ######################################## # Uuid columns ######################################## testuuid = UUID('7e3b5c1e-3402-4b10-a3c6-8ee6dbac7d1a') class UuidContainer(SQLObject): uuiddata = UuidCol(default=None) def test_uuidCol(): setupClass([UuidContainer], force=True) my_uuid = UuidContainer(uuiddata=testuuid) iid = my_uuid.id UuidContainer._connection.cache.clear() my_uuid_2 = UuidContainer.get(iid) assert my_uuid_2.uuiddata == testuuid SQLObject-3.4.0/sqlobject/tests/test_asdict.py0000644000175000017500000000072112770603634020674 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import setupClass ######################################## # sqlmeta.asDict() ######################################## class SOTestAsDict(SQLObject): name = StringCol(length=10) name2 = StringCol(length=10) def test_asDict(): setupClass(SOTestAsDict, force=True) t1 = SOTestAsDict(name='one', name2='1') assert t1.sqlmeta.asDict() == dict(name='one', name2='1', id=1) SQLObject-3.4.0/sqlobject/tests/test_sqlmeta_idName.py0000644000175000017500000000241312770603634022350 0ustar phdphd00000000000000from sqlobject import MixedCaseStyle, SQLObject, sqlmeta from sqlobject.tests.dbtest import setupClass class myid_sqlmeta(sqlmeta): idName = "my_id" class SOTestSqlmeta1(SQLObject): class sqlmeta(myid_sqlmeta): pass class SOTestSqlmeta2(SQLObject): class sqlmeta(sqlmeta): style = MixedCaseStyle(longID=True) class SOTestSqlmeta3(SQLObject): class sqlmeta(myid_sqlmeta): style = MixedCaseStyle(longID=True) class SOTestSqlmeta4(SQLObject): class sqlmeta(myid_sqlmeta): idName = None style = MixedCaseStyle(longID=True) class longid_sqlmeta(sqlmeta): idName = "my_id" style = MixedCaseStyle(longID=True) class SOTestSqlmeta5(SQLObject): class sqlmeta(longid_sqlmeta): pass class SOTestSqlmeta6(SQLObject): class sqlmeta(longid_sqlmeta): idName = None def test_sqlmeta_inherited_idName(): setupClass([SOTestSqlmeta1, SOTestSqlmeta2]) assert SOTestSqlmeta1.sqlmeta.idName == "my_id" assert SOTestSqlmeta2.sqlmeta.idName == "SOTestSqlmeta2ID" assert SOTestSqlmeta3.sqlmeta.idName == "my_id" assert SOTestSqlmeta4.sqlmeta.idName == "SOTestSqlmeta4ID" assert SOTestSqlmeta5.sqlmeta.idName == "my_id" assert SOTestSqlmeta6.sqlmeta.idName == "SOTestSqlmeta6ID" SQLObject-3.4.0/sqlobject/tests/test_joins_conditional.py0000644000175000017500000000773612770603634023147 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.sqlbuilder import JOIN, LEFTJOIN, LEFTJOINConditional, \ LEFTJOINOn, LEFTJOINUsing from sqlobject.tests.dbtest import getConnection, setupClass ######################################## # Condiotional joins ######################################## class SOTestJoin1(SQLObject): col1 = StringCol() class SOTestJoin2(SQLObject): col2 = StringCol() class SOTestJoin3(SQLObject): col3 = StringCol() class SOTestJoin4(SQLObject): col4 = StringCol() class SOTestJoin5(SQLObject): col5 = StringCol() def setup(): setupClass(SOTestJoin1) setupClass(SOTestJoin2) def test_1syntax(): setup() join = JOIN("table1", "table2") assert str(join) == "table1 JOIN table2" join = LEFTJOIN("table1", "table2") assert str(join) == "table1 LEFT JOIN table2" join = LEFTJOINOn("table1", "table2", "tabl1.col1 = table2.col2") assert getConnection().sqlrepr(join) == \ "table1 LEFT JOIN table2 ON tabl1.col1 = table2.col2" def test_2select_syntax(): setup() select = SOTestJoin1.select( join=LEFTJOINConditional(SOTestJoin1, SOTestJoin2, on_condition=( SOTestJoin1.q.col1 == SOTestJoin2.q.col2)) ) assert str(select) == \ "SELECT so_test_join1.id, so_test_join1.col1 " \ "FROM so_test_join1 " \ "LEFT JOIN so_test_join2 " \ "ON ((so_test_join1.col1) = (so_test_join2.col2)) WHERE 1 = 1" def test_3perform_join(): setup() SOTestJoin1(col1="test1") SOTestJoin1(col1="test2") SOTestJoin1(col1="test3") SOTestJoin2(col2="test1") SOTestJoin2(col2="test2") select = SOTestJoin1.select( join=LEFTJOINOn(SOTestJoin1, SOTestJoin2, SOTestJoin1.q.col1 == SOTestJoin2.q.col2) ) assert select.count() == 3 def test_4join_3tables_syntax(): setup() setupClass(SOTestJoin3) select = SOTestJoin1.select( join=LEFTJOIN(SOTestJoin2, SOTestJoin3) ) assert str(select) == \ "SELECT so_test_join1.id, so_test_join1.col1 " \ "FROM so_test_join1, so_test_join2 LEFT JOIN so_test_join3 WHERE 1 = 1" def test_5join_3tables_syntax2(): setup() setupClass(SOTestJoin3) select = SOTestJoin1.select( join=(LEFTJOIN(None, SOTestJoin2), LEFTJOIN(None, SOTestJoin3)) ) assert str(select) == \ "SELECT so_test_join1.id, so_test_join1.col1 " \ "FROM so_test_join1 " \ "LEFT JOIN so_test_join2 LEFT JOIN so_test_join3 WHERE 1 = 1" select = SOTestJoin1.select( join=(LEFTJOIN(SOTestJoin1, SOTestJoin2), LEFTJOIN(SOTestJoin1, SOTestJoin3)) ) assert str(select) == \ "SELECT so_test_join1.id, so_test_join1.col1 " \ "FROM so_test_join1 " \ "LEFT JOIN so_test_join2, so_test_join1 " \ "LEFT JOIN so_test_join3 WHERE 1 = 1" def test_6join_using(): setup() setupClass(SOTestJoin3) select = SOTestJoin1.select( join=LEFTJOINUsing(None, SOTestJoin2, [SOTestJoin2.q.id]) ) assert str(select) == \ "SELECT so_test_join1.id, so_test_join1.col1 " \ "FROM so_test_join1 " \ "LEFT JOIN so_test_join2 USING (so_test_join2.id) WHERE 1 = 1" def test_7join_on(): setup() setupClass(SOTestJoin3) setupClass(SOTestJoin4) setupClass(SOTestJoin5) select = SOTestJoin1.select(join=( LEFTJOINOn(SOTestJoin2, SOTestJoin3, SOTestJoin2.q.col2 == SOTestJoin3.q.col3), LEFTJOINOn(SOTestJoin4, SOTestJoin5, SOTestJoin4.q.col4 == SOTestJoin5.q.col5) )) assert str(select) == \ "SELECT so_test_join1.id, so_test_join1.col1 " \ "FROM so_test_join1, so_test_join2 " \ "LEFT JOIN so_test_join3 " \ "ON ((so_test_join2.col2) = (so_test_join3.col3)), so_test_join4 " \ "LEFT JOIN so_test_join5 " \ "ON ((so_test_join4.col4) = (so_test_join5.col5)) WHERE 1 = 1" SQLObject-3.4.0/sqlobject/tests/test_delete.py0000644000175000017500000000345412770603634020675 0ustar phdphd00000000000000from sqlobject import OR, RelatedJoin, SQLObject, StringCol, cache from sqlobject.tests.dbtest import setupClass from .test_basic import SOTestSO1, setupGetters ######################################## # Delete during select ######################################## def testSelect(): setupGetters(SOTestSO1) for obj in SOTestSO1.select('all'): obj.destroySelf() assert list(SOTestSO1.select('all')) == [] ######################################## # Delete many rows at once ######################################## def testDeleteMany(): setupGetters(SOTestSO1) SOTestSO1.deleteMany(OR(SOTestSO1.q.name == "bob", SOTestSO1.q.name == "fred")) assert len(list(SOTestSO1.select('all'))) == 2 def testDeleteBy(): setupGetters(SOTestSO1) SOTestSO1.deleteBy(name="dave") assert len(list(SOTestSO1.select())) == 3 ######################################## # Delete without caching ######################################## class NoCache(SQLObject): name = StringCol() def testDestroySelf(): setupClass(NoCache) old = NoCache._connection.cache NoCache._connection.cache = cache.CacheSet(cache=False) value = NoCache(name='test') value.destroySelf() NoCache._connection.cache = old ######################################## # Delete from related joins ######################################## class Service(SQLObject): groups = RelatedJoin("ServiceGroup") class ServiceGroup(SQLObject): services = RelatedJoin("Service") def testDeleteRelatedJoins(): setupClass([Service, ServiceGroup]) service = Service() service_group = ServiceGroup() service.addServiceGroup(service_group) service.destroySelf() service_group = ServiceGroup.get(service_group.id) assert len(service_group.services) == 0 SQLObject-3.4.0/sqlobject/tests/test_postgres.py0000644000175000017500000000271213040447174021271 0ustar phdphd00000000000000import os import pytest from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import getConnection, setupClass ######################################## # Test PosgreSQL sslmode ######################################## try: connection = getConnection() except (AttributeError, NameError): # The module was imported during documentation building pass else: if connection.dbName != "postgres": pytestmark = pytest.mark.skip('') class SOTestSSLMode(SQLObject): test = StringCol() def test_sslmode(): setupClass(SOTestSSLMode) connection = SOTestSSLMode._connection if (not connection.module.__name__.startswith('psycopg')) or \ (os.name == 'nt'): pytest.skip("The test requires PostgreSQL, psycopg and ssl mode; " "also it doesn't work on w32") connection = getConnection(sslmode='require') SOTestSSLMode._connection = connection test = SOTestSSLMode(test='test') # Connect to the DB to test sslmode connection.cache.clear() test = SOTestSSLMode.select()[0] assert test.test == 'test' ######################################## # Test PosgreSQL list{Database,Tables} ######################################## class SOTestSOList(SQLObject): pass def test_list_databases(): assert connection.db in connection.listDatabases() def test_list_tables(): setupClass(SOTestSOList) assert SOTestSOList.sqlmeta.table in connection.listTables() SQLObject-3.4.0/sqlobject/tests/test_boundattributes.py0000644000175000017500000000306612775273363022657 0ustar phdphd00000000000000import pytest from sqlobject import boundattributes from sqlobject import declarative pytestmark = pytest.mark.skipif('True') class SOTestMe(object): pass class AttrReplace(boundattributes.BoundAttribute): __unpackargs__ = ('replace',) replace = None @declarative.classinstancemethod def make_object(self, cls, added_class, attr_name, **attrs): if not self: return cls.singleton().make_object( added_class, attr_name, **attrs) self.replace.added_class = added_class self.replace.name = attr_name assert attrs['replace'] is self.replace del attrs['replace'] self.replace.attrs = attrs return self.replace class Holder: def __init__(self, name): self.holder_name = name def __repr__(self): return '' % self.holder_name def test_1(): v1 = Holder('v1') v2 = Holder('v2') v3 = Holder('v3') class V2Class(AttrReplace): arg1 = 'nothing' arg2 = ['something'] class A1(SOTestMe): a = AttrReplace(v1) v = V2Class(v2) class inline(AttrReplace): replace = v3 arg3 = 'again' arg4 = 'so there' for n in ('a', 'v', 'inline'): assert getattr(A1, n).name == n assert getattr(A1, n).added_class is A1 assert A1.a is v1 assert A1.a.attrs == {} assert A1.v is v2 assert A1.v.attrs == {'arg1': 'nothing', 'arg2': ['something']} assert A1.inline is v3 assert A1.inline.attrs == {'arg3': 'again', 'arg4': 'so there'} SQLObject-3.4.0/sqlobject/tests/test_sqlite.py0000644000175000017500000001010113040447174020713 0ustar phdphd00000000000000import threading import pytest from sqlobject import SQLObject, StringCol from sqlobject.compat import string_type from sqlobject.tests.dbtest import getConnection, setupClass, supports from sqlobject.tests.dbtest import setSQLiteConnectionFactory from .test_basic import SOTestSO1 try: connection = getConnection() except (AttributeError, NameError): # The module was imported during documentation building pass else: if connection.dbName != "sqlite": pytestmark = pytest.mark.skip('') class SQLiteFactoryTest(SQLObject): name = StringCol() def test_sqlite_factory(): setupClass(SQLiteFactoryTest) if not SQLiteFactoryTest._connection.using_sqlite2: pytest.skip("These tests require SQLite v2+") factory = [None] def SQLiteConnectionFactory(sqlite): class MyConnection(sqlite.Connection): pass factory[0] = MyConnection return MyConnection setSQLiteConnectionFactory(SQLiteFactoryTest, SQLiteConnectionFactory) conn = SQLiteFactoryTest._connection.makeConnection() assert factory[0] assert isinstance(conn, factory[0]) def test_sqlite_factory_str(): setupClass(SQLiteFactoryTest) if not SQLiteFactoryTest._connection.using_sqlite2: pytest.skip("These tests require SQLite v2+") factory = [None] def SQLiteConnectionFactory(sqlite): class MyConnection(sqlite.Connection): pass factory[0] = MyConnection return MyConnection from sqlobject.sqlite import sqliteconnection sqliteconnection.SQLiteConnectionFactory = SQLiteConnectionFactory setSQLiteConnectionFactory(SQLiteFactoryTest, "SQLiteConnectionFactory") conn = SQLiteFactoryTest._connection.makeConnection() assert factory[0] assert isinstance(conn, factory[0]) del sqliteconnection.SQLiteConnectionFactory def test_sqlite_aggregate(): setupClass(SQLiteFactoryTest) if not SQLiteFactoryTest._connection.using_sqlite2: pytest.skip("These tests require SQLite v2+") def SQLiteConnectionFactory(sqlite): class MyConnection(sqlite.Connection): def __init__(self, *args, **kwargs): super(MyConnection, self).__init__(*args, **kwargs) self.create_aggregate("group_concat", 1, self.group_concat) class group_concat: def __init__(self): self.acc = [] def step(self, value): if isinstance(value, string_type): self.acc.append(value) else: self.acc.append(str(value)) def finalize(self): self.acc.sort() return ", ".join(self.acc) return MyConnection setSQLiteConnectionFactory(SQLiteFactoryTest, SQLiteConnectionFactory) SQLiteFactoryTest(name='sqlobject') SQLiteFactoryTest(name='sqlbuilder') assert SQLiteFactoryTest.select(orderBy="name").\ accumulateOne("group_concat", "name") == \ "sqlbuilder, sqlobject" def do_select(): list(SOTestSO1.select()) def test_sqlite_threaded(): setupClass(SOTestSO1) t = threading.Thread(target=do_select) t.start() t.join() # This should reuse the same connection as the connection # made above (at least will with most database drivers, but # this will cause an error in SQLite): do_select() def test_empty_string(): setupClass(SOTestSO1) test = SOTestSO1(name=None, passwd='') assert test.name is None assert test.passwd == '' def test_memorydb(): if not supports("memorydb"): pytest.skip("memorydb isn't supported") if not connection._memory: pytest.skip("The connection isn't memorydb") setupClass(SOTestSO1) connection.close() # create a new connection to an in-memory database SOTestSO1.setConnection(connection) SOTestSO1.createTable() def test_list_databases(): assert connection.listDatabases() == ['main'] def test_list_tables(): setupClass(SOTestSO1) assert SOTestSO1.sqlmeta.table in connection.listTables() SQLObject-3.4.0/sqlobject/tests/test_distinct.py0000644000175000017500000000151312747416060021245 0ustar phdphd00000000000000from sqlobject import ForeignKey, IntCol, SQLObject from sqlobject.tests.dbtest import setupClass ######################################## # Distinct ######################################## class Distinct1(SQLObject): n = IntCol() class Distinct2(SQLObject): other = ForeignKey('Distinct1') def count(select): result = {} for ob in select: result[int(ob.n)] = result.get(int(ob.n), 0) + 1 return result def test_distinct(): setupClass([Distinct1, Distinct2]) obs = [Distinct1(n=i) for i in range(3)] Distinct2(other=obs[0]) Distinct2(other=obs[0]) Distinct2(other=obs[1]) query = (Distinct2.q.otherID == Distinct1.q.id) sel = Distinct1.select(query) assert count(sel) == {0: 2, 1: 1} sel = Distinct1.select(query, distinct=True) assert count(sel) == {0: 1, 1: 1} SQLObject-3.4.0/sqlobject/tests/test_aliases.py0000644000175000017500000000305512747416060021050 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.sqlbuilder import Alias, LEFTJOINOn from sqlobject.tests.dbtest import setupClass ######################################## # Table aliases and self-joins ######################################## class JoinAlias(SQLObject): name = StringCol() parent = StringCol() def test_1syntax(): setupClass(JoinAlias) alias = Alias(JoinAlias) select = JoinAlias.select(JoinAlias.q.parent == alias.q.name) assert str(select) == \ "SELECT join_alias.id, join_alias.name, join_alias.parent " \ "FROM join_alias, join_alias join_alias_alias1 " \ "WHERE ((join_alias.parent) = (join_alias_alias1.name))" def test_2perform_join(): setupClass(JoinAlias) JoinAlias(name="grandparent", parent=None) JoinAlias(name="parent", parent="grandparent") JoinAlias(name="child", parent="parent") alias = Alias(JoinAlias) select = JoinAlias.select(JoinAlias.q.parent == alias.q.name) assert select.count() == 2 def test_3joins(): setupClass(JoinAlias) alias = Alias(JoinAlias) select = JoinAlias.select( (JoinAlias.q.name == 'a') & (alias.q.name == 'b'), join=LEFTJOINOn(None, alias, alias.q.name == 'c') ) assert str(select) == \ "SELECT join_alias.id, join_alias.name, join_alias.parent " \ "FROM join_alias " \ "LEFT JOIN join_alias join_alias_alias3 " \ "ON ((join_alias_alias3.name) = ('c')) " \ "WHERE (((join_alias.name) = ('a')) AND " \ "((join_alias_alias3.name) = ('b')))" SQLObject-3.4.0/sqlobject/tests/test_sqlbuilder_joins_instances.py0000644000175000017500000000572612747416060025055 0ustar phdphd00000000000000from sqlobject import AND, ForeignKey, SQLMultipleJoin, SQLObject, \ SQLRelatedJoin, StringCol from sqlobject.tests.dbtest import inserts, setupClass # Testing for expressing join, foreign keys, # and instance identity in SQLBuilder expressions. class SBPerson(SQLObject): name = StringCol() addresses = SQLMultipleJoin('SBAddress', joinColumn='personID') sharedAddresses = SQLRelatedJoin('SBAddress', addRemoveName='SharedAddress') class SBAddress(SQLObject): city = StringCol() person = ForeignKey('SBPerson') sharedPeople = SQLRelatedJoin('SBPerson') def setup_module(mod): global ppl setupClass([SBPerson, SBAddress]) ppl = inserts(SBPerson, [('James',), ('Julia',)], 'name') adds = inserts(SBAddress, [('London', ppl[0].id), ('Chicago', ppl[1].id), ('Abu Dhabi', ppl[1].id)], 'city personID') ppl[0].addSharedAddress(adds[0]) ppl[0].addSharedAddress(adds[1]) ppl[1].addSharedAddress(adds[0]) def testJoin(): assert list(SBPerson.select( AND(SBPerson.q.id == SBAddress.q.personID, SBAddress.q.city == 'London'))) == \ list(SBAddress.selectBy(city='London').throughTo.person) assert list(SBAddress.select( AND(SBPerson.q.id == SBAddress.q.personID, SBPerson.q.name == 'Julia')).orderBy(SBAddress.q.city)) == \ list(SBPerson.selectBy(name='Julia'). throughTo.addresses.orderBy(SBAddress.q.city)) def testRelatedJoin(): assert list(SBPerson.selectBy(name='Julia').throughTo.sharedAddresses) == \ list(ppl[1].sharedAddresses) def testInstance(): assert list(SBAddress.select( AND(SBPerson.q.id == SBAddress.q.personID, SBPerson.q.id == ppl[0].id) )) == list(ppl[0].addresses) def testFK(): assert list(SBPerson.select( AND(SBAddress.j.person, SBAddress.q.city == 'London'))) == \ list(SBPerson.select( AND(SBPerson.q.id == SBAddress.q.personID, SBAddress.q.city == 'London'))) def testRelatedJoin2(): assert list(SBAddress.select( AND(SBAddress.j.sharedPeople, SBPerson.q.name == 'Julia'))) == \ list(SBPerson.select( SBPerson.q.name == 'Julia').throughTo.sharedAddresses) def testJoin2(): assert list(SBAddress.select( AND(SBPerson.j.addresses, SBPerson.q.name == 'Julia')).orderBy(SBAddress.q.city)) == \ list(SBAddress.select( AND(SBPerson.q.id == SBAddress.q.personID, SBPerson.q.name == 'Julia')).orderBy(SBAddress.q.city)) == \ list(SBPerson.selectBy(name='Julia').throughTo. addresses.orderBy(SBAddress.q.city)) def testFK2(): assert list(SBAddress.select( AND(SBAddress.j.person, SBPerson.q.name == 'Julia'))) == \ list(SBAddress.select( AND(SBPerson.q.id == SBAddress.q.personID, SBPerson.q.name == 'Julia'))) SQLObject-3.4.0/sqlobject/tests/test_combining_joins.py0000644000175000017500000000250412747416060022574 0ustar phdphd00000000000000from sqlobject import ForeignKey, ManyToMany, OneToMany, SQLObject, StringCol from .dbtest import setupClass class ComplexGroup(SQLObject): name = StringCol() complexes = OneToMany('Complex') def _get_unit_models(self): q = self.complexes.clause & Complex.unit_models.clause return UnitModel.select(q) class Complex(SQLObject): name = StringCol() unit_models = ManyToMany('UnitModel') complex_group = ForeignKey('ComplexGroup') class UnitModel(SQLObject): class sqlmeta: defaultOrderBy = 'name' name = StringCol() complexes = ManyToMany('Complex') def test_join_sqlrepr(): setupClass([ComplexGroup, UnitModel, Complex]) cg1 = ComplexGroup(name='cg1') cg2 = ComplexGroup(name='cg2') c1 = Complex(name='c1', complex_group=cg1) c2 = Complex(name='c2', complex_group=cg2) c3 = Complex(name='c3', complex_group=cg2) u1 = UnitModel(name='u1') u2 = UnitModel(name='u2') u1.complexes.add(c1) u1.complexes.add(c2) u2.complexes.add(c2) u2.complexes.add(c3) assert list(Complex.selectBy(name='c1')) == [c1] assert list(cg1.unit_models) == [u1] assert list(cg2.unit_models) == [u1, u2, u2] assert list(cg2.unit_models.distinct()) == [u1, u2] assert list( cg2.unit_models.filter(UnitModel.q.name == 'u1')) == [u1] SQLObject-3.4.0/sqlobject/tests/test_paste.py0000644000175000017500000000453512775273363020557 0ustar phdphd00000000000000from __future__ import print_function import pytest from sqlobject import sqlhub, SQLObject, StringCol try: from sqlobject.wsgi_middleware import make_middleware except ImportError: pytestmark = pytest.mark.skipif('True') from .dbtest import getConnection, getConnectionURI, setupClass class NameOnly(SQLObject): name = StringCol() def makeapp(abort=False, begin=False, fail=False): def app(environ, start_response): NameOnly(name='app1') if fail == 'early': assert 0 start_response('200 OK', [('content-type', 'text/plain')]) if begin: environ['sqlobject.begin']() NameOnly(name='app2') if abort: environ['sqlobject.abort']() if fail: assert 0 return ['ok'] return app def makestack(abort=False, begin=False, fail=False, **kw): app = makeapp(abort=abort, begin=begin, fail=fail) app = make_middleware(app, {}, database=getConnectionURI(), **kw) return app def runapp(**kw): print('-' * 8) app = makestack(**kw) env = {} def start_response(*args): pass try: list(app(env, start_response)) return True except AssertionError: return False def setup(): setupClass(NameOnly) getConnection().query('DELETE FROM name_only') NameOnly._connection = sqlhub def names(): names = [n.name for n in NameOnly.select(connection=getConnection())] names.sort() return names def test_fail(): setup() assert not runapp(fail=True, use_transaction=True) assert names() == [] setup() assert not runapp(fail=True, use_transaction=False) assert names() == ['app1', 'app2'] setup() assert not runapp(fail=True, begin=True, use_transaction=True) assert names() == ['app1'] def test_other(): setup() assert runapp(fail=False, begin=True, use_transaction=True) assert names() == ['app1', 'app2'] setup() # @@: Dammit, I can't get these to pass because I can't get the # stupid table to clear itself. setupClass() sucks. When I # fix it I'll take this disabling out: pytest.skip("Oops...") assert names() == [] assert runapp(fail=False, begin=True, abort=True, use_transaction=True) assert names() == ['app1'] setup() assert runapp(use_transaction=True) assert names() == ['app1', 'app2'] SQLObject-3.4.0/sqlobject/tests/test_events.py0000644000175000017500000000650612747416060020737 0ustar phdphd00000000000000from __future__ import print_function from sqlobject import IntCol, SQLObject, StringCol, events from sqlobject.inheritance import InheritableSQLObject from sqlobject.tests.dbtest import setupClass class EventTester(SQLObject): name = StringCol() def make_watcher(): log = [] def watch(*args): log.append(args) watch.log = log return watch def make_listen(signal, cls=None): if cls is None: cls = EventTester watcher = make_watcher() events.listen(watcher, cls, signal) return watcher def test_create(): watcher = make_listen(events.ClassCreateSignal) class EventTesterSub1(EventTester): pass class EventTesterSub2(EventTesterSub1): pass assert len(watcher.log) == 2 assert len(watcher.log[0]) == 5 assert watcher.log[0][0] == 'EventTesterSub1' assert watcher.log[0][1] == (EventTester,) assert isinstance(watcher.log[0][2], dict) assert isinstance(watcher.log[0][3], list) def test_row_create(): setupClass(EventTester) watcher = make_listen(events.RowCreateSignal) row1 = EventTester(name='foo') row2 = EventTester(name='bar') assert len(watcher.log) == 2 assert watcher.log == [ (row1, {'name': 'foo'}, []), (row2, {'name': 'bar'}, [])] def test_row_destroy(): setupClass(EventTester) watcher = make_listen(events.RowDestroySignal) f = EventTester(name='foo') assert not watcher.log f.destroySelf() assert watcher.log == [(f, [])] def test_row_destroyed(): setupClass(EventTester) watcher = make_listen(events.RowDestroyedSignal) f = EventTester(name='foo') assert not watcher.log f.destroySelf() assert watcher.log == [(f, [])] def test_row_update(): setupClass(EventTester) watcher = make_listen(events.RowUpdateSignal) f = EventTester(name='bar') assert not watcher.log f.name = 'bar2' f.set(name='bar3') assert watcher.log == [ (f, {'name': 'bar2'}), (f, {'name': 'bar3'})] def test_row_updated(): setupClass(EventTester) watcher = make_listen(events.RowUpdatedSignal) f = EventTester(name='bar') assert not watcher.log f.name = 'bar2' f.set(name='bar3') assert watcher.log == [(f, []), (f, [])] def test_add_column(): setupClass(EventTester) watcher = make_listen(events.AddColumnSignal) events.summarize_events_by_sender() class NewEventTester(EventTester): name2 = StringCol() expect = ( NewEventTester, None, 'name2', NewEventTester.sqlmeta.columnDefinitions['name2'], False, []) print(zip(watcher.log[1], expect)) assert watcher.log[1] == expect class InheritableEventTestA(InheritableSQLObject): a = IntCol() class InheritableEventTestB(InheritableEventTestA): b = IntCol() class InheritableEventTestC(InheritableEventTestB): c = IntCol() def _query(instance): row = InheritableEventTestA.get(instance.id) assert isinstance(row, InheritableEventTestC) assert row.c == 3 def _signal(instance, kwargs, postfuncs): postfuncs.append(_query) def test_inheritance_row_created(): setupClass([InheritableEventTestA, InheritableEventTestB, InheritableEventTestC]) events.listen(_signal, InheritableEventTestA, events.RowCreatedSignal) InheritableEventTestC(a=1, b=2, c=3) SQLObject-3.4.0/sqlobject/tests/test_SQLMultipleJoin.py0000644000175000017500000000443712775273363022437 0ustar phdphd00000000000000import pytest from sqlobject import ForeignKey, IntCol, MultipleJoin, SQLMultipleJoin, \ SQLObject, StringCol from sqlobject.tests.dbtest import setupClass, supports class Race(SQLObject): name = StringCol() fightersAsList = MultipleJoin('RFighter', joinColumn="rf_id") fightersAsSResult = SQLMultipleJoin('RFighter', joinColumn="rf_id") class RFighter(SQLObject): name = StringCol() race = ForeignKey('Race', dbName="rf_id") power = IntCol() def createAllTables(): setupClass([Race, RFighter]) def test_1(): createAllTables() # create some races human = Race(name='human') saiyajin = Race(name='saiyajin') hibrid = Race(name='hibrid (human with sayajin)') namek = Race(name='namekuseijin') # create some fighters RFighter(name='Gokou (Kakaruto)', race=saiyajin, power=10) RFighter(name='Vegeta', race=saiyajin, power=9) RFighter(name='Krilim', race=human, power=3) RFighter(name='Yancha', race=human, power=2) RFighter(name='Jackie Chan', race=human, power=2) RFighter(name='Gohan', race=hibrid, power=8) RFighter(name='Goten', race=hibrid, power=7) trunks = RFighter(name='Trunks', race=hibrid, power=8) picollo = RFighter(name='Picollo', race=namek, power=6) RFighter(name='Neil', race=namek, power=5) # testing the SQLMultipleJoin stuff for i, j in zip(human.fightersAsList, human.fightersAsSResult): assert i is j # the 2 ways should give the same result assert namek.fightersAsSResult.count() == len(namek.fightersAsList) assert saiyajin.fightersAsSResult.max('power') == 10 assert trunks in hibrid.fightersAsSResult assert picollo not in hibrid.fightersAsSResult assert str(hibrid.fightersAsSResult.sum('power')) == '23' def test_multiple_join_transaction(): if not supports('transactions'): pytest.skip("Transactions aren't supported") createAllTables() trans = Race._connection.transaction() try: namek = Race(name='namekuseijin', connection=trans) RFighter(name='Gokou (Kakaruto)', race=namek, power=10, connection=trans) assert namek.fightersAsSResult.count() == 1 assert namek.fightersAsSResult[0]._connection == trans finally: trans.commit(True) Race._connection.autoCommit = True SQLObject-3.4.0/sqlobject/tests/test_SingleJoin.py0000644000175000017500000000202112747416060021460 0ustar phdphd00000000000000from sqlobject import ForeignKey, SQLObject, SingleJoin, StringCol from sqlobject.tests.dbtest import setupClass class PersonWithAlbum(SQLObject): name = StringCol() # albumNone returns the album or none albumNone = SingleJoin('PhotoAlbum', joinColumn='test_person_id') # albumInstance returns the album or an default album instance albumInstance = SingleJoin('PhotoAlbum', makeDefault=True, joinColumn='test_person_id') class PhotoAlbum(SQLObject): color = StringCol(default='red') person = ForeignKey('PersonWithAlbum', dbName='test_person_id') def test_1(): setupClass([PersonWithAlbum, PhotoAlbum]) person = PersonWithAlbum(name='Gokou (Kakarouto)') # We didn't create an album, this way it returns None assert not person.albumNone assert isinstance(person.albumInstance, PhotoAlbum) PhotoAlbum(person=person) assert person.albumNone assert isinstance(person.albumNone, PhotoAlbum) assert isinstance(person.albumInstance, PhotoAlbum) SQLObject-3.4.0/sqlobject/tests/test_select.py0000644000175000017500000001417113036106435020701 0ustar phdphd00000000000000import pytest from sqlobject import IntCol, LIKE, RLIKE, SQLObject, \ SQLObjectIntegrityError, SQLObjectNotFound, StringCol from sqlobject.sqlbuilder import func from .dbtest import raises, setupClass, supports, setSQLiteConnectionFactory class IterTest(SQLObject): name = StringCol(dbName='name_col', length=200) names = ('a', 'b', 'c') def setupIter(): setupClass(IterTest) for n in names: IterTest(name=n) def test_00_normal(): setupIter() count = 0 for test in IterTest.select(): count += 1 assert count == len(names) def test_00b_lazy(): setupIter() count = 0 for test in IterTest.select(lazyColumns=True): count += 1 assert count == len(names) def test_01_turn_to_list(): count = 0 for test in list(IterTest.select()): count += 1 assert count == len(names) def test_02_generator(): all = IterTest.select() count = 0 for i, test in enumerate(all): count += 1 assert count == len(names) def test_03_ranged_indexed(): all = IterTest.select() count = 0 for i in range(all.count()): all[i] # test it's there count += 1 assert count == len(names) def test_04_indexed_ended_by_exception(): if not supports('limitSelect'): pytest.skip("limitSelect isn't supported") all = IterTest.select() count = 0 try: while 1: all[count] count = count + 1 # Stop the test if it's gone on too long if count > len(names): break except IndexError: pass assert count == len(names) def test_05_select_limit(): setupIter() assert len(list(IterTest.select(limit=2))) == 2 raises(AssertionError, IterTest.select(limit=2).count) def test_06_contains(): setupIter() assert len(list(IterTest.select(IterTest.q.name.startswith('a')))) == 1 assert len(list(IterTest.select(IterTest.q.name.contains('a')))) == 1 assert len(list(IterTest.select( IterTest.q.name.contains(func.lower('A'))))) == 1 assert len(list(IterTest.select(IterTest.q.name.contains("a'b")))) == 0 assert len(list(IterTest.select(IterTest.q.name.endswith('a')))) == 1 def test_07_contains_special(): setupClass(IterTest) a = IterTest(name='\\test') b = IterTest(name='100%') c = IterTest(name='test_') assert list(IterTest.select(IterTest.q.name.startswith('\\'))) == [a] assert list(IterTest.select(IterTest.q.name.contains('%'))) == [b] assert list(IterTest.select(IterTest.q.name.endswith('_'))) == [c] def test_select_getOne(): setupClass(IterTest) a = IterTest(name='a') b = IterTest(name='b') assert IterTest.selectBy(name='a').getOne() == a assert IterTest.select(IterTest.q.name == 'b').getOne() == b assert IterTest.selectBy(name='c').getOne(None) is None raises(SQLObjectNotFound, 'IterTest.selectBy(name="c").getOne()') IterTest(name='b') raises(SQLObjectIntegrityError, 'IterTest.selectBy(name="b").getOne()') raises(SQLObjectIntegrityError, 'IterTest.selectBy(name="b").getOne(None)') def test_selectBy(): setupClass(IterTest) IterTest(name='a') IterTest(name='b') assert IterTest.selectBy().count() == 2 def test_selectBy_kwargs(): setupClass(IterTest) raises(TypeError, IterTest, nonexistant='b') class UniqTest(SQLObject): name = StringCol(dbName='name_col', unique=True, length=100) def test_by_uniq(): setupClass(UniqTest) a = UniqTest(name='a') b = UniqTest(name='b') assert UniqTest.byName('a') is a assert UniqTest.byName('b') is b class Counter2(SQLObject): n1 = IntCol(notNull=True) n2 = IntCol(notNull=True) class TestSelect: def setup_method(self, meth): setupClass(Counter2) for i in range(10): for j in range(10): Counter2(n1=i, n2=j) def counterEqual(self, counters, value): assert [(c.n1, c.n2) for c in counters] == value def accumulateEqual(self, func, counters, value): assert func([c.n1 for c in counters]) == value def test_1(self): self.accumulateEqual(sum, Counter2.select(orderBy='n1'), sum(range(10)) * 10) def test_2(self): self.accumulateEqual(len, Counter2.select('all'), 100) def test_select_LIKE(): setupClass(IterTest) IterTest(name='sqlobject') IterTest(name='sqlbuilder') assert IterTest.select(LIKE(IterTest.q.name, "sql%")).count() == 2 assert IterTest.select(LIKE(IterTest.q.name, "sqlb%")).count() == 1 assert IterTest.select(LIKE(IterTest.q.name, "sqlb%")).count() == 1 assert IterTest.select(LIKE(IterTest.q.name, "sqlx%")).count() == 0 def test_select_RLIKE(): if not supports('rlike'): pytest.skip("rlike isn't supported") setupClass(IterTest) if IterTest._connection.dbName == "sqlite": if not IterTest._connection.using_sqlite2: pytest.skip("These tests require SQLite v2+") # Implement regexp() function for SQLite; only works with PySQLite2 import re def regexp(regexp, test): return bool(re.search(regexp, test)) def SQLiteConnectionFactory(sqlite): class MyConnection(sqlite.Connection): def __init__(self, *args, **kwargs): super(MyConnection, self).__init__(*args, **kwargs) self.create_function("regexp", 2, regexp) return MyConnection setSQLiteConnectionFactory(IterTest, SQLiteConnectionFactory) IterTest(name='sqlobject') IterTest(name='sqlbuilder') assert IterTest.select(RLIKE(IterTest.q.name, "^sql.*$")).count() == 2 assert IterTest.select(RLIKE(IterTest.q.name, "^sqlb.*$")).count() == 1 assert IterTest.select(RLIKE(IterTest.q.name, "^sqlb.*$")).count() == 1 assert IterTest.select(RLIKE(IterTest.q.name, "^sqlx.*$")).count() == 0 def test_select_sqlbuilder(): setupClass(IterTest) IterTest(name='sqlobject') IterTest.select(IterTest.q.name == u'sqlobject') def test_select_perConnection(): setupClass(IterTest) IterTest(name='a') assert not IterTest.select().getOne().sqlmeta._perConnection SQLObject-3.4.0/sqlobject/tests/test_constraints.py0000644000175000017500000000237412747416060022001 0ustar phdphd00000000000000from sqlobject.compat import PY2 from sqlobject.constraints import BadValue, InList, MaxLength, \ isFloat, isInt, isString, notNull from sqlobject.tests.dbtest import Dummy, raises if not PY2: # alias for python 3 compatability long = int def test_constraints(): obj = 'Test object' col = Dummy(name='col') isString(obj, col, 'blah') raises(BadValue, isString, obj, col, 1) if PY2: # @@: Should this really be an error? raises(BadValue, isString, obj, col, u'test!') else: raises(BadValue, isString, obj, col, b'test!') # isString(obj, col, u'test!') raises(BadValue, notNull, obj, col, None) raises(BadValue, isInt, obj, col, 1.1) isInt(obj, col, 1) isInt(obj, col, long(1)) isFloat(obj, col, 1) isFloat(obj, col, long(1)) isFloat(obj, col, 1.2) raises(BadValue, isFloat, obj, col, '1.0') # @@: Should test isBool, but I don't think isBool is right lst = InList(('a', 'b', 'c')) lst(obj, col, 'a') raises(BadValue, lst, obj, col, ('a', 'b', 'c')) raises(BadValue, lst, obj, col, 'A') maxlen = MaxLength(2) raises(BadValue, maxlen, obj, col, '123') maxlen(obj, col, '12') maxlen(obj, col, (1,)) raises(BadValue, maxlen, obj, col, 1) SQLObject-3.4.0/sqlobject/tests/test_groupBy.py0000644000175000017500000000254213036106435021050 0ustar phdphd00000000000000from sqlobject import IntCol, SQLObject, StringCol from sqlobject.sqlbuilder import Select, func from sqlobject.tests.dbtest import getConnection, setupClass ######################################## # groupBy ######################################## class GroupbyTest(SQLObject): name = StringCol() so_value = IntCol() def test_groupBy(): setupClass(GroupbyTest) GroupbyTest(name='a', so_value=1) GroupbyTest(name='a', so_value=2) GroupbyTest(name='b', so_value=1) connection = getConnection() select = Select( [GroupbyTest.q.name, func.COUNT(GroupbyTest.q.so_value)], groupBy=GroupbyTest.q.name, orderBy=GroupbyTest.q.name) sql = connection.sqlrepr(select) rows = list(connection.queryAll(sql)) assert [tuple(t) for t in rows] == [('a', 2), ('b', 1)] def test_groupBy_list(): setupClass(GroupbyTest) GroupbyTest(name='a', so_value=1) GroupbyTest(name='a', so_value=2) GroupbyTest(name='b', so_value=1) connection = getConnection() select = Select( [GroupbyTest.q.name, GroupbyTest.q.so_value], groupBy=[GroupbyTest.q.name, GroupbyTest.q.so_value], orderBy=[GroupbyTest.q.name, GroupbyTest.q.so_value]) sql = connection.sqlrepr(select) rows = list(connection.queryAll(sql)) assert [tuple(t) for t in rows] == [('a', 1), ('a', 2), ('b', 1)] SQLObject-3.4.0/sqlobject/tests/test_datetime.py0000644000175000017500000001013413036405001021200 0ustar phdphd00000000000000from datetime import datetime, date, time import pytest from sqlobject import SQLObject from sqlobject import col from sqlobject.col import DATETIME_IMPLEMENTATION, DateCol, DateTimeCol, \ MXDATETIME_IMPLEMENTATION, TimeCol, mxdatetime_available, use_microseconds from sqlobject.tests.dbtest import getConnection, setupClass ######################################## # Date/time columns ######################################## col.default_datetime_implementation = DATETIME_IMPLEMENTATION class DateTime1(SQLObject): col1 = DateTimeCol() col2 = DateCol() col3 = TimeCol() def test_dateTime(): setupClass(DateTime1) _now = datetime.now() dt1 = DateTime1(col1=_now, col2=_now, col3=_now.time()) assert isinstance(dt1.col1, datetime) assert dt1.col1.year == _now.year assert dt1.col1.month == _now.month assert dt1.col1.day == _now.day assert dt1.col1.hour == _now.hour assert dt1.col1.minute == _now.minute assert dt1.col1.second == _now.second assert isinstance(dt1.col2, date) assert not isinstance(dt1.col2, datetime) assert dt1.col2.year == _now.year assert dt1.col2.month == _now.month assert dt1.col2.day == _now.day assert isinstance(dt1.col3, time) assert dt1.col3.hour == _now.hour assert dt1.col3.minute == _now.minute assert dt1.col3.second == _now.second def test_microseconds(): connection = getConnection() if not hasattr(connection, 'can_use_microseconds') or \ not connection.can_use_microseconds(): pytest.skip( "The database doesn't support microseconds; " "microseconds are supported by MariaDB since version 5.3.0, " "by MySQL since version 5.6.4, " "by MSSQL since MS SQL Server 2008.") setupClass(DateTime1) _now = datetime.now() dt1 = DateTime1(col1=_now, col2=_now, col3=_now.time()) assert dt1.col1.microsecond == _now.microsecond assert dt1.col3.microsecond == _now.microsecond use_microseconds(False) setupClass(DateTime1) _now = datetime.now() dt1 = DateTime1(col1=_now, col2=_now, col3=_now.time()) assert dt1.col1.microsecond == 0 assert dt1.col3.microsecond == 0 use_microseconds(True) setupClass(DateTime1) _now = datetime.now() dt1 = DateTime1(col1=_now, col2=_now, col3=_now.time()) assert dt1.col1.microsecond == _now.microsecond assert dt1.col3.microsecond == _now.microsecond if mxdatetime_available: col.default_datetime_implementation = MXDATETIME_IMPLEMENTATION from mx.DateTime import now, Time dateFormat = None # use default try: connection = getConnection() except AttributeError: # The module was imported during documentation building pass else: if connection.dbName == "sqlite": if connection.using_sqlite2: # mxDateTime sends and PySQLite2 returns # full date/time for dates dateFormat = "%Y-%m-%d %H:%M:%S.%f" class DateTime2(SQLObject): col1 = DateTimeCol() col2 = DateCol(dateFormat=dateFormat) col3 = TimeCol() def test_mxDateTime(): setupClass(DateTime2) _now = now() dt2 = DateTime2(col1=_now, col2=_now.pydate(), col3=Time(_now.hour, _now.minute, _now.second)) assert isinstance(dt2.col1, col.DateTimeType) assert dt2.col1.year == _now.year assert dt2.col1.month == _now.month assert dt2.col1.day == _now.day assert dt2.col1.hour == _now.hour assert dt2.col1.minute == _now.minute assert dt2.col1.second == int(_now.second) assert isinstance(dt2.col2, col.DateTimeType) assert dt2.col2.year == _now.year assert dt2.col2.month == _now.month assert dt2.col2.day == _now.day assert dt2.col2.hour == 0 assert dt2.col2.minute == 0 assert dt2.col2.second == 0 assert isinstance(dt2.col3, (col.DateTimeType, col.TimeType)) assert dt2.col3.hour == _now.hour assert dt2.col3.minute == _now.minute assert dt2.col3.second == int(_now.second) SQLObject-3.4.0/sqlobject/tests/test_blob.py0000644000175000017500000000133413036106435020335 0ustar phdphd00000000000000import pytest from sqlobject import BLOBCol, SQLObject from sqlobject.compat import PY2 from sqlobject.tests.dbtest import setupClass, supports ######################################## # BLOB columns ######################################## class ImageData(SQLObject): image = BLOBCol(default=b'emptydata', length=256) def test_BLOBCol(): if not supports('blobData'): pytest.skip("blobData isn't supported") setupClass(ImageData) if PY2: data = ''.join([chr(x) for x in range(256)]) else: data = bytes(range(256)) prof = ImageData() prof.image = data iid = prof.id ImageData._connection.cache.clear() prof2 = ImageData.get(iid) assert prof2.image == data SQLObject-3.4.0/sqlobject/tests/test_expire.py0000644000175000017500000000136212747416060020722 0ustar phdphd00000000000000from sqlobject import SQLObject, StringCol from sqlobject.tests.dbtest import setupClass ######################################## # Expiring, syncing ######################################## class SyncTest(SQLObject): name = StringCol(length=50, alternateID=True, dbName='name_col') def test_expire(): setupClass(SyncTest) SyncTest(name='bob') SyncTest(name='tim') conn = SyncTest._connection b = SyncTest.byName('bob') conn.query("UPDATE sync_test SET name_col = 'robert' WHERE id = %i" % b.id) assert b.name == 'bob' b.expire() assert b.name == 'robert' conn.query("UPDATE sync_test SET name_col = 'bobby' WHERE id = %i" % b.id) b.sync() assert b.name == 'bobby' SQLObject-3.4.0/sqlobject/tests/test_decimal.py0000644000175000017500000000444213036460624021023 0ustar phdphd00000000000000from decimal import Decimal import pytest from sqlobject import DecimalCol, DecimalStringCol, SQLObject, UnicodeCol from sqlobject.tests.dbtest import setupClass, supports ######################################## # Decimal columns ######################################## try: support_decimal_column = supports('decimalColumn') except NameError: # The module was imported during documentation building pass else: if not support_decimal_column: pytestmark = pytest.mark.skip('') class DecimalTable(SQLObject): name = UnicodeCol(length=255) col1 = DecimalCol(size=6, precision=4) col2 = DecimalStringCol(size=6, precision=4) col3 = DecimalStringCol(size=6, precision=4, quantize=True) def test_1_decimal(): setupClass(DecimalTable) d = DecimalTable(name='test', col1=21.12, col2='10.01', col3='10.01') # psycopg2 returns float as Decimal if isinstance(d.col1, Decimal): assert d.col1 == Decimal("21.12") else: assert d.col1 == 21.12 assert d.col2 == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col2'].to_python( '10.01', d._SO_validatorState) == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col2'].from_python( '10.01', d._SO_validatorState) == "10.01" assert d.col3 == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col3'].to_python( '10.01', d._SO_validatorState) == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col3'].from_python( '10.01', d._SO_validatorState) == "10.0100" def test_2_decimal(): setupClass(DecimalTable) d = DecimalTable(name='test', col1=Decimal("21.12"), col2=Decimal('10.01'), col3=Decimal('10.01')) assert d.col1 == Decimal("21.12") assert d.col2 == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col2'].to_python( Decimal('10.01'), d._SO_validatorState) == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col2'].from_python( Decimal('10.01'), d._SO_validatorState) == "10.01" assert d.col3 == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col3'].to_python( Decimal('10.01'), d._SO_validatorState) == Decimal("10.01") assert DecimalTable.sqlmeta.columns['col3'].from_python( Decimal('10.01'), d._SO_validatorState) == "10.0100" SQLObject-3.4.0/sqlobject/sresults.py0000644000175000017500000003537512472652005017120 0ustar phdphd00000000000000from . import dbconnection from . import sqlbuilder from .compat import string_type __all__ = ['SelectResults'] class SelectResults(object): IterationClass = dbconnection.Iteration def __init__(self, sourceClass, clause, clauseTables=None, **ops): self.sourceClass = sourceClass if clause is None or isinstance(clause, str) and clause == 'all': clause = sqlbuilder.SQLTrueClause if not isinstance(clause, sqlbuilder.SQLExpression): clause = sqlbuilder.SQLConstant(clause) self.clause = clause self.ops = ops if ops.get('orderBy', sqlbuilder.NoDefault) is sqlbuilder.NoDefault: ops['orderBy'] = sourceClass.sqlmeta.defaultOrder orderBy = ops['orderBy'] if isinstance(orderBy, (tuple, list)): orderBy = list(map(self._mungeOrderBy, orderBy)) else: orderBy = self._mungeOrderBy(orderBy) ops['dbOrderBy'] = orderBy if 'connection' in ops and ops['connection'] is None: del ops['connection'] if ops.get('limit', None): assert not ops.get('start', None) and not ops.get('end', None), \ "'limit' cannot be used with 'start' or 'end'" ops["start"] = 0 ops["end"] = ops.pop("limit") tablesSet = sqlbuilder.tablesUsedSet(self.clause, self._getConnection().dbName) if clauseTables: for table in clauseTables: tablesSet.add(table) self.clauseTables = clauseTables # Explicitly post-adding-in sqlmeta.table, # sqlbuilder.Select will handle sqlrepr'ing and dupes. self.tables = list(tablesSet) + [sourceClass.sqlmeta.table] def queryForSelect(self): columns = [self.sourceClass.q.id] + \ [getattr(self.sourceClass.q, x.name) for x in self.sourceClass.sqlmeta.columnList] query = sqlbuilder.Select(columns, where=self.clause, join=self.ops.get( 'join', sqlbuilder.NoDefault), distinct=self.ops.get('distinct', False), lazyColumns=self.ops.get( 'lazyColumns', False), start=self.ops.get('start', 0), end=self.ops.get('end', None), orderBy=self.ops.get( 'dbOrderBy', sqlbuilder.NoDefault), reversed=self.ops.get('reversed', False), staticTables=self.tables, forUpdate=self.ops.get('forUpdate', False)) return query def __repr__(self): return "<%s at %x>" % (self.__class__.__name__, id(self)) def _getConnection(self): return self.ops.get('connection') or self.sourceClass._connection def __str__(self): conn = self._getConnection() return conn.queryForSelect(self) def _mungeOrderBy(self, orderBy): if isinstance(orderBy, string_type) and orderBy.startswith('-'): orderBy = orderBy[1:] desc = True else: desc = False if isinstance(orderBy, string_type): if orderBy in self.sourceClass.sqlmeta.columns: val = getattr(self.sourceClass.q, self.sourceClass.sqlmeta.columns[orderBy].name) if desc: return sqlbuilder.DESC(val) else: return val else: orderBy = sqlbuilder.SQLConstant(orderBy) if desc: return sqlbuilder.DESC(orderBy) else: return orderBy else: return orderBy def clone(self, **newOps): ops = self.ops.copy() ops.update(newOps) return self.__class__(self.sourceClass, self.clause, self.clauseTables, **ops) def orderBy(self, orderBy): return self.clone(orderBy=orderBy) def connection(self, conn): return self.clone(connection=conn) def limit(self, limit): return self[:limit] def lazyColumns(self, value): return self.clone(lazyColumns=value) def reversed(self): return self.clone(reversed=not self.ops.get('reversed', False)) def distinct(self): return self.clone(distinct=True) def newClause(self, new_clause): return self.__class__(self.sourceClass, new_clause, self.clauseTables, **self.ops) def filter(self, filter_clause): if filter_clause is None: # None doesn't filter anything, it's just a no-op: return self clause = self.clause if isinstance(clause, string_type): clause = sqlbuilder.SQLConstant('(%s)' % clause) return self.newClause(sqlbuilder.AND(clause, filter_clause)) def __getitem__(self, value): if isinstance(value, slice): assert not value.step, "Slices do not support steps" if not value.start and not value.stop: # No need to copy, I'm immutable return self # Negative indexes aren't handled (and everything we # don't handle ourselves we just create a list to # handle) if (value.start and value.start < 0) \ or (value.stop and value.stop < 0): if value.start: if value.stop: return list(self)[value.start:value.stop] return list(self)[value.start:] return list(self)[:value.stop] if value.start: assert value.start >= 0 start = self.ops.get('start', 0) + value.start if value.stop is not None: assert value.stop >= 0 if value.stop < value.start: # an empty result: end = start else: end = value.stop + self.ops.get('start', 0) if self.ops.get('end', None) is not None and \ self.ops['end'] < end: # truncated by previous slice: end = self.ops['end'] else: end = self.ops.get('end', None) else: start = self.ops.get('start', 0) end = value.stop + start if self.ops.get('end', None) is not None \ and self.ops['end'] < end: end = self.ops['end'] return self.clone(start=start, end=end) else: if value < 0: return list(iter(self))[value] else: start = self.ops.get('start', 0) + value return list(self.clone(start=start, end=start + 1))[0] def __iter__(self): # @@: This could be optimized, using a simpler algorithm # since we don't have to worry about garbage collection, # etc., like we do with .lazyIter() return iter(list(self.lazyIter())) def lazyIter(self): """ Returns an iterator that will lazily pull rows out of the database and return SQLObject instances """ conn = self._getConnection() return conn.iterSelect(self) def accumulate(self, *expressions): """ Use accumulate expression(s) to select result using another SQL select through current connection. Return the accumulate result """ conn = self._getConnection() exprs = [] for expr in expressions: if not isinstance(expr, sqlbuilder.SQLExpression): expr = sqlbuilder.SQLConstant(expr) exprs.append(expr) return conn.accumulateSelect(self, *exprs) def count(self): """ Counting elements of current select results """ assert not self.ops.get('start') and not self.ops.get('end'), \ "start/end/limit have no meaning with 'count'" assert not (self.ops.get('distinct') and (self.ops.get('start') or self.ops.get('end'))), \ "distinct-counting of sliced objects is not supported" if self.ops.get('distinct'): # Column must be specified, so we are using unique ID column. # COUNT(DISTINCT column) is supported by MySQL and PostgreSQL, # but not by SQLite. Perhaps more portable would be subquery: # SELECT COUNT(*) FROM (SELECT DISTINCT id FROM table) count = self.accumulate( 'COUNT(DISTINCT %s)' % self._getConnection().sqlrepr( self.sourceClass.q.id)) else: count = self.accumulate('COUNT(*)') if self.ops.get('start'): count -= self.ops['start'] if self.ops.get('end'): count = min(self.ops['end'] - self.ops.get('start', 0), count) return count def accumulateMany(self, *attributes): """ Making the expressions for count/sum/min/max/avg of a given select result attributes. `attributes` must be a list/tuple of pairs (func_name, attribute); `attribute` can be a column name (like 'a_column') or a dot-q attribute (like Table.q.aColumn) """ expressions = [] conn = self._getConnection() if self.ops.get('distinct'): distinct = 'DISTINCT ' else: distinct = '' for func_name, attribute in attributes: if not isinstance(attribute, str): attribute = conn.sqlrepr(attribute) expression = '%s(%s%s)' % (func_name, distinct, attribute) expressions.append(expression) return self.accumulate(*expressions) def accumulateOne(self, func_name, attribute): """ Making the sum/min/max/avg of a given select result attribute. `attribute` can be a column name (like 'a_column') or a dot-q attribute (like Table.q.aColumn) """ return self.accumulateMany((func_name, attribute)) def sum(self, attribute): return self.accumulateOne("SUM", attribute) def min(self, attribute): return self.accumulateOne("MIN", attribute) def avg(self, attribute): return self.accumulateOne("AVG", attribute) def max(self, attribute): return self.accumulateOne("MAX", attribute) def getOne(self, default=sqlbuilder.NoDefault): """ If a query is expected to only return a single value, using ``.getOne()`` will return just that value. If not results are found, ``SQLObjectNotFound`` will be raised, unless you pass in a default value (like ``.getOne(None)``). If more than one result is returned, ``SQLObjectIntegrityError`` will be raised. """ from . import main results = list(self) if not results: if default is sqlbuilder.NoDefault: raise main.SQLObjectNotFound( "No results matched the query for %s" % self.sourceClass.__name__) return default if len(results) > 1: raise main.SQLObjectIntegrityError( "More than one result returned from query: %s" % results) return results[0] def throughTo(self): class _throughTo_getter(object): def __init__(self, inst): self.sresult = inst def __getattr__(self, attr): return self.sresult._throughTo(attr) return _throughTo_getter(self) throughTo = property(throughTo) def _throughTo(self, attr): otherClass = None orderBy = sqlbuilder.NoDefault ref = self.sourceClass.sqlmeta.columns.get( attr.endswith('ID') and attr or attr + 'ID', None) if ref and ref.foreignKey: otherClass, clause = self._throughToFK(ref) else: join = [x for x in self.sourceClass.sqlmeta.joins if x.joinMethodName == attr] if join: join = join[0] orderBy = join.orderBy if hasattr(join, 'otherColumn'): otherClass, clause = self._throughToRelatedJoin(join) else: otherClass, clause = self._throughToMultipleJoin(join) if not otherClass: raise AttributeError( "throughTo argument (got %s) should be " "name of foreignKey or SQL*Join in %s" % (attr, self.sourceClass)) return otherClass.select(clause, orderBy=orderBy, connection=self._getConnection()) def _throughToFK(self, col): otherClass = getattr(self.sourceClass, "_SO_class_" + col.foreignKey) colName = col.name query = self.queryForSelect().newItems([ sqlbuilder.ColumnAS(getattr(self.sourceClass.q, colName), colName) ]).orderBy(None).distinct() query = sqlbuilder.Alias(query, "%s_%s" % (self.sourceClass.__name__, col.name)) return otherClass, otherClass.q.id == getattr(query.q, colName) def _throughToMultipleJoin(self, join): otherClass = join.otherClass colName = join.soClass.sqlmeta.style.\ dbColumnToPythonAttr(join.joinColumn) query = self.queryForSelect().newItems( [sqlbuilder.ColumnAS(self.sourceClass.q.id, 'id')]).\ orderBy(None).distinct() query = sqlbuilder.Alias(query, "%s_%s" % (self.sourceClass.__name__, join.joinMethodName)) joinColumn = getattr(otherClass.q, colName) return otherClass, joinColumn == query.q.id def _throughToRelatedJoin(self, join): otherClass = join.otherClass intTable = sqlbuilder.Table(join.intermediateTable) colName = join.joinColumn query = self.queryForSelect().newItems( [sqlbuilder.ColumnAS(self.sourceClass.q.id, 'id')]).\ orderBy(None).distinct() query = sqlbuilder.Alias(query, "%s_%s" % (self.sourceClass.__name__, join.joinMethodName)) clause = sqlbuilder.AND( otherClass.q.id == getattr(intTable, join.otherColumn), getattr(intTable, colName) == query.q.id) return otherClass, clause SQLObject-3.4.0/sqlobject/rdbhost/0000755000175000017500000000000013141371614016310 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/rdbhost/__init__.py0000644000175000017500000000030412464733165020430 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import rdbhostconnection return rdbhostconnection.RdbhostConnection registerConnection(['rdbhost'], builder) SQLObject-3.4.0/sqlobject/rdbhost/rdbhostconnection.py0000644000175000017500000000360313012667617022421 0ustar phdphd00000000000000""" This module written by David Keeney, 2009, 2010 Released under the LGPL for use with the SQLObject ORM library. """ from sqlobject.dbconnection import DBAPI from sqlobject.postgres.pgconnection import PostgresConnection class RdbhostConnection(PostgresConnection): supportTransactions = False dbName = 'rdbhost' schemes = [dbName] def __init__(self, dsn=None, host=None, port=None, db=None, user=None, password=None, unicodeCols=False, **kw): from rdbhdb import rdbhdb as rdb # monkey patch % escaping into Cursor._execute old_execute = getattr(rdb.Cursor, '_execute') setattr(rdb.Cursor, '_old_execute', old_execute) def _execute(self, query, *args): assert not any([a for a in args]) query = query.replace('%', '%%') self._old_execute(query, (), (), ()) setattr(rdb.Cursor, '_execute', _execute) self.module = rdb self.user = user self.host = host self.port = port self.db = db self.password = password self.dsn_dict = dsn_dict = {} self.use_dsn = dsn is not None if host: dsn_dict["host"] = host if user: dsn_dict["role"] = user if password: dsn_dict["authcode"] = password if dsn is None: dsn = [] if db: dsn.append('dbname=%s' % db) if user: dsn.append('user=%s' % user) if password: dsn.append('password=%s' % password) if host: dsn.append('host=%s' % host) if port: dsn.append('port=%d' % port) dsn = ' '.join(dsn) self.dsn = dsn self.unicodeCols = unicodeCols self.schema = kw.pop('schema', None) self.dbEncoding = 'utf-8' DBAPI.__init__(self, **kw) SQLObject-3.4.0/sqlobject/versioning/0000755000175000017500000000000013141371614017026 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/versioning/__init__.py0000644000175000017500000000754112470177652021160 0ustar phdphd00000000000000from datetime import datetime from sqlobject import col, events, SQLObject, AND class Version(SQLObject): def restore(self): values = self.sqlmeta.asDict() del values['id'] del values['masterID'] del values['dateArchived'] for _col in self.extraCols: del values[_col] self.masterClass.get(self.masterID).set(**values) def nextVersion(self): version = self.select( AND(self.q.masterID == self.masterID, self.q.id > self.id), orderBy=self.q.id) if version.count(): return version[0] else: return self.master def getChangedFields(self): next = self.nextVersion() columns = self.masterClass.sqlmeta.columns fields = [] for column in columns: if column not in ["dateArchived", "id", "masterID"]: if getattr(self, column) != getattr(next, column): fields.append(column.title()) return fields @classmethod def select(cls, clause=None, *args, **kw): if not getattr(cls, '_connection', None): cls._connection = cls.masterClass._connection return super(Version, cls).select(clause, *args, **kw) def __getattr__(self, attr): if attr in self.__dict__: return self.__dict__[attr] else: return getattr(self.master, attr) def getColumns(columns, cls): for column, defi in cls.sqlmeta.columnDefinitions.items(): if column.endswith("ID") and isinstance(defi, col.ForeignKey): column = column[:-2] # remove incompatible constraints kwds = dict(defi._kw) for kw in ["alternateID", "unique"]: if kw in kwds: del kwds[kw] columns[column] = defi.__class__(**kwds) # ascend heirarchy if cls.sqlmeta.parentClass: getColumns(columns, cls.sqlmeta.parentClass) class Versioning(object): def __init__(self, extraCols=None): if extraCols: self.extraCols = extraCols else: self.extraCols = {} pass def __addtoclass__(self, soClass, name): self.name = name self.soClass = soClass attrs = {'dateArchived': col.DateTimeCol(default=datetime.now), 'master': col.ForeignKey(self.soClass.__name__), 'masterClass': self.soClass, 'extraCols': self.extraCols } getColumns(attrs, self.soClass) attrs.update(self.extraCols) self.versionClass = type(self.soClass.__name__ + 'Versions', (Version,), attrs) if '_connection' in self.soClass.__dict__: self.versionClass._connection = \ self.soClass.__dict__['_connection'] events.listen(self.createTable, soClass, events.CreateTableSignal) events.listen(self.rowUpdate, soClass, events.RowUpdateSignal) def createVersionTable(self, cls, conn): self.versionClass.createTable(ifNotExists=True, connection=conn) def createTable(self, soClass, connection, extra_sql, post_funcs): assert soClass is self.soClass post_funcs.append(self.createVersionTable) def rowUpdate(self, instance, kwargs): if instance.childName and instance.childName != self.soClass.__name__: return # if you want your child class versioned, version it values = instance.sqlmeta.asDict() del values['id'] values['masterID'] = instance.id self.versionClass(connection=instance._connection, **values) def __get__(self, obj, type=None): if obj is None: return self return self.versionClass.select( self.versionClass.q.masterID == obj.id, connection=obj._connection) SQLObject-3.4.0/sqlobject/versioning/test/0000755000175000017500000000000013141371614020005 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/versioning/test/__init__.py0000644000175000017500000000000010545027231022102 0ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/versioning/test/test_version.py0000644000175000017500000001152013036106435023102 0ustar phdphd00000000000000from sqlobject import ForeignKey, IntCol, SQLObject, StringCol from sqlobject.inheritance import InheritableSQLObject from sqlobject.versioning import Versioning from sqlobject.tests.dbtest import setupClass class MyClass(SQLObject): name = StringCol() versions = Versioning() class Base(InheritableSQLObject): name = StringCol() so_value = IntCol(default=0) versions = Versioning() class Child(Base): toy = StringCol() class Government(InheritableSQLObject): name = StringCol() class Monarchy(Government): monarch = StringCol() versions = Versioning() class VChild(Base): weapon = StringCol() versions = Versioning() class HasForeign(SQLObject): foreign = ForeignKey("Base") versions = Versioning() def _set_extra(): return "read all about it" class Extra(SQLObject): name = StringCol() versions = Versioning( extraCols={'extra': StringCol(default=_set_extra())}) class HasAltId(SQLObject): name = StringCol() altid = IntCol(alternateID=True) versions = Versioning() def setup(): classes = [MyClass, Base, Child, Government, Monarchy, VChild, Extra, HasAltId] if hasattr(HasForeign, "_connection"): classes.insert(0, HasForeign) else: classes.append(HasForeign) for cls in classes: if hasattr(cls, 'versions') and getattr(cls, "_connection", None) and \ cls._connection.tableExists(cls.sqlmeta.table): setupClass(cls.versions.versionClass) setupClass(cls) if hasattr(cls, 'versions'): setupClass(cls.versions.versionClass) for version in cls.versions.versionClass.select(): version.destroySelf() def test_versioning(): # the simple case setup() mc = MyClass(name='fleem') mc.set(name='morx') assert len(list(mc.versions)) == 1 assert mc.versions[0].name == "fleem" assert len(list(MyClass.select())) == 1 def test_inheritable_versioning(): setup() # base versioned, child unversioned base = Base(name='fleem') base.set(name='morx') assert len(list(base.versions)) == 1 assert base.versions[0].name == "fleem" assert len(list(Base.select())) == 1 child = Child(name='child', toy='nintendo') child.set(name='teenager', toy='guitar') assert len(list(child.versions)) == 0 # child versioned, base unversioned government = Government(name='canada') assert not hasattr(government, 'versions') monarchy = Monarchy(name='UK', monarch='king george iv') assert len(list(monarchy.versions)) == 0 monarchy.set(name='queen elisabeth ii') assert len(list(monarchy.versions)) == 1 assert monarchy.versions[0].name == "UK" assert len(list(Monarchy.select())) == 1 # both parent and child versioned num_base_versions = len(list(base.versions)) vchild = VChild(name='kid', weapon='slingshot') vchild.set(name='toon', weapon='dynamite') assert len(list(base.versions)) == num_base_versions assert len(list(vchild.versions)) == 1 # test setting using setattr directly rather than .set vchild.name = "newname" assert len(list(vchild.versions)) == 2 def test_restore(): setup() base = Base(name='fleem') base.set(name='morx') assert base.name == "morx" base.versions[0].restore() assert base.name == "fleem" monarchy = Monarchy(name='USA', monarch='Emperor Norton I') monarchy.set(name='morx') assert monarchy.name == "morx" monarchy.versions[0].restore() assert monarchy.name == "USA" assert monarchy.monarch == "Emperor Norton I" extra = Extra(name='fleem') extra.set(name='morx') assert extra.name == "morx" extra.versions[0].restore() assert extra.name == "fleem" def test_next(): setup() base = Base(name='first', so_value=1) base.set(name='second') base.set(name='third', so_value=2) version = base.versions[0] assert version.nextVersion() == base.versions[1] assert version.nextVersion().nextVersion() == base def test_get_changed(): setup() base = Base(name='first', so_value=1) base.set(name='second') base.set(name='third', so_value=2) assert base.versions[0].getChangedFields() == ['Name'] assert sorted(base.versions[1].getChangedFields()) == ['Name', 'So_Value'] def test_foreign_keys(): setup() base1 = Base(name='first', so_value=1) base2 = Base(name='first', so_value=1) has_foreign = HasForeign(foreign=base1) has_foreign.foreign = base2 assert has_foreign.versions[0].foreign == base1 def test_extra(): setup() extra = Extra(name='title') extra.name = 'new' assert extra.versions[0].extra == 'read all about it' assert sorted(extra.versions[0].getChangedFields()) == ['Name'] def test_altid(): setup() extra = HasAltId(name="fleem", altid=5) extra.name = "morx" SQLObject-3.4.0/sqlobject/util/0000755000175000017500000000000013141371614015620 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/util/threadinglocal.py0000644000175000017500000000023011537673704021161 0ustar phdphd00000000000000try: from threading import local except ImportError: # No threads, so "thread local" means process-global class local(object): pass SQLObject-3.4.0/sqlobject/util/__init__.py0000644000175000017500000000000210223346763017726 0ustar phdphd00000000000000# SQLObject-3.4.0/sqlobject/util/csvimport.py0000644000175000017500000002572612470215762020241 0ustar phdphd00000000000000""" Import from a CSV file or directory of files. CSV files should have a header line that lists columns. Headers can also be appended with ``:type`` to indicate the type of the field. ``escaped`` is the default, though it can be overridden by the importer. Supported types: ``:python``: A python expression, run through ``eval()``. This can be a security risk, pass in ``allow_python=False`` if you don't want to allow it. ``:int``: Integer ``:float``: Float ``:str``: String ``:escaped``: A string with backslash escapes (note that you don't put quotation marks around the value) ``:base64``: A base64-encoded string ``:date``: ISO date, like YYYY-MM-DD; this can also be ``NOW+days`` or ``NOW-days`` ``:datetime``: ISO date/time like YYYY-MM-DDTHH:MM:SS (either T or a space can be used to separate the time, and seconds are optional). This can also be ``NOW+seconds`` or ``NOW-seconds`` ``:bool``: Converts true/false/yes/no/on/off/1/0 to boolean value ``:ref``: This will be resolved to the ID of the object named in this column (None if the column is empty). @@: Since there's no ordering, there's no way to promise the object already exists. You can also get back references to the objects if you have a special ``[name]`` column. Any column named ``[comment]`` or with no name will be ignored. In any column you can put ``[default]`` to exclude the value and use whatever default the class wants. ``[null]`` will use NULL. Lines that begin with ``[comment]`` are ignored. """ import csv from datetime import datetime, date, timedelta import os import time import types __all__ = ['load_csv_from_directory', 'load_csv', 'create_data'] DEFAULT_TYPE = 'escaped' def create_data(data, class_getter, keyorder=None): """ Create the ``data``, which is the return value from ``load_csv()``. Classes will be resolved with the callable ``class_getter``; or if ``class_getter`` is a module then the class names will be attributes of that. Returns a dictionary of ``{object_name: object(s)}``, using the names from the ``[name]`` columns (if there are any). If a name is used multiple times, you get a list of objects, not a single object. If ``keyorder`` is given, then the keys will be retrieved in that order. It can be a list/tuple of names, or a sorting function. If not given and ``class_getter`` is a module and has a ``soClasses`` function, then that will be used for the order. """ objects = {} classnames = data.keys() if (not keyorder and isinstance(class_getter, types.ModuleType) and hasattr(class_getter, 'soClasses')): keyorder = [c.__name__ for c in class_getter.soClasses] if not keyorder: classnames.sort() elif isinstance(keyorder, (list, tuple)): all = classnames classnames = [name for name in keyorder if name in classnames] for name in all: if name not in classnames: classnames.append(name) else: classnames.sort(keyorder) for classname in classnames: items = data[classname] if not items: continue if isinstance(class_getter, types.ModuleType): soClass = getattr(class_getter, classname) else: soClass = class_getter(classname) for item in items: for key, value in item.items(): if isinstance(value, Reference): resolved = objects.get(value.name) if not resolved: raise ValueError( "Object reference to %r does not have target" % value.name) elif (isinstance(resolved, list) and len(resolved) > 1): raise ValueError( "Object reference to %r is ambiguous (got %r)" % (value.name, resolved)) item[key] = resolved.id if '[name]' in item: name = item.pop('[name]').strip() else: name = None inst = soClass(**item) if name: if name in objects: if isinstance(objects[name], list): objects[name].append(inst) else: objects[name] = [objects[name], inst] else: objects[name] = inst return objects def load_csv_from_directory(directory, allow_python=True, default_type=DEFAULT_TYPE, allow_multiple_classes=True): """ Load the data from all the files in a directory. Filenames indicate the class, with ``general.csv`` for data not associated with a class. Return data just like ``load_csv`` does. This might cause problems on case-insensitive filesystems. """ results = {} for filename in os.listdir(directory): base, ext = os.path.splitext(filename) if ext.lower() != '.csv': continue f = open(os.path.join(directory, filename), 'rb') csvreader = csv.reader(f) data = load_csv(csvreader, allow_python=allow_python, default_type=default_type, default_class=base, allow_multiple_classes=allow_multiple_classes) f.close() for classname, items in data.items(): results.setdefault(classname, []).extend(items) return results def load_csv(csvreader, allow_python=True, default_type=DEFAULT_TYPE, default_class=None, allow_multiple_classes=True): """ Loads the CSV file, returning a list of dictionaries with types coerced. """ current_class = default_class current_headers = None results = {} for row in csvreader: if not [cell for cell in row if cell.strip()]: # empty row continue if row and row[0].strip() == 'CLASS:': if not allow_multiple_classes: raise ValueError( "CLASS: line in CSV file, but multiple classes " "are not allowed in this file (line: %r)" % row) if not row[1:]: raise ValueError( "CLASS: in line in CSV file, with no class name " "in next column (line: %r)" % row) current_class = row[1] current_headers = None continue if not current_class: raise ValueError( "No CLASS: line given, and there is no default class " "for this file (line: %r)" % row) if current_headers is None: current_headers = _parse_headers(row, default_type, allow_python=allow_python) continue if row[0] == '[comment]': continue # Pad row with empty strings: row += [''] * (len(current_headers) - len(row)) row_converted = {} for value, (name, coercer, args) in zip(row, current_headers): if name is None: # Comment continue if value == '[default]': continue if value == '[null]': row_converted[name] = None continue args = (value,) + args row_converted[name] = coercer(*args) results.setdefault(current_class, []).append(row_converted) return results def _parse_headers(header_row, default_type, allow_python=True): headers = [] for name in header_row: original_name = name if ':' in name: name, type = name.split(':', 1) else: type = default_type if type == 'python' and not allow_python: raise ValueError( ":python header given when python headers are not allowed " "(with header %r)" % original_name) name = name.strip() if name == '[comment]' or not name: headers.append((None, None, None)) continue type = type.strip().lower() if '(' in type: type, arg = type.split('(', 1) if not arg.endswith(')'): raise ValueError( "Arguments (in ()'s) do not end with ): %r" % original_name) args = (arg[:-1],) else: args = () if name == '[name]': type = 'str' coercer, args = get_coercer(type) headers.append((name, coercer, args)) return headers _coercers = {} def get_coercer(type): if type not in _coercers: raise ValueError( "Coercion type %r not known (I know: %s)" % (type, ', '.join(_coercers.keys()))) return _coercers[type] def register_coercer(type, coercer, *args): _coercers[type] = (coercer, args) def identity(v): return v register_coercer('str', identity) register_coercer('string', identity) def decode_string(v, encoding): return v.decode(encoding) register_coercer('escaped', decode_string, 'string_escape') register_coercer('strescaped', decode_string, 'string_escape') register_coercer('base64', decode_string, 'base64') register_coercer('int', int) register_coercer('float', float) def parse_python(v): return eval(v, {}, {}) register_coercer('python', parse_python) def parse_date(v): v = v.strip() if not v: return None if v.startswith('NOW-') or v.startswith('NOW+'): days = int(v[3:]) now = date.today() return now + timedelta(days) else: parsed = time.strptime(v, '%Y-%m-%d') return date.fromtimestamp(time.mktime(parsed)) register_coercer('date', parse_date) def parse_datetime(v): v = v.strip() if not v: return None if v.startswith('NOW-') or v.startswith('NOW+'): seconds = int(v[3:]) now = datetime.now() return now + timedelta(0, seconds) else: fmts = ['%Y-%m-%dT%H:%M:%S', '%Y-%m-%d %H:%M:%S', '%Y-%m-%dT%H:%M', '%Y-%m-%d %H:%M'] for fmt in fmts[:-1]: try: parsed = time.strptime(v, fmt) break except ValueError: pass else: parsed = time.strptime(v, fmts[-1]) return datetime.fromtimestamp(time.mktime(parsed)) register_coercer('datetime', parse_datetime) class Reference(object): def __init__(self, name): self.name = name def parse_ref(v): if not v.strip(): return None else: return Reference(v) register_coercer('ref', parse_ref) def parse_bool(v): v = v.strip().lower() if v in ('y', 'yes', 't', 'true', 'on', '1'): return True elif v in ('n', 'no', 'f', 'false', 'off', '0'): return False raise ValueError( "Value is not boolean-like: %r" % v) register_coercer('bool', parse_bool) register_coercer('boolean', parse_bool) SQLObject-3.4.0/sqlobject/util/csvexport.py0000644000175000017500000001561113043410220020217 0ustar phdphd00000000000000""" Exports a SQLObject class (possibly annotated) to a CSV file. """ import csv import os try: from cStringIO import StringIO except ImportError: try: from StringIO import StringIO except ImportError: from io import StringIO, BytesIO import sqlobject from sqlobject.compat import PY2, string_type __all__ = ['export_csv', 'export_csv_zip'] def export_csv(soClass, select=None, writer=None, connection=None, orderBy=None): """ Export the SQLObject class ``soClass`` to a CSV file. ``soClass`` can also be a SelectResults object, as returned by ``.select()``. If it is a class, all objects will be retrieved, ordered by ``orderBy`` if given, or the ``.csvOrderBy`` attribute if present (but csvOrderBy will only be applied when no select result is given). You can also pass in select results (or simply a list of instances) in ``select`` -- if you have a list of objects (not a SelectResults instance, as produced by ``.select()``) then you must pass it in with ``select`` and pass the class in as the first argument. ``writer`` is a ``csv.writer()`` object, or a file-like object. If not given, the string of the file will be returned. Uses ``connection`` as the data source, if given, otherwise the default connection. Columns can be annotated with ``.csvTitle`` attributes, which will form the attributes of the columns, or 'title' (secondarily), or if nothing then the column attribute name. If a column has a ``.noCSV`` attribute which is true, then the column will be suppressed. Additionally a class can have an ``.extraCSVColumns`` attribute, which should be a list of strings/tuples. If a tuple, it should be like ``(attribute, title)``, otherwise it is the attribute, which will also be the title. These will be appended to the end of the CSV file; the attribute will be retrieved from instances. Also a ``.csvColumnOrder`` attribute can be on the class, which is the string names of attributes in the order they should be presented. """ return_fileobj = None if not writer: return_fileobj = StringIO() writer = csv.writer(return_fileobj) elif not hasattr(writer, 'writerow'): writer = csv.writer(writer) if isinstance(soClass, sqlobject.SQLObject.SelectResultsClass): assert select is None, ( "You cannot pass in a select argument (%r) " "and a SelectResults argument (%r) for soClass" % (select, soClass)) select = soClass soClass = select.sourceClass elif select is None: select = soClass.select() if getattr(soClass, 'csvOrderBy', None): select = select.orderBy(soClass.csvOrderBy) if orderBy: select = select.orderBy(orderBy) if connection: select = select.connection(connection) _actually_export_csv(soClass, select, writer) if return_fileobj: # They didn't pass any writer or file object in, so we return # the string result: return return_fileobj.getvalue() def _actually_export_csv(soClass, select, writer): attributes, titles = _find_columns(soClass) writer.writerow(titles) for soInstance in select: row = [getattr(soInstance, attr) for attr in attributes] writer.writerow(row) def _find_columns(soClass): order = [] attrs = {} for col in soClass.sqlmeta.columnList: if getattr(col, 'noCSV', False): continue order.append(col.name) title = col.name if hasattr(col, 'csvTitle'): title = col.csvTitle elif getattr(col, 'title', None) is not None: title = col.title attrs[col.name] = title for attrDesc in getattr(soClass, 'extraCSVColumns', []): if isinstance(attrDesc, (list, tuple)): attr, title = attrDesc else: attr = title = attrDesc order.append(attr) attrs[attr] = title if hasattr(soClass, 'csvColumnOrder'): oldOrder = order order = soClass.csvColumnOrder for attr in order: if attr not in oldOrder: raise KeyError( "Attribute %r in csvColumnOrder (on class %r) " "does not exist as a column or in .extraCSVColumns " "(I have: %r)" % (attr, soClass, oldOrder)) oldOrder.remove(attr) order.extend(oldOrder) titles = [attrs[attr] for attr in order] return order, titles def export_csv_zip(soClasses, file=None, zip=None, filename_prefix='', connection=None): """ Export several SQLObject classes into a .zip file. Each item in the ``soClasses`` list may be a SQLObject class, select result, or ``(soClass, select)`` tuple. Each file in the zip will be named after the class name (with ``.csv`` appended), or using the filename in the ``.csvFilename`` attribute. If ``file`` is given, the zip will be written to that. ``file`` may be a string (a filename) or a file-like object. If not given, a string will be returnd. If ``zip`` is given, then the files will be written to that zip file. All filenames will be prefixed with ``filename_prefix`` (which may be a directory name, for instance). """ import zipfile close_file_when_finished = False close_zip_when_finished = True return_when_finished = False if file: if isinstance(file, string_type): close_file_when_finished = True file = open(file, 'wb') elif zip: close_zip_when_finished = False else: return_when_finished = True if PY2: file = StringIO() else: # zipfile on python3 requires BytesIO file = BytesIO() if not zip: zip = zipfile.ZipFile(file, mode='w') try: _actually_export_classes(soClasses, zip, filename_prefix, connection) finally: if close_zip_when_finished: zip.close() if close_file_when_finished: file.close() if return_when_finished: return file.getvalue() def _actually_export_classes(soClasses, zip, filename_prefix, connection): for classDesc in soClasses: if isinstance(classDesc, (tuple, list)): soClass, select = classDesc elif isinstance(classDesc, sqlobject.SQLObject.SelectResultsClass): select = classDesc soClass = select.sourceClass else: soClass = classDesc select = None filename = getattr(soClass, 'csvFilename', soClass.__name__) if not os.path.splitext(filename)[1]: filename += '.csv' filename = filename_prefix + filename zip.writestr(filename, export_csv(soClass, select, connection=connection)) SQLObject-3.4.0/sqlobject/util/moduleloader.py0000644000175000017500000000251012465143172020650 0ustar phdphd00000000000000import imp import os import sys def load_module(module_name): mod = __import__(module_name) components = module_name.split('.') for comp in components[1:]: mod = getattr(mod, comp) return mod def load_module_from_name(filename, module_name): if module_name in sys.modules: return sys.modules[module_name] init_filename = os.path.join(os.path.dirname(filename), '__init__.py') if not os.path.exists(init_filename): try: f = open(init_filename, 'w') except (OSError, IOError) as e: raise IOError( 'Cannot write __init__.py file into directory %s (%s)\n' % (os.path.dirname(filename), e)) f.write('#\n') f.close() fp = None if module_name in sys.modules: return sys.modules[module_name] if '.' in module_name: parent_name = '.'.join(module_name.split('.')[:-1]) base_name = module_name.split('.')[-1] load_module_from_name(os.path.dirname(filename), parent_name) else: base_name = module_name fp = None try: fp, pathname, stuff = imp.find_module( base_name, [os.path.dirname(filename)]) module = imp.load_module(module_name, fp, pathname, stuff) finally: if fp is not None: fp.close() return module SQLObject-3.4.0/sqlobject/maxdb/0000755000175000017500000000000013141371614015736 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/maxdb/maxdbconnection.py0000644000175000017500000002514712466177021021501 0ustar phdphd00000000000000""" Contributed by Edigram SAS, Paris France Tel:01 44 77 94 00 Ahmed MOHAMED ALI 27 April 2004 This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. connection creation sample:: __connection__ = DBConnection.maxdbConnection( host=hostname, database=dbname, user=user_name, password=user_password, autoCommit=1, debug=1) """ import os from sqlobject.dbconnection import DBAPI from sqlobject import col class maxdbException(Exception): def __init__(self, value): self.value = value def __str__(self): return repr(self.value) class LowerBoundOfSliceIsNotSupported(maxdbException): def __init__(self, value): maxdbException.__init__(self, '') class IncorrectIDStyleError(maxdbException): def __init__(self, value): maxdbException.__init__( self, 'This primary key name is not in the expected style, ' 'please rename the column to %r or switch to another style' % value) class StyleMismatchError(maxdbException): def __init__(self, value): maxdbException.__init__( self, 'The name %r is only permitted for primary key, change the ' 'column name or switch to another style' % value) class PrimaryKeyNotFounded(maxdbException): def __init__(self, value): maxdbException.__init__( self, "No primary key was defined on table %r" % value) SAPDBMAX_ID_LENGTH = 32 class MaxdbConnection(DBAPI): supportTransactions = True dbName = 'maxdb' schemes = [dbName] def __init__(self, host='', port=None, user=None, password=None, database=None, autoCommit=1, sqlmode='internal', isolation=None, timeout=None, **kw): from sapdb import dbapi self.module = dbapi self.host = host self.port = port self.user = user self.password = password self.db = database self.autoCommit = autoCommit self.sqlmode = sqlmode self.isolation = isolation self.timeout = timeout DBAPI.__init__(self, **kw) @classmethod def _connectionFromParams(cls, auth, password, host, port, path, args): path = path.replace('/', os.path.sep) return cls(host, port, user=auth, password=password, database=path, **args) def _getConfigParams(self, sqlmode, auto): autocommit = 'off' if auto: autocommit = 'on' opt = {} opt["autocommit"] = autocommit opt["sqlmode"] = sqlmode if self.isolation: opt["isolation"] = self.isolation if self.timeout: opt["timeout"] = self.timeout return opt def _setAutoCommit(self, conn, auto): conn.close() conn.__init__(self.user, self.password, self.db, self.host, **self._getConfigParams(self.sqlmode, auto)) def createSequenceName(self, table): """ sequence name are builded with the concatenation of the table name with '_SEQ' word we truncate the name of the sequence_name because sapdb identifier cannot exceed 32 characters so that the name of the sequence does not exceed 32 characters """ return '%s_SEQ' % (table[:SAPDBMAX_ID_LENGTH - 4]) def makeConnection(self): conn = self.module.Connection( self.user, self.password, self.db, self.host, **self._getConfigParams(self.sqlmode, self.autoCommit)) return conn def _queryInsertID(self, conn, soInstance, id, names, values): table = soInstance.sqlmeta.table idName = soInstance.sqlmeta.idName c = conn.cursor() if id is None: c.execute( 'SELECT %s.NEXTVAL FROM DUAL' % ( self.createSequenceName(table))) id = c.fetchone()[0] names = [idName] + names values = [id] + values q = self._insertSQL(table, names, values) if self.debug: self.printDebug(conn, q, 'QueryIns') c.execute(q) if self.debugOutput: self.printDebug(conn, id, 'QueryIns', 'result') return id @classmethod def sqlAddLimit(cls, query, limit): sql = query sql = sql.replace("SELECT", "SELECT ROWNO, ") if sql.find('WHERE') != -1: sql = sql + ' AND ' + limit else: sql = sql + 'WHERE ' + limit return sql @classmethod def _queryAddLimitOffset(cls, query, start, end): if start: raise LowerBoundOfSliceIsNotSupported limit = ' ROWNO <= %d ' % (end) return cls.sqlAddLimit(query, limit) def createTable(self, soClass): # We create the table in a transaction because the addition of the # table and the sequence must be atomic. # I tried to use the transaction class # but I get a recursion limit error. # t=self.transaction() # t.query('CREATE TABLE %s (\n%s\n)' % # (soClass.sqlmeta.table, self.createColumns(soClass))) # # t.query("CREATE SEQUENCE %s" % # self.createSequenceName(soClass.sqlmeta.table)) # t.commit() # so use transaction when the problem will be solved self.query('CREATE TABLE %s (\n%s\n)' % (soClass.sqlmeta.table, self.createColumns(soClass))) self.query("CREATE SEQUENCE %s" % self.createSequenceName(soClass.sqlmeta.table)) return [] def createReferenceConstraint(self, soClass, col): return col.maxdbCreateReferenceConstraint() def createColumn(self, soClass, col): return col.maxdbCreateSQL() def createIDColumn(self, soClass): key_type = {int: "INT", str: "TEXT"}[soClass.sqlmeta.idType] return '%s %s PRIMARY KEY' % (soClass.sqlmeta.idName, key_type) def createIndexSQL(self, soClass, index): return index.maxdbCreateIndexSQL(soClass) def dropTable(self, tableName, cascade=False): # We drop the table in a transaction because the removal of the # table and the sequence must be atomic. # I tried to use the transaction class # but I get a recursion limit error. # try: # t=self.transaction() # t.query("DROP TABLE %s" % tableName) # t.query("DROP SEQUENCE %s" % self.createSequenceName(tableName)) # t.commit() # except: # t.rollback() # so use transaction when the problem will be solved self.query("DROP TABLE %s" % tableName) self.query("DROP SEQUENCE %s" % self.createSequenceName(tableName)) def joinSQLType(self, join): return 'INT NOT NULL' def tableExists(self, tableName): for (table,) in self.queryAll( "SELECT OBJECT_NAME FROM ALL_OBJECTS " "WHERE OBJECT_TYPE='TABLE'"): if table.lower() == tableName.lower(): return True return False def addColumn(self, tableName, column): self.query('ALTER TABLE %s ADD %s' % (tableName, column.maxdbCreateSQL())) def delColumn(self, sqlmeta, column): self.query('ALTER TABLE %s DROP COLUMN %s' % (sqlmeta.table, column.dbName)) GET_COLUMNS = """ SELECT COLUMN_NAME, NULLABLE, DATA_DEFAULT, DATA_TYPE, DATA_LENGTH, DATA_SCALE FROM USER_TAB_COLUMNS WHERE TABLE_NAME=UPPER('%s')""" GET_PK_AND_FK = """ SELECT constraint_cols.column_name, constraints.constraint_type, refname,reftablename FROM user_cons_columns constraint_cols INNER JOIN user_constraints constraints ON constraint_cols.constraint_name = constraints.constraint_name LEFT OUTER JOIN show_foreign_key fk ON constraint_cols.column_name = fk.columnname WHERE constraints.table_name =UPPER('%s')""" def columnsFromSchema(self, tableName, soClass): colData = self.queryAll(self.GET_COLUMNS % tableName) results = [] keymap = {} pkmap = {} fkData = self.queryAll(self.GET_PK_AND_FK % tableName) for _col, cons_type, refcol, reftable in fkData: col_name = _col.lower() pkmap[col_name] = False if cons_type == 'R': keymap[col_name] = reftable.lower() elif cons_type == 'P': pkmap[col_name] = True if len(pkmap) == 0: raise PrimaryKeyNotFounded(tableName) for (field, nullAllowed, default, data_type, data_len, data_scale) in colData: # id is defined as primary key --> ok # We let sqlobject raise error if the 'id' is used # for another column. field_name = field.lower() if (field_name == soClass.sqlmeta.idName) and pkmap[field_name]: continue colClass, kw = self.guessClass(data_type, data_len, data_scale) kw['name'] = field_name kw['dbName'] = field if nullAllowed == 'Y': nullAllowed = False else: nullAllowed = True kw['notNone'] = nullAllowed if default is not None: kw['default'] = default if field_name in keymap: kw['foreignKey'] = keymap[field_name] results.append(colClass(**kw)) return results _numericTypes = ['INTEGER', 'INT', 'SMALLINT'] _dateTypes = ['DATE', 'TIME', 'TIMESTAMP'] def guessClass(self, t, flength, fscale=None): """ An internal method that tries to figure out what Col subclass is appropriate given whatever introspective information is available -- both very database-specific. """ if t in self._numericTypes: return col.IntCol, {} # The type returned by the sapdb library for LONG is # SapDB_LongReader To get the data call the read member with # desired size (default =-1 means get all) elif t.find('LONG') != -1: return col.StringCol, {'length': flength, 'varchar': False} elif t in self._dateTypes: return col.DateTimeCol, {} elif t == 'FIXED': return col.CurrencyCol, {'size': flength, 'precision': fscale} else: return col.Col, {} SQLObject-3.4.0/sqlobject/maxdb/__init__.py0000644000175000017500000000030512464733165020057 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import maxdbconnection return maxdbconnection.MaxdbConnection registerConnection(['maxdb', 'sapdb'], builder) SQLObject-3.4.0/sqlobject/maxdb/readme.txt0000644000175000017500000000125610063265525017743 0ustar phdphd00000000000000This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. Author: Edigram SA - Paris France Tel:0144779400 SAPDBAPI installation --------------------- The sapdb module can be downloaded from: Win32 ------- ftp://ftp.sap.com/pub/sapdb/bin/win/sapdb-python-win32-7.4.03.31a.zip Linux ------ ftp://ftp.sap.com/pub/sapdb/bin/linux/sapdb-python-linux-i386-7.4.03.31a.tgz uncompress the archive and add the sapdb directory path to your PYTHONPATH. SQLObject-3.4.0/sqlobject/index.py0000644000175000017500000001470012470115365016331 0ustar phdphd00000000000000from itertools import count from .converters import sqlrepr creationOrder = count() class SODatabaseIndex(object): def __init__(self, soClass, name, columns, creationOrder, unique=False): self.soClass = soClass self.name = name self.descriptions = self.convertColumns(columns) self.creationOrder = creationOrder self.unique = unique def get(self, *args, **kw): if not self.unique: raise AttributeError( "'%s' object has no attribute 'get' " "(index is not unique)" % self.name) connection = kw.pop('connection', None) if args and kw: raise TypeError("You cannot mix named and unnamed arguments") columns = [d['column'] for d in self.descriptions if 'column' in d] if kw and len(kw) != len(columns) or \ args and len(args) != len(columns): raise TypeError( "get() takes exactly %d argument and an optional " "named argument 'connection' (%d given)" % ( len(columns), len(args) + len(kw))) if args: kw = {} for i in range(len(args)): if columns[i].foreignName is not None: kw[columns[i].foreignName] = args[i] else: kw[columns[i].name] = args[i] return self.soClass.selectBy(connection=connection, **kw).getOne() def convertColumns(self, columns): """ Converts all the columns to dictionary descriptors; dereferences string column names. """ new = [] for desc in columns: if not isinstance(desc, dict): desc = {'column': desc} if 'expression' in desc: assert 'column' not in desc, ( 'You cannot provide both an expression and a column ' '(for %s in index %s in %s)' % (desc, self.name, self.soClass)) assert 'length' not in desc, ( 'length does not apply to expressions (for %s in ' 'index %s in %s)' % (desc, self.name, self.soClass)) new.append(desc) continue columnName = desc['column'] if not isinstance(columnName, str): columnName = columnName.name colDict = self.soClass.sqlmeta.columns if columnName not in colDict: for possible in colDict.values(): if possible.origName == columnName: column = possible break else: # None found raise ValueError( "The column by the name %r was not found " "in the class %r" % (columnName, self.soClass)) else: column = colDict[columnName] desc['column'] = column new.append(desc) return new def getExpression(self, desc, db): if isinstance(desc['expression'], str): return desc['expression'] else: return sqlrepr(desc['expression'], db) def sqliteCreateIndexSQL(self, soClass): if self.unique: uniqueOrIndex = 'UNIQUE INDEX' else: uniqueOrIndex = 'INDEX' spec = [] for desc in self.descriptions: if 'expression' in desc: spec.append(self.getExpression(desc, 'sqlite')) else: spec.append(desc['column'].dbName) ret = 'CREATE %s %s_%s ON %s (%s)' % \ (uniqueOrIndex, self.soClass.sqlmeta.table, self.name, self.soClass.sqlmeta.table, ', '.join(spec)) return ret postgresCreateIndexSQL = maxdbCreateIndexSQL = mssqlCreateIndexSQL = \ sybaseCreateIndexSQL = firebirdCreateIndexSQL = sqliteCreateIndexSQL def mysqlCreateIndexSQL(self, soClass): if self.unique: uniqueOrIndex = 'UNIQUE' else: uniqueOrIndex = 'INDEX' spec = [] for desc in self.descriptions: if 'expression' in desc: spec.append(self.getExpression(desc, 'mysql')) elif 'length' in desc: spec.append('%s(%d)' % (desc['column'].dbName, desc['length'])) else: spec.append(desc['column'].dbName) return 'ALTER TABLE %s ADD %s %s (%s)' % \ (soClass.sqlmeta.table, uniqueOrIndex, self.name, ', '.join(spec)) class DatabaseIndex(object): """ This takes a variable number of parameters, each of which is a column for indexing. Each column may be a column object or the string name of the column (*not* the database name). You may also use dictionaries, to further customize the indexing of the column. The dictionary may have certain keys: 'column': The column object or string identifier. 'length': MySQL will only index the first N characters if this is given. For other databases this is ignored. 'expression': You can create an index based on an expression, e.g., 'lower(column)'. This can either be a string or a sqlbuilder expression. Further keys may be added to the column specs in the future. The class also take the keyword argument `unique`; if true then a UNIQUE index is created. """ baseClass = SODatabaseIndex def __init__(self, *columns, **kw): kw['columns'] = columns self.kw = kw self.creationOrder = next(creationOrder) def setName(self, value): assert self.kw.get('name') is None, \ "You cannot change a name after it has already been set " \ "(from %s to %s)" % (self.kw['name'], value) self.kw['name'] = value def _get_name(self): return self.kw['name'] def _set_name(self, value): self.setName(value) name = property(_get_name, _set_name) def withClass(self, soClass): return self.baseClass(soClass=soClass, creationOrder=self.creationOrder, **self.kw) def __repr__(self): return '<%s %s %s>' % ( self.__class__.__name__, hex(abs(id(self)))[2:], self.kw) __all__ = ['DatabaseIndex'] SQLObject-3.4.0/sqlobject/classregistry.py0000644000175000017500000001146112470415717020125 0ustar phdphd00000000000000""" classresolver.py 2 February 2004, Ian Bicking Resolves strings to classes, and runs callbacks when referenced classes are created. Classes are referred to only by name, not by module. So that identically-named classes can coexist, classes are put into individual registries, which are keyed on strings (names). These registries are created on demand. Use like:: >>> import classregistry >>> registry = classregistry.registry('MyModules') >>> def afterMyClassExists(cls): ... print('Class finally exists: %s' % cls) >>> registry.addClassCallback('MyClass', afterMyClassExists) >>> class MyClass: ... pass >>> registry.addClass(MyClass) Class finally exists: MyClass """ class ClassRegistry(object): """ We'll be dealing with classes that reference each other, so class C1 may reference C2 (in a join), while C2 references C1 right back. Since classes are created in an order, there will be a point when C1 exists but C2 doesn't. So we deal with classes by name, and after each class is created we try to fix up any references by replacing the names with actual classes. Here we keep a dictionaries of class names to classes -- note that the classes might be spread among different modules, so since we pile them together names need to be globally unique, to just module unique. Like needSet below, the container dictionary is keyed by the class registry. """ def __init__(self, name): self.name = name self.classes = {} self.callbacks = {} self.genericCallbacks = [] def addClassCallback(self, className, callback, *args, **kw): """ Whenever a name is substituted for the class, you can register a callback that will be called when the needed class is created. If it's already been created, the callback will be called immediately. """ if className in self.classes: callback(self.classes[className], *args, **kw) else: self.callbacks.setdefault(className, []).append( (callback, args, kw)) def addCallback(self, callback, *args, **kw): """ This callback is called for all classes, not just specific ones (like addClassCallback). """ self.genericCallbacks.append((callback, args, kw)) for cls in self.classes.values(): callback(cls, *args, **kw) def addClass(self, cls): """ Everytime a class is created, we add it to the registry, so that other classes can find it by name. We also call any callbacks that are waiting for the class. """ if cls.__name__ in self.classes: import sys other = self.classes[cls.__name__] raise ValueError( "class %s is already in the registry (other class is " "%r, from the module %s in %s; attempted new class is " "%r, from the module %s in %s)" % (cls.__name__, other, other.__module__, getattr(sys.modules.get(other.__module__), '__file__', '(unknown)'), cls, cls.__module__, getattr(sys.modules.get(cls.__module__), '__file__', '(unknown)'))) self.classes[cls.__name__] = cls if cls.__name__ in self.callbacks: for callback, args, kw in self.callbacks[cls.__name__]: callback(cls, *args, **kw) del self.callbacks[cls.__name__] for callback, args, kw in self.genericCallbacks: callback(cls, *args, **kw) def getClass(self, className): try: return self.classes[className] except KeyError: all = sorted(self.classes.keys()) raise KeyError( "No class %s found in the registry %s (these classes " "exist: %s)" % (className, self.name or '[default]', ', '.join(all))) def allClasses(self): return self.classes.values() class _MasterRegistry(object): """ This singleton holds all the class registries. There can be multiple registries to hold different unrelated sets of classes that reside in the same process. These registries are named with strings, and are created on demand. The MasterRegistry module global holds the singleton. """ def __init__(self): self.registries = {} def registry(self, item): if item not in self.registries: self.registries[item] = ClassRegistry(item) return self.registries[item] MasterRegistry = _MasterRegistry() registry = MasterRegistry.registry def findClass(name, class_registry=None): return registry(class_registry).getClass(name) SQLObject-3.4.0/sqlobject/compat.py0000644000175000017500000000163012735240476016511 0ustar phdphd00000000000000import sys import types # Credit to six authors: https://pypi.python.org/pypi/six # License: MIT def with_metaclass(meta, *bases): """Create a base class with a metaclass.""" # This requires a bit of explanation: the basic idea is to make a dummy # metaclass for one level of class instantiation that replaces itself with # the actual metaclass. class metaclass(meta): def __new__(cls, name, this_bases, d): return meta(name, bases, d) return type.__new__(metaclass, 'temporary_class', (), {}) # Compatability definitions (inspired by six) PY2 = sys.version_info[0] < 3 if PY2: # disable flake8 checks on python 3 string_type = basestring # noqa unicode_type = unicode # noqa class_types = (type, types.ClassType) buffer_type = buffer # noqa else: string_type = str unicode_type = str class_types = (type,) buffer_type = memoryview SQLObject-3.4.0/sqlobject/joins.py0000644000175000017500000004432013103631236016337 0ustar phdphd00000000000000from itertools import count from . import boundattributes from . import classregistry from . import events from . import styles from . import sqlbuilder from .styles import capword __all__ = ['MultipleJoin', 'SQLMultipleJoin', 'RelatedJoin', 'SQLRelatedJoin', 'SingleJoin', 'ManyToMany', 'OneToMany'] creationOrder = count() NoDefault = sqlbuilder.NoDefault def getID(obj): try: return obj.id except AttributeError: return int(obj) class Join(object): def __init__(self, otherClass=None, **kw): kw['otherClass'] = otherClass self.kw = kw self._joinMethodName = self.kw.pop('joinMethodName', None) self.creationOrder = next(creationOrder) def _set_joinMethodName(self, value): assert self._joinMethodName == value or self._joinMethodName is None, \ "You have already given an explicit joinMethodName (%s), " \ "and you are now setting it to %s" % (self._joinMethodName, value) self._joinMethodName = value def _get_joinMethodName(self): return self._joinMethodName joinMethodName = property(_get_joinMethodName, _set_joinMethodName) name = joinMethodName def withClass(self, soClass): if 'joinMethodName' in self.kw: self._joinMethodName = self.kw['joinMethodName'] del self.kw['joinMethodName'] return self.baseClass(creationOrder=self.creationOrder, soClass=soClass, joinDef=self, joinMethodName=self._joinMethodName, **self.kw) # A join is separate from a foreign key, i.e., it is # many-to-many, or one-to-many where the *other* class # has the foreign key. class SOJoin(object): def __init__(self, creationOrder, soClass=None, otherClass=None, joinColumn=None, joinMethodName=None, orderBy=NoDefault, joinDef=None): self.creationOrder = creationOrder self.soClass = soClass self.joinDef = joinDef self.otherClassName = otherClass classregistry.registry(soClass.sqlmeta.registry).addClassCallback( otherClass, self._setOtherClass) self.joinColumn = joinColumn self.joinMethodName = joinMethodName self._orderBy = orderBy if not self.joinColumn: # Here we set up the basic join, which is # one-to-many, where the other class points to # us. self.joinColumn = styles.getStyle( self.soClass).tableReference(self.soClass.sqlmeta.table) def orderBy(self): if self._orderBy is NoDefault: self._orderBy = self.otherClass.sqlmeta.defaultOrder return self._orderBy orderBy = property(orderBy) def _setOtherClass(self, cls): self.otherClass = cls def hasIntermediateTable(self): return False def _applyOrderBy(self, results, defaultSortClass): if self.orderBy is not None: doSort(results, self.orderBy) return results class MinType(object): """Sort less than everything, for handling None's in the results""" # functools.total_ordering would simplify this def __lt__(self, other): if self is other: return False return True def __eq__(self, other): return self is other def __gt__(self, other): return False def __le__(self, other): return True def __ge__(self, other): if self is other: return True return False Min = MinType() def doSort(results, orderBy): if isinstance(orderBy, (tuple, list)): if len(orderBy) == 1: orderBy = orderBy[0] else: # Rely on stable sort results, since this is simpler # than trying to munge everything into a single sort key doSort(results, orderBy[0]) doSort(results, orderBy[1:]) return if isinstance(orderBy, sqlbuilder.DESC) \ and isinstance(orderBy.expr, sqlbuilder.SQLObjectField): orderBy = '-' + orderBy.expr.original elif isinstance(orderBy, sqlbuilder.SQLObjectField): orderBy = orderBy.original # @@: but we don't handle more complex expressions for orderings if orderBy.startswith('-'): orderBy = orderBy[1:] reverse = True else: reverse = False def sortkey(x, attr=orderBy): a = getattr(x, attr) if a is None: return Min return a results.sort(key=sortkey, reverse=reverse) # This is a one-to-many class SOMultipleJoin(SOJoin): def __init__(self, addRemoveName=None, **kw): # addRemovePrefix is something like @@ SOJoin.__init__(self, **kw) # Here we generate the method names if not self.joinMethodName: name = self.otherClassName[0].lower() + self.otherClassName[1:] if name.endswith('s'): name = name + "es" else: name = name + "s" self.joinMethodName = name if addRemoveName: self.addRemoveName = addRemoveName else: self.addRemoveName = capword(self.otherClassName) def performJoin(self, inst): ids = inst._connection._SO_selectJoin( self.otherClass, self.joinColumn, inst.id) if inst.sqlmeta._perConnection: conn = inst._connection else: conn = None return self._applyOrderBy( [self.otherClass.get(id, conn) for (id,) in ids if id is not None], self.otherClass) def _dbNameToPythonName(self): for column in self.otherClass.sqlmeta.columns.values(): if column.dbName == self.joinColumn: return column.name return self.soClass.sqlmeta.style.dbColumnToPythonAttr(self.joinColumn) class MultipleJoin(Join): baseClass = SOMultipleJoin class SOSQLMultipleJoin(SOMultipleJoin): def performJoin(self, inst): if inst.sqlmeta._perConnection: conn = inst._connection else: conn = None pythonColumn = self._dbNameToPythonName() results = self.otherClass.select( getattr(self.otherClass.q, pythonColumn) == inst.id, connection=conn) return results.orderBy(self.orderBy) class SQLMultipleJoin(Join): baseClass = SOSQLMultipleJoin # This is a many-to-many join, with an intermediary table class SORelatedJoin(SOMultipleJoin): def __init__(self, otherColumn=None, intermediateTable=None, createRelatedTable=True, **kw): self.intermediateTable = intermediateTable self.otherColumn = otherColumn self.createRelatedTable = createRelatedTable SOMultipleJoin.__init__(self, **kw) classregistry.registry( self.soClass.sqlmeta.registry).addClassCallback( self.otherClassName, self._setOtherRelatedClass) def _setOtherRelatedClass(self, otherClass): if not self.intermediateTable: names = [self.soClass.sqlmeta.table, otherClass.sqlmeta.table] names.sort() self.intermediateTable = '%s_%s' % (names[0], names[1]) if not self.otherColumn: self.otherColumn = self.soClass.sqlmeta.style.tableReference( otherClass.sqlmeta.table) def hasIntermediateTable(self): return True def performJoin(self, inst): ids = inst._connection._SO_intermediateJoin( self.intermediateTable, self.otherColumn, self.joinColumn, inst.id) if inst.sqlmeta._perConnection: conn = inst._connection else: conn = None return self._applyOrderBy( [self.otherClass.get(id, conn) for (id,) in ids if id is not None], self.otherClass) def remove(self, inst, other): inst._connection._SO_intermediateDelete( self.intermediateTable, self.joinColumn, getID(inst), self.otherColumn, getID(other)) def add(self, inst, other): inst._connection._SO_intermediateInsert( self.intermediateTable, self.joinColumn, getID(inst), self.otherColumn, getID(other)) class RelatedJoin(MultipleJoin): baseClass = SORelatedJoin # helper classes to SQLRelatedJoin class OtherTableToJoin(sqlbuilder.SQLExpression): def __init__(self, otherTable, otherIdName, interTable, joinColumn): self.otherTable = otherTable self.otherIdName = otherIdName self.interTable = interTable self.joinColumn = joinColumn def tablesUsedImmediate(self): return [self.otherTable, self.interTable] def __sqlrepr__(self, db): return '%s.%s = %s.%s' % (self.otherTable, self.otherIdName, self.interTable, self.joinColumn) class JoinToTable(sqlbuilder.SQLExpression): def __init__(self, table, idName, interTable, joinColumn): self.table = table self.idName = idName self.interTable = interTable self.joinColumn = joinColumn def tablesUsedImmediate(self): return [self.table, self.interTable] def __sqlrepr__(self, db): return '%s.%s = %s.%s' % (self.interTable, self.joinColumn, self.table, self.idName) class TableToId(sqlbuilder.SQLExpression): def __init__(self, table, idName, idValue): self.table = table self.idName = idName self.idValue = idValue def tablesUsedImmediate(self): return [self.table] def __sqlrepr__(self, db): return '%s.%s = %s' % (self.table, self.idName, self.idValue) class SOSQLRelatedJoin(SORelatedJoin): def performJoin(self, inst): if inst.sqlmeta._perConnection: conn = inst._connection else: conn = None results = self.otherClass.select(sqlbuilder.AND( OtherTableToJoin( self.otherClass.sqlmeta.table, self.otherClass.sqlmeta.idName, self.intermediateTable, self.otherColumn ), JoinToTable( self.soClass.sqlmeta.table, self.soClass.sqlmeta.idName, self.intermediateTable, self.joinColumn ), TableToId(self.soClass.sqlmeta.table, self.soClass.sqlmeta.idName, inst.id), ), clauseTables=(self.soClass.sqlmeta.table, self.otherClass.sqlmeta.table, self.intermediateTable), connection=conn) return results.orderBy(self.orderBy) class SQLRelatedJoin(RelatedJoin): baseClass = SOSQLRelatedJoin class SOSingleJoin(SOMultipleJoin): def __init__(self, **kw): self.makeDefault = kw.pop('makeDefault', False) SOMultipleJoin.__init__(self, **kw) def performJoin(self, inst): if inst.sqlmeta._perConnection: conn = inst._connection else: conn = None pythonColumn = self._dbNameToPythonName() results = self.otherClass.select( getattr(self.otherClass.q, pythonColumn) == inst.id, connection=conn ) if results.count() == 0: if not self.makeDefault: return None else: kw = {self.soClass.sqlmeta.style. instanceIDAttrToAttr(pythonColumn): inst} # instanciating the otherClass with all return self.otherClass(**kw) else: return results[0] class SingleJoin(Join): baseClass = SOSingleJoin class SOManyToMany(object): def __init__(self, soClass, name, join, intermediateTable, joinColumn, otherColumn, createJoinTable, **attrs): self.name = name self.intermediateTable = intermediateTable self.joinColumn = joinColumn self.otherColumn = otherColumn self.createJoinTable = createJoinTable self.soClass = self.otherClass = None for name, value in attrs.items(): setattr(self, name, value) classregistry.registry( soClass.sqlmeta.registry).addClassCallback( join, self._setOtherClass) classregistry.registry( soClass.sqlmeta.registry).addClassCallback( soClass.__name__, self._setThisClass) def _setThisClass(self, soClass): self.soClass = soClass if self.soClass and self.otherClass: self._finishSet() def _setOtherClass(self, otherClass): self.otherClass = otherClass if self.soClass and self.otherClass: self._finishSet() def _finishSet(self): if self.intermediateTable is None: names = [self.soClass.sqlmeta.table, self.otherClass.sqlmeta.table] names.sort() self.intermediateTable = '%s_%s' % (names[0], names[1]) if not self.otherColumn: self.otherColumn = self.soClass.sqlmeta.style.tableReference( self.otherClass.sqlmeta.table) if not self.joinColumn: self.joinColumn = styles.getStyle( self.soClass).tableReference(self.soClass.sqlmeta.table) events.listen(self.event_CreateTableSignal, self.soClass, events.CreateTableSignal) events.listen(self.event_CreateTableSignal, self.otherClass, events.CreateTableSignal) self.clause = ( (self.otherClass.q.id == sqlbuilder.Field(self.intermediateTable, self.otherColumn)) & (sqlbuilder.Field(self.intermediateTable, self.joinColumn) == self.soClass.q.id)) def __get__(self, obj, type): if obj is None: return self query = ( (self.otherClass.q.id == sqlbuilder.Field(self.intermediateTable, self.otherColumn)) & (sqlbuilder.Field(self.intermediateTable, self.joinColumn) == obj.id)) select = self.otherClass.select(query) return _ManyToManySelectWrapper(obj, self, select) def event_CreateTableSignal(self, soClass, connection, extra_sql, post_funcs): if self.createJoinTable: post_funcs.append(self.event_CreateTableSignalPost) def event_CreateTableSignalPost(self, soClass, connection): if connection.tableExists(self.intermediateTable): return connection._SO_createJoinTable(self) class ManyToMany(boundattributes.BoundFactory): factory_class = SOManyToMany __restrict_attributes__ = ( 'join', 'intermediateTable', 'joinColumn', 'otherColumn', 'createJoinTable') __unpackargs__ = ('join',) # Default values: intermediateTable = None joinColumn = None otherColumn = None createJoinTable = True class _ManyToManySelectWrapper(object): def __init__(self, forObject, join, select): self.forObject = forObject self.join = join self.select = select def __getattr__(self, attr): # @@: This passes through private variable access too... should it? # Also magic methods, like __str__ return getattr(self.select, attr) def __repr__(self): return '<%s for: %s>' % (self.__class__.__name__, repr(self.select)) def __str__(self): return str(self.select) def __iter__(self): return iter(self.select) def __getitem__(self, key): return self.select[key] def add(self, obj): obj._connection._SO_intermediateInsert( self.join.intermediateTable, self.join.joinColumn, getID(self.forObject), self.join.otherColumn, getID(obj)) def remove(self, obj): obj._connection._SO_intermediateDelete( self.join.intermediateTable, self.join.joinColumn, getID(self.forObject), self.join.otherColumn, getID(obj)) def create(self, **kw): obj = self.join.otherClass(**kw) self.add(obj) return obj class SOOneToMany(object): def __init__(self, soClass, name, join, joinColumn, **attrs): self.soClass = soClass self.name = name self.joinColumn = joinColumn for name, value in attrs.items(): setattr(self, name, value) classregistry.registry( soClass.sqlmeta.registry).addClassCallback( join, self._setOtherClass) def _setOtherClass(self, otherClass): self.otherClass = otherClass if not self.joinColumn: self.joinColumn = styles.getStyle( self.soClass).tableReference(self.soClass.sqlmeta.table) self.clause = ( sqlbuilder.Field(self.otherClass.sqlmeta.table, self.joinColumn) == self.soClass.q.id) def __get__(self, obj, type): if obj is None: return self query = ( sqlbuilder.Field(self.otherClass.sqlmeta.table, self.joinColumn) == obj.id) select = self.otherClass.select(query) return _OneToManySelectWrapper(obj, self, select) class OneToMany(boundattributes.BoundFactory): factory_class = SOOneToMany __restrict_attributes__ = ( 'join', 'joinColumn') __unpackargs__ = ('join',) # Default values: joinColumn = None class _OneToManySelectWrapper(object): def __init__(self, forObject, join, select): self.forObject = forObject self.join = join self.select = select def __getattr__(self, attr): # @@: This passes through private variable access too... should it? # Also magic methods, like __str__ return getattr(self.select, attr) def __repr__(self): return '<%s for: %s>' % (self.__class__.__name__, repr(self.select)) def __str__(self): return str(self.select) def __iter__(self): return iter(self.select) def __getitem__(self, key): return self.select[key] def create(self, **kw): kw[self.join.joinColumn] = self.forObject.id return self.join.otherClass(**kw) SQLObject-3.4.0/sqlobject/conftest.py0000644000175000017500000000377012775273363017067 0ustar phdphd00000000000000"""This module is used by pytest to configure testing""" try: import pkg_resources except ImportError: pass else: pkg_resources.require('SQLObject') # Override some options (doesn't override command line): verbose = 0 exitfirst = True connectionShortcuts = { 'mysql': 'mysql://test@localhost/test', 'dbm': 'dbm:///data', 'postgres': 'postgres:///test', 'postgresql': 'postgres:///test', 'rdbhost': 'rdhbost://role:authcode@www.rdbhost.com/', 'pygresql': 'pygresql://localhost/test', 'sqlite': 'sqlite:/:memory:', 'sybase': 'sybase://test:test123@sybase/test?autoCommit=0', 'firebird': 'firebird://sysdba:masterkey@localhost/var/lib/firebird/data/test.gdb', 'mssql': 'mssql://sa:@127.0.0.1/test' } def pytest_addoption(parser): """Add the SQLObject options""" parser.addoption( '-D', '--Database', action="store", dest="Database", default='sqlite', help="The database to run the tests under (default sqlite). " "Can also use an alias from: %s" % (', '.join(connectionShortcuts.keys()))) parser.addoption( '-S', '--SQL', action="store_true", dest="show_sql", default=False, help="Show SQL from statements (when capturing stdout the " "SQL is only displayed when a test fails)") parser.addoption( '-O', '--SQL-output', action="store_true", dest="show_sql_output", default=False, help="Show output from SQL statements (when capturing " "stdout the output is only displayed when a test fails)") parser.addoption( '-E', '--events', action="store_true", dest="debug_events", default=False, help="Debug events (print information about events as they are " "sent)") option = None def pytest_configure(config): """Make cmdline arguments available to dbtest""" global option option = config.option def setup_tests(): if option.debug_events: from sqlobject import events events.debug_events() SQLObject-3.4.0/sqlobject/firebird/0000755000175000017500000000000013141371614016431 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/firebird/__init__.py0000644000175000017500000000032512464733165020554 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import firebirdconnection return firebirdconnection.FirebirdConnection registerConnection(['firebird', 'interbase'], builder) SQLObject-3.4.0/sqlobject/firebird/firebirdconnection.py0000644000175000017500000004456113036405001022652 0ustar phdphd00000000000000import os import re import warnings from sqlobject import col from sqlobject.dbconnection import DBAPI class FirebirdConnection(DBAPI): supportTransactions = False dbName = 'firebird' schemes = [dbName] limit_re = re.compile('^\s*(select )(.*)', re.IGNORECASE) def __init__(self, host, db, port='3050', user='sysdba', password='masterkey', autoCommit=1, dialect=None, role=None, charset=None, **kw): drivers = kw.pop('driver', None) or 'fdb,kinterbasdb' for driver in drivers.split(','): driver = driver.strip() if not driver: continue try: if driver == 'fdb': import fdb self.module = fdb elif driver == 'kinterbasdb': import kinterbasdb # See # http://kinterbasdb.sourceforge.net/dist_docs/usage.html # for an explanation; in short: use datetime, decimal and # unicode. kinterbasdb.init(type_conv=200) self.module = kinterbasdb elif driver in ('firebirdsql', 'pyfirebirdsql'): import firebirdsql self.module = firebirdsql else: raise ValueError( 'Unknown FireBird driver "%s", ' 'expected fdb, kinterbasdb or firebirdsql' % driver) except ImportError: pass else: break else: raise ImportError( 'Cannot find an FireBird driver, tried %s' % drivers) self.host = host self.port = port self.db = db self.user = user self.password = password if dialect: self.dialect = int(dialect) else: self.dialect = None self.role = role if charset: # Encoding defined by user in the connection string. self.dbEncoding = charset.replace('-', '') else: self.dbEncoding = charset # Encoding defined during database creation and stored in the database. self.defaultDbEncoding = '' DBAPI.__init__(self, **kw) @classmethod def _connectionFromParams(cls, auth, password, host, port, path, args): if not password: password = 'masterkey' if not auth: auth = 'sysdba' # check for alias using if (path[0] == '/') and path[-3:].lower() not in ('fdb', 'gdb'): path = path[1:] path = path.replace('/', os.sep) return cls(host, port=port, db=path, user=auth, password=password, **args) def _runWithConnection(self, meth, *args): if not self.autoCommit: return DBAPI._runWithConnection(self, meth, args) conn = self.getConnection() # @@: Horrible auto-commit implementation. Just horrible! try: conn.begin() except self.module.ProgrammingError: pass try: val = meth(conn, *args) try: conn.commit() except self.module.ProgrammingError: pass finally: self.releaseConnection(conn) return val def _setAutoCommit(self, conn, auto): # Only _runWithConnection does "autocommit", so we don't # need to worry about that. pass def makeConnection(self): extra = {} if self.dialect: extra['dialect'] = self.dialect return self.module.connect( host=self.host, port=self.port, database=self.db, user=self.user, password=self.password, role=self.role, charset=self.dbEncoding, **extra ) def _queryInsertID(self, conn, soInstance, id, names, values): """Firebird uses 'generators' to create new ids for a table. The users needs to create a generator named GEN_ for each table this method to work.""" table = soInstance.sqlmeta.table idName = soInstance.sqlmeta.idName sequenceName = soInstance.sqlmeta.idSequence or 'GEN_%s' % table c = conn.cursor() if id is None: c.execute('SELECT gen_id(%s,1) FROM rdb$database' % sequenceName) id = c.fetchone()[0] names = [idName] + names values = [id] + values q = self._insertSQL(table, names, values) if self.debug: self.printDebug(conn, q, 'QueryIns') c.execute(q) if self.debugOutput: self.printDebug(conn, id, 'QueryIns', 'result') return id @classmethod def _queryAddLimitOffset(cls, query, start, end): """Firebird slaps the limit and offset (actually 'first' and 'skip', respectively) statement right after the select.""" if not start: limit_str = "SELECT FIRST %i" % end if not end: limit_str = "SELECT SKIP %i" % start else: limit_str = "SELECT FIRST %i SKIP %i" % (end - start, start) match = cls.limit_re.match(query) if match and len(match.groups()) == 2: return ' '.join([limit_str, match.group(2)]) else: return query def createTable(self, soClass): self.query('CREATE TABLE %s (\n%s\n)' % (soClass.sqlmeta.table, self.createColumns(soClass))) self.query("CREATE GENERATOR GEN_%s" % soClass.sqlmeta.table) return [] def createReferenceConstraint(self, soClass, col): return None def createColumn(self, soClass, col): return col.firebirdCreateSQL() def createIDColumn(self, soClass): key_type = {int: "INT", str: "VARCHAR(255)"}[soClass.sqlmeta.idType] return '%s %s NOT NULL PRIMARY KEY' % (soClass.sqlmeta.idName, key_type) def createIndexSQL(self, soClass, index): return index.firebirdCreateIndexSQL(soClass) def joinSQLType(self, join): return 'INT NOT NULL' def tableExists(self, tableName): # there's something in the database by this name...let's # assume it's a table. By default, fb 1.0 stores EVERYTHING # it cares about in uppercase. result = self.queryOne( "SELECT COUNT(rdb$relation_name) FROM rdb$relations " "WHERE rdb$relation_name = '%s'" % tableName.upper()) return result[0] def addColumn(self, tableName, column): self.query('ALTER TABLE %s ADD %s' % (tableName, column.firebirdCreateSQL())) def dropTable(self, tableName, cascade=False): self.query("DROP TABLE %s" % tableName) self.query("DROP GENERATOR GEN_%s" % tableName) def delColumn(self, sqlmeta, column): self.query('ALTER TABLE %s DROP %s' % (sqlmeta.table, column.dbName)) def readDefaultEncodingFromDB(self): # Get out if encoding is known allready (can by None as well). if self.defaultDbEncoding == "": self.defaultDbEncoding = str(self.queryOne( "SELECT rdb$character_set_name FROM rdb$database")[0]. strip().lower()) # encoding defined during db creation if self.defaultDbEncoding == "none": self.defaultDbEncoding = None if self.dbEncoding != self.defaultDbEncoding: warningText = """\n Database charset: %s is different """ \ """from connection charset: %s.\n""" % ( self.defaultDbEncoding, self.dbEncoding) warnings.warn(warningText) # TODO: ??? print out the uri string, # so user can see what is going on warningText = """\n Every CHAR or VARCHAR field can (or, better: must) """ \ """have a character set defined in Firebird. In the case, field charset is not defined, """ \ """SQLObject try to use a db default encoding instead. Firebird is unable to transliterate between character sets. So you must set the correct values on the server """ \ "and on the client if everything is to work smoothely.\n" warnings.warn(warningText) if not self.dbEncoding: # defined by user in the connection string self.dbEncoding = self.defaultDbEncoding warningText = """\n encoding: %s will be used as default """ \ """for this connection\n""" % self.dbEncoding warnings.warn(warningText) def columnsFromSchema(self, tableName, soClass): """ Look at the given table and create Col instances (or subclasses of Col) for the fields it finds in that table. """ self.readDefaultEncodingFromDB() fieldQuery = """\ SELECT r.RDB$FIELD_NAME AS field_name, CASE f.RDB$FIELD_TYPE when 7 then 'smallint' when 8 then 'integer' when 16 then 'int64' when 9 then 'quad' when 10 then 'float' when 11 then 'd_float' when 17 then 'boolean' when 27 then 'double' when 12 then 'date' when 13 then 'time' when 35 then 'timestamp' when 261 then 'blob' when 37 then 'varchar' when 14 then 'char' when 40 then 'cstring' when 45 then 'blob_id' ELSE 'UNKNOWN' END AS field_type, case f.rdb$field_type when 7 then case f.rdb$field_sub_type when 1 then 'numeric' when 2 then 'decimal' end when 8 then case f.rdb$field_sub_type when 1 then 'numeric' when 2 then 'decimal' end when 16 then case f.rdb$field_sub_type when 1 then 'numeric' when 2 then 'decimal' else 'bigint' end when 14 then case f.rdb$field_sub_type when 0 then 'unspecified' when 1 then 'binary' when 3 then 'acl' else case when f.rdb$field_sub_type is null then 'unspecified' end end when 37 then case f.rdb$field_sub_type when 0 then 'unspecified' when 1 then 'text' when 3 then 'acl' else case when f.rdb$field_sub_type is null then 'unspecified' end end when 261 then case f.rdb$field_sub_type when 0 then 'unspecified' when 1 then 'text' when 2 then 'blr' when 3 then 'acl' when 4 then 'reserved' when 5 then 'encoded-meta-data' when 6 then 'irregular-finished-multi-db-tx' when 7 then 'transactional_description' when 8 then 'external_file_description' end end as "ActualSubType", f.RDB$FIELD_LENGTH AS field_length, f.RDB$FIELD_PRECISION AS field_precision, f.RDB$FIELD_SCALE AS field_scale, cset.RDB$CHARACTER_SET_NAME AS field_charset, coll.RDB$COLLATION_NAME AS field_collation, r.rdb$default_source, r.RDB$NULL_FLAG AS field_not_null_constraint, r.RDB$DESCRIPTION AS field_description FROM RDB$RELATION_FIELDS r LEFT JOIN RDB$FIELDS f ON r.RDB$FIELD_SOURCE = f.RDB$FIELD_NAME LEFT JOIN RDB$COLLATIONS coll ON f.RDB$COLLATION_ID = coll.RDB$COLLATION_ID LEFT JOIN RDB$CHARACTER_SETS cset ON f.RDB$CHARACTER_SET_ID = cset.RDB$CHARACTER_SET_ID WHERE r.RDB$RELATION_NAME='%s' -- table name ORDER BY r.RDB$FIELD_POSITION """ colData = self.queryAll(fieldQuery % tableName.upper()) results = [] for (field, fieldType, fieldSubtype, fieldLength, fieldPrecision, fieldScale, fieldCharset, collationName, defaultSource, fieldNotNullConstraint, fieldDescription) in colData: field = field.strip().lower() fieldType = fieldType.strip() if fieldCharset: fieldCharset = str(fieldCharset.strip()) # 'UNICODE_FSS' is less strict # Firebird/Interbase UTF8 definition if fieldCharset.startswith('UNICODE_FSS'): fieldCharset = "UTF8" if fieldSubtype: fieldSubtype = fieldSubtype.strip() if fieldType == "int64": fieldType = fieldSubtype # can look like: "DEFAULT 0", "DEFAULT 'default text'", None if defaultSource: defaultSource = defaultSource.split(' ')[1] if defaultSource.startswith("'") and \ defaultSource.endswith("'"): defaultSource = str(defaultSource[1:-1]) elif fieldType in ("integer", "smallint", "bigint"): defaultSource = int(defaultSource) elif fieldType in ("float", "double"): defaultSource = float(defaultSource) # TODO: other types for defaultSource # elif fieldType == "datetime": idName = str(soClass.sqlmeta.idName or 'id').upper() if field.upper() == idName: continue if fieldScale: # PRECISION refers to the total number of digits, # and SCALE refers to the number of digits # to the right of the decimal point. # Both numbers can be from 1 to 18 (SQL dialect 1: 1-15), # but SCALE mustbe less than or equal to PRECISION. if fieldScale > fieldLength: fieldScale = fieldLength colClass, kw = self.guessClass(fieldType, fieldLength, fieldCharset, fieldScale) kw['name'] = str( soClass.sqlmeta.style.dbColumnToPythonAttr(field).strip()) kw['dbName'] = str(field) kw['notNone'] = not fieldNotNullConstraint kw['default'] = defaultSource results.append(colClass(**kw)) return results def guessClass(self, t, flength, fCharset, fscale=None): """ An internal method that tries to figure out what Col subclass is appropriate given whatever introspective information is available -- both very database-specific. """ # TODO: check if negative values are allowed for fscale if t == 'smallint': # -32,768 to +32,767, 16 bits return col.IntCol, {} elif t == 'integer': # -2,147,483,648 to +2,147,483,647, 32 bits return col.IntCol, {} elif t == 'bigint': # -2^63 to 2^63-1 or # -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807, 64 bits return col.IntCol, {} elif t == 'float': # 32 bits, 3.4x10^-38 to 3.4x10^38, 7 digit precision # (7 significant decimals) return col.FloatCol, {} elif t == 'double': # 64 bits, 1.7x10^-308 to 1.7x10^308, 15 digit precision # (15 significant decimals) return col.FloatCol, {} elif t == 'numeric': # Numeric and Decimal are internally stored as smallint, # integer or bigint depending on the size. # They can handle up to 18 digits. if (not flength or not fscale): # If neither PRECISION nor SCALE are specified, # Firebird/InterBase defines the column as INTEGER # instead of NUMERIC and stores only the integer portion # of the value return col.IntCol, {} # check if negative values are allowed for fscale return col.DecimalCol, {'size': flength, 'precision': fscale} elif t == 'decimal': # Check if negative values are allowed for fscale return col.DecimalCol, {'size': flength, 'precision': fscale} elif t == 'date': # 32 bits, 1 Jan 100. to 29 Feb 32768. return col.DateCol, {} elif t == 'time': # 32 bits, 00:00 to 23:59.9999 return col.TimeCol, {} elif t == 'timestamp': # 64 bits, 1 Jan 100 to 28 Feb 32768. return col.DateTimeCol, {} elif t == 'char': # 32767 bytes if fCharset and (fCharset != "NONE"): return col.UnicodeCol, {'length': flength, 'varchar': False, 'dbEncoding': fCharset} elif self.dbEncoding: return col.UnicodeCol, {'length': flength, 'varchar': False, 'dbEncoding': self.dbEncoding} else: return col.StringCol, {'length': flength, 'varchar': False} elif t == 'varchar': # 32767 bytes if fCharset and (fCharset != "NONE"): return col.UnicodeCol, {'length': flength, 'varchar': True, 'dbEncoding': fCharset} elif self.dbEncoding: return col.UnicodeCol, {'length': flength, 'varchar': True, 'dbEncoding': self.dbEncoding} else: return col.StringCol, {'length': flength, 'varchar': True} elif t == 'blob': # 32GB return col.BLOBCol, {} else: return col.Col, {} def createEmptyDatabase(self): self.module.create_database( "CREATE DATABASE '%s' user '%s' password '%s'" % (self.db, self.user, self.password)) def dropDatabase(self): self.module.drop_database() SQLObject-3.4.0/sqlobject/sybase/0000755000175000017500000000000013141371614016131 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/sybase/__init__.py0000644000175000017500000000030012464733165020245 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import sybaseconnection return sybaseconnection.SybaseConnection registerConnection(['sybase'], builder) SQLObject-3.4.0/sqlobject/sybase/sybaseconnection.py0000644000175000017500000001361413037500317022054 0ustar phdphd00000000000000from sqlobject.dbconnection import DBAPI from sqlobject import col class SybaseConnection(DBAPI): supportTransactions = False dbName = 'sybase' schemes = [dbName] NumericType = None def __init__(self, db, user, password='', host='localhost', port=None, locking=1, **kw): db = db.strip('/') import Sybase Sybase._ctx.debug = 0 if SybaseConnection.NumericType is None: from Sybase import NumericType SybaseConnection.NumericType = NumericType from sqlobject.converters import registerConverter, IntConverter registerConverter(NumericType, IntConverter) self.module = Sybase self.locking = int(locking) self.host = host self.port = port self.db = db self.user = user self.password = password autoCommit = kw.get('autoCommit') if autoCommit: autoCommit = int(autoCommit) else: autoCommit = None kw['autoCommit'] = autoCommit DBAPI.__init__(self, **kw) @classmethod def _connectionFromParams(cls, user, password, host, port, path, args): return cls(user=user, password=password, host=host or 'localhost', port=port, db=path, **args) def insert_id(self, conn): """ Sybase adapter/cursor does not support the insert_id method. """ c = conn.cursor() c.execute('SELECT @@IDENTITY') return c.fetchone()[0] def makeConnection(self): return self.module.connect(self.host, self.user, self.password, database=self.db, auto_commit=self.autoCommit, locking=self.locking) HAS_IDENTITY = """ SELECT col.name, col.status, obj.name FROM syscolumns col JOIN sysobjects obj ON obj.id = col.id WHERE obj.name = '%s' AND (col.status & 0x80) = 0x80 """ def _hasIdentity(self, conn, table): query = self.HAS_IDENTITY % table c = conn.cursor() c.execute(query) r = c.fetchone() return r is not None def _queryInsertID(self, conn, soInstance, id, names, values): table = soInstance.sqlmeta.table idName = soInstance.sqlmeta.idName c = conn.cursor() if id is not None: names = [idName] + names values = [id] + values has_identity = self._hasIdentity(conn, table) identity_insert_on = False if has_identity and (id is not None): identity_insert_on = True c.execute('SET IDENTITY_INSERT %s ON' % table) if names and values: q = self._insertSQL(table, names, values) else: q = "INSERT INTO %s DEFAULT VALUES" % table if self.debug: self.printDebug(conn, q, 'QueryIns') c.execute(q) if has_identity and identity_insert_on: c.execute('SET IDENTITY_INSERT %s OFF' % table) if id is None: id = self.insert_id(conn) if self.debugOutput: self.printDebug(conn, id, 'QueryIns', 'result') return id @classmethod def _queryAddLimitOffset(cls, query, start, end): # XXX Sybase doesn't support OFFSET if end: return "SET ROWCOUNT %i %s SET ROWCOUNT 0" % (end, query) return query def createReferenceConstraint(self, soClass, col): return None def createColumn(self, soClass, col): return col.sybaseCreateSQL() def createIDColumn(self, soClass): key_type = {int: "NUMERIC(18,0)", str: "TEXT"}[soClass.sqlmeta.idType] return '%s %s IDENTITY UNIQUE' % (soClass.sqlmeta.idName, key_type) def createIndexSQL(self, soClass, index): return index.sybaseCreateIndexSQL(soClass) def joinSQLType(self, join): return 'NUMERIC(18,0) NOT NULL' SHOW_TABLES = "SELECT name FROM sysobjects WHERE type='U'" def tableExists(self, tableName): for (table,) in self.queryAll(self.SHOW_TABLES): if table.lower() == tableName.lower(): return True return False def addColumn(self, tableName, column): self.query('ALTER TABLE %s ADD COLUMN %s' % (tableName, column.sybaseCreateSQL())) def delColumn(self, sqlmeta, column): self.query( 'ALTER TABLE %s DROP COLUMN %s' % (sqlmeta.table, column.dbName)) SHOW_COLUMNS = ('SELECT ' 'COLUMN_NAME, DATA_TYPE, IS_NULLABLE, COLUMN_DEFAULT ' 'FROM INFORMATION_SCHEMA.COLUMNS ' 'WHERE TABLE_NAME = \'%s\'') def columnsFromSchema(self, tableName, soClass): colData = self.queryAll(self.SHOW_COLUMNS % tableName) results = [] for field, t, nullAllowed, default in colData: if field == soClass.sqlmeta.idName: continue colClass, kw = self.guessClass(t) kw['name'] = soClass.sqlmeta.style.dbColumnToPythonAttr(field) kw['dbName'] = field kw['notNone'] = not nullAllowed kw['default'] = default # @@ skip key... # @@ skip extra... kw['forceDBName'] = True results.append(colClass(**kw)) return results def _setAutoCommit(self, conn, auto): conn.auto_commit = auto def guessClass(self, t): if t.startswith('int'): return col.IntCol, {} elif t.startswith('varchar'): return col.StringCol, {'length': int(t[8:-1])} elif t.startswith('char'): return col.StringCol, {'length': int(t[5:-1]), 'varchar': False} elif t.startswith('datetime'): return col.DateTimeCol, {} else: return col.Col, {} SQLObject-3.4.0/sqlobject/styles.py0000644000175000017500000001067012463427161016551 0ustar phdphd00000000000000import re __all__ = ["Style", "MixedCaseUnderscoreStyle", "DefaultStyle", "MixedCaseStyle"] class Style(object): """ The base Style class, and also the simplest implementation. No translation occurs -- column names and attribute names match, as do class names and table names (when using auto class or schema generation). """ def __init__(self, pythonAttrToDBColumn=None, dbColumnToPythonAttr=None, pythonClassToDBTable=None, dbTableToPythonClass=None, idForTable=None, longID=False): if pythonAttrToDBColumn: self.pythonAttrToDBColumn = \ lambda a, s=self: pythonAttrToDBColumn(s, a) if dbColumnToPythonAttr: self.dbColumnToPythonAttr = \ lambda a, s=self: dbColumnToPythonAttr(s, a) if pythonClassToDBTable: self.pythonClassToDBTable = \ lambda a, s=self: pythonClassToDBTable(s, a) if dbTableToPythonClass: self.dbTableToPythonClass = \ lambda a, s=self: dbTableToPythonClass(s, a) if idForTable: self.idForTable = lambda a, s=self: idForTable(s, a) self.longID = longID def pythonAttrToDBColumn(self, attr): return attr def dbColumnToPythonAttr(self, col): return col def pythonClassToDBTable(self, className): return className def dbTableToPythonClass(self, table): return table def idForTable(self, table): if self.longID: return self.tableReference(table) else: return 'id' def pythonClassToAttr(self, className): return lowerword(className) def instanceAttrToIDAttr(self, attr): return attr + "ID" def instanceIDAttrToAttr(self, attr): return attr[:-2] def tableReference(self, table): return table + "_id" class MixedCaseUnderscoreStyle(Style): """ This is the default style. Python attributes use mixedCase, while database columns use underscore_separated. """ def pythonAttrToDBColumn(self, attr): return mixedToUnder(attr) def dbColumnToPythonAttr(self, col): return underToMixed(col) def pythonClassToDBTable(self, className): return className[0].lower() \ + mixedToUnder(className[1:]) def dbTableToPythonClass(self, table): return table[0].upper() \ + underToMixed(table[1:]) def pythonClassToDBTableReference(self, className): return self.tableReference(self.pythonClassToDBTable(className)) def tableReference(self, table): return table + "_id" DefaultStyle = MixedCaseUnderscoreStyle class MixedCaseStyle(Style): """ This style leaves columns as mixed-case, and uses long ID names (like ProductID instead of simply id). """ def pythonAttrToDBColumn(self, attr): return capword(attr) def dbColumnToPythonAttr(self, col): return lowerword(col) def dbTableToPythonClass(self, table): return capword(table) def tableReference(self, table): return table + "ID" defaultStyle = DefaultStyle() def getStyle(soClass, dbConnection=None): if dbConnection is None: if hasattr(soClass, '_connection'): dbConnection = soClass._connection if hasattr(soClass.sqlmeta, 'style') and soClass.sqlmeta.style: return soClass.sqlmeta.style elif dbConnection and dbConnection.style: return dbConnection.style else: return defaultStyle ############################################################ # Text utilities ############################################################ _mixedToUnderRE = re.compile(r'[A-Z]+') def mixedToUnder(s): if s.endswith('ID'): return mixedToUnder(s[:-2] + "_id") trans = _mixedToUnderRE.sub(mixedToUnderSub, s) if trans.startswith('_'): trans = trans[1:] return trans def mixedToUnderSub(match): m = match.group(0).lower() if len(m) > 1: return '_%s_%s' % (m[:-1], m[-1]) else: return '_%s' % m def capword(s): return s[0].upper() + s[1:] def lowerword(s): return s[0].lower() + s[1:] _underToMixedRE = re.compile('_.') def underToMixed(name): if name.endswith('_id'): return underToMixed(name[:-3] + "ID") return _underToMixedRE.sub(lambda m: m.group(0)[1].upper(), name) SQLObject-3.4.0/sqlobject/sqlite/0000755000175000017500000000000013141371614016144 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/sqlite/__init__.py0000644000175000017500000000030012464733165020260 0ustar phdphd00000000000000from sqlobject.dbconnection import registerConnection def builder(): from . import sqliteconnection return sqliteconnection.SQLiteConnection registerConnection(['sqlite'], builder) SQLObject-3.4.0/sqlobject/sqlite/sqliteconnection.py0000644000175000017500000003776513113104506022112 0ustar phdphd00000000000000import base64 import os try: from _thread import get_ident except ImportError: from thread import get_ident try: from urllib import quote except ImportError: from urllib.parse import quote from sqlobject import col from sqlobject import dberrors from sqlobject.dbconnection import DBAPI, Boolean sqlite2_Binary = None class ErrorMessage(str): def __new__(cls, e): obj = str.__new__(cls, e.args[0]) obj.code = None obj.module = e.__module__ obj.exception = e.__class__.__name__ return obj class SQLiteConnection(DBAPI): supportTransactions = True dbName = 'sqlite' schemes = [dbName] def __init__(self, filename, autoCommit=1, **kw): drivers = kw.pop('driver', None) or 'pysqlite2,sqlite3,sqlite' for driver in drivers.split(','): driver = driver.strip() if not driver: continue try: if driver in ('sqlite2', 'pysqlite2'): from pysqlite2 import dbapi2 as sqlite self.using_sqlite2 = True elif driver == 'sqlite3': import sqlite3 as sqlite self.using_sqlite2 = True elif driver in ('sqlite', 'sqlite1'): import sqlite self.using_sqlite2 = False else: raise ValueError( 'Unknown SQLite driver "%s", ' 'expected pysqlite2, sqlite3 or sqlite' % driver) except ImportError: pass else: break else: raise ImportError( 'Cannot find an SQLite driver, tried %s' % drivers) if self.using_sqlite2: sqlite.encode = base64.b64encode sqlite.decode = base64.b64decode self.module = sqlite self.filename = filename # full path to sqlite-db-file self._memory = filename == ':memory:' if self._memory and not self.using_sqlite2: raise ValueError("You must use sqlite2 to use in-memory databases") # connection options opts = {} if self.using_sqlite2: if autoCommit: opts["isolation_level"] = None global sqlite2_Binary if sqlite2_Binary is None: sqlite2_Binary = sqlite.Binary sqlite.Binary = lambda s: sqlite2_Binary(sqlite.encode(s)) if 'factory' in kw: factory = kw.pop('factory') if isinstance(factory, str): factory = globals()[factory] opts['factory'] = factory(sqlite) else: opts['autocommit'] = Boolean(autoCommit) if 'encoding' in kw: opts['encoding'] = kw.pop('encoding') if 'mode' in kw: opts['mode'] = int(kw.pop('mode'), 0) if 'timeout' in kw: if self.using_sqlite2: opts['timeout'] = float(kw.pop('timeout')) else: opts['timeout'] = int(float(kw.pop('timeout')) * 1000) if 'check_same_thread' in kw: opts["check_same_thread"] = Boolean(kw.pop('check_same_thread')) # use only one connection for sqlite - supports multiple) # cursors per connection self._connOptions = opts self.use_table_info = Boolean(kw.pop("use_table_info", True)) DBAPI.__init__(self, **kw) self._threadPool = {} self._threadOrigination = {} if self._memory: self.makeMemoryConnection() @classmethod def _connectionFromParams(cls, user, password, host, port, path, args): assert host is None and port is None, ( "SQLite can only be used locally (with a URI like " "sqlite:/file or sqlite:///file, not sqlite://%s%s)" % (host, port and ':%r' % port or '')) assert user is None and password is None, ( "You may not provide usernames or passwords for SQLite " "databases") if path == "/:memory:": path = ":memory:" return cls(filename=path, **args) def oldUri(self): path = self.filename if path == ":memory:": path = "/:memory:" else: path = "//" + path return 'sqlite:%s' % path def uri(self): path = self.filename if path == ":memory:": path = "/:memory:" else: if path.startswith('/'): path = "//" + path else: path = "///" + path path = quote(path) return 'sqlite:%s' % path def getConnection(self): # SQLite can't share connections between threads, and so can't # pool connections. Since we are isolating threads here, we # don't have to worry about locking as much. if self._memory: conn = self.makeConnection() self._connectionNumbers[id(conn)] = self._connectionCount self._connectionCount += 1 return conn threadid = get_ident() if (self._pool is not None and threadid in self._threadPool): conn = self._threadPool[threadid] del self._threadPool[threadid] if conn in self._pool: self._pool.remove(conn) else: conn = self.makeConnection() if self._pool is not None: self._threadOrigination[id(conn)] = threadid self._connectionNumbers[id(conn)] = self._connectionCount self._connectionCount += 1 if self.debug: s = 'ACQUIRE' if self._pool is not None: s += ' pool=[%s]' % ', '.join( [str(self._connectionNumbers[id(v)]) for v in self._pool]) self.printDebug(conn, s, 'Pool') return conn def releaseConnection(self, conn, explicit=False): if self._memory: return threadid = self._threadOrigination.get(id(conn)) DBAPI.releaseConnection(self, conn, explicit=explicit) if (self._pool is not None and threadid and threadid not in self._threadPool): self._threadPool[threadid] = conn else: if self._pool and conn in self._pool: self._pool.remove(conn) conn.close() def _setAutoCommit(self, conn, auto): if self.using_sqlite2: if auto: conn.isolation_level = None else: conn.isolation_level = "" else: conn.autocommit = auto def _setIsolationLevel(self, conn, level): if not self.using_sqlite2: return conn.isolation_level = level def makeMemoryConnection(self): self._memoryConn = self.module.connect( self.filename, **self._connOptions) # Convert text data from SQLite to str, not unicode - # SQLObject converts it to unicode itself. self._memoryConn.text_factory = str def makeConnection(self): if self._memory: return self._memoryConn conn = self.module.connect(self.filename, **self._connOptions) conn.text_factory = str # Convert text data to str, not unicode return conn def close(self): DBAPI.close(self) self._threadPool = {} if self._memory: self._memoryConn.close() self.makeMemoryConnection() def _executeRetry(self, conn, cursor, query): if self.debug: self.printDebug(conn, query, 'QueryR') try: return cursor.execute(query) except self.module.OperationalError as e: raise dberrors.OperationalError(ErrorMessage(e)) except self.module.IntegrityError as e: msg = ErrorMessage(e) if msg.startswith('column') and msg.endswith('not unique') \ or msg.startswith('UNIQUE constraint failed:'): raise dberrors.DuplicateEntryError(msg) else: raise dberrors.IntegrityError(msg) except self.module.InternalError as e: raise dberrors.InternalError(ErrorMessage(e)) except self.module.ProgrammingError as e: raise dberrors.ProgrammingError(ErrorMessage(e)) except self.module.DataError as e: raise dberrors.DataError(ErrorMessage(e)) except self.module.NotSupportedError as e: raise dberrors.NotSupportedError(ErrorMessage(e)) except self.module.DatabaseError as e: raise dberrors.DatabaseError(ErrorMessage(e)) except self.module.InterfaceError as e: raise dberrors.InterfaceError(ErrorMessage(e)) except self.module.Warning as e: raise Warning(ErrorMessage(e)) except self.module.Error as e: raise dberrors.Error(ErrorMessage(e)) def _queryInsertID(self, conn, soInstance, id, names, values): table = soInstance.sqlmeta.table idName = soInstance.sqlmeta.idName c = conn.cursor() if id is not None: names = [idName] + names values = [id] + values q = self._insertSQL(table, names, values) if self.debug: self.printDebug(conn, q, 'QueryIns') self._executeRetry(conn, c, q) # lastrowid is a DB-API extension from "PEP 0249": if id is None: id = int(c.lastrowid) if self.debugOutput: self.printDebug(conn, id, 'QueryIns', 'result') return id def _insertSQL(self, table, names, values): if not names: assert not values # INSERT INTO table () VALUES () isn't allowed in # SQLite (though it is in other databases) return ("INSERT INTO %s VALUES (NULL)" % table) else: return DBAPI._insertSQL(self, table, names, values) @classmethod def _queryAddLimitOffset(cls, query, start, end): if not start: return "%s LIMIT %i" % (query, end) if not end: return "%s LIMIT 0 OFFSET %i" % (query, start) return "%s LIMIT %i OFFSET %i" % (query, end - start, start) def createColumn(self, soClass, col): return col.sqliteCreateSQL() def createReferenceConstraint(self, soClass, col): return None def createIDColumn(self, soClass): return self._createIDColumn(soClass.sqlmeta) def _createIDColumn(self, sqlmeta): if sqlmeta.idType == str: return '%s TEXT PRIMARY KEY' % sqlmeta.idName return '%s INTEGER PRIMARY KEY AUTOINCREMENT' % sqlmeta.idName def joinSQLType(self, join): return 'INT NOT NULL' def tableExists(self, tableName): result = self.queryOne( "SELECT tbl_name FROM sqlite_master " "WHERE type='table' AND tbl_name = '%s'" % tableName) # turn it into a boolean: return not not result def createIndexSQL(self, soClass, index): return index.sqliteCreateIndexSQL(soClass) def addColumn(self, tableName, column): self.query('ALTER TABLE %s ADD COLUMN %s' % (tableName, column.sqliteCreateSQL())) self.query('VACUUM') def delColumn(self, sqlmeta, column): self.recreateTableWithoutColumn(sqlmeta, column) def recreateTableWithoutColumn(self, sqlmeta, column): new_name = sqlmeta.table + '_ORIGINAL' self.query('ALTER TABLE %s RENAME TO %s' % (sqlmeta.table, new_name)) cols = [self._createIDColumn(sqlmeta)] + \ [self.createColumn(None, col) for col in sqlmeta.columnList if col.name != column.name] cols = ",\n".join([" %s" % c for c in cols]) self.query('CREATE TABLE %s (\n%s\n)' % (sqlmeta.table, cols)) all_columns = ', '.join( [sqlmeta.idName] + [col.dbName for col in sqlmeta.columnList]) self.query('INSERT INTO %s (%s) SELECT %s FROM %s' % ( sqlmeta.table, all_columns, all_columns, new_name)) self.query('DROP TABLE %s' % new_name) def columnsFromSchema(self, tableName, soClass): if self.use_table_info: return self._columnsFromSchemaTableInfo(tableName, soClass) else: return self._columnsFromSchemaParse(tableName, soClass) def _columnsFromSchemaTableInfo(self, tableName, soClass): colData = self.queryAll("PRAGMA table_info(%s)" % tableName) results = [] for index, field, t, nullAllowed, default, key in colData: if field == soClass.sqlmeta.idName: continue colClass, kw = self.guessClass(t) if default == 'NULL': nullAllowed = True default = None kw['name'] = soClass.sqlmeta.style.dbColumnToPythonAttr(field) kw['dbName'] = field kw['notNone'] = not nullAllowed kw['default'] = default # @@ skip key... # @@ skip extra... results.append(colClass(**kw)) return results def _columnsFromSchemaParse(self, tableName, soClass): colData = self.queryOne( "SELECT sql FROM sqlite_master WHERE type='table' AND name='%s'" % tableName) if not colData: raise ValueError( 'The table %s was not found in the database. Load failed.' % tableName) colData = colData[0].split('(', 1)[1].strip()[:-2] while True: start = colData.find('(') if start == -1: break end = colData.find(')', start) if end == -1: break colData = colData[:start] + colData[end + 1:] results = [] for colDesc in colData.split(','): parts = colDesc.strip().split(' ', 2) field = parts[0].strip() # skip comments if field.startswith('--'): continue # get rid of enclosing quotes if field[0] == field[-1] == '"': field = field[1:-1] if field == getattr(soClass.sqlmeta, 'idName', 'id'): continue colClass, kw = self.guessClass(parts[1].strip()) if len(parts) == 2: index_info = '' else: index_info = parts[2].strip().upper() kw['name'] = soClass.sqlmeta.style.dbColumnToPythonAttr(field) kw['dbName'] = field import re nullble = re.search(r'(\b\S*)\sNULL', index_info) default = re.search( r"DEFAULT\s((?:\d[\dA-FX.]*)|(?:'[^']*')|(?:#[^#]*#))", index_info) kw['notNone'] = nullble and nullble.group(1) == 'NOT' kw['default'] = default and default.group(1) # @@ skip key... # @@ skip extra... results.append(colClass(**kw)) return results def guessClass(self, t): t = t.upper() if t.find('INT') >= 0: return col.IntCol, {} elif t.find('TEXT') >= 0 or t.find('CHAR') >= 0 or t.find('CLOB') >= 0: return col.StringCol, {'length': 2 ** 32 - 1} elif t.find('BLOB') >= 0: return col.BLOBCol, {"length": 2 ** 32 - 1} elif t.find('REAL') >= 0 or t.find('FLOAT') >= 0: return col.FloatCol, {} elif t.find('DECIMAL') >= 0: return col.DecimalCol, {'size': None, 'precision': None} elif t.find('BOOL') >= 0: return col.BoolCol, {} else: return col.Col, {} def listTables(self): return [v[0] for v in self.queryAll( "SELECT name FROM sqlite_master WHERE type='table' ORDER BY name")] def listDatabases(self): # The pragma returns a list of (index, name, filename) return [v[1] for v in self.queryAll("PRAGMA database_list")] def createEmptyDatabase(self): if self._memory: return open(self.filename, 'w').close() def dropDatabase(self): if self._memory: return os.unlink(self.filename) SQLObject-3.4.0/sqlobject/constraints.py0000644000175000017500000000367112503207731017572 0ustar phdphd00000000000000""" Constraints """ from sqlobject.compat import PY2 if not PY2: # alias for python 3 compatability long = int class BadValue(ValueError): def __init__(self, desc, obj, col, value, *args): self.desc = desc self.col = col # I want these objects to be garbage-collectable, so # I just keep their repr: self.obj = repr(obj) self.value = repr(value) fullDesc = "%s.%s %s (you gave: %s)" \ % (obj, col.name, desc, value) ValueError.__init__(self, fullDesc, *args) def isString(obj, col, value): if not isinstance(value, str): raise BadValue("only allows strings", obj, col, value) def notNull(obj, col, value): if value is None: raise BadValue("is defined NOT NULL", obj, col, value) def isInt(obj, col, value): if not isinstance(value, (int, long)): raise BadValue("only allows integers", obj, col, value) def isFloat(obj, col, value): if not isinstance(value, (int, long, float)): raise BadValue("only allows floating point numbers", obj, col, value) def isBool(obj, col, value): if not isinstance(value, bool): raise BadValue("only allows booleans", obj, col, value) class InList: def __init__(self, l): self.list = l def __call__(self, obj, col, value): if value not in self.list: raise BadValue("accepts only values in %s" % repr(self.list), obj, col, value) class MaxLength: def __init__(self, length): self.length = length def __call__(self, obj, col, value): try: length = len(value) except TypeError: raise BadValue("object does not have a length", obj, col, value) if length > self.length: raise BadValue("must be shorter in length than %s" % self.length, obj, col, value) SQLObject-3.4.0/sqlobject/dberrors.py0000644000175000017500000000135212747416061017047 0ustar phdphd00000000000000"""dberrors: database exception classes for SQLObject. These classes are dictated by the DB API v2.0, see: https://wiki.python.org/moin/DatabaseProgramming """ from sqlobject.compat import PY2 if not PY2: StandardError = Exception class Error(StandardError): pass class Warning(StandardError): pass class InterfaceError(Error): pass class DatabaseError(Error): pass class InternalError(DatabaseError): pass class OperationalError(DatabaseError): pass class ProgrammingError(DatabaseError): pass class IntegrityError(DatabaseError): pass class DataError(DatabaseError): pass class NotSupportedError(DatabaseError): pass class DuplicateEntryError(IntegrityError): pass SQLObject-3.4.0/sqlobject/events.py0000644000175000017500000002772512471401471016536 0ustar phdphd00000000000000from __future__ import print_function import sys import types from pydispatch import dispatcher from weakref import ref from .compat import class_types subclassClones = {} def listen(receiver, soClass, signal, alsoSubclasses=True, weak=True): """ Listen for the given ``signal`` on the SQLObject subclass ``soClass``, calling ``receiver()`` when ``send(soClass, signal, ...)`` is called. If ``alsoSubclasses`` is true, receiver will also be called when an event is fired on any subclass. """ dispatcher.connect(receiver, signal=signal, sender=soClass, weak=weak) weakReceiver = ref(receiver) subclassClones.setdefault(soClass, []).append((weakReceiver, signal)) # We export this function: send = dispatcher.send class Signal(object): """ Base event for all SQLObject events. In general the sender for these methods is the class, not the instance. """ class ClassCreateSignal(Signal): """ Signal raised after class creation. The sender is the superclass (in case of multiple superclasses, the first superclass). The arguments are ``(new_class_name, bases, new_attrs, post_funcs, early_funcs)``. ``new_attrs`` is a dictionary and may be modified (but ``new_class_name`` and ``bases`` are immutable). ``post_funcs`` is an initially-empty list that can have callbacks appended to it. Note: at the time this event is called, the new class has not yet been created. The functions in ``post_funcs`` will be called after the class is created, with the single arguments of ``(new_class)``. Also, ``early_funcs`` will be called at the soonest possible time after class creation (``post_funcs`` is called after the class's ``__classinit__``). """ def _makeSubclassConnections(new_class_name, bases, new_attrs, post_funcs, early_funcs): early_funcs.insert(0, _makeSubclassConnectionsPost) def _makeSubclassConnectionsPost(new_class): for cls in new_class.__bases__: for weakReceiver, signal in subclassClones.get(cls, []): receiver = weakReceiver() if not receiver: continue listen(receiver, new_class, signal) dispatcher.connect(_makeSubclassConnections, signal=ClassCreateSignal) # @@: Should there be a class reload event? This would allow modules # to be reloaded, possibly. Or it could even be folded into # ClassCreateSignal, since anything that listens to that needs to pay # attention to reloads (or else it is probably buggy). class RowCreateSignal(Signal): """ Called before an instance is created, with the class as the sender. Called with the arguments ``(instance, kwargs, post_funcs)``. There may be a ``connection`` argument. ``kwargs``may be usefully modified. ``post_funcs`` is a list of callbacks, intended to have functions appended to it, and are called with the arguments ``(new_instance)``. Note: this is not called when an instance is created from an existing database row. """ class RowCreatedSignal(Signal): """ Called after an instance is created, with the class as the sender. Called with the arguments ``(instance, kwargs, post_funcs)``. There may be a ``connection`` argument. ``kwargs``may be usefully modified. ``post_funcs`` is a list of callbacks, intended to have functions appended to it, and are called with the arguments ``(new_instance)``. Note: this is not called when an instance is created from an existing database row. """ # @@: An event for getting a row? But for each row, when doing a # select? For .sync, .syncUpdate, .expire? class RowDestroySignal(Signal): """ Called before an instance is deleted. Sender is the instance's class. Arguments are ``(instance, post_funcs)``. ``post_funcs`` is a list of callbacks, intended to have functions appended to it, and are called with arguments ``(instance)``. If any of the post_funcs raises an exception, the deletion is only affected if this will prevent a commit. You cannot cancel the delete, but you can raise an exception (which will probably cancel the delete, but also cause an uncaught exception if not expected). Note: this is not called when an instance is destroyed through garbage collection. @@: Should this allow ``instance`` to be a primary key, so that a row can be deleted without first fetching it? """ class RowDestroyedSignal(Signal): """ Called after an instance is deleted. Sender is the instance's class. Arguments are ``(instance)``. This is called before the post_funcs of RowDestroySignal Note: this is not called when an instance is destroyed through garbage collection. """ class RowUpdateSignal(Signal): """ Called when an instance is updated through a call to ``.set()`` (or a column attribute assignment). The arguments are ``(instance, kwargs)``. ``kwargs`` can be modified. This is run *before* the instance is updated; if you want to look at the current values, simply look at ``instance``. """ class RowUpdatedSignal(Signal): """ Called when an instance is updated through a call to ``.set()`` (or a column attribute assignment). The arguments are ``(instance, post_funcs)``. ``post_funcs`` is a list of callbacks, intended to have functions appended to it, and are called with the arguments ``(new_instance)``. This is run *after* the instance is updated; Works better with lazyUpdate = True. """ class AddColumnSignal(Signal): """ Called when a column is added to a class, with arguments ``(cls, connection, column_name, column_definition, changeSchema, post_funcs)``. This is called *after* the column has been added, and is called for each column after class creation. post_funcs are called with ``(cls, so_column_obj)`` """ class DeleteColumnSignal(Signal): """ Called when a column is removed from a class, with the arguments ``(cls, connection, column_name, so_column_obj, post_funcs)``. Like ``AddColumnSignal`` this is called after the action has been performed, and is called for subclassing (when a column is implicitly removed by setting it to ``None``). post_funcs are called with ``(cls, so_column_obj)`` """ # @@: Signals for indexes and joins? These are mostly event consumers, # though. class CreateTableSignal(Signal): """ Called when a table is created. If ``ifNotExists==True`` and the table exists, this event is not called. Called with ``(cls, connection, extra_sql, post_funcs)``. ``extra_sql`` is a list (which can be appended to) of extra SQL statements to be run after the table is created. ``post_funcs`` functions are called with ``(cls, connection)`` after the table has been created. Those functions are *not* called simply when constructing the SQL. """ class DropTableSignal(Signal): """ Called when a table is dropped. If ``ifExists==True`` and the table doesn't exist, this event is not called. Called with ``(cls, connection, extra_sql, post_funcs)``. ``post_funcs`` functions are called with ``(cls, connection)`` after the table has been dropped. """ ############################################################ # Event Debugging ############################################################ def summarize_events_by_sender(sender=None, output=None, indent=0): """ Prints out a summary of the senders and listeners in the system, for debugging purposes. """ if output is None: output = sys.stdout leader = ' ' * indent if sender is None: send_list = [ (deref(dispatcher.senders.get(sid)), listeners) for sid, listeners in dispatcher.connections.items() if deref(dispatcher.senders.get(sid))] for sender, listeners in sorted_items(send_list): real_sender = deref(sender) if not real_sender: continue header = 'Sender: %r' % real_sender print(leader + header, file=output) print(leader + ('=' * len(header)), file=output) summarize_events_by_sender(real_sender, output=output, indent=indent + 2) else: for signal, receivers in \ sorted_items(dispatcher.connections.get(id(sender), [])): receivers = [deref(r) for r in receivers if deref(r)] header = 'Signal: %s (%i receivers)' % (sort_name(signal), len(receivers)) print(leader + header, file=output) print(leader + ('-' * len(header)), file=output) for receiver in sorted(receivers, key=sort_name): print(leader + ' ' + nice_repr(receiver), file=output) def deref(value): if isinstance(value, dispatcher.WEAKREF_TYPES): return value() else: return value def sorted_items(a_dict): if isinstance(a_dict, dict): a_dict = a_dict.items() return sorted(a_dict, key=lambda t: sort_name(t[0])) def sort_name(value): if isinstance(value, type): return value.__name__ elif isinstance(value, types.FunctionType): return value.__name__ else: return str(value) _real_dispatcher_send = dispatcher.send _real_dispatcher_sendExact = dispatcher.sendExact _real_dispatcher_connect = dispatcher.connect _real_dispatcher_disconnect = dispatcher.disconnect _debug_enabled = False def debug_events(): global _debug_enabled, send if _debug_enabled: return _debug_enabled = True dispatcher.send = send = _debug_send dispatcher.sendExact = _debug_sendExact dispatcher.disconnect = _debug_disconnect dispatcher.connect = _debug_connect def _debug_send(signal=dispatcher.Any, sender=dispatcher.Anonymous, *arguments, **named): print("send %s from %s: %s" % ( nice_repr(signal), nice_repr(sender), fmt_args(*arguments, **named))) return _real_dispatcher_send(signal, sender, *arguments, **named) def _debug_sendExact(signal=dispatcher.Any, sender=dispatcher.Anonymous, *arguments, **named): print("sendExact %s from %s: %s" % ( nice_repr(signal), nice_repr(sender), fmt_args(*arguments, **name))) return _real_dispatcher_sendExact(signal, sender, *arguments, **named) def _debug_connect(receiver, signal=dispatcher.Any, sender=dispatcher.Any, weak=True): print("connect %s to %s signal %s" % ( nice_repr(receiver), nice_repr(signal), nice_repr(sender))) return _real_dispatcher_connect(receiver, signal, sender, weak) def _debug_disconnect(receiver, signal=dispatcher.Any, sender=dispatcher.Any, weak=True): print("disconnecting %s from %s signal %s" % ( nice_repr(receiver), nice_repr(signal), nice_repr(sender))) return _real_dispatcher_disconnect(receiver, signal, sender, weak) def fmt_args(*arguments, **name): args = [repr(a) for a in arguments] args.extend([ '%s=%r' % (n, v) for n, v in sorted(name.items())]) return ', '.join(args) def nice_repr(v): """ Like repr(), but nicer for debugging here. """ if isinstance(v, class_types): return v.__module__ + '.' + v.__name__ elif isinstance(v, types.FunctionType): if '__name__' in v.__globals__: if getattr(sys.modules[v.__globals__['__name__']], v.__name__, None) is v: return '%s.%s' % (v.__globals__['__name__'], v.__name__) return repr(v) elif isinstance(v, types.MethodType): return '%s.%s of %s' % ( nice_repr(v.__self__.__class__), v.__func__.__name__, nice_repr(v.__self__)) else: return repr(v) __all__ = ['listen', 'send'] # Use copy() to avoid 'dictionary changed' issues on python 3 for name, value in globals().copy().items(): if isinstance(value, type) and issubclass(value, Signal): __all__.append(name) SQLObject-3.4.0/sqlobject/views.py0000644000175000017500000001336112747416060016364 0ustar phdphd00000000000000from .main import SQLObject from .sqlbuilder import AND, Alias, ColumnAS, LEFTJOINOn, \ NoDefault, SQLCall, SQLConstant, SQLObjectField, SQLObjectTable, SQLOp, \ Select, sqlrepr class ViewSQLObjectField(SQLObjectField): def __init__(self, alias, *arg): SQLObjectField.__init__(self, *arg) self.alias = alias def __sqlrepr__(self, db): return self.alias + "." + self.fieldName def tablesUsedImmediate(self): return [self.tableName] class ViewSQLObjectTable(SQLObjectTable): FieldClass = ViewSQLObjectField def __getattr__(self, attr): if attr == 'sqlmeta': raise AttributeError return SQLObjectTable.__getattr__(self, attr) def _getattrFromID(self, attr): return self.FieldClass(self.soClass.sqlmeta.alias, self.tableName, 'id', attr, self.soClass, None) def _getattrFromColumn(self, column, attr): return self.FieldClass(self.soClass.sqlmeta.alias, self.tableName, column.name, attr, self.soClass, column) class ViewSQLObject(SQLObject): """ A SQLObject class that derives all it's values from other SQLObject classes. Columns on subclasses should use SQLBuilder constructs for dbName, and sqlmeta should specify: * idName as a SQLBuilder construction * clause as SQLBuilder clause for specifying join conditions or other restrictions * table as an optional alternate name for the class alias See test_views.py for simple examples. """ def __classinit__(cls, new_attrs): SQLObject.__classinit__(cls, new_attrs) # like is_base if cls.__name__ != 'ViewSQLObject': dbName = hasattr(cls, '_connection') and \ (cls._connection and cls._connection.dbName) or None if getattr(cls.sqlmeta, 'table', None): cls.sqlmeta.alias = cls.sqlmeta.table else: cls.sqlmeta.alias = \ cls.sqlmeta.style.pythonClassToDBTable(cls.__name__) alias = cls.sqlmeta.alias columns = [ColumnAS(cls.sqlmeta.idName, 'id')] # {sqlrepr-key: [restriction, *aggregate-column]} aggregates = {'': [None]} inverseColumns = dict( [(y, x) for x, y in cls.sqlmeta.columns.items()]) for col in cls.sqlmeta.columnList: n = inverseColumns[col] ascol = ColumnAS(col.dbName, n) if isAggregate(col.dbName): restriction = getattr(col, 'aggregateClause', None) if restriction: restrictkey = sqlrepr(restriction, dbName) aggregates[restrictkey] = \ aggregates.get(restrictkey, [restriction]) + \ [ascol] else: aggregates[''].append(ascol) else: columns.append(ascol) metajoin = getattr(cls.sqlmeta, 'join', NoDefault) clause = getattr(cls.sqlmeta, 'clause', NoDefault) select = Select(columns, distinct=True, # @@ LDO check if this really mattered # for performance # @@ Postgres (and MySQL?) extension! # distinctOn=cls.sqlmeta.idName, join=metajoin, clause=clause) aggregates = aggregates.values() if aggregates != [[None]]: join = [] last_alias = "%s_base" % alias last_id = "id" last = Alias(select, last_alias) columns = [ ColumnAS(SQLConstant("%s.%s" % (last_alias, x.expr2)), x.expr2) for x in columns] for i, agg in enumerate(aggregates): restriction = agg[0] if restriction is None: restriction = clause else: restriction = AND(clause, restriction) agg = agg[1:] agg_alias = "%s_%s" % (alias, i) agg_id = '%s_id' % agg_alias if not last.q.alias.endswith('base'): last = None new_alias = Alias(Select( [ColumnAS(cls.sqlmeta.idName, agg_id)] + agg, groupBy=cls.sqlmeta.idName, join=metajoin, clause=restriction), agg_alias) agg_join = LEFTJOINOn(last, new_alias, "%s.%s = %s.%s" % ( last_alias, last_id, agg_alias, agg_id)) join.append(agg_join) for col in agg: columns.append( ColumnAS(SQLConstant( "%s.%s" % (agg_alias, col.expr2)), col.expr2)) last = new_alias last_alias = agg_alias last_id = agg_id select = Select(columns, join=join) cls.sqlmeta.table = Alias(select, alias) cls.q = ViewSQLObjectTable(cls) for n, col in cls.sqlmeta.columns.items(): col.dbName = n def isAggregate(expr): if isinstance(expr, SQLCall): return True if isinstance(expr, SQLOp): return isAggregate(expr.expr1) or isAggregate(expr.expr2) return False SQLObject-3.4.0/sqlobject/include/0000755000175000017500000000000013141371614016266 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/include/__init__.py0000644000175000017500000000000210332042146020361 0ustar phdphd00000000000000# SQLObject-3.4.0/sqlobject/include/hashcol.py0000644000175000017500000000615312640254342020267 0ustar phdphd00000000000000import sqlobject.col from sqlobject.compat import PY2, string_type __all__ = ['HashCol'] class DbHash: """ Presents a comparison object for hashes, allowing plain text to be automagically compared with the base content. """ def __init__(self, hash, hashMethod): self.hash = hash self.hashMethod = hashMethod def _get_key(self, other): """Create the hash of the other class""" if not isinstance(other, string_type): raise TypeError( "A hash may only be compared with a string, or None.") return self.hashMethod(other) def __eq__(self, other): if other is None: if self.hash is None: return True return False other_key = self._get_key(other) return other_key == self.hash def __lt__(self, other): if other is None: return False other_key = self._get_key(other) return other_key < self.hash def __gt__(self, other): if other is None: if self.hash is None: return False return True other_key = self._get_key(other) return other_key > self.hash def __le__(self, other): if other is None: if self.hash is None: return True return False other_key = self._get_key(other) return other_key <= self.hash def __ge__(self, other): if other is None: return True other_key = self._get_key(other) return other_key >= self.hash def __repr__(self): return "" class HashValidator(sqlobject.col.StringValidator): """ Provides formal SQLObject validation services for the HashCol. """ def to_python(self, value, state): """ Passes out a hash object. """ if value is None: return None return DbHash(hash=value, hashMethod=self.hashMethod) def from_python(self, value, state): """ Store the given value as a MD5 hash, or None if specified. """ if value is None: return None return self.hashMethod(value) class SOHashCol(sqlobject.col.SOStringCol): """ The internal HashCol definition. By default, enforces a md5 digest. """ def __init__(self, **kw): if 'hashMethod' not in kw: from hashlib import md5 if PY2: self.hashMethod = lambda v: md5(v).hexdigest() else: self.hashMethod = lambda v: md5(v.encode('utf8')).hexdigest() if 'length' not in kw: kw['length'] = 32 else: self.hashMethod = kw['hashMethod'] del kw['hashMethod'] super(SOHashCol, self).__init__(**kw) def createValidators(self): return [HashValidator(name=self.name, hashMethod=self.hashMethod)] + \ super(SOHashCol, self).createValidators() class HashCol(sqlobject.col.StringCol): """ End-user HashCol class. May be instantiated with 'hashMethod', a function which returns the string hash of any other string (i.e. basestring). """ baseClass = SOHashCol SQLObject-3.4.0/sqlobject/include/tests/0000755000175000017500000000000013141371614017430 5ustar phdphd00000000000000SQLObject-3.4.0/sqlobject/include/tests/__init__.py0000644000175000017500000000000212473702601021532 0ustar phdphd00000000000000# SQLObject-3.4.0/sqlobject/include/tests/test_hashcol.py0000644000175000017500000000440213036106435022462 0ustar phdphd00000000000000from hashlib import sha256, md5 from sqlobject import AND, IntCol, OR, SQLObject from sqlobject.compat import PY2 from sqlobject.include import hashcol from sqlobject.tests.dbtest import setupClass ######################################## # HashCol test ######################################## if PY2: def sha256_str(x): return sha256(x).hexdigest() def md5_str(x): return md5(x).hexdigest() else: def sha256_str(x): return sha256(x.encode('utf8')).hexdigest() def md5_str(x): return md5(x.encode('utf8')).hexdigest() class HashTest(SQLObject): so_count = IntCol(alternateID=True) col1 = hashcol.HashCol() col2 = hashcol.HashCol(hashMethod=sha256_str) data = ['test', 'This is more text', 'test 2'] items = [] def setup(): global items items = [] setupClass(HashTest) for i, s in enumerate(data): items.append(HashTest(so_count=i, col1=s, col2=s)) def test_create(): setup() for s, item in zip(data, items): assert item.col1 == s assert item.col2 == s conn = HashTest._connection rows = conn.queryAll(""" SELECT so_count, col1, col2 FROM hash_test ORDER BY so_count """) for so_count, col1, col2 in rows: assert md5_str(data[so_count]) == col1 assert sha256_str(data[so_count]) == col2 def test_select(): for i, value in enumerate(data): rows = list(HashTest.select(HashTest.q.col1 == value)) assert len(rows) == 1 rows = list(HashTest.select(HashTest.q.col2 == value)) assert len(rows) == 1 # Passing the hash in directly should fail rows = list(HashTest.select(HashTest.q.col2 == sha256_str(value))) assert len(rows) == 0 rows = list(HashTest.select(AND( HashTest.q.col1 == value, HashTest.q.col2 == value ))) assert len(rows) == 1 rows = list(HashTest.selectBy(col1=value)) assert len(rows) == 1 rows = list(HashTest.selectBy(col2=value)) assert len(rows) == 1 rows = list(HashTest.selectBy(col1=value, col2=value)) assert len(rows) == 1 rows = list(HashTest.select(OR( HashTest.q.col1 == 'test 2', HashTest.q.col2 == 'test' ))) assert len(rows) == 2 SQLObject-3.4.0/sqlobject/wsgi_middleware.py0000644000175000017500000000674012760105144020371 0ustar phdphd00000000000000import sys import sqlobject from sqlobject.compat import string_type # The module was imported during documentation building if 'sphinx' not in sys.modules: from paste.deploy.converters import asbool from paste.wsgilib import catch_errors from paste.util import import_string def make_middleware(app, global_conf, database=None, use_transaction=False, hub=None): """ WSGI middleware that sets the connection for the request (using the database URI or connection object) and the given hub (or ``sqlobject.sqlhub`` if not given). If ``use_transaction`` is true, then the request will be run in a transaction. Applications can use the keys (which are all no-argument functions): ``sqlobject.get_connection()``: Returns the connection object ``sqlobject.abort()``: Aborts the transaction. Does not raise an error, but at the *end* of the request there will be a rollback. ``sqlobject.begin()``: Starts a transaction. First commits (or rolls back if aborted) if this is run in a transaction. ``sqlobject.in_transaction()``: Returns true or false, depending if we are currently in a transaction. """ use_transaction = asbool(use_transaction) if database is None: database = global_conf.get('database') if not database: raise ValueError( "You must provide a 'database' configuration value") if isinstance(hub, string_type): hub = import_string.eval_import(hub) if not hub: hub = sqlobject.sqlhub if isinstance(database, string_type): database = sqlobject.connectionForURI(database) return SQLObjectMiddleware(app, database, use_transaction, hub) class SQLObjectMiddleware(object): def __init__(self, app, conn, use_transaction, hub): self.app = app self.conn = conn self.use_transaction = use_transaction self.hub = hub def __call__(self, environ, start_response): conn = [self.conn] if self.use_transaction: conn[0] = conn[0].transaction() any_errors = [] use_transaction = [self.use_transaction] self.hub.threadConnection = conn[0] def abort(): assert use_transaction[0], ( "You cannot abort, because a transaction is not being used") any_errors.append(None) def begin(): if use_transaction[0]: if any_errors: conn[0].rollback() else: conn[0].commit() any_errors[:] = [] use_transaction[0] = True conn[0] = self.conn.transaction() self.hub.threadConnection = conn[0] def error(exc_info=None): any_errors.append(None) ok() def ok(): if use_transaction[0]: if any_errors: conn[0].rollback() else: conn[0].commit(close=True) self.hub.threadConnection = None def in_transaction(): return use_transaction[0] def get_connection(): return conn[0] environ['sqlobject.get_connection'] = get_connection environ['sqlobject.abort'] = abort environ['sqlobject.begin'] = begin environ['sqlobject.in_transaction'] = in_transaction return catch_errors(self.app, environ, start_response, error_callback=error, ok_callback=ok) SQLObject-3.4.0/sqlobject/sqlbuilder.py0000644000175000017500000013463613102700617017374 0ustar phdphd00000000000000""" sqlobject.sqlbuilder -------------------- :author: Ian Bicking Builds SQL expressions from normal Python expressions. Disclaimer ---------- This program is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. Instructions ------------ To begin a SQL expression, you must use some sort of SQL object -- a field, table, or SQL statement (``SELECT``, ``INSERT``, etc.) You can then use normal operators, with the exception of: `and`, `or`, `not`, and `in`. You can use the `AND`, `OR`, `NOT`, and `IN` functions instead, or you can also use `&`, `|`, and `~` for `and`, `or`, and `not` respectively (however -- the precidence for these operators doesn't work as you would want, so you must use many parenthesis). To create a sql field, table, or constant/function, use the namespaces `table`, `const`, and `func`. For instance, ``table.address`` refers to the ``address`` table, and ``table.address.state`` refers to the ``state`` field in the address table. ``const.NULL`` is the ``NULL`` SQL constant, and ``func.NOW()`` is the ``NOW()`` function call (`const` and `func` are actually identicle, but the two names are provided for clarity). Once you create this object, expressions formed with it will produce SQL statements. The ``sqlrepr(obj)`` function gets the SQL representation of these objects, as well as the proper SQL representation of basic Python types (None==NULL). There are a number of DB-specific SQL features that this does not implement. There are a bunch of normal ANSI features also not present. See the bottom of this module for some examples, and run it (i.e. ``python sql.py``) to see the results of those examples. """ ######################################## # Constants ######################################## import fnmatch import operator import re import threading import types import weakref from . import classregistry from .converters import registerConverter, sqlrepr, quote_str, unquote_str from .compat import PY2, string_type class VersionError(Exception): pass class NoDefault: pass class SQLObjectState(object): def __init__(self, soObject, connection=None): self.soObject = weakref.proxy(soObject) self.connection = connection safeSQLRE = re.compile(r'^[a-zA-Z_][a-zA-Z0-9_\.]*$') def sqlIdentifier(obj): # some db drivers return unicode column names return isinstance(obj, string_type) and bool(safeSQLRE.search(obj.strip())) def execute(expr, executor): if hasattr(expr, 'execute'): return expr.execute(executor) else: return expr def _str_or_sqlrepr(expr, db): if isinstance(expr, string_type): return expr return sqlrepr(expr, db) ######################################## # Expression generation ######################################## class SQLExpression: def __add__(self, other): return SQLOp("+", self, other) def __radd__(self, other): return SQLOp("+", other, self) def __sub__(self, other): return SQLOp("-", self, other) def __rsub__(self, other): return SQLOp("-", other, self) def __mul__(self, other): return SQLOp("*", self, other) def __rmul__(self, other): return SQLOp("*", other, self) def __div__(self, other): return SQLOp("/", self, other) def __rdiv__(self, other): return SQLOp("/", other, self) def __pos__(self): return SQLPrefix("+", self) def __neg__(self): return SQLPrefix("-", self) def __pow__(self, other): return SQLConstant("POW")(self, other) def __rpow__(self, other): return SQLConstant("POW")(other, self) def __abs__(self): return SQLConstant("ABS")(self) def __mod__(self, other): return SQLModulo(self, other) def __rmod__(self, other): return SQLConstant("MOD")(other, self) def __lt__(self, other): return SQLOp("<", self, other) def __le__(self, other): return SQLOp("<=", self, other) def __gt__(self, other): return SQLOp(">", self, other) def __ge__(self, other): return SQLOp(">=", self, other) def __eq__(self, other): if other is None: return ISNULL(self) else: return SQLOp("=", self, other) def __ne__(self, other): if other is None: return ISNOTNULL(self) else: return SQLOp("<>", self, other) def __and__(self, other): return SQLOp("AND", self, other) def __rand__(self, other): return SQLOp("AND", other, self) def __or__(self, other): return SQLOp("OR", self, other) def __ror__(self, other): return SQLOp("OR", other, self) def __invert__(self): return SQLPrefix("NOT", self) def __call__(self, *args): return SQLCall(self, args) def __repr__(self): try: return self.__sqlrepr__(None) except AssertionError: return '<%s %s>' % ( self.__class__.__name__, hex(id(self))[2:]) def __str__(self): return repr(self) def __cmp__(self, other): raise VersionError("Python 2.1+ required") def __rcmp__(self, other): raise VersionError("Python 2.1+ required") def startswith(self, s): return STARTSWITH(self, s) def endswith(self, s): return ENDSWITH(self, s) def contains(self, s): return CONTAINSSTRING(self, s) def components(self): return [] def tablesUsed(self, db): return self.tablesUsedSet(db) def tablesUsedSet(self, db): tables = set() for table in self.tablesUsedImmediate(): if hasattr(table, '__sqlrepr__'): table = sqlrepr(table, db) tables.add(table) for component in self.components(): tables.update(tablesUsedSet(component, db)) return tables def tablesUsedImmediate(self): return [] ####################################### # Converter for SQLExpression instances ####################################### def SQLExprConverter(value, db): return value.__sqlrepr__() registerConverter(SQLExpression, SQLExprConverter) def tablesUsedSet(obj, db): if hasattr(obj, "tablesUsedSet"): return obj.tablesUsedSet(db) else: return {} if PY2: div = operator.div else: div = operator.truediv operatorMap = { "+": operator.add, "/": div, "-": operator.sub, "*": operator.mul, "<": operator.lt, "<=": operator.le, "=": operator.eq, "!=": operator.ne, ">=": operator.ge, ">": operator.gt, "IN": operator.contains, "IS": operator.eq, } class SQLOp(SQLExpression): def __init__(self, op, expr1, expr2): self.op = op.upper() self.expr1 = expr1 self.expr2 = expr2 def __sqlrepr__(self, db): s1 = sqlrepr(self.expr1, db) s2 = sqlrepr(self.expr2, db) if s1[0] != '(' and s1 != 'NULL': s1 = '(' + s1 + ')' if s2[0] != '(' and s2 != 'NULL': s2 = '(' + s2 + ')' return "(%s %s %s)" % (s1, self.op, s2) def components(self): return [self.expr1, self.expr2] def execute(self, executor): if self.op == "AND": return execute(self.expr1, executor) \ and execute(self.expr2, executor) elif self.op == "OR": return execute(self.expr1, executor) \ or execute(self.expr2, executor) else: return operatorMap[self.op.upper()](execute(self.expr1, executor), execute(self.expr2, executor)) registerConverter(SQLOp, SQLExprConverter) class SQLModulo(SQLOp): def __init__(self, expr1, expr2): SQLOp.__init__(self, '%', expr1, expr2) def __sqlrepr__(self, db): if db == 'sqlite': return SQLOp.__sqlrepr__(self, db) s1 = sqlrepr(self.expr1, db) s2 = sqlrepr(self.expr2, db) return "MOD(%s, %s)" % (s1, s2) registerConverter(SQLModulo, SQLExprConverter) class SQLCall(SQLExpression): def __init__(self, expr, args): self.expr = expr self.args = args def __sqlrepr__(self, db): return "%s%s" % (sqlrepr(self.expr, db), sqlrepr(self.args, db)) def components(self): return [self.expr] + list(self.args) def execute(self, executor): raise ValueError("I don't yet know how to locally execute functions") registerConverter(SQLCall, SQLExprConverter) class SQLPrefix(SQLExpression): def __init__(self, prefix, expr): self.prefix = prefix self.expr = expr def __sqlrepr__(self, db): return "%s %s" % (self.prefix, sqlrepr(self.expr, db)) def components(self): return [self.expr] def execute(self, executor): prefix = self.prefix expr = execute(self.expr, executor) if prefix == "+": return expr elif prefix == "-": return -expr elif prefix.upper() == "NOT": return not expr registerConverter(SQLPrefix, SQLExprConverter) class SQLConstant(SQLExpression): def __init__(self, const): self.const = const def __sqlrepr__(self, db): return self.const def execute(self, executor): raise ValueError("I don't yet know how to execute SQL constants") registerConverter(SQLConstant, SQLExprConverter) class SQLTrueClauseClass(SQLExpression): def __sqlrepr__(self, db): return "1 = 1" def execute(self, executor): return 1 SQLTrueClause = SQLTrueClauseClass() registerConverter(SQLTrueClauseClass, SQLExprConverter) ######################################## # Namespaces ######################################## class Field(SQLExpression): def __init__(self, tableName, fieldName): self.tableName = tableName self.fieldName = fieldName def __sqlrepr__(self, db): return self.tableName + "." + self.fieldName def tablesUsedImmediate(self): return [self.tableName] def execute(self, executor): return executor.field(self.tableName, self.fieldName) class SQLObjectField(Field): def __init__(self, tableName, fieldName, original, soClass, column): Field.__init__(self, tableName, fieldName) self.original = original self.soClass = soClass self.column = column def _from_python(self, value): column = self.column if not isinstance(value, SQLExpression) and \ column and column.from_python: value = column.from_python(value, SQLObjectState(self.soClass)) return value def __eq__(self, other): if other is None: return ISNULL(self) other = self._from_python(other) return SQLOp('=', self, other) def __ne__(self, other): if other is None: return ISNOTNULL(self) other = self._from_python(other) return SQLOp('<>', self, other) def startswith(self, s): s = self._from_python(s) return STARTSWITH(self, s) def endswith(self, s): s = self._from_python(s) return ENDSWITH(self, s) def contains(self, s): s = self._from_python(s) return CONTAINSSTRING(self, s) registerConverter(SQLObjectField, SQLExprConverter) class Table(SQLExpression): FieldClass = Field def __init__(self, tableName): self.tableName = tableName def __getattr__(self, attr): if attr.startswith('__'): raise AttributeError return self.FieldClass(self.tableName, attr) def __sqlrepr__(self, db): return _str_or_sqlrepr(self.tableName, db) def execute(self, executor): raise ValueError("Tables don't have values") class SQLObjectTable(Table): FieldClass = SQLObjectField def __init__(self, soClass): self.soClass = soClass assert soClass.sqlmeta.table, ( "Bad table name in class %r: %r" % (soClass, soClass.sqlmeta.table)) Table.__init__(self, soClass.sqlmeta.table) def __getattr__(self, attr): if attr.startswith('__'): raise AttributeError if attr == 'id': return self._getattrFromID(attr) elif attr in self.soClass.sqlmeta.columns: column = self.soClass.sqlmeta.columns[attr] return self._getattrFromColumn(column, attr) elif attr + 'ID' in \ [k for (k, v) in self.soClass.sqlmeta.columns.items() if v.foreignKey]: attr += 'ID' column = self.soClass.sqlmeta.columns[attr] return self._getattrFromColumn(column, attr) else: raise AttributeError( "%s instance has no attribute '%s'" % (self.soClass.__name__, attr)) def _getattrFromID(self, attr): return self.FieldClass(self.tableName, self.soClass.sqlmeta.idName, attr, self.soClass, None) def _getattrFromColumn(self, column, attr): return self.FieldClass(self.tableName, column.dbName, attr, self.soClass, column) class SQLObjectTableWithJoins(SQLObjectTable): def __getattr__(self, attr): if attr + 'ID' in \ [k for (k, v) in self.soClass.sqlmeta.columns.items() if v.foreignKey]: column = self.soClass.sqlmeta.columns[attr + 'ID'] return self._getattrFromForeignKey(column, attr) elif attr in [x.joinMethodName for x in self.soClass.sqlmeta.joins]: join = [x for x in self.soClass.sqlmeta.joins if x.joinMethodName == attr][0] return self._getattrFromJoin(join, attr) else: return SQLObjectTable.__getattr__(self, attr) def _getattrFromForeignKey(self, column, attr): ret = getattr(self, column.name) == \ getattr(self.soClass, '_SO_class_' + column.foreignKey).q.id return ret def _getattrFromJoin(self, join, attr): if hasattr(join, 'otherColumn'): return AND( join.otherClass.q.id == Field(join.intermediateTable, join.otherColumn), Field(join.intermediateTable, join.joinColumn) == self.soClass.q.id) else: return getattr(join.otherClass.q, join.joinColumn) == \ self.soClass.q.id class TableSpace: TableClass = Table def __getattr__(self, attr): if attr.startswith('__'): raise AttributeError return self.TableClass(attr) class ConstantSpace: def __getattr__(self, attr): if attr.startswith('__'): raise AttributeError return SQLConstant(attr) ######################################## # Table aliases ######################################## class AliasField(Field): def __init__(self, tableName, fieldName, alias, aliasTable): Field.__init__(self, tableName, fieldName) self.alias = alias self.aliasTable = aliasTable def __sqlrepr__(self, db): fieldName = self.fieldName if isinstance(fieldName, SQLExpression): fieldName = sqlrepr(fieldName, db) return self.alias + "." + fieldName def tablesUsedImmediate(self): return [self.aliasTable] class AliasTable(Table): as_string = '' # set it to "AS" if your database requires it FieldClass = AliasField _alias_lock = threading.Lock() _alias_counter = 0 def __init__(self, table, alias=None): if hasattr(table, "sqlmeta"): tableName = SQLConstant(table.sqlmeta.table) elif isinstance(table, (Select, Union)): assert alias is not None, \ "Alias name cannot be constructed from Select instances, " \ "please provide an 'alias' keyword." tableName = Subquery('', table) table = None else: tableName = SQLConstant(table) table = None Table.__init__(self, tableName) self.table = table if alias is None: self._alias_lock.acquire() try: AliasTable._alias_counter += 1 alias = "%s_alias%d" % (tableName, AliasTable._alias_counter) finally: self._alias_lock.release() self.alias = alias def __getattr__(self, attr): if attr.startswith('__'): raise AttributeError if self.table: attr = getattr(self.table.q, attr).fieldName return self.FieldClass(self.tableName, attr, self.alias, self) def __sqlrepr__(self, db): return "%s %s %s" % (sqlrepr(self.tableName, db), self.as_string, self.alias) class Alias(SQLExpression): def __init__(self, table, alias=None): self.q = AliasTable(table, alias) def __sqlrepr__(self, db): return sqlrepr(self.q, db) def components(self): return [self.q] class Union(SQLExpression): def __init__(self, *tables): tabs = [] for t in tables: if not isinstance(t, SQLExpression) and hasattr(t, 'sqlmeta'): t = t.sqlmeta.table if isinstance(t, Alias): t = t.q if isinstance(t, Table): t = t.tableName if not isinstance(t, SQLExpression): t = SQLConstant(t) tabs.append(t) self.tables = tabs def __sqlrepr__(self, db): return " UNION ".join([str(sqlrepr(t, db)) for t in self.tables]) ######################################## # SQL Statements ######################################## class Select(SQLExpression): def __init__(self, items=NoDefault, where=NoDefault, groupBy=NoDefault, having=NoDefault, orderBy=NoDefault, limit=NoDefault, join=NoDefault, lazyColumns=False, distinct=False, start=0, end=None, reversed=False, forUpdate=False, clause=NoDefault, staticTables=NoDefault, distinctOn=NoDefault): self.ops = {} if not isinstance(items, (list, tuple, types.GeneratorType)): items = [items] if clause is NoDefault and where is not NoDefault: clause = where if staticTables is NoDefault: staticTables = [] self.ops['items'] = items self.ops['clause'] = clause self.ops['groupBy'] = groupBy self.ops['having'] = having self.ops['orderBy'] = orderBy self.ops['limit'] = limit self.ops['join'] = join self.ops['lazyColumns'] = lazyColumns self.ops['distinct'] = distinct self.ops['distinctOn'] = distinctOn self.ops['start'] = start self.ops['end'] = end self.ops['reversed'] = reversed self.ops['forUpdate'] = forUpdate self.ops['staticTables'] = staticTables def clone(self, **newOps): ops = self.ops.copy() ops.update(newOps) return self.__class__(**ops) def newItems(self, items): return self.clone(items=items) def newClause(self, new_clause): return self.clone(clause=new_clause) def orderBy(self, orderBy): return self.clone(orderBy=orderBy) def unlimited(self): return self.clone(limit=NoDefault, start=0, end=None) def limit(self, limit): self.clone(limit=limit) def lazyColumns(self, value): return self.clone(lazyColumns=value) def reversed(self): return self.clone(reversed=not self.ops.get('reversed', False)) def distinct(self): return self.clone(distinct=True) def filter(self, filter_clause): if filter_clause is None: # None doesn't filter anything, it's just a no-op: return self clause = self.ops['clause'] if isinstance(clause, string_type): clause = SQLConstant('(%s)' % clause) return self.newClause(AND(clause, filter_clause)) def __sqlrepr__(self, db): select = "SELECT" if self.ops['distinct']: select += " DISTINCT" if self.ops['distinctOn'] is not NoDefault: select += " ON(%s)" % _str_or_sqlrepr( self.ops['distinctOn'], db) if not self.ops['lazyColumns']: select += " %s" % ", ".join( [str(_str_or_sqlrepr(v, db)) for v in self.ops['items']]) else: select += " %s" % _str_or_sqlrepr(self.ops['items'][0], db) join = [] join_str = '' if self.ops['join'] is not NoDefault and self.ops['join'] is not None: _join = self.ops['join'] if isinstance(_join, str): join_str = " " + _join elif isinstance(_join, SQLJoin): join.append(_join) else: join.extend(_join) tables = set() for x in self.ops['staticTables']: if isinstance(x, SQLExpression): x = sqlrepr(x, db) tables.add(x) things = list(self.ops['items']) + join if self.ops['clause'] is not NoDefault: things.append(self.ops['clause']) for thing in things: if isinstance(thing, SQLExpression): tables.update(tablesUsedSet(thing, db)) for j in join: t1 = _str_or_sqlrepr(j.table1, db) if t1 in tables: tables.remove(t1) t2 = _str_or_sqlrepr(j.table2, db) if t2 in tables: tables.remove(t2) if tables: select += " FROM %s" % ", ".join(sorted(tables)) elif join: select += " FROM" tablesYet = tables for j in join: if tablesYet and j.table1: sep = ", " else: sep = " " select += sep + sqlrepr(j, db) tablesYet = True if join_str: select += join_str if self.ops['clause'] is not NoDefault: select += " WHERE %s" % _str_or_sqlrepr(self.ops['clause'], db) if self.ops['groupBy'] is not NoDefault: groupBy = _str_or_sqlrepr(self.ops['groupBy'], db) if isinstance(self.ops['groupBy'], (list, tuple)): groupBy = groupBy[1:-1] # Remove parens select += " GROUP BY %s" % groupBy if self.ops['having'] is not NoDefault: select += " HAVING %s" % _str_or_sqlrepr(self.ops['having'], db) if self.ops['orderBy'] is not NoDefault and \ self.ops['orderBy'] is not None: orderBy = self.ops['orderBy'] if self.ops['reversed']: reverser = DESC else: def reverser(x): return x if isinstance(orderBy, (list, tuple)): select += " ORDER BY %s" % ", ".join( [_str_or_sqlrepr(reverser(_x), db) for _x in orderBy]) else: select += " ORDER BY %s" % _str_or_sqlrepr( reverser(orderBy), db) start, end = self.ops['start'], self.ops['end'] if self.ops['limit'] is not NoDefault: end = start + self.ops['limit'] if start or end: from .dbconnection import dbConnectionForScheme select = dbConnectionForScheme(db)._queryAddLimitOffset(select, start, end) if self.ops['forUpdate']: select += " FOR UPDATE" return select registerConverter(Select, SQLExprConverter) class Insert(SQLExpression): def __init__(self, table, valueList=None, values=None, template=NoDefault): self.template = template self.table = table if valueList: if values: raise TypeError("You may only give valueList *or* values") self.valueList = valueList else: self.valueList = [values] def __sqlrepr__(self, db): if not self.valueList: return '' insert = "INSERT INTO %s" % self.table allowNonDict = True template = self.template if (template is NoDefault) and isinstance(self.valueList[0], dict): template = list(sorted(self.valueList[0].keys())) allowNonDict = False if template is not NoDefault: insert += " (%s)" % ", ".join(template) insert += " VALUES " listToJoin = [] listToJoin_app = listToJoin.append for value in self.valueList: if isinstance(value, dict): if template is NoDefault: raise TypeError( "You can't mix non-dictionaries with dictionaries " "in an INSERT if you don't provide a template (%s)" % repr(value)) value = dictToList(template, value) elif not allowNonDict: raise TypeError( "You can't mix non-dictionaries with dictionaries " "in an INSERT if you don't provide a template (%s)" % repr(value)) listToJoin_app("(%s)" % ", ".join([sqlrepr(v, db) for v in value])) insert = "%s%s" % (insert, ", ".join(listToJoin)) return insert registerConverter(Insert, SQLExprConverter) def dictToList(template, dict): list = [] for key in template: list.append(dict[key]) if len(dict.keys()) > len(template): raise TypeError( "Extra entries in dictionary that aren't asked for in template " "(template=%s, dict=%s)" % (repr(template), repr(dict))) return list class Update(SQLExpression): def __init__(self, table, values, template=NoDefault, where=NoDefault): self.table = table self.values = values self.template = template self.whereClause = where def __sqlrepr__(self, db): update = "%s %s" % (self.sqlName(), self.table) update += " SET" first = True if self.template is not NoDefault: for i in range(len(self.template)): if first: first = False else: update += "," update += " %s=%s" % (self.template[i], sqlrepr(self.values[i], db)) else: for key, value in sorted(self.values.items()): if first: first = False else: update += "," update += " %s=%s" % (key, sqlrepr(value, db)) if self.whereClause is not NoDefault: update += " WHERE %s" % _str_or_sqlrepr(self.whereClause, db) return update def sqlName(self): return "UPDATE" registerConverter(Update, SQLExprConverter) class Delete(SQLExpression): """To be safe, this will signal an error if there is no where clause, unless you pass in where=None to the constructor.""" def __init__(self, table, where=NoDefault): self.table = table if where is NoDefault: raise TypeError( "You must give a where clause or pass in None " "to indicate no where clause") self.whereClause = where def __sqlrepr__(self, db): whereClause = self.whereClause if whereClause is None: return "DELETE FROM %s" % self.table whereClause = _str_or_sqlrepr(whereClause, db) return "DELETE FROM %s WHERE %s" % (self.table, whereClause) registerConverter(Delete, SQLExprConverter) class Replace(Update): def sqlName(self): return "REPLACE" registerConverter(Replace, SQLExprConverter) ######################################## # SQL Builtins ######################################## class DESC(SQLExpression): def __init__(self, expr): self.expr = expr def __sqlrepr__(self, db): if isinstance(self.expr, DESC): return sqlrepr(self.expr.expr, db) return '%s DESC' % sqlrepr(self.expr, db) def AND(*ops): if not ops: return None op1 = ops[0] ops = ops[1:] if ops: return SQLOp("AND", op1, AND(*ops)) else: return op1 def OR(*ops): if not ops: return None op1 = ops[0] ops = ops[1:] if ops: return SQLOp("OR", op1, OR(*ops)) else: return op1 def NOT(op): return SQLPrefix("NOT", op) def _IN(item, list): return SQLOp("IN", item, list) def IN(item, list): from .sresults import SelectResults # Import here to avoid circular import if isinstance(list, SelectResults): query = list.queryForSelect() query.ops['items'] = [list.sourceClass.q.id] list = query if isinstance(list, Select): return INSubquery(item, list) else: return _IN(item, list) def NOTIN(item, list): if isinstance(list, Select): return NOTINSubquery(item, list) else: return NOT(_IN(item, list)) def STARTSWITH(expr, pattern): return LIKE(expr, _LikeQuoted(pattern) + '%', escape='\\') def ENDSWITH(expr, pattern): return LIKE(expr, '%' + _LikeQuoted(pattern), escape='\\') def CONTAINSSTRING(expr, pattern): return LIKE(expr, '%' + _LikeQuoted(pattern) + '%', escape='\\') def ISNULL(expr): return SQLOp("IS", expr, None) def ISNOTNULL(expr): return SQLOp("IS NOT", expr, None) class ColumnAS(SQLOp): ''' Just like SQLOp('AS', expr, name) except without the parentheses ''' def __init__(self, expr, name): if isinstance(name, string_type): name = SQLConstant(name) SQLOp.__init__(self, 'AS', expr, name) def __sqlrepr__(self, db): return "%s %s %s" % (sqlrepr(self.expr1, db), self.op, sqlrepr(self.expr2, db)) class _LikeQuoted: # It assumes prefix and postfix are strings; usually just a percent sign. # @@: I'm not sure what the quoting rules really are for all the # databases def __init__(self, expr): self.expr = expr self.prefix = '' self.postfix = '' def __radd__(self, s): self.prefix = s + self.prefix return self def __add__(self, s): self.postfix += s return self def __sqlrepr__(self, db): s = self.expr if isinstance(s, SQLExpression): values = [] if self.prefix: values.append(quote_str(self.prefix, db)) s = _quote_like_special(sqlrepr(s, db), db) values.append(s) if self.postfix: values.append(quote_str(self.postfix, db)) if db == "mysql": return "CONCAT(%s)" % ", ".join(values) elif db in ("mssql", "sybase"): return " + ".join(values) else: return " || ".join(values) elif isinstance(s, string_type): s = _quote_like_special(unquote_str(sqlrepr(s, db)), db) return quote_str("%s%s%s" % (self.prefix, s, self.postfix), db) else: raise TypeError( "expected str, unicode or SQLExpression, got %s" % type(s)) def _quote_like_special(s, db): if db in ('postgres', 'rdbhost'): escape = r'\\' else: escape = '\\' s = s.replace('\\', r'\\').\ replace('%', escape + '%').\ replace('_', escape + '_') return s class CONCAT(SQLExpression): def __init__(self, *expressions): self.expressions = expressions def __sqlrepr__(self, db): values = [sqlrepr(expr, db) for expr in self.expressions] if db == "mysql": return "CONCAT(%s)" % ", ".join(values) elif db in ("mssql", "sybase"): return " + ".join(values) else: return " || ".join(values) ######################################## # SQL JOINs ######################################## class SQLJoin(SQLExpression): def __init__(self, table1, table2, op=','): if hasattr(table1, 'sqlmeta'): table1 = table1.sqlmeta.table if hasattr(table2, 'sqlmeta'): table2 = table2.sqlmeta.table if isinstance(table1, str): table1 = SQLConstant(table1) if isinstance(table2, str): table2 = SQLConstant(table2) self.table1 = table1 self.table2 = table2 self.op = op def __sqlrepr__(self, db): if self.table1: return "%s%s %s" % (sqlrepr(self.table1, db), self.op, sqlrepr(self.table2, db)) else: return "%s %s" % (self.op, sqlrepr(self.table2, db)) registerConverter(SQLJoin, SQLExprConverter) def JOIN(table1, table2): return SQLJoin(table1, table2, " JOIN") def INNERJOIN(table1, table2): return SQLJoin(table1, table2, " INNER JOIN") def CROSSJOIN(table1, table2): return SQLJoin(table1, table2, " CROSS JOIN") def STRAIGHTJOIN(table1, table2): return SQLJoin(table1, table2, " STRAIGHT JOIN") def LEFTJOIN(table1, table2): return SQLJoin(table1, table2, " LEFT JOIN") def LEFTOUTERJOIN(table1, table2): return SQLJoin(table1, table2, " LEFT OUTER JOIN") def NATURALJOIN(table1, table2): return SQLJoin(table1, table2, " NATURAL JOIN") def NATURALLEFTJOIN(table1, table2): return SQLJoin(table1, table2, " NATURAL LEFT JOIN") def NATURALLEFTOUTERJOIN(table1, table2): return SQLJoin(table1, table2, " NATURAL LEFT OUTER JOIN") def RIGHTJOIN(table1, table2): return SQLJoin(table1, table2, " RIGHT JOIN") def RIGHTOUTERJOIN(table1, table2): return SQLJoin(table1, table2, " RIGHT OUTER JOIN") def NATURALRIGHTJOIN(table1, table2): return SQLJoin(table1, table2, " NATURAL RIGHT JOIN") def NATURALRIGHTOUTERJOIN(table1, table2): return SQLJoin(table1, table2, " NATURAL RIGHT OUTER JOIN") def FULLJOIN(table1, table2): return SQLJoin(table1, table2, " FULL JOIN") def FULLOUTERJOIN(table1, table2): return SQLJoin(table1, table2, " FULL OUTER JOIN") def NATURALFULLJOIN(table1, table2): return SQLJoin(table1, table2, " NATURAL FULL JOIN") def NATURALFULLOUTERJOIN(table1, table2): return SQLJoin(table1, table2, " NATURAL FULL OUTER JOIN") class SQLJoinConditional(SQLJoin): """Conditional JOIN""" def __init__(self, table1, table2, op, on_condition=None, using_columns=None): """For condition you must give on_condition or using_columns but not both on_condition can be a string or SQLExpression, for example Table1.q.col1 == Table2.q.col2 using_columns can be a string or a list of columns, e.g. (Table1.q.col1, Table2.q.col2) """ if not on_condition and not using_columns: raise TypeError("You must give ON condition or USING columns") if on_condition and using_columns: raise TypeError( "You must give ON condition or USING columns but not both") SQLJoin.__init__(self, table1, table2, op) self.on_condition = on_condition self.using_columns = using_columns def __sqlrepr__(self, db): if self.on_condition: on_condition = self.on_condition if hasattr(on_condition, "__sqlrepr__"): on_condition = sqlrepr(on_condition, db) join = "%s %s ON %s" % (self.op, sqlrepr(self.table2, db), on_condition) if self.table1: join = "%s %s" % (sqlrepr(self.table1, db), join) return join elif self.using_columns: using_columns = [] for col in self.using_columns: if hasattr(col, "__sqlrepr__"): col = sqlrepr(col, db) using_columns.append(col) using_columns = ", ".join(using_columns) join = "%s %s USING (%s)" % (self.op, sqlrepr(self.table2, db), using_columns) if self.table1: join = "%s %s" % (sqlrepr(self.table1, db), join) return join else: RuntimeError, "Impossible error" registerConverter(SQLJoinConditional, SQLExprConverter) def INNERJOINConditional(table1, table2, on_condition=None, using_columns=None): return SQLJoinConditional(table1, table2, "INNER JOIN", on_condition, using_columns) def LEFTJOINConditional(table1, table2, on_condition=None, using_columns=None): return SQLJoinConditional(table1, table2, "LEFT JOIN", on_condition, using_columns) def LEFTOUTERJOINConditional(table1, table2, on_condition=None, using_columns=None): return SQLJoinConditional(table1, table2, "LEFT OUTER JOIN", on_condition, using_columns) def RIGHTJOINConditional(table1, table2, on_condition=None, using_columns=None): return SQLJoinConditional(table1, table2, "RIGHT JOIN", on_condition, using_columns) def RIGHTOUTERJOINConditional(table1, table2, on_condition=None, using_columns=None): return SQLJoinConditional(table1, table2, "RIGHT OUTER JOIN", on_condition, using_columns) def FULLJOINConditional(table1, table2, on_condition=None, using_columns=None): return SQLJoinConditional(table1, table2, "FULL JOIN", on_condition, using_columns) def FULLOUTERJOINConditional(table1, table2, on_condition=None, using_columns=None): return SQLJoinConditional(table1, table2, "FULL OUTER JOIN", on_condition, using_columns) class SQLJoinOn(SQLJoinConditional): """Conditional JOIN ON""" def __init__(self, table1, table2, op, on_condition): SQLJoinConditional.__init__(self, table1, table2, op, on_condition) registerConverter(SQLJoinOn, SQLExprConverter) class SQLJoinUsing(SQLJoinConditional): """Conditional JOIN USING""" def __init__(self, table1, table2, op, using_columns): SQLJoinConditional.__init__(self, table1, table2, op, None, using_columns) registerConverter(SQLJoinUsing, SQLExprConverter) def INNERJOINOn(table1, table2, on_condition): return SQLJoinOn(table1, table2, "INNER JOIN", on_condition) def LEFTJOINOn(table1, table2, on_condition): return SQLJoinOn(table1, table2, "LEFT JOIN", on_condition) def LEFTOUTERJOINOn(table1, table2, on_condition): return SQLJoinOn(table1, table2, "LEFT OUTER JOIN", on_condition) def RIGHTJOINOn(table1, table2, on_condition): return SQLJoinOn(table1, table2, "RIGHT JOIN", on_condition) def RIGHTOUTERJOINOn(table1, table2, on_condition): return SQLJoinOn(table1, table2, "RIGHT OUTER JOIN", on_condition) def FULLJOINOn(table1, table2, on_condition): return SQLJoinOn(table1, table2, "FULL JOIN", on_condition) def FULLOUTERJOINOn(table1, table2, on_condition): return SQLJoinOn(table1, table2, "FULL OUTER JOIN", on_condition) def INNERJOINUsing(table1, table2, using_columns): return SQLJoinUsing(table1, table2, "INNER JOIN", using_columns) def LEFTJOINUsing(table1, table2, using_columns): return SQLJoinUsing(table1, table2, "LEFT JOIN", using_columns) def LEFTOUTERJOINUsing(table1, table2, using_columns): return SQLJoinUsing(table1, table2, "LEFT OUTER JOIN", using_columns) def RIGHTJOINUsing(table1, table2, using_columns): return SQLJoinUsing(table1, table2, "RIGHT JOIN", using_columns) def RIGHTOUTERJOINUsing(table1, table2, using_columns): return SQLJoinUsing(table1, table2, "RIGHT OUTER JOIN", using_columns) def FULLJOINUsing(table1, table2, using_columns): return SQLJoinUsing(table1, table2, "FULL JOIN", using_columns) def FULLOUTERJOINUsing(table1, table2, using_columns): return SQLJoinUsing(table1, table2, "FULL OUTER JOIN", using_columns) ######################################## # Subqueries (subselects) ######################################## class OuterField(SQLObjectField): def tablesUsedImmediate(self): return [] class OuterTable(SQLObjectTable): FieldClass = OuterField class Outer: def __init__(self, table): self.q = OuterTable(table) class LIKE(SQLExpression): op = "LIKE" def __init__(self, expr, string, escape=None): self.expr = expr self.string = string self.escape = escape def __sqlrepr__(self, db): escape = self.escape like = "%s %s (%s)" % (sqlrepr(self.expr, db), self.op, sqlrepr(self.string, db)) if escape is None: return "(%s)" % like else: return "(%s ESCAPE %s)" % (like, sqlrepr(escape, db)) def components(self): return [self.expr, self.string] def execute(self, executor): if not hasattr(self, '_regex'): # @@: Crude, not entirely accurate dest = self.string dest = dest.replace("%%", "\001") dest = dest.replace("*", "\002") dest = dest.replace("%", "*") dest = dest.replace("\001", "%") dest = dest.replace("\002", "[*]") self._regex = re.compile(fnmatch.translate(dest), re.I) return self._regex.search(execute(self.expr, executor)) class RLIKE(LIKE): op = "RLIKE" op_db = { 'firebird': 'RLIKE', 'maxdb': 'RLIKE', 'mysql': 'RLIKE', 'postgres': '~', 'rdbhost': '~', 'sqlite': 'REGEXP' } def _get_op(self, db): return self.op_db.get(db, 'LIKE') def __sqlrepr__(self, db): return "(%s %s (%s))" % ( sqlrepr(self.expr, db), self._get_op(db), sqlrepr(self.string, db) ) def execute(self, executor): self.op = self._get_op(self.db) return LIKE.execute(self, executor) class INSubquery(SQLExpression): op = "IN" def __init__(self, item, subquery): self.item = item self.subquery = subquery def components(self): return [self.item] def __sqlrepr__(self, db): return "%s %s (%s)" % (sqlrepr(self.item, db), self.op, sqlrepr(self.subquery, db)) class NOTINSubquery(INSubquery): op = "NOT IN" class Subquery(SQLExpression): def __init__(self, op, subquery): self.op = op self.subquery = subquery def __sqlrepr__(self, db): return "%s (%s)" % (self.op, sqlrepr(self.subquery, db)) def EXISTS(subquery): return Subquery("EXISTS", subquery) def NOTEXISTS(subquery): return Subquery("NOT EXISTS", subquery) def SOME(subquery): return Subquery("SOME", subquery) def ANY(subquery): return Subquery("ANY", subquery) def ALL(subquery): return Subquery("ALL", subquery) #### class ImportProxyField(SQLObjectField): def tablesUsedImmediate(self): return [str(self.tableName)] class ImportProxy(SQLExpression): '''Class to be used in column definitions that rely on other tables that might not yet be in a classregistry. ''' FieldClass = ImportProxyField def __init__(self, clsName, registry=None): self.tableName = _DelayClass(self, clsName) self.sqlmeta = _Delay_proxy(table=_DelayClass(self, clsName)) self.q = self self.soClass = None classregistry.registry(registry).addClassCallback( clsName, lambda foreign, me: setattr(me, 'soClass', foreign), self) def __nonzero__(self): return True __bool__ = __nonzero__ def __getattr__(self, attr): if self.soClass is None: return _Delay(self, attr) return getattr(self.soClass.q, attr) class _Delay(SQLExpression): def __init__(self, proxy, attr): self.attr = attr self.proxy = proxy def __sqlrepr__(self, db): if self.proxy.soClass is None: return '_DELAYED_' + self.attr val = self._resolve() if isinstance(val, SQLExpression): val = sqlrepr(val, db) return val def tablesUsedImmediate(self): return getattr(self._resolve(), 'tablesUsedImmediate', lambda: [])() def components(self): return getattr(self._resolve(), 'components', lambda: [])() def _resolve(self): return getattr(self.proxy, self.attr) # For AliasTable etc def fieldName(self): class _aliasFieldName(SQLExpression): def __init__(self, proxy): self.proxy = proxy def __sqlrepr__(self, db): return self.proxy._resolve().fieldName return _aliasFieldName(self) fieldName = property(fieldName) class _DelayClass(_Delay): def _resolve(self): return self.proxy.soClass.sqlmeta.table class _Delay_proxy(object): def __init__(self, **kw): self.__dict__.update(kw) ###### ######################################## # Global initializations ######################################## table = TableSpace() const = ConstantSpace() func = const ######################################## # Testing ######################################## if __name__ == "__main__": tests = """ >>> AND(table.address.name == "Ian Bicking", table.address.zip > 30000) >>> table.address.name >>> AND(LIKE(table.address.name, "this"), IN(table.address.zip, [100, 200, 300])) >>> Select([table.address.name, table.address.state], where=LIKE(table.address.name, "%ian%")) >>> Select([table.user.name], where=AND(table.user.state == table.states.abbrev)) >>> Insert(table.address, [{"name": "BOB", "address": "3049 N. 18th St."}, {"name": "TIM", "address": "409 S. 10th St."}]) >>> Insert(table.address, [("BOB", "3049 N. 18th St."), ("TIM", "409 S. 10th St.")], template=('name', 'address')) >>> Delete(table.address, where="BOB"==table.address.name) >>> Update(table.address, {"lastModified": const.NOW()}) >>> Replace(table.address, [("BOB", "3049 N. 18th St."), ("TIM", "409 S. 10th St.")], template=('name', 'address')) """ # noqa: allow long (> 79) lines for expr in tests.split('\n'): if not expr.strip(): continue if expr.startswith('>>> '): expr = expr[4:] SQLObject-3.4.0/SQLObject.egg-info/0000755000175000017500000000000013141371614016135 5ustar phdphd00000000000000SQLObject-3.4.0/SQLObject.egg-info/entry_points.txt0000644000175000017500000000013113141371613021425 0ustar phdphd00000000000000 [paste.filter_app_factory] main = sqlobject.wsgi_middleware:make_middleware SQLObject-3.4.0/SQLObject.egg-info/dependency_links.txt0000644000175000017500000000000113141371613022202 0ustar phdphd00000000000000 SQLObject-3.4.0/SQLObject.egg-info/SOURCES.txt0000644000175000017500000020420513141371613020023 0ustar phdphd00000000000000.travis.yml ANNOUNCE.rst LICENSE MANIFEST.in README.rst setup.cfg setup.py tox.ini SQLObject.egg-info/PKG-INFO SQLObject.egg-info/SOURCES.txt SQLObject.egg-info/dependency_links.txt SQLObject.egg-info/entry_points.txt SQLObject.egg-info/requires.txt SQLObject.egg-info/top_level.txt debian/changelog debian/control debian/copyright debian/docs debian/examples debian/rules docs/Authors.rst docs/DeveloperGuide.rst docs/FAQ.rst docs/Inheritance.rst docs/Makefile docs/News.rst docs/News1.rst docs/News2.rst docs/News3.rst docs/News4.rst docs/News5.rst docs/Python3.rst docs/SQLBuilder.rst docs/SQLObject.rst docs/SelectResults.rst docs/TODO.rst docs/Versioning.rst docs/Views.rst docs/community.rst docs/conf.py docs/download.rst docs/genapidocs docs/index.rst docs/interface.py docs/links.rst docs/rebuild docs/sqlobject-admin.rst docs/test.py docs/api/modules.rst docs/api/sqlobject.boundattributes.rst docs/api/sqlobject.cache.rst docs/api/sqlobject.classregistry.rst docs/api/sqlobject.col.rst docs/api/sqlobject.compat.rst docs/api/sqlobject.conftest.rst docs/api/sqlobject.constraints.rst docs/api/sqlobject.converters.rst docs/api/sqlobject.dbconnection.rst docs/api/sqlobject.dberrors.rst docs/api/sqlobject.declarative.rst docs/api/sqlobject.events.rst docs/api/sqlobject.firebird.firebirdconnection.rst docs/api/sqlobject.firebird.rst docs/api/sqlobject.include.hashcol.rst docs/api/sqlobject.include.rst docs/api/sqlobject.include.tests.rst docs/api/sqlobject.include.tests.test_hashcol.rst docs/api/sqlobject.index.rst docs/api/sqlobject.inheritance.iteration.rst docs/api/sqlobject.inheritance.rst docs/api/sqlobject.inheritance.tests.rst docs/api/sqlobject.inheritance.tests.test_aggregates.rst docs/api/sqlobject.inheritance.tests.test_asdict.rst docs/api/sqlobject.inheritance.tests.test_deep_inheritance.rst docs/api/sqlobject.inheritance.tests.test_destroy_cascade.rst docs/api/sqlobject.inheritance.tests.test_foreignKey.rst docs/api/sqlobject.inheritance.tests.test_indexes.rst docs/api/sqlobject.inheritance.tests.test_inheritance.rst docs/api/sqlobject.inheritance.tests.test_inheritance_tree.rst docs/api/sqlobject.joins.rst docs/api/sqlobject.main.rst docs/api/sqlobject.manager.command.rst docs/api/sqlobject.manager.rst docs/api/sqlobject.maxdb.maxdbconnection.rst docs/api/sqlobject.maxdb.rst docs/api/sqlobject.mssql.mssqlconnection.rst docs/api/sqlobject.mssql.rst docs/api/sqlobject.mysql.mysqlconnection.rst docs/api/sqlobject.mysql.rst docs/api/sqlobject.postgres.pgconnection.rst docs/api/sqlobject.postgres.rst docs/api/sqlobject.rdbhost.rdbhostconnection.rst docs/api/sqlobject.rdbhost.rst docs/api/sqlobject.rst docs/api/sqlobject.sqlbuilder.rst docs/api/sqlobject.sqlite.rst docs/api/sqlobject.sqlite.sqliteconnection.rst docs/api/sqlobject.sresults.rst docs/api/sqlobject.styles.rst docs/api/sqlobject.sybase.rst docs/api/sqlobject.sybase.sybaseconnection.rst docs/api/sqlobject.tests.dbtest.rst docs/api/sqlobject.tests.rst docs/api/sqlobject.tests.test_ForeignKey.rst docs/api/sqlobject.tests.test_NoneValuedResultItem.rst docs/api/sqlobject.tests.test_SQLMultipleJoin.rst docs/api/sqlobject.tests.test_SQLRelatedJoin.rst docs/api/sqlobject.tests.test_SingleJoin.rst docs/api/sqlobject.tests.test_aggregates.rst docs/api/sqlobject.tests.test_aliases.rst docs/api/sqlobject.tests.test_asdict.rst docs/api/sqlobject.tests.test_auto.rst docs/api/sqlobject.tests.test_basic.rst docs/api/sqlobject.tests.test_blob.rst docs/api/sqlobject.tests.test_boundattributes.rst docs/api/sqlobject.tests.test_cache.rst docs/api/sqlobject.tests.test_class_hash.rst docs/api/sqlobject.tests.test_columns_order.rst docs/api/sqlobject.tests.test_combining_joins.rst docs/api/sqlobject.tests.test_comparison.rst docs/api/sqlobject.tests.test_complex_sorting.rst docs/api/sqlobject.tests.test_constraints.rst docs/api/sqlobject.tests.test_converters.rst docs/api/sqlobject.tests.test_create_drop.rst docs/api/sqlobject.tests.test_csvexport.rst docs/api/sqlobject.tests.test_cyclic_reference.rst docs/api/sqlobject.tests.test_datetime.rst docs/api/sqlobject.tests.test_decimal.rst docs/api/sqlobject.tests.test_declarative.rst docs/api/sqlobject.tests.test_default_style.rst docs/api/sqlobject.tests.test_delete.rst docs/api/sqlobject.tests.test_distinct.rst docs/api/sqlobject.tests.test_empty.rst docs/api/sqlobject.tests.test_enum.rst docs/api/sqlobject.tests.test_events.rst docs/api/sqlobject.tests.test_exceptions.rst docs/api/sqlobject.tests.test_expire.rst docs/api/sqlobject.tests.test_groupBy.rst docs/api/sqlobject.tests.test_identity.rst docs/api/sqlobject.tests.test_indexes.rst docs/api/sqlobject.tests.test_inheritance.rst docs/api/sqlobject.tests.test_joins.rst docs/api/sqlobject.tests.test_joins_conditional.rst docs/api/sqlobject.tests.test_jsonbcol.rst docs/api/sqlobject.tests.test_jsoncol.rst docs/api/sqlobject.tests.test_lazy.rst docs/api/sqlobject.tests.test_md5.rst docs/api/sqlobject.tests.test_mysql.rst docs/api/sqlobject.tests.test_new_joins.rst docs/api/sqlobject.tests.test_parse_uri.rst docs/api/sqlobject.tests.test_paste.rst docs/api/sqlobject.tests.test_perConnection.rst docs/api/sqlobject.tests.test_pickle.rst docs/api/sqlobject.tests.test_picklecol.rst docs/api/sqlobject.tests.test_postgres.rst docs/api/sqlobject.tests.test_reparent_sqlmeta.rst docs/api/sqlobject.tests.test_schema.rst docs/api/sqlobject.tests.test_select.rst docs/api/sqlobject.tests.test_select_through.rst docs/api/sqlobject.tests.test_setters.rst docs/api/sqlobject.tests.test_slice.rst docs/api/sqlobject.tests.test_sorting.rst docs/api/sqlobject.tests.test_sqlbuilder.rst docs/api/sqlobject.tests.test_sqlbuilder_dbspecific.rst docs/api/sqlobject.tests.test_sqlbuilder_importproxy.rst docs/api/sqlobject.tests.test_sqlbuilder_joins_instances.rst docs/api/sqlobject.tests.test_sqlite.rst docs/api/sqlobject.tests.test_sqlmeta_idName.rst docs/api/sqlobject.tests.test_sqlobject_admin.rst docs/api/sqlobject.tests.test_string_id.rst docs/api/sqlobject.tests.test_style.rst docs/api/sqlobject.tests.test_subqueries.rst docs/api/sqlobject.tests.test_transactions.rst docs/api/sqlobject.tests.test_unicode.rst docs/api/sqlobject.tests.test_uuidcol.rst docs/api/sqlobject.tests.test_validation.rst docs/api/sqlobject.tests.test_views.rst docs/api/sqlobject.util.csvexport.rst docs/api/sqlobject.util.csvimport.rst docs/api/sqlobject.util.moduleloader.rst docs/api/sqlobject.util.rst docs/api/sqlobject.util.threadinglocal.rst docs/api/sqlobject.versioning.rst docs/api/sqlobject.versioning.test.rst docs/api/sqlobject.versioning.test.test_version.rst docs/api/sqlobject.views.rst docs/api/sqlobject.wsgi_middleware.rst docs/europython/europython_sqlobj.py docs/europython/main.css docs/europython/person.py docs/html/Authors.html docs/html/DeveloperGuide.html docs/html/FAQ.html docs/html/Inheritance.html docs/html/News.html docs/html/News1.html docs/html/News2.html docs/html/News3.html docs/html/News4.html docs/html/News5.html docs/html/Python3.html docs/html/SQLBuilder.html docs/html/SQLObject.html docs/html/SelectResults.html docs/html/TODO.html docs/html/Versioning.html docs/html/Views.html docs/html/community.html docs/html/download.html docs/html/genindex.html docs/html/index.html docs/html/links.html docs/html/py-modindex.html docs/html/search.html docs/html/searchindex.js docs/html/sqlobject-admin.html docs/html/_modules/builtins.html docs/html/_modules/index.html docs/html/_modules/optparse.html docs/html/_modules/_pytest/python.html docs/html/_modules/_pytest/_code/code.html docs/html/_modules/pydispatch/dispatcher.html docs/html/_modules/sqlobject/boundattributes.html docs/html/_modules/sqlobject/cache.html docs/html/_modules/sqlobject/classregistry.html docs/html/_modules/sqlobject/col.html docs/html/_modules/sqlobject/compat.html docs/html/_modules/sqlobject/conftest.html docs/html/_modules/sqlobject/constraints.html docs/html/_modules/sqlobject/converters.html docs/html/_modules/sqlobject/dbconnection.html docs/html/_modules/sqlobject/dberrors.html docs/html/_modules/sqlobject/declarative.html docs/html/_modules/sqlobject/events.html docs/html/_modules/sqlobject/firebird.html docs/html/_modules/sqlobject/index.html docs/html/_modules/sqlobject/inheritance.html docs/html/_modules/sqlobject/joins.html docs/html/_modules/sqlobject/main.html docs/html/_modules/sqlobject/maxdb.html docs/html/_modules/sqlobject/mssql.html docs/html/_modules/sqlobject/mysql.html docs/html/_modules/sqlobject/postgres.html docs/html/_modules/sqlobject/rdbhost.html docs/html/_modules/sqlobject/sqlbuilder.html docs/html/_modules/sqlobject/sqlite.html docs/html/_modules/sqlobject/sresults.html docs/html/_modules/sqlobject/styles.html docs/html/_modules/sqlobject/sybase.html docs/html/_modules/sqlobject/versioning.html docs/html/_modules/sqlobject/views.html docs/html/_modules/sqlobject/wsgi_middleware.html docs/html/_modules/sqlobject/firebird/firebirdconnection.html docs/html/_modules/sqlobject/include/hashcol.html docs/html/_modules/sqlobject/include/tests/test_hashcol.html docs/html/_modules/sqlobject/inheritance/iteration.html docs/html/_modules/sqlobject/inheritance/tests/test_aggregates.html docs/html/_modules/sqlobject/inheritance/tests/test_asdict.html docs/html/_modules/sqlobject/inheritance/tests/test_deep_inheritance.html docs/html/_modules/sqlobject/inheritance/tests/test_destroy_cascade.html docs/html/_modules/sqlobject/inheritance/tests/test_foreignKey.html docs/html/_modules/sqlobject/inheritance/tests/test_indexes.html docs/html/_modules/sqlobject/inheritance/tests/test_inheritance.html docs/html/_modules/sqlobject/inheritance/tests/test_inheritance_tree.html docs/html/_modules/sqlobject/manager/command.html docs/html/_modules/sqlobject/maxdb/maxdbconnection.html docs/html/_modules/sqlobject/mssql/mssqlconnection.html docs/html/_modules/sqlobject/mysql/mysqlconnection.html docs/html/_modules/sqlobject/postgres/pgconnection.html docs/html/_modules/sqlobject/rdbhost/rdbhostconnection.html docs/html/_modules/sqlobject/sqlite/sqliteconnection.html docs/html/_modules/sqlobject/sybase/sybaseconnection.html docs/html/_modules/sqlobject/tests/dbtest.html docs/html/_modules/sqlobject/tests/test_ForeignKey.html docs/html/_modules/sqlobject/tests/test_NoneValuedResultItem.html docs/html/_modules/sqlobject/tests/test_SQLMultipleJoin.html docs/html/_modules/sqlobject/tests/test_SQLRelatedJoin.html docs/html/_modules/sqlobject/tests/test_SingleJoin.html docs/html/_modules/sqlobject/tests/test_aggregates.html docs/html/_modules/sqlobject/tests/test_aliases.html docs/html/_modules/sqlobject/tests/test_asdict.html docs/html/_modules/sqlobject/tests/test_auto.html docs/html/_modules/sqlobject/tests/test_basic.html docs/html/_modules/sqlobject/tests/test_blob.html docs/html/_modules/sqlobject/tests/test_boundattributes.html docs/html/_modules/sqlobject/tests/test_cache.html docs/html/_modules/sqlobject/tests/test_class_hash.html docs/html/_modules/sqlobject/tests/test_columns_order.html docs/html/_modules/sqlobject/tests/test_combining_joins.html docs/html/_modules/sqlobject/tests/test_comparison.html docs/html/_modules/sqlobject/tests/test_complex_sorting.html docs/html/_modules/sqlobject/tests/test_constraints.html docs/html/_modules/sqlobject/tests/test_converters.html docs/html/_modules/sqlobject/tests/test_create_drop.html docs/html/_modules/sqlobject/tests/test_csvexport.html docs/html/_modules/sqlobject/tests/test_cyclic_reference.html docs/html/_modules/sqlobject/tests/test_datetime.html docs/html/_modules/sqlobject/tests/test_decimal.html docs/html/_modules/sqlobject/tests/test_declarative.html docs/html/_modules/sqlobject/tests/test_default_style.html docs/html/_modules/sqlobject/tests/test_delete.html docs/html/_modules/sqlobject/tests/test_distinct.html docs/html/_modules/sqlobject/tests/test_empty.html docs/html/_modules/sqlobject/tests/test_enum.html docs/html/_modules/sqlobject/tests/test_events.html docs/html/_modules/sqlobject/tests/test_exceptions.html docs/html/_modules/sqlobject/tests/test_expire.html docs/html/_modules/sqlobject/tests/test_groupBy.html docs/html/_modules/sqlobject/tests/test_identity.html docs/html/_modules/sqlobject/tests/test_indexes.html docs/html/_modules/sqlobject/tests/test_inheritance.html docs/html/_modules/sqlobject/tests/test_joins.html docs/html/_modules/sqlobject/tests/test_joins_conditional.html docs/html/_modules/sqlobject/tests/test_jsonbcol.html docs/html/_modules/sqlobject/tests/test_jsoncol.html docs/html/_modules/sqlobject/tests/test_lazy.html docs/html/_modules/sqlobject/tests/test_md5.html docs/html/_modules/sqlobject/tests/test_mysql.html docs/html/_modules/sqlobject/tests/test_new_joins.html docs/html/_modules/sqlobject/tests/test_parse_uri.html docs/html/_modules/sqlobject/tests/test_paste.html docs/html/_modules/sqlobject/tests/test_perConnection.html docs/html/_modules/sqlobject/tests/test_pickle.html docs/html/_modules/sqlobject/tests/test_picklecol.html docs/html/_modules/sqlobject/tests/test_postgres.html docs/html/_modules/sqlobject/tests/test_reparent_sqlmeta.html docs/html/_modules/sqlobject/tests/test_schema.html docs/html/_modules/sqlobject/tests/test_select.html docs/html/_modules/sqlobject/tests/test_select_through.html docs/html/_modules/sqlobject/tests/test_setters.html docs/html/_modules/sqlobject/tests/test_slice.html docs/html/_modules/sqlobject/tests/test_sorting.html docs/html/_modules/sqlobject/tests/test_sqlbuilder.html docs/html/_modules/sqlobject/tests/test_sqlbuilder_dbspecific.html docs/html/_modules/sqlobject/tests/test_sqlbuilder_importproxy.html docs/html/_modules/sqlobject/tests/test_sqlbuilder_joins_instances.html docs/html/_modules/sqlobject/tests/test_sqlite.html docs/html/_modules/sqlobject/tests/test_sqlmeta_idName.html docs/html/_modules/sqlobject/tests/test_sqlobject_admin.html docs/html/_modules/sqlobject/tests/test_string_id.html docs/html/_modules/sqlobject/tests/test_style.html docs/html/_modules/sqlobject/tests/test_subqueries.html docs/html/_modules/sqlobject/tests/test_transactions.html docs/html/_modules/sqlobject/tests/test_unicode.html docs/html/_modules/sqlobject/tests/test_uuidcol.html docs/html/_modules/sqlobject/tests/test_validation.html docs/html/_modules/sqlobject/tests/test_views.html docs/html/_modules/sqlobject/util/csvexport.html docs/html/_modules/sqlobject/util/csvimport.html docs/html/_modules/sqlobject/util/moduleloader.html docs/html/_modules/sqlobject/versioning/test/test_version.html docs/html/_sources/Authors.rst.txt docs/html/_sources/DeveloperGuide.rst.txt docs/html/_sources/FAQ.rst.txt docs/html/_sources/Inheritance.rst.txt docs/html/_sources/News.rst.txt docs/html/_sources/News1.rst.txt docs/html/_sources/News2.rst.txt docs/html/_sources/News3.rst.txt docs/html/_sources/News4.rst.txt docs/html/_sources/News5.rst.txt docs/html/_sources/Python3.rst.txt docs/html/_sources/SQLBuilder.rst.txt docs/html/_sources/SQLObject.rst.txt docs/html/_sources/SelectResults.rst.txt docs/html/_sources/TODO.rst.txt docs/html/_sources/Versioning.rst.txt docs/html/_sources/Views.rst.txt docs/html/_sources/community.rst.txt docs/html/_sources/download.rst.txt docs/html/_sources/index.rst.txt docs/html/_sources/links.rst.txt docs/html/_sources/sqlobject-admin.rst.txt docs/html/_sources/api/modules.rst.txt docs/html/_sources/api/sqlobject.boundattributes.rst.txt docs/html/_sources/api/sqlobject.cache.rst.txt docs/html/_sources/api/sqlobject.classregistry.rst.txt docs/html/_sources/api/sqlobject.col.rst.txt docs/html/_sources/api/sqlobject.compat.rst.txt docs/html/_sources/api/sqlobject.conftest.rst.txt docs/html/_sources/api/sqlobject.constraints.rst.txt docs/html/_sources/api/sqlobject.converters.rst.txt docs/html/_sources/api/sqlobject.dbconnection.rst.txt docs/html/_sources/api/sqlobject.dberrors.rst.txt docs/html/_sources/api/sqlobject.declarative.rst.txt docs/html/_sources/api/sqlobject.events.rst.txt docs/html/_sources/api/sqlobject.firebird.firebirdconnection.rst.txt docs/html/_sources/api/sqlobject.firebird.rst.txt docs/html/_sources/api/sqlobject.include.hashcol.rst.txt docs/html/_sources/api/sqlobject.include.rst.txt docs/html/_sources/api/sqlobject.include.tests.rst.txt docs/html/_sources/api/sqlobject.include.tests.test_hashcol.rst.txt docs/html/_sources/api/sqlobject.index.rst.txt docs/html/_sources/api/sqlobject.inheritance.iteration.rst.txt docs/html/_sources/api/sqlobject.inheritance.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_aggregates.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_asdict.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_deep_inheritance.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_destroy_cascade.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_foreignKey.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_indexes.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_inheritance.rst.txt docs/html/_sources/api/sqlobject.inheritance.tests.test_inheritance_tree.rst.txt docs/html/_sources/api/sqlobject.joins.rst.txt docs/html/_sources/api/sqlobject.main.rst.txt docs/html/_sources/api/sqlobject.manager.command.rst.txt docs/html/_sources/api/sqlobject.manager.rst.txt docs/html/_sources/api/sqlobject.maxdb.maxdbconnection.rst.txt docs/html/_sources/api/sqlobject.maxdb.rst.txt docs/html/_sources/api/sqlobject.mssql.mssqlconnection.rst.txt docs/html/_sources/api/sqlobject.mssql.rst.txt docs/html/_sources/api/sqlobject.mysql.mysqlconnection.rst.txt docs/html/_sources/api/sqlobject.mysql.rst.txt docs/html/_sources/api/sqlobject.postgres.pgconnection.rst.txt docs/html/_sources/api/sqlobject.postgres.rst.txt docs/html/_sources/api/sqlobject.rdbhost.rdbhostconnection.rst.txt docs/html/_sources/api/sqlobject.rdbhost.rst.txt docs/html/_sources/api/sqlobject.rst.txt docs/html/_sources/api/sqlobject.sqlbuilder.rst.txt docs/html/_sources/api/sqlobject.sqlite.rst.txt docs/html/_sources/api/sqlobject.sqlite.sqliteconnection.rst.txt docs/html/_sources/api/sqlobject.sresults.rst.txt docs/html/_sources/api/sqlobject.styles.rst.txt docs/html/_sources/api/sqlobject.sybase.rst.txt docs/html/_sources/api/sqlobject.sybase.sybaseconnection.rst.txt docs/html/_sources/api/sqlobject.tests.dbtest.rst.txt docs/html/_sources/api/sqlobject.tests.rst.txt docs/html/_sources/api/sqlobject.tests.test_ForeignKey.rst.txt docs/html/_sources/api/sqlobject.tests.test_NoneValuedResultItem.rst.txt docs/html/_sources/api/sqlobject.tests.test_SQLMultipleJoin.rst.txt docs/html/_sources/api/sqlobject.tests.test_SQLRelatedJoin.rst.txt docs/html/_sources/api/sqlobject.tests.test_SingleJoin.rst.txt docs/html/_sources/api/sqlobject.tests.test_aggregates.rst.txt docs/html/_sources/api/sqlobject.tests.test_aliases.rst.txt docs/html/_sources/api/sqlobject.tests.test_asdict.rst.txt docs/html/_sources/api/sqlobject.tests.test_auto.rst.txt docs/html/_sources/api/sqlobject.tests.test_basic.rst.txt docs/html/_sources/api/sqlobject.tests.test_blob.rst.txt docs/html/_sources/api/sqlobject.tests.test_boundattributes.rst.txt docs/html/_sources/api/sqlobject.tests.test_cache.rst.txt docs/html/_sources/api/sqlobject.tests.test_class_hash.rst.txt docs/html/_sources/api/sqlobject.tests.test_columns_order.rst.txt docs/html/_sources/api/sqlobject.tests.test_combining_joins.rst.txt docs/html/_sources/api/sqlobject.tests.test_comparison.rst.txt docs/html/_sources/api/sqlobject.tests.test_complex_sorting.rst.txt docs/html/_sources/api/sqlobject.tests.test_constraints.rst.txt docs/html/_sources/api/sqlobject.tests.test_converters.rst.txt docs/html/_sources/api/sqlobject.tests.test_create_drop.rst.txt docs/html/_sources/api/sqlobject.tests.test_csvexport.rst.txt docs/html/_sources/api/sqlobject.tests.test_cyclic_reference.rst.txt docs/html/_sources/api/sqlobject.tests.test_datetime.rst.txt docs/html/_sources/api/sqlobject.tests.test_decimal.rst.txt docs/html/_sources/api/sqlobject.tests.test_declarative.rst.txt docs/html/_sources/api/sqlobject.tests.test_default_style.rst.txt docs/html/_sources/api/sqlobject.tests.test_delete.rst.txt docs/html/_sources/api/sqlobject.tests.test_distinct.rst.txt docs/html/_sources/api/sqlobject.tests.test_empty.rst.txt docs/html/_sources/api/sqlobject.tests.test_enum.rst.txt docs/html/_sources/api/sqlobject.tests.test_events.rst.txt docs/html/_sources/api/sqlobject.tests.test_exceptions.rst.txt docs/html/_sources/api/sqlobject.tests.test_expire.rst.txt docs/html/_sources/api/sqlobject.tests.test_groupBy.rst.txt docs/html/_sources/api/sqlobject.tests.test_identity.rst.txt docs/html/_sources/api/sqlobject.tests.test_indexes.rst.txt docs/html/_sources/api/sqlobject.tests.test_inheritance.rst.txt docs/html/_sources/api/sqlobject.tests.test_joins.rst.txt docs/html/_sources/api/sqlobject.tests.test_joins_conditional.rst.txt docs/html/_sources/api/sqlobject.tests.test_jsonbcol.rst.txt docs/html/_sources/api/sqlobject.tests.test_jsoncol.rst.txt docs/html/_sources/api/sqlobject.tests.test_lazy.rst.txt docs/html/_sources/api/sqlobject.tests.test_md5.rst.txt docs/html/_sources/api/sqlobject.tests.test_mysql.rst.txt docs/html/_sources/api/sqlobject.tests.test_new_joins.rst.txt docs/html/_sources/api/sqlobject.tests.test_parse_uri.rst.txt docs/html/_sources/api/sqlobject.tests.test_paste.rst.txt docs/html/_sources/api/sqlobject.tests.test_perConnection.rst.txt docs/html/_sources/api/sqlobject.tests.test_pickle.rst.txt docs/html/_sources/api/sqlobject.tests.test_picklecol.rst.txt docs/html/_sources/api/sqlobject.tests.test_postgres.rst.txt docs/html/_sources/api/sqlobject.tests.test_reparent_sqlmeta.rst.txt docs/html/_sources/api/sqlobject.tests.test_schema.rst.txt docs/html/_sources/api/sqlobject.tests.test_select.rst.txt docs/html/_sources/api/sqlobject.tests.test_select_through.rst.txt docs/html/_sources/api/sqlobject.tests.test_setters.rst.txt docs/html/_sources/api/sqlobject.tests.test_slice.rst.txt docs/html/_sources/api/sqlobject.tests.test_sorting.rst.txt docs/html/_sources/api/sqlobject.tests.test_sqlbuilder.rst.txt docs/html/_sources/api/sqlobject.tests.test_sqlbuilder_dbspecific.rst.txt docs/html/_sources/api/sqlobject.tests.test_sqlbuilder_importproxy.rst.txt docs/html/_sources/api/sqlobject.tests.test_sqlbuilder_joins_instances.rst.txt docs/html/_sources/api/sqlobject.tests.test_sqlite.rst.txt docs/html/_sources/api/sqlobject.tests.test_sqlmeta_idName.rst.txt docs/html/_sources/api/sqlobject.tests.test_sqlobject_admin.rst.txt docs/html/_sources/api/sqlobject.tests.test_string_id.rst.txt docs/html/_sources/api/sqlobject.tests.test_style.rst.txt docs/html/_sources/api/sqlobject.tests.test_subqueries.rst.txt docs/html/_sources/api/sqlobject.tests.test_transactions.rst.txt docs/html/_sources/api/sqlobject.tests.test_unicode.rst.txt docs/html/_sources/api/sqlobject.tests.test_uuidcol.rst.txt docs/html/_sources/api/sqlobject.tests.test_validation.rst.txt docs/html/_sources/api/sqlobject.tests.test_views.rst.txt docs/html/_sources/api/sqlobject.util.csvexport.rst.txt docs/html/_sources/api/sqlobject.util.csvimport.rst.txt docs/html/_sources/api/sqlobject.util.moduleloader.rst.txt docs/html/_sources/api/sqlobject.util.rst.txt docs/html/_sources/api/sqlobject.util.threadinglocal.rst.txt docs/html/_sources/api/sqlobject.versioning.rst.txt docs/html/_sources/api/sqlobject.versioning.test.rst.txt docs/html/_sources/api/sqlobject.versioning.test.test_version.rst.txt docs/html/_sources/api/sqlobject.views.rst.txt docs/html/_sources/api/sqlobject.wsgi_middleware.rst.txt docs/html/_static/ajax-loader.gif docs/html/_static/background_b01.png docs/html/_static/basic.css docs/html/_static/bizstyle.css docs/html/_static/bizstyle.js docs/html/_static/classic.css docs/html/_static/comment-bright.png docs/html/_static/comment-close.png docs/html/_static/comment.png docs/html/_static/css3-mediaqueries.js docs/html/_static/css3-mediaqueries_src.js docs/html/_static/default.css docs/html/_static/doctools.js docs/html/_static/down-pressed.png docs/html/_static/down.png docs/html/_static/file.png docs/html/_static/jquery-1.11.1.js docs/html/_static/jquery-3.1.0.js docs/html/_static/jquery.js docs/html/_static/minus.png docs/html/_static/plus.png docs/html/_static/pygments.css docs/html/_static/searchtools.js docs/html/_static/sidebar.js docs/html/_static/traditional.css docs/html/_static/underscore-1.3.1.js docs/html/_static/underscore.js docs/html/_static/up-pressed.png docs/html/_static/up.png docs/html/_static/websupport.js docs/html/api/modules.html docs/html/api/sqlobject.boundattributes.html docs/html/api/sqlobject.cache.html docs/html/api/sqlobject.classregistry.html docs/html/api/sqlobject.col.html docs/html/api/sqlobject.compat.html docs/html/api/sqlobject.conftest.html docs/html/api/sqlobject.constraints.html docs/html/api/sqlobject.converters.html docs/html/api/sqlobject.dbconnection.html docs/html/api/sqlobject.dberrors.html docs/html/api/sqlobject.declarative.html docs/html/api/sqlobject.events.html docs/html/api/sqlobject.firebird.firebirdconnection.html docs/html/api/sqlobject.firebird.html docs/html/api/sqlobject.html docs/html/api/sqlobject.include.hashcol.html docs/html/api/sqlobject.include.html docs/html/api/sqlobject.include.tests.html docs/html/api/sqlobject.include.tests.test_hashcol.html docs/html/api/sqlobject.index.html docs/html/api/sqlobject.inheritance.html docs/html/api/sqlobject.inheritance.iteration.html docs/html/api/sqlobject.inheritance.tests.html docs/html/api/sqlobject.inheritance.tests.test_aggregates.html docs/html/api/sqlobject.inheritance.tests.test_asdict.html docs/html/api/sqlobject.inheritance.tests.test_deep_inheritance.html docs/html/api/sqlobject.inheritance.tests.test_destroy_cascade.html docs/html/api/sqlobject.inheritance.tests.test_foreignKey.html docs/html/api/sqlobject.inheritance.tests.test_indexes.html docs/html/api/sqlobject.inheritance.tests.test_inheritance.html docs/html/api/sqlobject.inheritance.tests.test_inheritance_tree.html docs/html/api/sqlobject.joins.html docs/html/api/sqlobject.main.html docs/html/api/sqlobject.manager.command.html docs/html/api/sqlobject.manager.html docs/html/api/sqlobject.maxdb.html docs/html/api/sqlobject.maxdb.maxdbconnection.html docs/html/api/sqlobject.mssql.html docs/html/api/sqlobject.mssql.mssqlconnection.html docs/html/api/sqlobject.mysql.html docs/html/api/sqlobject.mysql.mysqlconnection.html docs/html/api/sqlobject.postgres.html docs/html/api/sqlobject.postgres.pgconnection.html docs/html/api/sqlobject.rdbhost.html docs/html/api/sqlobject.rdbhost.rdbhostconnection.html docs/html/api/sqlobject.sqlbuilder.html docs/html/api/sqlobject.sqlite.html docs/html/api/sqlobject.sqlite.sqliteconnection.html docs/html/api/sqlobject.sresults.html docs/html/api/sqlobject.styles.html docs/html/api/sqlobject.sybase.html docs/html/api/sqlobject.sybase.sybaseconnection.html docs/html/api/sqlobject.tests.dbtest.html docs/html/api/sqlobject.tests.html docs/html/api/sqlobject.tests.test_ForeignKey.html docs/html/api/sqlobject.tests.test_NoneValuedResultItem.html docs/html/api/sqlobject.tests.test_SQLMultipleJoin.html docs/html/api/sqlobject.tests.test_SQLRelatedJoin.html docs/html/api/sqlobject.tests.test_SingleJoin.html docs/html/api/sqlobject.tests.test_aggregates.html docs/html/api/sqlobject.tests.test_aliases.html docs/html/api/sqlobject.tests.test_asdict.html docs/html/api/sqlobject.tests.test_auto.html docs/html/api/sqlobject.tests.test_basic.html docs/html/api/sqlobject.tests.test_blob.html docs/html/api/sqlobject.tests.test_boundattributes.html docs/html/api/sqlobject.tests.test_cache.html docs/html/api/sqlobject.tests.test_class_hash.html docs/html/api/sqlobject.tests.test_columns_order.html docs/html/api/sqlobject.tests.test_combining_joins.html docs/html/api/sqlobject.tests.test_comparison.html docs/html/api/sqlobject.tests.test_complex_sorting.html docs/html/api/sqlobject.tests.test_constraints.html docs/html/api/sqlobject.tests.test_converters.html docs/html/api/sqlobject.tests.test_create_drop.html docs/html/api/sqlobject.tests.test_csvexport.html docs/html/api/sqlobject.tests.test_cyclic_reference.html docs/html/api/sqlobject.tests.test_datetime.html docs/html/api/sqlobject.tests.test_decimal.html docs/html/api/sqlobject.tests.test_declarative.html docs/html/api/sqlobject.tests.test_default_style.html docs/html/api/sqlobject.tests.test_delete.html docs/html/api/sqlobject.tests.test_distinct.html docs/html/api/sqlobject.tests.test_empty.html docs/html/api/sqlobject.tests.test_enum.html docs/html/api/sqlobject.tests.test_events.html docs/html/api/sqlobject.tests.test_exceptions.html docs/html/api/sqlobject.tests.test_expire.html docs/html/api/sqlobject.tests.test_groupBy.html docs/html/api/sqlobject.tests.test_identity.html docs/html/api/sqlobject.tests.test_indexes.html docs/html/api/sqlobject.tests.test_inheritance.html docs/html/api/sqlobject.tests.test_joins.html docs/html/api/sqlobject.tests.test_joins_conditional.html docs/html/api/sqlobject.tests.test_jsonbcol.html docs/html/api/sqlobject.tests.test_jsoncol.html docs/html/api/sqlobject.tests.test_lazy.html docs/html/api/sqlobject.tests.test_md5.html docs/html/api/sqlobject.tests.test_mysql.html docs/html/api/sqlobject.tests.test_new_joins.html docs/html/api/sqlobject.tests.test_parse_uri.html docs/html/api/sqlobject.tests.test_paste.html docs/html/api/sqlobject.tests.test_perConnection.html docs/html/api/sqlobject.tests.test_pickle.html docs/html/api/sqlobject.tests.test_picklecol.html docs/html/api/sqlobject.tests.test_postgres.html docs/html/api/sqlobject.tests.test_reparent_sqlmeta.html docs/html/api/sqlobject.tests.test_schema.html docs/html/api/sqlobject.tests.test_select.html docs/html/api/sqlobject.tests.test_select_through.html docs/html/api/sqlobject.tests.test_setters.html docs/html/api/sqlobject.tests.test_slice.html docs/html/api/sqlobject.tests.test_sorting.html docs/html/api/sqlobject.tests.test_sqlbuilder.html docs/html/api/sqlobject.tests.test_sqlbuilder_dbspecific.html docs/html/api/sqlobject.tests.test_sqlbuilder_importproxy.html docs/html/api/sqlobject.tests.test_sqlbuilder_joins_instances.html docs/html/api/sqlobject.tests.test_sqlite.html docs/html/api/sqlobject.tests.test_sqlmeta_idName.html docs/html/api/sqlobject.tests.test_sqlobject_admin.html docs/html/api/sqlobject.tests.test_string_id.html docs/html/api/sqlobject.tests.test_style.html docs/html/api/sqlobject.tests.test_subqueries.html docs/html/api/sqlobject.tests.test_transactions.html docs/html/api/sqlobject.tests.test_unicode.html docs/html/api/sqlobject.tests.test_uuidcol.html docs/html/api/sqlobject.tests.test_validation.html docs/html/api/sqlobject.tests.test_views.html docs/html/api/sqlobject.util.csvexport.html docs/html/api/sqlobject.util.csvimport.html docs/html/api/sqlobject.util.html docs/html/api/sqlobject.util.moduleloader.html docs/html/api/sqlobject.util.threadinglocal.html docs/html/api/sqlobject.versioning.html docs/html/api/sqlobject.versioning.test.html docs/html/api/sqlobject.versioning.test.test_version.html docs/html/api/sqlobject.views.html docs/html/api/sqlobject.wsgi_middleware.html docs/presentation-2004-11/sqlobject-and-database-programming.html docs/presentation-2004-11/ui/bodybg.gif docs/presentation-2004-11/ui/custom.css docs/presentation-2004-11/ui/framing.css docs/presentation-2004-11/ui/opera.css docs/presentation-2004-11/ui/pretty.css docs/presentation-2004-11/ui/print.css docs/presentation-2004-11/ui/s5-core.css docs/presentation-2004-11/ui/slides.css docs/presentation-2004-11/ui/slides.js scripts/sqlobject-admin scripts/sqlobject-convertOldURI sqlobject/.coveragerc sqlobject/__init__.py sqlobject/__version__.py sqlobject/boundattributes.py sqlobject/cache.py sqlobject/classregistry.py sqlobject/col.py sqlobject/compat.py sqlobject/conftest.py sqlobject/constraints.py sqlobject/converters.py sqlobject/dbconnection.py sqlobject/dberrors.py sqlobject/declarative.py sqlobject/events.py sqlobject/index.py sqlobject/joins.py sqlobject/main.py sqlobject/sqlbuilder.py sqlobject/sresults.py sqlobject/styles.py sqlobject/views.py sqlobject/wsgi_middleware.py sqlobject/../LICENSE sqlobject/../docs/Authors.rst sqlobject/../docs/DeveloperGuide.rst sqlobject/../docs/FAQ.rst sqlobject/../docs/Inheritance.rst sqlobject/../docs/News.rst sqlobject/../docs/News1.rst sqlobject/../docs/News2.rst sqlobject/../docs/News3.rst sqlobject/../docs/News4.rst sqlobject/../docs/News5.rst sqlobject/../docs/Python3.rst sqlobject/../docs/SQLBuilder.rst sqlobject/../docs/SQLObject.rst sqlobject/../docs/SelectResults.rst sqlobject/../docs/TODO.rst sqlobject/../docs/Versioning.rst sqlobject/../docs/Views.rst sqlobject/../docs/community.rst sqlobject/../docs/download.rst sqlobject/../docs/index.rst sqlobject/../docs/links.rst sqlobject/../docs/sqlobject-admin.rst sqlobject/../docs/html/Authors.html sqlobject/../docs/html/DeveloperGuide.html sqlobject/../docs/html/FAQ.html sqlobject/../docs/html/Inheritance.html sqlobject/../docs/html/News.html sqlobject/../docs/html/News1.html sqlobject/../docs/html/News2.html sqlobject/../docs/html/News3.html sqlobject/../docs/html/News4.html sqlobject/../docs/html/News5.html sqlobject/../docs/html/Python3.html sqlobject/../docs/html/SQLBuilder.html sqlobject/../docs/html/SQLObject.html sqlobject/../docs/html/SelectResults.html sqlobject/../docs/html/TODO.html sqlobject/../docs/html/Versioning.html sqlobject/../docs/html/Views.html sqlobject/../docs/html/community.html sqlobject/../docs/html/download.html sqlobject/../docs/html/genindex.html sqlobject/../docs/html/index.html sqlobject/../docs/html/links.html sqlobject/../docs/html/py-modindex.html sqlobject/../docs/html/search.html sqlobject/../docs/html/searchindex.js sqlobject/../docs/html/sqlobject-admin.html sqlobject/../docs/html/_modules/builtins.html sqlobject/../docs/html/_modules/index.html sqlobject/../docs/html/_modules/optparse.html sqlobject/../docs/html/_modules/_pytest/python.html sqlobject/../docs/html/_modules/pydispatch/dispatcher.html sqlobject/../docs/html/_modules/sqlobject/boundattributes.html sqlobject/../docs/html/_modules/sqlobject/cache.html sqlobject/../docs/html/_modules/sqlobject/classregistry.html sqlobject/../docs/html/_modules/sqlobject/col.html sqlobject/../docs/html/_modules/sqlobject/compat.html sqlobject/../docs/html/_modules/sqlobject/conftest.html sqlobject/../docs/html/_modules/sqlobject/constraints.html sqlobject/../docs/html/_modules/sqlobject/converters.html sqlobject/../docs/html/_modules/sqlobject/dbconnection.html sqlobject/../docs/html/_modules/sqlobject/dberrors.html sqlobject/../docs/html/_modules/sqlobject/declarative.html sqlobject/../docs/html/_modules/sqlobject/events.html sqlobject/../docs/html/_modules/sqlobject/firebird.html sqlobject/../docs/html/_modules/sqlobject/index.html sqlobject/../docs/html/_modules/sqlobject/inheritance.html sqlobject/../docs/html/_modules/sqlobject/joins.html sqlobject/../docs/html/_modules/sqlobject/main.html sqlobject/../docs/html/_modules/sqlobject/maxdb.html sqlobject/../docs/html/_modules/sqlobject/mssql.html sqlobject/../docs/html/_modules/sqlobject/mysql.html sqlobject/../docs/html/_modules/sqlobject/postgres.html sqlobject/../docs/html/_modules/sqlobject/rdbhost.html sqlobject/../docs/html/_modules/sqlobject/sqlbuilder.html sqlobject/../docs/html/_modules/sqlobject/sqlite.html sqlobject/../docs/html/_modules/sqlobject/sresults.html sqlobject/../docs/html/_modules/sqlobject/styles.html sqlobject/../docs/html/_modules/sqlobject/sybase.html sqlobject/../docs/html/_modules/sqlobject/versioning.html sqlobject/../docs/html/_modules/sqlobject/views.html sqlobject/../docs/html/_modules/sqlobject/wsgi_middleware.html sqlobject/../docs/html/_modules/sqlobject/firebird/firebirdconnection.html sqlobject/../docs/html/_modules/sqlobject/include/hashcol.html sqlobject/../docs/html/_modules/sqlobject/include/tests/test_hashcol.html sqlobject/../docs/html/_modules/sqlobject/inheritance/iteration.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_aggregates.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_asdict.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_deep_inheritance.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_destroy_cascade.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_foreignKey.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_indexes.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_inheritance.html sqlobject/../docs/html/_modules/sqlobject/inheritance/tests/test_inheritance_tree.html sqlobject/../docs/html/_modules/sqlobject/manager/command.html sqlobject/../docs/html/_modules/sqlobject/maxdb/maxdbconnection.html sqlobject/../docs/html/_modules/sqlobject/mssql/mssqlconnection.html sqlobject/../docs/html/_modules/sqlobject/mysql/mysqlconnection.html sqlobject/../docs/html/_modules/sqlobject/postgres/pgconnection.html sqlobject/../docs/html/_modules/sqlobject/rdbhost/rdbhostconnection.html sqlobject/../docs/html/_modules/sqlobject/sqlite/sqliteconnection.html sqlobject/../docs/html/_modules/sqlobject/sybase/sybaseconnection.html sqlobject/../docs/html/_modules/sqlobject/tests/dbtest.html sqlobject/../docs/html/_modules/sqlobject/tests/test_ForeignKey.html sqlobject/../docs/html/_modules/sqlobject/tests/test_NoneValuedResultItem.html sqlobject/../docs/html/_modules/sqlobject/tests/test_SQLMultipleJoin.html sqlobject/../docs/html/_modules/sqlobject/tests/test_SQLRelatedJoin.html sqlobject/../docs/html/_modules/sqlobject/tests/test_SingleJoin.html sqlobject/../docs/html/_modules/sqlobject/tests/test_aggregates.html sqlobject/../docs/html/_modules/sqlobject/tests/test_aliases.html sqlobject/../docs/html/_modules/sqlobject/tests/test_asdict.html sqlobject/../docs/html/_modules/sqlobject/tests/test_auto.html sqlobject/../docs/html/_modules/sqlobject/tests/test_basic.html sqlobject/../docs/html/_modules/sqlobject/tests/test_blob.html sqlobject/../docs/html/_modules/sqlobject/tests/test_boundattributes.html sqlobject/../docs/html/_modules/sqlobject/tests/test_cache.html sqlobject/../docs/html/_modules/sqlobject/tests/test_class_hash.html sqlobject/../docs/html/_modules/sqlobject/tests/test_columns_order.html sqlobject/../docs/html/_modules/sqlobject/tests/test_combining_joins.html sqlobject/../docs/html/_modules/sqlobject/tests/test_comparison.html sqlobject/../docs/html/_modules/sqlobject/tests/test_complex_sorting.html sqlobject/../docs/html/_modules/sqlobject/tests/test_constraints.html sqlobject/../docs/html/_modules/sqlobject/tests/test_converters.html sqlobject/../docs/html/_modules/sqlobject/tests/test_create_drop.html sqlobject/../docs/html/_modules/sqlobject/tests/test_csvexport.html sqlobject/../docs/html/_modules/sqlobject/tests/test_cyclic_reference.html sqlobject/../docs/html/_modules/sqlobject/tests/test_datetime.html sqlobject/../docs/html/_modules/sqlobject/tests/test_decimal.html sqlobject/../docs/html/_modules/sqlobject/tests/test_declarative.html sqlobject/../docs/html/_modules/sqlobject/tests/test_default_style.html sqlobject/../docs/html/_modules/sqlobject/tests/test_delete.html sqlobject/../docs/html/_modules/sqlobject/tests/test_distinct.html sqlobject/../docs/html/_modules/sqlobject/tests/test_empty.html sqlobject/../docs/html/_modules/sqlobject/tests/test_enum.html sqlobject/../docs/html/_modules/sqlobject/tests/test_events.html sqlobject/../docs/html/_modules/sqlobject/tests/test_exceptions.html sqlobject/../docs/html/_modules/sqlobject/tests/test_expire.html sqlobject/../docs/html/_modules/sqlobject/tests/test_groupBy.html sqlobject/../docs/html/_modules/sqlobject/tests/test_identity.html sqlobject/../docs/html/_modules/sqlobject/tests/test_indexes.html sqlobject/../docs/html/_modules/sqlobject/tests/test_inheritance.html sqlobject/../docs/html/_modules/sqlobject/tests/test_joins.html sqlobject/../docs/html/_modules/sqlobject/tests/test_joins_conditional.html sqlobject/../docs/html/_modules/sqlobject/tests/test_jsonbcol.html sqlobject/../docs/html/_modules/sqlobject/tests/test_jsoncol.html sqlobject/../docs/html/_modules/sqlobject/tests/test_lazy.html sqlobject/../docs/html/_modules/sqlobject/tests/test_md5.html sqlobject/../docs/html/_modules/sqlobject/tests/test_mysql.html sqlobject/../docs/html/_modules/sqlobject/tests/test_new_joins.html sqlobject/../docs/html/_modules/sqlobject/tests/test_parse_uri.html sqlobject/../docs/html/_modules/sqlobject/tests/test_paste.html sqlobject/../docs/html/_modules/sqlobject/tests/test_perConnection.html sqlobject/../docs/html/_modules/sqlobject/tests/test_pickle.html sqlobject/../docs/html/_modules/sqlobject/tests/test_picklecol.html sqlobject/../docs/html/_modules/sqlobject/tests/test_postgres.html sqlobject/../docs/html/_modules/sqlobject/tests/test_reparent_sqlmeta.html sqlobject/../docs/html/_modules/sqlobject/tests/test_schema.html sqlobject/../docs/html/_modules/sqlobject/tests/test_select.html sqlobject/../docs/html/_modules/sqlobject/tests/test_select_through.html sqlobject/../docs/html/_modules/sqlobject/tests/test_setters.html sqlobject/../docs/html/_modules/sqlobject/tests/test_slice.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sorting.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sqlbuilder.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sqlbuilder_dbspecific.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sqlbuilder_importproxy.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sqlbuilder_joins_instances.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sqlite.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sqlmeta_idName.html sqlobject/../docs/html/_modules/sqlobject/tests/test_sqlobject_admin.html sqlobject/../docs/html/_modules/sqlobject/tests/test_string_id.html sqlobject/../docs/html/_modules/sqlobject/tests/test_style.html sqlobject/../docs/html/_modules/sqlobject/tests/test_subqueries.html sqlobject/../docs/html/_modules/sqlobject/tests/test_transactions.html sqlobject/../docs/html/_modules/sqlobject/tests/test_unicode.html sqlobject/../docs/html/_modules/sqlobject/tests/test_uuidcol.html sqlobject/../docs/html/_modules/sqlobject/tests/test_validation.html sqlobject/../docs/html/_modules/sqlobject/tests/test_views.html sqlobject/../docs/html/_modules/sqlobject/util/csvexport.html sqlobject/../docs/html/_modules/sqlobject/util/csvimport.html sqlobject/../docs/html/_modules/sqlobject/util/moduleloader.html sqlobject/../docs/html/_modules/sqlobject/versioning/test/test_version.html sqlobject/../docs/html/_sources/Authors.rst.txt sqlobject/../docs/html/_sources/DeveloperGuide.rst.txt sqlobject/../docs/html/_sources/FAQ.rst.txt sqlobject/../docs/html/_sources/Inheritance.rst.txt sqlobject/../docs/html/_sources/News.rst.txt sqlobject/../docs/html/_sources/News1.rst.txt sqlobject/../docs/html/_sources/News2.rst.txt sqlobject/../docs/html/_sources/News3.rst.txt sqlobject/../docs/html/_sources/News4.rst.txt sqlobject/../docs/html/_sources/News5.rst.txt sqlobject/../docs/html/_sources/Python3.rst.txt sqlobject/../docs/html/_sources/SQLBuilder.rst.txt sqlobject/../docs/html/_sources/SQLObject.rst.txt sqlobject/../docs/html/_sources/SelectResults.rst.txt sqlobject/../docs/html/_sources/TODO.rst.txt sqlobject/../docs/html/_sources/Versioning.rst.txt sqlobject/../docs/html/_sources/Views.rst.txt sqlobject/../docs/html/_sources/community.rst.txt sqlobject/../docs/html/_sources/download.rst.txt sqlobject/../docs/html/_sources/index.rst.txt sqlobject/../docs/html/_sources/links.rst.txt sqlobject/../docs/html/_sources/sqlobject-admin.rst.txt sqlobject/../docs/html/_sources/api/modules.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.boundattributes.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.cache.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.classregistry.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.col.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.compat.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.conftest.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.constraints.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.converters.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.dbconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.dberrors.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.declarative.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.events.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.firebird.firebirdconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.firebird.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.include.hashcol.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.include.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.include.tests.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.include.tests.test_hashcol.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.index.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.iteration.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_aggregates.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_asdict.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_deep_inheritance.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_destroy_cascade.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_foreignKey.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_indexes.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_inheritance.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.inheritance.tests.test_inheritance_tree.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.joins.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.main.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.manager.command.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.manager.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.maxdb.maxdbconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.maxdb.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.mssql.mssqlconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.mssql.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.mysql.mysqlconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.mysql.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.postgres.pgconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.postgres.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.rdbhost.rdbhostconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.rdbhost.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.sqlbuilder.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.sqlite.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.sqlite.sqliteconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.sresults.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.styles.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.sybase.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.sybase.sybaseconnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.dbtest.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_ForeignKey.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_NoneValuedResultItem.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_SQLMultipleJoin.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_SQLRelatedJoin.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_SingleJoin.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_aggregates.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_aliases.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_asdict.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_auto.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_basic.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_blob.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_boundattributes.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_cache.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_class_hash.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_columns_order.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_combining_joins.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_comparison.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_complex_sorting.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_constraints.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_converters.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_create_drop.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_csvexport.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_cyclic_reference.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_datetime.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_decimal.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_declarative.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_default_style.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_delete.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_distinct.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_empty.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_enum.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_events.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_exceptions.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_expire.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_groupBy.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_identity.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_indexes.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_inheritance.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_joins.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_joins_conditional.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_jsonbcol.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_jsoncol.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_lazy.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_md5.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_mysql.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_new_joins.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_parse_uri.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_paste.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_perConnection.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_pickle.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_picklecol.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_postgres.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_reparent_sqlmeta.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_schema.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_select.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_select_through.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_setters.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_slice.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sorting.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sqlbuilder.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sqlbuilder_dbspecific.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sqlbuilder_importproxy.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sqlbuilder_joins_instances.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sqlite.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sqlmeta_idName.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_sqlobject_admin.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_string_id.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_style.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_subqueries.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_transactions.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_unicode.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_uuidcol.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_validation.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.tests.test_views.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.util.csvexport.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.util.csvimport.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.util.moduleloader.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.util.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.util.threadinglocal.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.versioning.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.versioning.test.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.versioning.test.test_version.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.views.rst.txt sqlobject/../docs/html/_sources/api/sqlobject.wsgi_middleware.rst.txt sqlobject/../docs/html/_static/ajax-loader.gif sqlobject/../docs/html/_static/background_b01.png sqlobject/../docs/html/_static/basic.css sqlobject/../docs/html/_static/bizstyle.css sqlobject/../docs/html/_static/bizstyle.js sqlobject/../docs/html/_static/classic.css sqlobject/../docs/html/_static/comment-bright.png sqlobject/../docs/html/_static/comment-close.png sqlobject/../docs/html/_static/comment.png sqlobject/../docs/html/_static/css3-mediaqueries.js sqlobject/../docs/html/_static/css3-mediaqueries_src.js sqlobject/../docs/html/_static/default.css sqlobject/../docs/html/_static/doctools.js sqlobject/../docs/html/_static/down-pressed.png sqlobject/../docs/html/_static/down.png sqlobject/../docs/html/_static/file.png sqlobject/../docs/html/_static/jquery-1.11.1.js sqlobject/../docs/html/_static/jquery-3.1.0.js sqlobject/../docs/html/_static/jquery.js sqlobject/../docs/html/_static/minus.png sqlobject/../docs/html/_static/plus.png sqlobject/../docs/html/_static/pygments.css sqlobject/../docs/html/_static/searchtools.js sqlobject/../docs/html/_static/sidebar.js sqlobject/../docs/html/_static/traditional.css sqlobject/../docs/html/_static/underscore-1.3.1.js sqlobject/../docs/html/_static/underscore.js sqlobject/../docs/html/_static/up-pressed.png sqlobject/../docs/html/_static/up.png sqlobject/../docs/html/_static/websupport.js sqlobject/../docs/html/api/modules.html sqlobject/../docs/html/api/sqlobject.boundattributes.html sqlobject/../docs/html/api/sqlobject.cache.html sqlobject/../docs/html/api/sqlobject.classregistry.html sqlobject/../docs/html/api/sqlobject.col.html sqlobject/../docs/html/api/sqlobject.compat.html sqlobject/../docs/html/api/sqlobject.conftest.html sqlobject/../docs/html/api/sqlobject.constraints.html sqlobject/../docs/html/api/sqlobject.converters.html sqlobject/../docs/html/api/sqlobject.dbconnection.html sqlobject/../docs/html/api/sqlobject.dberrors.html sqlobject/../docs/html/api/sqlobject.declarative.html sqlobject/../docs/html/api/sqlobject.events.html sqlobject/../docs/html/api/sqlobject.firebird.firebirdconnection.html sqlobject/../docs/html/api/sqlobject.firebird.html sqlobject/../docs/html/api/sqlobject.html sqlobject/../docs/html/api/sqlobject.include.hashcol.html sqlobject/../docs/html/api/sqlobject.include.html sqlobject/../docs/html/api/sqlobject.include.tests.html sqlobject/../docs/html/api/sqlobject.include.tests.test_hashcol.html sqlobject/../docs/html/api/sqlobject.index.html sqlobject/../docs/html/api/sqlobject.inheritance.html sqlobject/../docs/html/api/sqlobject.inheritance.iteration.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_aggregates.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_asdict.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_deep_inheritance.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_destroy_cascade.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_foreignKey.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_indexes.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_inheritance.html sqlobject/../docs/html/api/sqlobject.inheritance.tests.test_inheritance_tree.html sqlobject/../docs/html/api/sqlobject.joins.html sqlobject/../docs/html/api/sqlobject.main.html sqlobject/../docs/html/api/sqlobject.manager.command.html sqlobject/../docs/html/api/sqlobject.manager.html sqlobject/../docs/html/api/sqlobject.maxdb.html sqlobject/../docs/html/api/sqlobject.maxdb.maxdbconnection.html sqlobject/../docs/html/api/sqlobject.mssql.html sqlobject/../docs/html/api/sqlobject.mssql.mssqlconnection.html sqlobject/../docs/html/api/sqlobject.mysql.html sqlobject/../docs/html/api/sqlobject.mysql.mysqlconnection.html sqlobject/../docs/html/api/sqlobject.postgres.html sqlobject/../docs/html/api/sqlobject.postgres.pgconnection.html sqlobject/../docs/html/api/sqlobject.rdbhost.html sqlobject/../docs/html/api/sqlobject.rdbhost.rdbhostconnection.html sqlobject/../docs/html/api/sqlobject.sqlbuilder.html sqlobject/../docs/html/api/sqlobject.sqlite.html sqlobject/../docs/html/api/sqlobject.sqlite.sqliteconnection.html sqlobject/../docs/html/api/sqlobject.sresults.html sqlobject/../docs/html/api/sqlobject.styles.html sqlobject/../docs/html/api/sqlobject.sybase.html sqlobject/../docs/html/api/sqlobject.sybase.sybaseconnection.html sqlobject/../docs/html/api/sqlobject.tests.dbtest.html sqlobject/../docs/html/api/sqlobject.tests.html sqlobject/../docs/html/api/sqlobject.tests.test_ForeignKey.html sqlobject/../docs/html/api/sqlobject.tests.test_NoneValuedResultItem.html sqlobject/../docs/html/api/sqlobject.tests.test_SQLMultipleJoin.html sqlobject/../docs/html/api/sqlobject.tests.test_SQLRelatedJoin.html sqlobject/../docs/html/api/sqlobject.tests.test_SingleJoin.html sqlobject/../docs/html/api/sqlobject.tests.test_aggregates.html sqlobject/../docs/html/api/sqlobject.tests.test_aliases.html sqlobject/../docs/html/api/sqlobject.tests.test_asdict.html sqlobject/../docs/html/api/sqlobject.tests.test_auto.html sqlobject/../docs/html/api/sqlobject.tests.test_basic.html sqlobject/../docs/html/api/sqlobject.tests.test_blob.html sqlobject/../docs/html/api/sqlobject.tests.test_boundattributes.html sqlobject/../docs/html/api/sqlobject.tests.test_cache.html sqlobject/../docs/html/api/sqlobject.tests.test_class_hash.html sqlobject/../docs/html/api/sqlobject.tests.test_columns_order.html sqlobject/../docs/html/api/sqlobject.tests.test_combining_joins.html sqlobject/../docs/html/api/sqlobject.tests.test_comparison.html sqlobject/../docs/html/api/sqlobject.tests.test_complex_sorting.html sqlobject/../docs/html/api/sqlobject.tests.test_constraints.html sqlobject/../docs/html/api/sqlobject.tests.test_converters.html sqlobject/../docs/html/api/sqlobject.tests.test_create_drop.html sqlobject/../docs/html/api/sqlobject.tests.test_csvexport.html sqlobject/../docs/html/api/sqlobject.tests.test_cyclic_reference.html sqlobject/../docs/html/api/sqlobject.tests.test_datetime.html sqlobject/../docs/html/api/sqlobject.tests.test_decimal.html sqlobject/../docs/html/api/sqlobject.tests.test_declarative.html sqlobject/../docs/html/api/sqlobject.tests.test_default_style.html sqlobject/../docs/html/api/sqlobject.tests.test_delete.html sqlobject/../docs/html/api/sqlobject.tests.test_distinct.html sqlobject/../docs/html/api/sqlobject.tests.test_empty.html sqlobject/../docs/html/api/sqlobject.tests.test_enum.html sqlobject/../docs/html/api/sqlobject.tests.test_events.html sqlobject/../docs/html/api/sqlobject.tests.test_exceptions.html sqlobject/../docs/html/api/sqlobject.tests.test_expire.html sqlobject/../docs/html/api/sqlobject.tests.test_groupBy.html sqlobject/../docs/html/api/sqlobject.tests.test_identity.html sqlobject/../docs/html/api/sqlobject.tests.test_indexes.html sqlobject/../docs/html/api/sqlobject.tests.test_inheritance.html sqlobject/../docs/html/api/sqlobject.tests.test_joins.html sqlobject/../docs/html/api/sqlobject.tests.test_joins_conditional.html sqlobject/../docs/html/api/sqlobject.tests.test_jsonbcol.html sqlobject/../docs/html/api/sqlobject.tests.test_jsoncol.html sqlobject/../docs/html/api/sqlobject.tests.test_lazy.html sqlobject/../docs/html/api/sqlobject.tests.test_md5.html sqlobject/../docs/html/api/sqlobject.tests.test_mysql.html sqlobject/../docs/html/api/sqlobject.tests.test_new_joins.html sqlobject/../docs/html/api/sqlobject.tests.test_parse_uri.html sqlobject/../docs/html/api/sqlobject.tests.test_paste.html sqlobject/../docs/html/api/sqlobject.tests.test_perConnection.html sqlobject/../docs/html/api/sqlobject.tests.test_pickle.html sqlobject/../docs/html/api/sqlobject.tests.test_picklecol.html sqlobject/../docs/html/api/sqlobject.tests.test_postgres.html sqlobject/../docs/html/api/sqlobject.tests.test_reparent_sqlmeta.html sqlobject/../docs/html/api/sqlobject.tests.test_schema.html sqlobject/../docs/html/api/sqlobject.tests.test_select.html sqlobject/../docs/html/api/sqlobject.tests.test_select_through.html sqlobject/../docs/html/api/sqlobject.tests.test_setters.html sqlobject/../docs/html/api/sqlobject.tests.test_slice.html sqlobject/../docs/html/api/sqlobject.tests.test_sorting.html sqlobject/../docs/html/api/sqlobject.tests.test_sqlbuilder.html sqlobject/../docs/html/api/sqlobject.tests.test_sqlbuilder_dbspecific.html sqlobject/../docs/html/api/sqlobject.tests.test_sqlbuilder_importproxy.html sqlobject/../docs/html/api/sqlobject.tests.test_sqlbuilder_joins_instances.html sqlobject/../docs/html/api/sqlobject.tests.test_sqlite.html sqlobject/../docs/html/api/sqlobject.tests.test_sqlmeta_idName.html sqlobject/../docs/html/api/sqlobject.tests.test_sqlobject_admin.html sqlobject/../docs/html/api/sqlobject.tests.test_string_id.html sqlobject/../docs/html/api/sqlobject.tests.test_style.html sqlobject/../docs/html/api/sqlobject.tests.test_subqueries.html sqlobject/../docs/html/api/sqlobject.tests.test_transactions.html sqlobject/../docs/html/api/sqlobject.tests.test_unicode.html sqlobject/../docs/html/api/sqlobject.tests.test_uuidcol.html sqlobject/../docs/html/api/sqlobject.tests.test_validation.html sqlobject/../docs/html/api/sqlobject.tests.test_views.html sqlobject/../docs/html/api/sqlobject.util.csvexport.html sqlobject/../docs/html/api/sqlobject.util.csvimport.html sqlobject/../docs/html/api/sqlobject.util.html sqlobject/../docs/html/api/sqlobject.util.moduleloader.html sqlobject/../docs/html/api/sqlobject.util.threadinglocal.html sqlobject/../docs/html/api/sqlobject.versioning.html sqlobject/../docs/html/api/sqlobject.versioning.test.html sqlobject/../docs/html/api/sqlobject.versioning.test.test_version.html sqlobject/../docs/html/api/sqlobject.views.html sqlobject/../docs/html/api/sqlobject.wsgi_middleware.html sqlobject/firebird/__init__.py sqlobject/firebird/firebirdconnection.py sqlobject/include/__init__.py sqlobject/include/hashcol.py sqlobject/include/tests/__init__.py sqlobject/include/tests/test_hashcol.py sqlobject/inheritance/__init__.py sqlobject/inheritance/iteration.py sqlobject/inheritance/tests/__init__.py sqlobject/inheritance/tests/test_aggregates.py sqlobject/inheritance/tests/test_asdict.py sqlobject/inheritance/tests/test_deep_inheritance.py sqlobject/inheritance/tests/test_destroy_cascade.py sqlobject/inheritance/tests/test_foreignKey.py sqlobject/inheritance/tests/test_indexes.py sqlobject/inheritance/tests/test_inheritance.py sqlobject/inheritance/tests/test_inheritance_tree.py sqlobject/manager/__init__.py sqlobject/manager/command.py sqlobject/maxdb/__init__.py sqlobject/maxdb/maxdbconnection.py sqlobject/maxdb/readme.txt sqlobject/mssql/__init__.py sqlobject/mssql/mssqlconnection.py sqlobject/mysql/__init__.py sqlobject/mysql/mysqlconnection.py sqlobject/postgres/__init__.py sqlobject/postgres/pgconnection.py sqlobject/rdbhost/__init__.py sqlobject/rdbhost/rdbhostconnection.py sqlobject/sqlite/__init__.py sqlobject/sqlite/sqliteconnection.py sqlobject/sybase/__init__.py sqlobject/sybase/sybaseconnection.py sqlobject/tests/__init__.py sqlobject/tests/dbtest.py sqlobject/tests/test_ForeignKey.py sqlobject/tests/test_NoneValuedResultItem.py sqlobject/tests/test_SQLMultipleJoin.py sqlobject/tests/test_SQLRelatedJoin.py sqlobject/tests/test_SingleJoin.py sqlobject/tests/test_aggregates.py sqlobject/tests/test_aliases.py sqlobject/tests/test_asdict.py sqlobject/tests/test_auto.py sqlobject/tests/test_basic.py sqlobject/tests/test_blob.py sqlobject/tests/test_boundattributes.py sqlobject/tests/test_cache.py sqlobject/tests/test_class_hash.py sqlobject/tests/test_columns_order.py sqlobject/tests/test_combining_joins.py sqlobject/tests/test_comparison.py sqlobject/tests/test_complex_sorting.py sqlobject/tests/test_constraints.py sqlobject/tests/test_converters.py sqlobject/tests/test_create_drop.py sqlobject/tests/test_csvexport.py sqlobject/tests/test_csvimport.py sqlobject/tests/test_cyclic_reference.py sqlobject/tests/test_datetime.py sqlobject/tests/test_decimal.py sqlobject/tests/test_declarative.py sqlobject/tests/test_default_style.py sqlobject/tests/test_delete.py sqlobject/tests/test_distinct.py sqlobject/tests/test_empty.py sqlobject/tests/test_enum.py sqlobject/tests/test_events.py sqlobject/tests/test_exceptions.py sqlobject/tests/test_expire.py sqlobject/tests/test_groupBy.py sqlobject/tests/test_identity.py sqlobject/tests/test_indexes.py sqlobject/tests/test_inheritance.py sqlobject/tests/test_joins.py sqlobject/tests/test_joins_conditional.py sqlobject/tests/test_jsonbcol.py sqlobject/tests/test_jsoncol.py sqlobject/tests/test_lazy.py sqlobject/tests/test_md5.py sqlobject/tests/test_mysql.py sqlobject/tests/test_new_joins.py sqlobject/tests/test_parse_uri.py sqlobject/tests/test_paste.py sqlobject/tests/test_perConnection.py sqlobject/tests/test_pickle.py sqlobject/tests/test_picklecol.py sqlobject/tests/test_postgres.py sqlobject/tests/test_reparent_sqlmeta.py sqlobject/tests/test_schema.py sqlobject/tests/test_select.py sqlobject/tests/test_select_through.py sqlobject/tests/test_setters.py sqlobject/tests/test_slice.py sqlobject/tests/test_sorting.py sqlobject/tests/test_sqlbuilder.py sqlobject/tests/test_sqlbuilder_dbspecific.py sqlobject/tests/test_sqlbuilder_importproxy.py sqlobject/tests/test_sqlbuilder_joins_instances.py sqlobject/tests/test_sqlite.py sqlobject/tests/test_sqlmeta_idName.py sqlobject/tests/test_sqlobject_admin.py sqlobject/tests/test_string_id.py sqlobject/tests/test_style.py sqlobject/tests/test_subqueries.py sqlobject/tests/test_transactions.py sqlobject/tests/test_unicode.py sqlobject/tests/test_uuidcol.py sqlobject/tests/test_validation.py sqlobject/tests/test_views.py sqlobject/util/__init__.py sqlobject/util/csvexport.py sqlobject/util/csvimport.py sqlobject/util/moduleloader.py sqlobject/util/threadinglocal.py sqlobject/versioning/__init__.py sqlobject/versioning/test/__init__.py sqlobject/versioning/test/test_version.pySQLObject-3.4.0/SQLObject.egg-info/top_level.txt0000644000175000017500000000001213141371613020657 0ustar phdphd00000000000000sqlobject SQLObject-3.4.0/SQLObject.egg-info/requires.txt0000644000175000017500000000102313141371613020530 0ustar phdphd00000000000000FormEncode!=1.3.0,>=1.1.1 PyDispatcher>=2.0.4 [adodbapi] adodbapi [fdb] fdb [firebirdsql] firebirdsql [kinterbasdb] kinterbasdb [mysql] MySQLdb [mysql-connector] mysql-connector [odbc] pyodbc [oursql] oursql [postgres] psycopg2 [postgresql] psycopg2 [psycopg] psycopg2 [psycopg1] psycopg1 [psycopg2] psycopg2 [py-postgresql] py-postgresql [pygresql] pygresql [pymssql] pymssql [pymysql] pymysql [pyodbc] pyodbc [pypostgresql] py-postgresql [pypyodbc] pypyodbc [sapdb] sapdb [sqlite] pysqlite [sybase] Sybase SQLObject-3.4.0/SQLObject.egg-info/PKG-INFO0000644000175000017500000000350613141371613017235 0ustar phdphd00000000000000Metadata-Version: 1.1 Name: SQLObject Version: 3.4.0 Summary: Object-Relational Manager, aka database wrapper Home-page: http://sqlobject.org/ Author: Oleg Broytman Author-email: phd@phdru.name License: LGPL Download-URL: https://pypi.python.org/pypi/SQLObject/3.4.0 Description: SQLObject is a popular *Object Relational Manager* for providing an object interface to your database, with tables as classes, rows as instances, and columns as attributes. SQLObject includes a Python-object-based query language that makes SQL more abstract, and provides substantial database independence for applications. Supports MySQL, PostgreSQL, SQLite, Firebird, Sybase, MSSQL and MaxDB (SAPDB). Python 2.7 or 3.4+ is required. For development see the projects at `SourceForge `_ and `GitHub `_. .. image:: https://travis-ci.org/sqlobject/sqlobject.svg?branch=master :target: https://travis-ci.org/sqlobject/sqlobject Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL) Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Topic :: Database Classifier: Topic :: Database :: Front-Ends Classifier: Topic :: Software Development :: Libraries :: Python Modules Requires: FormEncode Requires: PyDispatcher SQLObject-3.4.0/setup.py0000755000175000017500000001655313113104234014373 0ustar phdphd00000000000000#!/usr/bin/env python import sys from imp import load_source from os.path import abspath, dirname, join try: from setuptools import setup is_setuptools = True except ImportError: from distutils.core import setup is_setuptools = False versionpath = join(abspath(dirname(__file__)), "sqlobject", "__version__.py") load_source("sqlobject_version", versionpath) from sqlobject_version import version # noqa: ignore flake8 E402 subpackages = ['firebird', 'include', 'include.tests', 'inheritance', 'inheritance.tests', 'manager', 'maxdb', 'mysql', 'mssql', 'postgres', 'rdbhost', 'sqlite', 'sybase', 'tests', 'util', 'versioning', 'versioning.test'] kw = {} if is_setuptools: kw['entry_points'] = """ [paste.filter_app_factory] main = sqlobject.wsgi_middleware:make_middleware """ if (sys.version_info[:2] == (2, 7)): PY2 = True elif (sys.version_info[0] == 3) and (sys.version_info[:2] >= (3, 4)): PY2 = False else: raise ImportError("SQLObject requires Python 2.7 or 3.4+") kw['install_requires'] = install_requires = [] if PY2: install_requires.append("FormEncode>=1.1.1,!=1.3.0") else: install_requires.append("FormEncode>=1.3.1") install_requires.append("PyDispatcher>=2.0.4") kw['extras_require'] = extras_require = { # Firebird/Interbase 'fdb': ['fdb'], 'firebirdsql': ['firebirdsql'], 'kinterbasdb': ['kinterbasdb'], # MS SQL 'adodbapi': ['adodbapi'], 'pymssql': ['pymssql'], # MySQL 'mysql-connector': ['mysql-connector'], 'pymysql': ['pymysql'], # ODBC 'odbc': ['pyodbc'], 'pyodbc': ['pyodbc'], 'pypyodbc': ['pypyodbc'], # PostgreSQL 'psycopg1': ['psycopg1'], 'psycopg2': ['psycopg2'], 'psycopg': ['psycopg2'], 'postgres': ['psycopg2'], 'postgresql': ['psycopg2'], 'pygresql': ['pygresql'], 'pypostgresql': ['py-postgresql'], 'py-postgresql': ['py-postgresql'], # 'sapdb': ['sapdb'], 'sqlite': ['pysqlite'], 'sybase': ['Sybase'], } if PY2: extras_require['mysql'] = ['MySQLdb'] extras_require['oursql'] = ['oursql'] else: extras_require['mysql'] = ['mysqlclient'] setup(name="SQLObject", version=version, description="Object-Relational Manager, aka database wrapper", long_description="""\ SQLObject is a popular *Object Relational Manager* for providing an object interface to your database, with tables as classes, rows as instances, and columns as attributes. SQLObject includes a Python-object-based query language that makes SQL more abstract, and provides substantial database independence for applications. Supports MySQL, PostgreSQL, SQLite, Firebird, Sybase, MSSQL and MaxDB (SAPDB). Python 2.7 or 3.4+ is required. For development see the projects at `SourceForge `_ and `GitHub `_. .. image:: https://travis-ci.org/sqlobject/sqlobject.svg?branch=master :target: https://travis-ci.org/sqlobject/sqlobject """, classifiers=[ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: " "GNU Library or Lesser General Public License (LGPL)", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Topic :: Database", "Topic :: Database :: Front-Ends", "Topic :: Software Development :: Libraries :: Python Modules", ], author="Ian Bicking", author_email="ianb@colorstudy.com", maintainer="Oleg Broytman", maintainer_email="phd@phdru.name", url="http://sqlobject.org/", download_url="https://pypi.python.org/pypi/SQLObject/%s" % version, license="LGPL", packages=["sqlobject"] + ['sqlobject.%s' % package for package in subpackages], scripts=["scripts/sqlobject-admin", "scripts/sqlobject-convertOldURI"], package_data={"sqlobject": [ "../LICENSE", "../docs/*.rst", "../docs/html/*", "../docs/html/_sources/*", "../docs/html/_sources/api/*", "../docs/html/_modules/*", "../docs/html/_modules/sqlobject/*", "../docs/html/_modules/sqlobject/mysql/*", "../docs/html/_modules/sqlobject/postgres/*", "../docs/html/_modules/sqlobject/manager/*", "../docs/html/_modules/sqlobject/inheritance/*", "../docs/html/_modules/sqlobject/inheritance/tests/*", "../docs/html/_modules/sqlobject/mssql/*", "../docs/html/_modules/sqlobject/tests/*", "../docs/html/_modules/sqlobject/rdbhost/*", "../docs/html/_modules/sqlobject/versioning/*", "../docs/html/_modules/sqlobject/versioning/test/*", "../docs/html/_modules/sqlobject/util/*", "../docs/html/_modules/sqlobject/maxdb/*", "../docs/html/_modules/sqlobject/firebird/*", "../docs/html/_modules/sqlobject/sybase/*", "../docs/html/_modules/sqlobject/sqlite/*", "../docs/html/_modules/sqlobject/include/*", "../docs/html/_modules/sqlobject/include/tests/*", "../docs/html/_modules/pydispatch/*", "../docs/html/_modules/_pytest/*", "../docs/html/api/*", "../docs/html/_static/*", ], "sqlobject.maxdb": ["readme.txt"], }, requires=['FormEncode', 'PyDispatcher'], **kw ) # Send announce to: # sqlobject-discuss@lists.sourceforge.net # python-announce@python.org # python-list@python.org # db-sig@python.org # Email tempate: """ @@ INTRO What's new in SQLObject ======================= @@ CHANGES For a more complete list, please see the news: http://sqlobject.org/docs/News.html What is SQLObject ================= SQLObject is an object-relational mapper. Your database tables are described as classes, and rows are instances of those classes. SQLObject is meant to be easy to use and quick to get started with. It currently supports MySQL through the `MySQLdb` package, PostgreSQL through the `psycopg` package, SQLite, Firebird, MaxDB (SAP DB), MS SQL Sybase and Rdbhost. Python 2.7 or 3.4+ is required. Where is SQLObject ================== Site: http://sqlobject.org Mailing list: https://lists.sourceforge.net/mailman/listinfo/sqlobject-discuss Download: https://pypi.python.org/pypi/SQLObject/@@ News and changes: http://sqlobject.org/docs/News.html -- Ian Bicking / ianb@colorstudy.com / http://blog.ianbicking.org """ # noqa: preserve space after two dashes SQLObject-3.4.0/docs/0000755000175000017500000000000013141371614013605 5ustar phdphd00000000000000SQLObject-3.4.0/docs/Makefile0000644000175000017500000001562412760105144015254 0ustar phdphd00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # User-friendly check for sphinx-build ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) endif # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest coverage gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " xml to make Docutils-native XML files" @echo " pseudoxml to make pseudoxml-XML files for display purposes" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" @echo " coverage to run coverage check of the documentation (if enabled)" clean: rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/SQLObject.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/SQLObject.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/SQLObject" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/SQLObject" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." latexpdfja: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through platex and dvipdfmx..." $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." coverage: $(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage @echo "Testing of coverage in the sources finished, look at the " \ "results in $(BUILDDIR)/coverage/python.txt." xml: $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml @echo @echo "Build finished. The XML files are in $(BUILDDIR)/xml." pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." SQLObject-3.4.0/docs/TODO.rst0000644000175000017500000001004113103631236015075 0ustar phdphd00000000000000TODO ---- * PyPy. * Quote table/column names that are reserved keywords (order => "order", values => `values` for MySQL). * RelatedJoin.hasOther(otherObject[.id]) * createParamsPre/Post:: class MyTable(SQLObject): class sqlmeta: createParamsPre = 'TEMPORARY IF NOT EXISTS' createParamsPre = {temporary: True, ifNotExists: True, 'postgres': 'LOCAL'} createParamsPost = 'ENGINE InnoDB' createParamsPost = {'mysql': 'ENGINE InnoDB', 'postgres': 'WITH OIDS'} * SQLObject.fastInsert(). * IntervalCol * TimedeltaCol * Cached join results. * Invert tests isinstance(obj, (tuple, list)) to not isinstance(obj, basestr) to allow any iterable. * Always use .lazyIter(). * Optimize Iteration.next() - use cursor.fetchmany(). * Generators instead of loops (fetchall => fetchone). * Cache columns in sqlmeta.getColumns(); reset the cache on add/del Column/Join. * Make ConnectionHub a context manager instead of .doInTransaction(). * Make version_info a namedtuple. * Expression columns - in SELECT but not in INSERT/UPDATE. Something like this:: class MyClass(SQLObject): function1 = ExpressionCol(func.my_function(MyClass.q.col1)) function2 = ExpressionCol('sum(col2)') * A hierarchy of exceptions. SQLObject should translate exceptions from low-level drivers to a consistent set of high-level exceptions. * Memcache. * Refactor ``DBConnection`` to use parameterized queries instead of generating query strings. * PREPARE/EXECUTE. * Protect all .encode(), catch UnicodeEncode exceptions and reraise Invalid. * More kinds of joins, and more powerful join results (closer to how `select` works). * Better joins - automatic joins in .select() based on ForeignKey/MultipleJoin/RelatedJoin. * Deprecate, then remove connectionForOldURI. * Switch from setuptools to distribute. * Support PyODBC driver for all backends. * `dbms `_ is a DB API wrapper for DB API drivers for IBM DB2, Firebird, MSSQL Server, MySQL, Oracle, PostgreSQL, SQLite and ODBC. * dict API: use getitem interface for column access instead of getattr; reserve getattr for internal attributes only; this helps to avoid collisions with internal attributes. * Or move column values access to a separate namespace, e.g. .c: row.c.column. * More documentation. * RSS 2.0 and Atom news feeds. * Use DBUtils_, especially SolidConnection. .. _DBUtils: https://pypi.python.org/pypi/DBUtils * ``_fromDatabase`` currently doesn't support IDs that don't fit into the normal naming scheme. It should do so. You can still use ``_idName`` with ``_fromDatabase``. * More databases supported. There has been interest and some work in the progress for Oracle. IWBN to have Informix and DB2 drivers. * Better transaction support -- right now you can use transactions for the database, but objects aren't transaction-aware, so non-database persistence won't be able to be rolled back. * Optimistic locking and other techniques to handle concurrency. * Profile of SQLObject performance to identify bottlenecks. * Increase hooks with FormEncode validation and form generation package, so SQLObject classes (read: schemas) can be published for editing more directly and easily. (First step: get Schema-generating method into sqlmeta class) * Merge SQLObject.create*, .create*SQL methods with DBPI.create* methods. * Made SQLObject unicode-based instead of just unicode-aware. All internal processing should be done with unicode strings, conversion to/from ascii strings should happen for non-unicode DB API drivers. * Allow to override ConsoleWriter/LogWriter classes and makeDebugWriter function. * Type annotations and mypy tests. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/links.rst0000644000175000017500000000324413044165436015467 0ustar phdphd00000000000000SQLObject Links =============== .. contents:: If you have a link you'd like added to this page, please submit a `pull requests at GitHub `_ or a bug report with the link and title at `bug tracker `_. Articles and Documentation -------------------------- * `Connecting databases to Python with SQLObject `_; an article at DeveloperWorks. * `DB migration using sqlobject-admin how-to `_. * `Using raw SQL with SQLObject and keeping the object-y goodness `_. * `Example of using SQLObject with web.py under mod_wsgi `_. Open Source Projects -------------------- * `Catwalk `_ is a web-based SQLObject browser and object editor (included in TurboGears 1.0). * `Ultra Gleeper `_, a Recommendation Engine for Web Pages. * `Guten `_; an application for browsing, reading, and managing books from `Project Gutenberg `_. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/FAQ.rst0000644000175000017500000005073613043410204014747 0ustar phdphd00000000000000+++++++++++++ SQLObject FAQ +++++++++++++ .. contents:: SQLExpression ------------- In `SomeTable.select(SomeTable.q.Foo > 30)` why doesn't the inner parameter, `SomeTable.q.Foo > 30`, get evaluated to some boolean value? `q` is an object that returns special attributes of type `sqlbuilder.SQLExpression`. SQLExpression is a special class that overrides almost all Python magic methods and upon any operation instead of evaluating it constructs another instance of SQLExpression that remembers what operation it has to do. This is similar to a symbolic algebra. Example: SQLExpression("foo") > 30 produces SQLExpression("foo", ">", 30) (well, it really produces SQLExpression(SQLExpression("foo")...)) How does the select(...) method know what to do? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In short, select() recursively evaluates the top-most SQLExpression to a string: SQLExpression("foo", ">", 30) => "foo > 30" and passes the result as a string to the SQL backend. The longer but more detailed and correct explanation is that select() produces an instance of SelectResults_ class that upon being iterated over produces an instance of Iteration class that upon calling its next() method (it is iterator!) constructs the SQL query string, passes it to the backend, fetches the results, wraps every row as SQLObject instance and passes them back to the user. .. _SelectResults: SelectResults.html For the details of the implementation see sqlobject/main.py for SQLObject, sqlobject/sqlbuilder.py for SQLExpression, sqlobject/dbconnection.py for DBConnection class (that constructs the query strings) and Iteration class, and different subdirectories of sqlobject for concrete implementations of connection classes - different backends require different query strings. Why there is no __len__? ------------------------ There are reasons why there is no __len__ method, though many people think having those make them feel more integrated into Python. One is that len(foo) is expected to be fast, but issuing a COUNT query can be slow. Worse, often this causes the database to do essentially redundant work when the actual query is performed (generally taking the len of a sequence is followed by accessing items from that sequence). Another is that list(foo) implicitly tries to do a len first, as an optimization (because len is expected to be cheap -- see previous point). Worse, it swallows *all* exceptions that occur during that call to __len__, so if it fails (e.g. there's a typo somewhere in the query), the original cause is silently discarded, and instead you're left with mysterious errors like "current transaction is aborted, commands ignored until end of transaction block" for no apparent reason. How can I do a LEFT JOIN? ------------------------- The short: you can't. You don't need to. That's a relational way of thinking, not an object way of thinking. But it's okay! It's not hard to do the same thing, even if it's not with the same query. For these examples, imagine you have a bunch of customers, with contacts. Not all customers have a contact, some have several. The left join would look like:: SELECT customer.id, customer.first_name, customer.last_name, contact.id, contact.address FROM customer LEFT JOIN contact ON contact.customer_id = customer.id Simple ~~~~~~ :: for customer in Customer.select(): print customer.firstName, customer.lastName for contact in customer.contacts: print ' ', contact.phoneNumber The effect is the same as the left join -- you get all the customers, and you get all their contacts. The problem, however, is that you will be executing more queries -- a query for each customer to fetch the contacts -- where with the left join you'd only do one query. The actual amount of information returned from the database will be the same. There's a good chance that this won't be significantly slower. I'd advise doing it this way unless you hit an actual performance problem. Efficient ~~~~~~~~~ Lets say you really don't want to do all those queries. Okay, fine:: custContacts = {} for contact in Contact.select(): custContacts.setdefault(contact.customerID, []).append(contact) for customer in Customer.select(): print customer.firstName, customer.lastName for contact in custContacts.get(customer.id, []): print ' ', contact.phoneNumber This way there will only be at most two queries. It's a little more crude, but this is an optimization, and optimizations often look less than pretty. But, say you don't want to get everyone, just some group of people (presumably a large enough group that you still need this optimization):: query = Customer.q.firstName.startswith('J') custContacts = {} for contact in Contact.select(AND(Contact.q.customerID == Customer.q.id, query)): custContacts.setdefault(contact.customerID, []).append(contact) for customer in Customer.select(query): print customer.firstName, customer.lastName for contact in custContacts.get(customer.id, []): print ' ', contact.phoneNumber SQL-wise ~~~~~~~~ Use LEFTJOIN() from SQLBuilder_. How can I join a table with itself? ----------------------------------- Use Alias from SQLBuilder_. See example_. .. _SQLBuilder: SQLBuilder.html .. _example: SQLObject.html#how-can-i-join-a-table-with-itself How can I define my own intermediate table in my Many-to-Many relationship? --------------------------------------------------------------------------- .. note:: In User and Role, SQLRelatedJoin is used with createRelatedTable=False so the intermediate table is not created automatically. We also set the intermediate table name with intermediateTable='user_roles'. UserRoles is the definition of our intermediate table. UserRoles creates a unique index to make sure we don't have duplicate data in the database. We also added an extra field called active which has a boolean value. The active column might be used to activate/deactivate a given role for a user in this example. Another common field to add in this an intermediate table might be a sort field. If you want to get a list of rows from the intermediate table directly add a MultipleJoin to User or Role class. We'll expand on the User and Role example and define our own UserRoles class which will be the intermediate table for the User and Role Many-to-Many relationship. Example:: >>> class User(SQLObject): ... class sqlmeta: ... table = "user_table" ... username = StringCol(alternateID=True, length=20) ... roles = SQLRelatedJoin('Role', ... intermediateTable='user_roles', ... createRelatedTable=False) >>> class Role(SQLObject): ... name = StringCol(alternateID=True, length=20) ... users = SQLRelatedJoin('User', ... intermediateTable='user_roles', ... createRelatedTable=False) >>> class UserRoles(SQLObject): ... class sqlmeta: ... table = "user_roles" ... user = ForeignKey('User', notNull=True, cascade=True) ... role = ForeignKey('Role', notNull=True, cascade=True) ... active = BoolCol(notNull=True, default=False) ... unique = index.DatabaseIndex(user, role, unique=True) How Does Inheritance Work? -------------------------- SQLObject is not intended to represent every Python inheritance structure in an RDBMS -- rather it is intended to represent RDBMS structures as Python objects. So lots of things you can do in Python you can't do with SQLObject classes. However, some form of inheritance is possible. One way of using this is to create local conventions. Perhaps:: class SiteSQLObject(SQLObject): _connection = DBConnection.MySQLConnection(user='test', db='test') _style = MixedCaseStyle() # And maybe you want a list of the columns, to autogenerate # forms from: def columns(self): return [col.name for col in self._columns] Since SQLObject doesn't have a firm introspection mechanism (at least not yet) the example shows the beginnings of a bit of ad hoc introspection (in this case exposing the ``_columns`` attribute in a more pleasing/public interface). However, this doesn't relate to *database* inheritance at all, since we didn't define any columns. What if we do? :: class Person(SQLObject): firstName = StringCol() lastName = StringCol() class Employee(Person): position = StringCol() Unfortunately, the resultant schema probably doesn't look like what you might have wanted:: CREATE TABLE person ( id INT PRIMARY KEY, first_name TEXT, last_name TEXT ); CREATE TABLE employee ( id INT PRIMARY KEY first_name TEXT, last_name TEXT, position TEXT ) All the columns from ``person`` are just repeated in the ``employee`` table. What's more, an ID for a Person is distinct from an ID for an employee, so for instance you must choose ``ForeignKey("Person")`` or ``ForeignKey("Employee")``, you can't have a foreign key that sometimes refers to one, and sometimes refers to the other. Altogether, not very useful. You probably want a ``person`` table, and then an ``employee`` table with a one-to-one relation between the two. Of course, you can have that, just create the appropriate classes/tables -- but it will appear as two distinct classes, and you'd have to do something like ``Person(1).employee.position``. Of course, you can always create the necessary shortcuts, like:: class Person(SQLObject): firstName = StringCol() lastName = StringCol() def _get_employee(self): value = Employee.selectBy(person=self) if value: return value[0] else: raise AttributeError, '%r is not an employee' % self def _get_isEmployee(self): value = Employee.selectBy(person=self) # turn into a bool: return not not value def _set_isEmployee(self, value): if value: # Make sure we are an employee... if not self.isEmployee: Empoyee.new(person=self, position=None) else: if self.isEmployee: self.employee.destroySelf() def _get_position(self): return self.employee.position def _set_position(self, value): self.employee.position = value class Employee(SQLObject): person = ForeignKey('Person') position = StringCol() There is also another kind of inheritance. See Inheritance.html_ .. _Inheritance.html: Inheritance.html Composite/Compound Attributes ----------------------------- A composite attribute is an attribute formed from two columns. For example:: CREATE TABLE invoice_item ( id INT PRIMARY KEY, amount NUMERIC(10, 2), currency CHAR(3) ); Now, you'll probably want to deal with one amount/currency value, instead of two columns. SQLObject doesn't directly support this, but it's easy (and encouraged) to do it on your own:: class InvoiceItem(SQLObject): amount = Currency() currency = StringChar(length=3) def _get_price(self): return Price(self.amount, self.currency) def _set_price(self, price): self.amount = price.amount self.currency = price.currency class Price(object): def __init__(self, amount, currency): self._amount = amount self._currency = currency def _get_amount(self): return self._amount amount = property(_get_amount) def _get_currency(self): return self._currency currency = property(_get_currency) def __repr__(self): return '' % (self.amount, self.currency) You'll note we go to some trouble to make sure that ``Price`` is an immutable object. This is important, because if ``Price`` wasn't and someone changed an attribute, the containing ``InvoiceItem`` instance wouldn't detect the change and update the database. (Also, since ``Price`` doesn't subclass ``SQLObject``, we have to be explicit about creating properties) Some people refer to this sort of class as a *Value Object*, that can be used similar to how an integer or string is used. You could also use a mutable composite class:: class Address(SQLObject): street = StringCol() city = StringCol() state = StringCol(length=2) latitude = FloatCol() longitude = FloatCol() def _init(self, id): SQLObject._init(self, id) self._coords = SOCoords(self) def _get_coords(self): return self._coords class SOCoords(object): def __init__(self, so): self._so = so def _get_latitude(self): return self._so.latitude def _set_latitude(self, value): self._so.latitude = value latitude = property(_get_latitude, set_latitude) def _get_longitude(self): return self._so.longitude def _set_longitude(self, value): self._so.longitude = value longitude = property(_get_longitude, set_longitude) Pretty much a proxy, really, but ``SOCoords`` could contain other logic, could interact with non-SQLObject-based latitude/longitude values, or could be used among several objects that have latitude/longitude columns. Non-Integer IDs --------------- Yes, you can use non-integer IDs. If you use non-integer IDs, you will not be able to use automatic ``CREATE TABLE`` generation (i.e., ``createTable``); SQLObject can create tables with int or str IDs. You also will have to give your own ID values when creating an object, like:: color = Something(id="blue", r=0, b=100, g=0) IDs are, and always will in future versions, be considered immutable. Right now that is not enforced; you can assign to the ``id`` attribute. But if you do you'll just mess everything up. This will probably be taken away sometime to avoid possibly confusing bugs (actually, assigning to ``id`` is almost certain to cause confusing bugs). If you are concerned about enforcing the type of IDs (which can be a problem even with integer IDs) you may want to do this:: def Color(SQLObject): def _init(self, id, connection=None): id = str(id) SQLObject._init(self, id, connection) Instead of ``str()`` you may use ``int()`` or whatever else you want. This will be resolved in a future version when ID column types can be declared like other columns. Additionally you can set idType=str in you SQLObject class. Binary Values ------------- Binary values can be difficult to store in databases, as SQL doesn't have a widely-implemented way to express binaries as literals, and there's differing support in database. The module sqlobject.col defines validators and column classes that to some extent support binary values. There is BLOBCol that extends StringCol and allow to store binary values; currently it works only with PostgreSQL and MySQL. PickleCol extends BLOBCol and allows to store any object in the column; the column, naturally, pickles the object upon assignment and unpickles it upon retrieving the data from the DB. Another possible way to keep binary data in a database is by using encoding. Base 64 is a good encoding, reasonably compact but also safe. As an example, imagine you want to store images in the database:: class Image(SQLObject): data = StringCol() height = IntCol() width = IntCol() def _set_data(self, value): self._SO_set_data(value.encode('base64')) def _get_data(self, value): return self._SO_get_data().decode('base64') Reloading Modules ----------------- If you've tried to reload a module that defines SQLObject subclasses, you've probably encountered various odd errors. The short answer: you can't reload these modules. The long answer: reloading modules in Python doesn't work very well. Reloading actually means *re-running* the module. Every ``class`` statement creates a class -- but your old classes don't disappear. When you reload a module, new classes are created, and they take over the names in the module. SQLObject, however, doesn't search the names in a module to find a class. When you say ``ForeignKey('SomeClass')``, SQLObject looks for any SQLObject subclass anywhere with the name ``SomeClass``. This is to avoid problems with circular imports and circular dependencies, as tables have forward- and back-references, and other circular dependencies. SQLObject resolves these dependencies lazily. But when you reload a module, suddenly there will be two SQLObject classes in the process with the same name. SQLObject doesn't know that one of them is obsolete. And even if it did, it doesn't know every other place in the system that has a reference to that obsolete class. For this reason and several others, reloading modules is highly error-prone and difficult to support. Python Keywords --------------- If you have a table column that is a Python keyword, you should know that the Python attribute doesn't have to match the name of the column. See `Irregular Naming`_ in the documentation. .. _Irregular Naming: SQLObject.html#irregular-naming Lazy Updates and Insert ----------------------- `Lazy updates `_ allow you to defer sending ``UPDATES`` until you synchronize the object. However, there is no way to do a lazy insert; as soon as you create an instance the ``INSERT`` is executed. The reason for this limit is that each object needs a database ID, and in many databases you cannot attain an ID until you create a row. Mutually referencing tables --------------------------- How can I create mutually referencing tables? For the code:: class Person(SQLObject): role = ForeignKey("Role") class Role(SQLObject): person = ForeignKey("Person") Person.createTable() Role.createTable() Postgres raises ProgrammingError: ERROR: relation "role" does not exist. The correct way is to delay constraints creation until all tables are created:: class Person(SQLObject): role = ForeignKey("Role") class Role(SQLObject): person = ForeignKey("Person") constraints = Person.createTable(applyConstraints=False) constraints += Role.createTable(applyConstraints=False) for constraint in constraints: connection.query(constraint) What about GROUP BY, UNION, etc? -------------------------------- In short - not every query can be represented in SQLObject. SQLOBject's objects are instances of "table" clasess:: class MyTable(SQLObject): ... my_table_row = MyTable.get(id) Now my_table_row is an instance of MyTable class and represents a row in the my_table table. But for a statement with GROUP BY like this:: SELECT my_column, COUNT(*) FROM my_table GROUP BY my_column; there is no table, there is no corresponding "table" class, and SQLObject cannot return a list of meaningful objects. You can use a lower-level machinery available in SQLBuilder_. How to do mass-insertion? ------------------------- Mass-insertion using high-level API in SQLObject is slow. There are many reasons for that. First, on creation SQLObject instances pass all values through validators/converters which is convenient but takes time. Second, after an INSERT query SQLObject executes a SELECT query to get back autogenerated values (id and timestamps). Third, there is caching and cache maintaining. Most of this is unnecessary for mass-insertion, hence high-level API is unsuitable. Less convenient (no validators) but much faster API is Insert_ from SQLBuilder_. .. _Insert: SQLBuilder.html#insert How can I specify the MySQL engine to use, or tweak other SQL-engine specific features? --------------------------------------------------------------------------------------- You can *ALTER* the table just after creation using the ``sqlmeta`` attribute ``createSQL``, for example:: class SomeObject(SQLObject): class sqlmeta: createSQL = { 'mysql' : 'ALTER TABLE some_object ENGINE InnoDB' } # your columns here Maybe you want to specify the charset too? No problem:: class SomeObject(SQLObject): class sqlmeta: createSQL = { 'mysql' : [ 'ALTER TABLE some_object ENGINE InnoDB', '''ALTER TABLE some_object CHARACTER SET utf8 COLLATE utf8_estonian_ci'''] } .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/interface.py0000644000175000017500000002672012461505074016131 0ustar phdphd00000000000000""" This is a not-very-formal outline of the interface that SQLObject provides. While its in the form of a formal interface, it doesn't use any interface system. """ class Interface(object): pass class ISQLObject(Interface): sqlmeta = """ A class or instance representing internal state and methods for introspecting this class. ``MyClass.sqlmeta`` is a class, and ``myInstance.sqlmeta`` is an instance of this class. So every instance gets its own instance of the metadata. This object follows the ``Isqlmeta`` interface. """ # classmethod def get(id, connection=None): """ Returns the object with the given `id`. If `connection` is given, then get the object from the given connection (otherwise use the default or configured connection) It raises ``SQLObjectNotFound`` if no row exists with that ID. """ # classmethod def selectBy(connection=None, **attrs): """ Performs a ``SELECT`` in the given `connection` (or default connection). Each of the keyword arguments should be a column, and the equality comparisons will be ``ANDed`` together to produce the result. """ # classmethod def dropTable(ifExists=False, dropJoinTables=True, cascade=False, connection=None): """ Drops this table from the database. If ``ifExists`` is true, then it is not an error if the table doesn't exist. Join tables (mapping tables for many-to-many joins) are dropped if this class comes alphabetically before the other join class, and if ``dropJoinTables`` is true. ``cascade`` is passed to the connection, and if true tries to drop tables that depend on this table. """ # classmethod def createTable(ifNotExists=False, createJoinTables=True, createIndexes=True, connection=None): """ Creates the table. If ``ifNotExists`` is true, then it is not an error if the table already exists. Join tables (mapping tables for many-to-many joins) are created if this class comes alphabetically before the other join class, and if ``createJoinTables`` is true. If ``createIndexes`` is true, indexes are also created. """ # classmethod def createTableSQL(createJoinTables=True, connection=None, createIndexes=True): """ Returns the SQL that would be sent with the analogous call to ``Class.createTable(...)`` """ def sync(): """ This will refetch the data from the database, putting it in sync with the database (in case another process has modified the database since this object was fetched). It will raise ``SQLObjectNotFound`` if the row has been deleted. This will call ``self.syncUpdate()`` if ``lazyUpdates`` are on. """ def syncUpdate(): """ If ``.sqlmeta.lazyUpdates`` is true, then this method must be called to push accumulated updates to the server. """ def expire(): """ This will remove all the column information from the object. The next time this information is used, a ``SELECT`` will be made to re-fetch the data. This is like a lazy ``.sync()``. """ def set(**attrs): """ This sets many column attributes at once. ``obj.set(a=1, b=2)`` is equivalent to ``obj.a=1; obj.b=2``, except that it will be grouped into one ``UPDATE`` """ def destroySelf(): """ Deletes this row from the database. This is called on instances, not on the class. The object still persists, because objects cannot be deleted from the Python process (they can only be forgotten about, at which time they are garbage collected). The object becomes obsolete, and further activity on it will raise errors. """ def sqlrepr(obj, connection=None): """ Returns the SQL representation of the given object, for the configured database connection. """ class Isqlmeta(Interface): table = """ The name of the table in the database. This is derived from ``style`` and the class name if no explicit name is given. """ idName = """ The name of the primary key column in the database. This is derived from ``style`` if no explicit name is given. """ idType = """ A function that coerces/normalizes IDs when setting IDs. This is ``int`` by default (all IDs are normalized to integers). """ style = """ An instance of the ``IStyle`` interface. This maps Python identifiers to database names. """ lazyUpdate = """ A boolean (default false). If true, then setting attributes on instances (or using ``inst.set(...)`` will not send ``UPDATE`` queries immediately (you must call ``inst.syncUpdates()`` or ``inst.sync()`` first). """ defaultOrder = """ When selecting objects and not giving an explicit order, this attribute indicates the default ordering. It is like this value is passed to ``.select()`` and related methods; see those method's documentation for details. """ cacheValues = """ A boolean (default true). If true, then the values in the row are cached as long as the instance is kept (and ``inst.expire()`` is not called). If false, then every attribute access causes a ``SELECT`` (which is absurdly inefficient). """ registry = """ Because SQLObject uses strings to relate classes, and these strings do not respect module names, name clashes will occur if you put different systems together. This string value serves as a namespace for classes. """ fromDatabase = """ A boolean (default false). If true, then on class creation the database will be queried for the table's columns, and any missing columns (possible all columns) will be added automatically. """ columns = """ A dictionary of ``{columnName: anSOColInstance}``. You can get information on the columns via this read-only attribute. """ columnList = """ A list of the values in ``columns``. Sometimes a stable, ordered version of the columns is necessary; this is used for that. """ columnDefinitions = """ A dictionary like ``columns``, but contains the original column definitions (which are not class-specific, and have no logic). """ joins = """ A list of all the Join objects for this class. """ indexes = """ A list of all the indexes for this class. """ # Instance attributes expired = """ A boolean. If true, then the next time this object's column attributes are accessed a query will be run. """ # Methods def addColumn(columnDef, changeSchema=False, connection=None): """ Adds the described column to the table. If ``changeSchema`` is true, then an ``ALTER TABLE`` query is called to change the database. Attributes given in the body of the SQLObject subclass are collected and become calls to this method. """ def delColumn(column, changeSchema=False, connection=None): """ Removes the given column (either the definition from ``columnDefinition`` or the SOCol object from ``columns``). If ``changeSchema`` is true, then an ``ALTER TABLE`` query is made. """ def addColumnsFromDatabase(connection=None): """ Adds all the columns from the database that are not already defined. If the ``fromDatabase`` attribute is true, then this is called on class instantiation. """ def addJoin(joinDef): """ Adds a join to the class. """ def delJoin(joinDef): """ Removes a join from the class. """ def addIndex(indexDef): """ Adds the index to the class. """ def asDict(): """ Returns the SQLObject instance as a dictionary (column names as keys, column values as values). Use like:: ASQLObjectClass(a=1, b=2).asDict() Which should return ``{'a': 1, 'b': 2}``. Note: this is a *copy* of the object's columns; changing the dictionary will not effect the object it was created from. """ class ICol(Interface): def __init__(name=None, **kw): """ Creates a column definition. This is an object that describes a column, basically just holding the keywords for later creating an ``SOCol`` (or subclass) instance. Subclasses of ``Col`` (which implement this interface) typically create the related subclass of ``SOCol``. """ name = """ The name of the column. If this is not given in the constructor, ``SQLObject`` will set this attribute from the variable name this object is assigned to. """ class ISOCol(Interface): """ This is a column description that is bound to a single class. This cannot be shared by subclasses, so a new instance is created for every new class (in cases where classes share columns). These objects are created by ``Col`` instances, you do not create them directly. """ name = """ The name of the attribute that points to this column. This is the Python name of the column. """ columnDef = """ The ``Col`` object that created this instance. """ immutable = """ Boolean, default false. If true, then this column cannot be modified. It cannot even be modified at construction, rendering the table read-only. This will probably change in the future, as it renders the option rather useless. """ cascade = """ If a foreign key, then this indicates if deletes in that foreign table should cascade into this table. This can be true (deletes cascade), false (the default, they do not cascade), or ``'null'`` (this column is set to ``NULL`` if the foreign key is deleted). """ constraints = """ A list of ... @@? """ notNone = """ Boolean, default false. It true, then ``None`` (aka ``NULL``) is not allowed in this column. Also the ``notNull`` attribute can be used. """ foreignKey = """ If not None, then this column points to another table. The attribute is the name (a string) of that table/class. """ dbName = """ The name of this column in the database. """ alternateID = """ Boolean, default false. If true, then this column is assumed to be unique, and you can fetch individual rows based on this column's value. This will add a method ``byAttributeName`` to the parent SQLObject subclass. """ unique = """ Boolean, default false. If this column is unique; effects the database ``CREATE`` statement, and is implied by ``alternateID=True``. """ validator = """ A IValidator object. All setting of this column goes through the ``fromPython`` method of the validator. All getting of this column from the database goes through ``toPython``. """ default = """ A value that holds the default value for this column. If the default value passed in is a callable, then that value is called to return a default (a typical example being ``DateTime.now``). """ sqlType = """ The SQL type of the column, overriding the default column type. """ SQLObject-3.4.0/docs/Authors.rst0000644000175000017500000000317212752476767016015 0ustar phdphd00000000000000Authors ======= SQLObject was originally written by Ian Bicking . Contributions have been made by: * Frank Barknecht * Bud P. Bruegger * David M. Cook * Luke Opperman * James Ralston * Sidnei da Silva * Brad Bollenbach * Daniel Savard, Xsoli Inc * alexander smishlajev * Yaroslav Samchuk * Runar Petursson * J Paulo Fernandes Farias * Shahms King * David Turner, The Open Planning Project * Dan Pascu * Diez B. Roggisch * Christopher Singley * David Keeney * Daniel Fetchinson * Neil Muller * Petr Jakes * Ken Lalonde * Andrew Ziem * Andrew Trusty * Ian Cordasco * Lukasz Dobrzanski * Gregor Horvath * Nathan Edwards * Lutz Steinborn * Oleg Broytman .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/News4.rst0000644000175000017500000002257312752476767015376 0ustar phdphd00000000000000++++ News ++++ .. contents:: Contents: :backlinks: none .. _start: SQLObject 0.15.1 ================ Released 22 Mar 2011. * A bug was fixed in MSSQLConnection. * A minor bug was fixed in sqlbuilder.Union. SQLObject 0.15.0 ================ Released 6 Dec 2010. Features & Interface -------------------- * Major API change: all signals are sent with the instance (or the class) as the first parameter. The following signals were changed: RowCreateSignal, RowCreatedSignal, DeleteColumnSignal. * Major API change: post-processing functions for all signals are called with the instance as the first parameter. The following signals were changed: RowUpdatedSignal, RowDestroySignal, RowDestroyedSignal. SQLObject 0.14.2 ================ * A minor bug was fixed in sqlbuilder.Union. SQLObject 0.14.1 ================ Released 15 Oct 2010. * A bugfix was ported from `SQLObject 0.13.1`_. SQLObject 0.14.0 ================ Released 10 Oct 2010. Features & Interface -------------------- * The lists of columns/indices/joins are now sorted according to the order of declaration. * ``validator2`` was added to all columns; it is inserted at the beginning of the list of validators, i.e. its ``from_python()`` method is called first, ``to_python()`` is called last, after all validators in the list. * SQLiteConnection's parameter ``use_table_info`` became boolean with default value True; this means the default schema parser is now based on ``PRAGMA table_info()``. * Major API change: attribute ``dirty`` was moved to sqlmeta. SQLObject 0.13.1 ================ Released 15 Oct 2010. * A bug was fixed in a subtle case when a per-instance connection is not passed to validators. SQLObject 0.13.0 ================ Released 11 Aug 2010. Features & Interface -------------------- * SQLObject instances that don't have a per-instance connection can be pickled and unpickled. * Validators became stricter: StringCol and UnicodeCol now accept only str, unicode or an instance of a class that implements __unicode__ (but not __str__ because every object has a __str__ method); BoolCol accepts only bool or int or an instance of a class that implements __nonzero__; IntCol accepts int, long or an instance of a class that implements __int__ or __long__; FloatCol accepts float, int, long or an instance of a class that implements __float__, __int__ or __long__. * Added a connection class for rdbhost.com (commercial Postgres-over-Web service). Small Features -------------- * Added TimedeltaCol; currently it's only implemented on PostgreSQL as an INTERVAL type. * Do not pollute the base sqlmeta class to allow Style to set idName. In the case of inherited idName inherited value takes precedence; to allow Style to set idName reset inherited idName to None. * Better handling of circular dependencies in sqlobject-admin - do not include the class in the list of other classes. * Renamed db_encoding to dbEncoding in UnicodeStringValidator. * A new parameter ``sslmode`` was added to PostgresConnection. * Removed SQLValidator - its attemptConvert was never called because in FormEncode it's named attempt_convert. SQLObject 0.12.5 ================ * ``backend`` parameter was renamed to ``driver``. To preserve backward compatibility ``backend`` is still recognized and will be recognized for some time. SQLObject 0.12.4 ================ Released 5 May 2010. * Bugs were fixed in calling from_python(). * A bugfix was ported from `SQLObject 0.11.6`_. SQLObject 0.12.3 ================ Released 11 Apr 2010. * A bugfix ported from `SQLObject 0.11.5`_. SQLObject 0.12.2 ================ Released 4 Mar 2010. * Bugfixes ported from `SQLObject 0.11.4`_. SQLObject 0.12.1 ================ Released 8 Jan 2010. * Fixed three bugs in PostgresConnection. * A number of changes ported from `SQLObject 0.11.3`_. SQLObject 0.12 ============== Released 20 Oct 2009. Features & Interface -------------------- * .selectBy(), .deleteBy() and .by*() methods pass all values through .from_python(), not only unicode. * The user can choose a DB API driver for SQLite by using a ``backend`` parameter in DB URI or SQLiteConnection that can be a comma-separated list of backend names. Possible backends are: ``pysqlite2`` (alias ``sqlite2``), ``sqlite3``, ``sqlite`` (alias ``sqlite1``). Default is to test pysqlite2, sqlite3 and sqlite in that order. * The user can choose a DB API driver for PostgreSQL by using a ``backend`` parameter in DB URI or PostgresConnection that can be a comma-separated list of backend names. Possible backends are: ``psycopg2``, ``psycopg1``, ``psycopg`` (tries psycopg2 and psycopg1), ``pygresql``. Default is ``psycopg``. WARNING: API change! PostgresConnection's parameter ``usePygresql`` is now replaced with ``backend=pygresql``. * The user can choose a DB API driver for MSSQL by using a ``backend`` parameter in DB URI or MSSQLConnection that can be a comma-separated list of backend names. Possible backends are: ``adodb`` (alias ``adodbapi``) and ``pymssql``. Default is to test adodbapi and pymssql in that order. * alternateMethodName is defined for all unique fields, not only alternateID; this makes SQLObject create .by*() methods for all unique fields. * SET client_encoding for PostgreSQL to the value of ``charset`` parameter in DB URI or PostgresConnection. * TimestampCol() can be instantiated without any defaults, in this case default will be None (good default for TIMESTAMP columns in MySQL). Small Features -------------- * Imported DB API drivers are stored as connection instance variables, not in global variables; this allows to use different DB API drivers at the same time; for example, PySQLite2 and sqlite3. * Removed all deprecated attributes and functions. * Removed the last remained string exceptions. SQLObject 0.11.6 ================ Released 5 May 2010. * A bug was fixed in SQLiteConnection.columnsFromSchema(): pass None as size/precision to DecimalCol; DecimalCol doesn't allow default values (to force user to pass meaningful values); but columnsFromSchema() doesn't implement proper parsing of column details. SQLObject 0.11.5 ================ Released 11 Apr 2010. * Fixed a bug in replacing _connection in a case when no previous _connection has been set. SQLObject 0.11.4 ================ Released 4 Mar 2010. * Fixed a bug in inheritance - if creation of the row failed and if the connection is not a transaction and is in autocommit mode - remove parent row(s). * Do not set _perConnection flag if get() or _init() is passed the same connection; this is often the case with select(). SQLObject 0.11.3 ================ Released 8 Jan 2010. * Fixed a bug in col.py and dbconnection.py - if dbEncoding is None suppose it's 'ascii'. * Fixed a bug in FirebirdConnection. * The cache culling algorithm was enhanced to eliminate memory leaks by removing references to dead objects; tested on a website that runs around 4 million requests a day. SQLObject 0.11.2 ================ Released 30 Sep 2009. * Fixed a bug in logging to console - convert unicode to str. * Fixed an obscure bug in ConnectionHub triggered by an SQLObject class whose instances can be coerced to boolean False. SQLObject 0.11.1 ================ Released 20 Sep 2009. * Fixed a bug: Sybase tables with identity column fire two identity_inserts. * Fixed a bug: q.startswith(), q.contains() and q.endswith() escape (with a backslash) all special characters (backslashes, underscores and percent signs). SQLObject 0.11.0 ================ Released 12 Aug 2009. Features & Interface -------------------- * Dropped support for Python 2.3. The minimal version of Python for SQLObject is 2.4 now. * Dropped support for PostgreSQL 7.2. The minimal supported version of PostgreSQL is 7.3 now. * New magic attribute ``j`` similar to ``q`` was added that automagically does join with the other table in MultipleJoin or RelatedJoin. * SQLObject can now create and drop a database in MySQL, PostgreSQL, SQLite and Firebird/Interbase. * Added some support for schemas in PostgreSQL. * Added DecimalStringCol - similar to DecimalCol but stores data as strings to work around problems in some drivers and type affinity problem in SQLite. * Added sqlobject.include.hashcol.HashCol - a column type that automatically hashes anything going into it, and returns out an object that hashes anything being compared to itself. Basically, it's good for really simple one-way password fields, and it even supports the assignment of None to indicate no password set. By default, it uses the md5 library for hashing, but this can be changed in a HashCol definition. * RowDestroyedSignal and RowUpdatedSignal were added. Minor features -------------- * Use reversed() in manager/command.py instead of .__reversed__(). * Minor change in logging to console - logger no longer stores the output file, it gets the file from module sys every time by name; this means logging will use new sys.stdout (or stderr) in case the user changed them. * Changed the order of testing of SQLite modules - look for external PySQLite2 before sqlite3. `Older news`__ .. __: News3.html .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/News5.rst0000644000175000017500000003347312752476767015400 0ustar phdphd00000000000000++++ News ++++ .. contents:: Contents: :backlinks: none .. _start: SQLObject 2.2.1 =============== Released 10 May 2016. * Fix a bug in sqlbuilder.CONCAT (inherit from SQLExpression). SQLObject 2.2.0 =============== Released 8 May 2016. Features & Interface -------------------- * Add function col.use_microseconds(True/False). Default is to use microseconds (True). * For MSSQL use datetime2(6) and time(6) columns. * Columns for ForeignKey are created using idType of the referenced table. Minor features -------------- * Add sqlbuilder.CONCAT to generate concatenation command (either using function CONCAT() or '||' operator). * Minor refactoring to pave the way to Python 3 was merged from `SQLObject 1.6.7`_. * Document MSSQL server versions -- merged from `SQLObject 1.7.6`_. Bugfixes -------- * Fix a bug: mxDateTime doesn't support microseconds; %s in mxDateTime format means ticks. Tests ----- * Speedup SQLite connections in tests -- merged from `SQLObject 1.7.6`_. * Added new test helper setupCyclicClasses to setup classes with mutual references. SQLObject 2.1.2 =============== Released 15 Mar 2015. * Using fdb adapter for Firebird was ported from `SQLObject 1.6.6`_. SQLObject 2.1.1 =============== Released 10 Mar 2015. * Minor fix in PostgresConnection was ported from `SQLObject 1.6.5`_. SQLObject 2.1.0 =============== Released 6 Jan 2015. Minor features -------------- * In queries generated with SQLObject's tables columns are sorted in the order they are declared in the table. * In queries generated with sqlbuilder's Insert/Update, if values are passed using dictionaries, columns are sorted alphabetically. * Tables in SELECT...FROM clause are sorted alphabetically. * MySQLConnection, PostgresConnection and SQLiteConnection have got a new method listDatabases() that lists databases in the connection and returns a list of names. * MySQLConnection, PostgresConnection and SQLiteConnection have got a new method listTables() that returns a list of table names in the database. SQLObject 2.0.0 =============== Released 20 Dec 2014. Features & Interface -------------------- * DateTimeCol and TimeCol can read and write values with microseconds. WARNING: microseconds are supported by MariaDB since version 5.3.0 and by MySQL since version 5.6.4, and even these versions require special handling: columns to store microseconds have to be declared with precision 6: TIME(6), DATETIME(6), TIMESTAMP(6). SQLObject does the right thing when creating a new database but existing databases have to be changed: run something like ``ALTER TABLE name MODIFY COLUMN col TIME(6)`` for every column that you want to store microseconds. For MSSQL use datetime2(6) and time(6) columns. They are available since MS SQL Server 2008. WARNING: backward compatibility problem! Date/Time columns created with microseconds cannot be read back from SQLite databases (and perhaps other backends) with versions of SQLObject older than 1.7. Minor features -------------- * PostgresConnection, when used with fromDatabase=True, sets alternateID for unique columns. Development ----------- * Development was switched from Subversion to git. Documentation ------------- * Old news were restored back to version 0.2.1. * News.txt was split into 5 small files. SQLObject 1.7.6 =============== * Minor refactoring to pave the way to Python 3 was merged from `SQLObject 1.6.7`_. * Document MSSQL server versions -- merged from `SQLObject 1.6.7`_. * Speedup SQLite connections in tests -- merged from `SQLObject 1.6.7`_. SQLObject 1.7.5 =============== Released 15 Mar 2015. * Using fdb adapter for Firebird was ported from `SQLObject 1.6.6`_. SQLObject 1.7.4 =============== Released 10 Mar 2015. * Minor fix in PostgresConnection was ported from `SQLObject 1.6.5`_. SQLObject 1.7.3 =============== Released 18 Dec 2014. * Documentation updates and setup.py change were ported from `SQLObject 1.5.6`_. SQLObject 1.7.2 =============== Released 14 Dec 2014. * Fix a bug: zero-pad microseconds on the right, not on the left; 0.0835 seconds means 83500 microseconds. SQLObject 1.7.1 =============== Released 11 Dec 2014. * Documentation updates and setuptools change were ported from `SQLObject 1.5.5`_. SQLObject 1.7.0 =============== Released 8 Dec 2014. Features & Interface -------------------- * Python 2.5 is no longer supported. The minimal supported version is Python 2.6. * DateTimeCol and TimeCol can read values with microseconds (created by SQLObject 2.0) but do not write microseconds back. Minor features -------------- * Upgrade ez_setup to 2.2. Bugfixes -------- * Thre bugfixes were ported from `SQLObject 1.5.2`_, `SQLObject 1.5.3`_ and `SQLObject 1.5.4`_. SQLObject 1.6.7 =============== * Minor refactoring to pave the way to Python 3: replace calls like ``unicode(data, encoding)`` with ``data.decode(encoding)``. * Document MSSQL server versions. * Minor fix in HashCol. * Speedup SQLite connections in tests. SQLObject 1.6.6 =============== Released 15 Mar 2015. * Use fdb adapter for Firebird. SQLObject 1.6.5 =============== Released 10 Mar 2015. * Minor fix in PostgresConnection: close the cursor and connection in _createOrDropDatabase even after an error. SQLObject 1.6.4 =============== Released 18 Dec 2014. * Documentation updates and setup.py change were ported from `SQLObject 1.5.6`_. SQLObject 1.6.3 =============== Released 11 Dec 2014. * Documentation updates and setuptools change were ported from `SQLObject 1.5.5`_. SQLObject 1.6.2 =============== Released 8 Dec 2014. * A bugfix was ported from `SQLObject 1.5.4`_. SQLObject 1.6.1 =============== Released 26 Oct 2014. * A bugfix was ported from `SQLObject 1.5.3`_. SQLObject 1.6.0 =============== Released 15 May 2014. Features & Interface -------------------- * Python 2.4 is no longer supported. The minimal supported version is Python 2.5. * Support for Python 2.5 is declared obsolete and will be removed in the next release. Minor features -------------- * Upgrade ez_setup to 1.4.2. Bugfixes -------- * A bugfix was ported from `SQLObject 1.5.2`_. Development ----------- * Development switched from Subvesion to git. SQLObject 1.5.6 =============== Released 18 Dec 2014. * Extend setup.py: include docs and tests into the egg. SQLObject 1.5.5 =============== Released 11 Dec 2014. * Documentation update: change URLs for development with git, add Travis CI build status image. * Extend sdist: include everything into source distribution. SQLObject 1.5.4 =============== Released 8 Dec 2014. * Fix a minor bug in MSSQLConnection: do not override callable server_version with a non-callable. SQLObject 1.5.3 =============== Released 26 Oct 2014. * Allow unicode in .orderBy(u'-column'). Development ----------- * Development switched from Subvesion to git. SQLObject 1.5.2 =============== Released 13 Apr 2014. * Adapt duplicate error message strings for SQLite 3.8. SQLObject 1.5.1 =============== Released 15 Dec 2013. * SQLiteConnection.close() now closes and reopens a connection to in-memory database. SQLObject 1.5.0 =============== Released 5 Oct 2013. Features & Interface -------------------- * Helpers for class Outer were changed to lookup columns in table's declarations. * Support for Python 2.4 is declared obsolete and will be removed in the next release. Minor features -------------- * When a PostgresConnection raises an exception the instance has code/error attributes copied from psycopg2's pgcode/pgerror attributes. * Encode unicode enum values to str. * Removed setDeprecationLevel from the list of public functions. * A number of fixes for tests. Bugfixes -------- * A bug was fixed in DBConnection.close(); close() doesn't raise an UnboundLocalError if connection pool is empty. * Fixed parameters for pymssql. Documentation ------------- * GNU LGPL text was added as docs/LICENSE file. * Old FSF address was changed to the new one. SQLObject 1.4.1 =============== Released 26 May 2013. * A few bugfixes were ported from `SQLObject 1.3.3`_. SQLObject 1.4.0 =============== Released 18 May 2013. Features & Interface -------------------- * Support for PostgreSQL 8.1 is dropped. The minimal supported version of PostgreSQL is 8.2 now. * Optimization in PostgresConnection: use INSERT...RETURNING id to get the autoincremented id in one query instead of two (INSERT + SELECT id). * Changed the way to get if the table has identity in MS SQL. * NCHAR/NVARCHAR and N''-quoted strings for MS SQL. SQLObject 1.3.3 =============== Released 26 May 2013. * Fixed bugs in pickling and unpickling (remove/restore a weak proxy to self, fixed cache handling). * Added an example of using SQLObject with web.py by Rhubarb Sin to the links page. SQLObject 1.3.2 =============== Released 20 Oct 2012. * A number of bugfixes were ported from `SQLObject 1.2.4`_. SQLObject 1.3.1 =============== Released 25 May 2012. * Two bugfixes were ported from `SQLObject 1.2.3`_. SQLObject 1.3.0 =============== Released 31 Mar 2012. Features & Interface -------------------- * PostgresConnection performs translation of exceptions to standard SQLObject's hierarchy of exceptions. * Major update of FirebirdConnection: introspection was completely rewritten and extended; ``charset`` was renamed to ``dbEncoding``; a longstanding bug was fixed - pass port to connect(). SQLObject 1.2.4 =============== Released 20 Oct 2012. * Fixed a bug in sqlbuilder.Select.filter - removed comparison with SQLTrueClause. * Neil Muller fixed a number of tests. SQLObject 1.2.3 =============== Released 25 May 2012. * Fixed a minor bug in PostgreSQL introspection: VIEWs don't have PRIMARY KEYs - use sqlmeta.idName as the key. * Fixed a bug in cache handling while unpickling. SQLObject 1.2.2 =============== Released 1 Mar 2012. * A bugfix was ported from `SQLObject 1.1.5`_. SQLObject 1.2.1 =============== Released 4 Dec 2011. * A bugfix was ported from `SQLObject 1.1.4`_. SQLObject 1.2.0 =============== Released 20 Nov 2011. Features & Interface -------------------- * Strings are treated specially in Select to allow Select(['id, 'name'], where='value = 42'). Update allows a string in WHERE. * ForeignKey('Table', refColumn='refcol_id') to allow ForeignKey to point to a non-id column; the referred column must be a unique integer column. * delColumn now accepts a ForeignKey's name without 'ID'. * Support for PostgreSQL 7.* is dropped. The minimal supported version of PostgreSQL is 8.1 now. * Quoting rules changed for PostgreSQL: SQLObject uses E'' escape string if the string contains characters escaped with backslash. * A bug caused by psycopg2 recently added a new boolean not callable autocommit attribute was fixed. * sqlobject.__doc__ and main.__doc__ no longer contain version number. Use sqlobject.version or version_info. SQLObject 1.1.5 =============== Released 1 Mar 2012. * A bug was fixed in SQLiteConnection - clear _threadPool on close(). SQLObject 1.1.4 =============== Released 4 Dec 2011. * A bug was fixed in handling ``modulo`` operator - SQLite implements only ``%``, MySQL - only ``MOD()``, PostgreSQL implements both. SQLObject 1.1.3 =============== Released 30 Aug 2011. * A bugfix was ported from `SQLObject 1.0.3`_. SQLObject 1.1.2 =============== Released 8 Aug 2011. * A bugfix was ported from `SQLObject 1.0.2`_. SQLObject 1.1.1 =============== Released 1 Jul 2011. * Parsing sqlobject.__doc__ for version number is declared obsolete. Use sqlobject.version or version_info. * Documented sqlmeta.dbEncoding and connection.dbEncoding. SQLObject 1.1 ============= Released 20 Jun 2011. Features & Interface -------------------- * SelectResults (returned from .select()) is allowed in IN(column, list). * A different workaround is used in SQLiteConnection to prevent PySQLite from converting strings to unicode - in the case of a registered text conversion function PySQLite silently converts empty strings to Nones; now SQLObject uses text_factory instead and properly returns empty strings. * It is now possible to declare one encoding for all UnicodeCol's per table (as sqlmeta.dbEncoding) or per connection (as connection.dbEncoding). Default (if dbEncoding is found neither in column nor in table nor in connection) is 'utf-8'. Source code and internals ------------------------- * Decorators @classmethod and @staticmethod are used everywhere. * All 'mydict.has_key(name)' checks were replaced with 'name in mydict'. SQLObject 1.0.3 =============== Released 30 Aug 2011. * Fixed a bug with Postgres - add quotes in "SET client_encoding" query. SQLObject 1.0.2 =============== Released 8 Aug 2011. * A bug was fixed in SelectResults slicing that prevented to slice a slice (my_results[:20][1:5]). SQLObject 1.0.1 =============== Released 30 May 2011. * A syntax incompatibility was fixed in SQLiteConnection that prevented SQLObject to be used with Python 2.4. SQLObject 1.0.0 =============== Released 28 Mar 2011. Features & Interface -------------------- * Major API change: DB URI parser was changed to use urllib.split*() and unquote(). This means any username/password/path are allowed in DB URIs if they are properly %-encoded, and DB URIs are automatically unquoted. * A new module ``__version__.py`` was added. New variables ``version`` (string) and ``version_info`` (5-tuple: major, minor, micro, release level, serial) are imported into ``sqlobject`` namespace. * In SQLite, id columns are made AUTOINCREMENT. * Parameter ``backend`` in DB URI is no longer supported, use parameter ``driver``. `Older news`__ .. __: News4.html .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/SQLBuilder.rst0000644000175000017500000002176612752476767016347 0ustar phdphd00000000000000`````````` SQLBuilder `````````` .. contents:: A number of variables from SQLBuilder are included with ``from sqlobject import *`` -- see the `relevant SQLObject documentation`_ for more. Its functionality is also available through the special ``q`` attribute of `SQLObject` classes. .. _`relevant SQLObject documentation`: SQLObject.html#exported-symbols SQLExpression ============= SQLExpression uses clever overriding of operators to make Python expressions build SQL expressions -- so long as you start with a Magic Object that knows how to fake it. With SQLObject, you get a Magic Object by accessing the ``q`` attribute of a table class -- this gives you an object that represents the field. All of this is probably easier to grasp in an example:: >>> from sqlobject.sqlbuilder import * >>> person = table.person # person is now equivalent to the Person.q object from the SQLObject # documentation >>> person person >>> person.first_name person.first_name >>> person.first_name == 'John' person.first_name = 'John' >>> name = 'John' >>> person.first_name != name person.first_name != 'John' >>> AND(person.first_name == 'John', person.last_name == 'Doe') (person.first_name = 'John' AND person.last_name = 'Doe') Most of the operators work properly: <, >, <=, >=, !=, ==, +, -, /, \*, \*\*, %. However, ``and``, ``or``, and ``not`` **do not work**. You can use &, \|, and ~ instead -- but be aware that these have the same precedence as multiplication. So:: # This isn't what you want: >> person.first_name == 'John' & person.last_name == 'Doe' (person.first_name = ('John' AND person.last_name)) = 'Doe') # This is: >> (person.first_name == 'John') & (person.last_name == 'Doe') ((person.first_name = 'John') AND (person.last_name == 'Doe')) SQLBuilder also contains the functions ``AND``, ``OR``, and ``NOT`` which also work -- I find these easier to work with. ``AND`` and ``OR`` can take any number of arguments. You can also use ``.startswith()`` and ``.endswith()`` on an SQL expression -- these will translate to appropriate ``LIKE`` statements and all ``%`` quoting is handled for you, so you can ignore that implementation detail. There is also a ``LIKE`` function, where you can pass your string, with ``%`` for the wildcard, as usual. If you want to access an SQL function, use the ``func`` variable, like:: >> person.created < func.NOW() To pass a constant, use the ``const`` variable which is actually an alias for func. SQL statements ============== SQLBuilder implements objects that execute SQL statements. SQLObject uses them internally in its `higher-level API`_, but users can use this mid-level API to execute SQL queries that are not supported by the high-level API. To use these objects first construct an instance of a statement object, then ask the connection to convert the instance to an SQL query and finally ask the connection to execute the query and return the results. For example, for ``Select`` class:: >>> from sqlobject.sqlbuilder import * >> select = Select(['name', 'AVG(salary)'], staticTables=['employees'], >> groupBy='name') # create an instance >> query = connection.sqlrepr(select) # Convert to SQL string: >> # SELECT name, AVG(salary) FROM employees GROUP BY name >> rows = connection.queryAll(query) # Execute the query >> # and get back the results as a list of rows >> # where every row is a sequence of length 2 (name and average salary) .. _`higher-level API`: SQLObject.html Select ~~~~~~ A class to build ``SELECT`` queries. Accepts a number of parameters, all parameters except `items` are optional. Use ``connection.queryAll(query)`` to execute the query and get back the results as a list of rows. `items`: A string, an SQLExpression or a sequence of strings or SQLExpression's, represents the list of columns. If there are q-values SQLExpression's ``Select`` derives a list of tables for SELECT query. `where`: A string or an SQLExpression, represents the ``WHERE`` clause. `groupBy`: A string or an SQLExpression, represents the ``GROUP BY`` clause. `having`: A string or an SQLExpression, represents the ``HAVING`` part of the ``GROUP BY`` clause. `orderBy`: A string or an SQLExpression, represents the ``ORDER BY`` clause. `join`: A (list of) JOINs (``LEFT JOIN``, etc.) `distinct`: A bool flag to turn on ``DISTINCT`` query. `start`, `end`: Integers. The way to calculate ``OFFSET`` and ``LIMIT``. `limit`: An integer. `limit`, if passed, overrides `end`. `reversed`: A bool flag to do ``ORDER BY`` in the reverse direction. `forUpdate`: A bool flag to turn on ``SELECT FOR UPDATE`` query. `staticTables`: A sequence of strings or SQLExpression's that name tables for ``FROM``. This parameter must be used if `items` is a list of strings from which Select cannot derive the list of tables. Insert ~~~~~~ A class to build ``INSERT`` queries. Accepts a number of parameters. Use ``connection.query(query)`` to execute the query. `table`: A string that names the table to ``INSERT`` into. Required. `valueList`: A list of (key, value) sequences or {key: value} dictionaries; keys are column names. Either `valueList` or `values` must be passed, but not both. Example:: >> insert = Insert('person', valueList=[('name', 'Test'), ('age', 42)]) # or >> insert = Insert('person', valueList=[{'name': 'Test'}, {'age': 42}]) >> query = connection.sqlrepr(insert) # Both generate the same query: # INSERT INTO person (name, age) VALUES ('Test', 42) >> connection.query(query) `values`: A dictionary {key: value}; keys are column names. Either `valueList` or `values` must be passed, but not both. Example:: >> insert = Insert('person', values={'name': 'Test', 'age': 42}) >> query = connection.sqlrepr(insert) # The query is the same # INSERT INTO person (name, age) VALUES ('Test', 42) >> connection.query(query) Instances of the class work fast and thus are suitable for mass-insertion. If one needs to populate a database with SQLObject running a lot of ``INSERT`` queries this class is the way to go. Update ~~~~~~ A class to build ``UPDATE`` queries. Accepts a number of parameters. Use ``connection.query(query)`` to execute the query. `table`: A string that names the table to ``UPDATE``. Required. `values`: A dictionary {key: value}; keys are column names. Required. `where`: An optional string or SQLExpression, represents the ``WHERE`` clause. Example:: >> update = Update('person', >> values={'name': 'Test', 'age': 42}, where='id=1') >> query = connection.sqlrepr(update) # UPDATE person SET name='Test', age=42 WHERE id=1 >> connection.query(query) Delete ~~~~~~ A class to build ``DELETE FROM`` queries. Accepts a number of parameters. Use ``connection.query(query)`` to execute the query. `table`: A string that names the table to ``UPDATE``. Required. `where`: An optional string or an SQLExpression, represents the ``WHERE`` clause. Required. If you need to delete all rows pass ``where=None``; this is a safety measure. Example:: >> update = Delete('person', where='id=1') >> query = connection.sqlrepr(update) # DELETE FROM person WHERE id=1 >> connection.query(query) Union ~~~~~ A class to build ``UNION`` queries. Accepts a number of parameters - ``Select`` queries. Use ``connection.queryAll(query)`` to execute the query and get back the results. Example:: >> select1 = Select(['min', func.MIN(const.salary)], staticTables=['employees']) >> select2 = Select(['max', func.MAX(const.salary)], staticTables=['employees']) >> union = Union(select1, select2) >> query = connection.sqlrepr(union) # SELECT 'min', MIN(salary) FROM employees # UNION # SELECT 'max', MAX(salary) FROM employees >> rows = connection.queryAll(query) Nested SQL statements (subqueries) ================================== There are a few special operators that receive as parameter SQL statements. These are ``IN``, ``NOTIN``, ``EXISTS``, ``NOTEXISTS``, ``SOME``, ``ANY`` and ``ALL``. Consider the following example: You are interested in removing records from a table using deleteMany. However, the criterion for doing so depends on another table. You would expect the following to work:: >> PersonWorkplace.deleteMany(where= ((PersonWorkplace.q.WorkplaceID==Workplace.q.id) & (Workplace.q.id==SOME_ID))) But this doesn't work! However, you can't do a join in a deleteMany call. To work around this issue, use ``IN``:: >> PersonWorkplace.deleteMany(where= IN(PersonWorkplace.q.WorkplaceID, Select(Workplace.q.id, Workplace.q.id==SOME_ID))) .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/presentation-2004-11/0000755000175000017500000000000013141371614017122 5ustar phdphd00000000000000SQLObject-3.4.0/docs/presentation-2004-11/sqlobject-and-database-programming.html0000644000175000017500000004310210707101542026614 0ustar phdphd00000000000000 SQLObject and Database Programming in Python

SQLObject and Database Programming in Python

Ian Bicking

Imaginary Landscape

The Ongoing Example

Address book:

  • person:
    • first_name
    • last_name
    • email (many emails for one person)
  • email:
    • type
    • address

The DB API

Database programming in Python is based on the DB-API. Some supported databases:

  • MySQL
  • PostgreSQL (multiple drivers)
  • SQLite
  • Firebird (Interbase)
  • MAXDB (SAP DB)
  • Oracle, Sybase, MS SQL Server

DB API Example

Example:

import MySQLdb
conn = MySQLdb.connect(
    db='test', username='ianb')
cur = conn.cursor()
cur.execute(
    "SELECT id, first_name, last_name FROM person")
result = cur.fetchall()
for id, first_name, last_name in result:
    print "%2i %s, %s" % (id, last_name, first_name)

DB API

  • Import the database module (MySQLdb, psycopg, sqlite, etc)
  • Use module.connect(...) to create a connection.
  • Use connection.cursor() to get a cursor. Cursors do all the work.
  • Use cursor.execute(sql_query) to run something.
  • Use cursor.fetchall() to get results.

DB API problems

  1. Connections and cursors are tedious.
  2. The databases don't all work the same; multiple SQL dialects.
  3. Lots of time spent writing SQL.
  4. Database updates can effect all the SQL you've written.
  5. Unless you make abstractions...

Abstractions

  • Every significant database application has a database abstraction layer.
  • You write one whether you mean to or not.

Object-Relational Mappers

The table:
person
id first_name last_name
1 John Doe
2 Tom Jones
...

ORMs: Classes

The class:
person
id first_name last_name
1 John Doe
2 Tom Jones
...
class Person(SQLObject):
    # id is implicit
    first_name = StringCol()
    last_name = StringCol()

ORMs: Instances

An instance:
person
id first_name last_name
1 John Doe
2 Tom Jones
...
>>> john = Person.get(1)
>>> john.first_name
'John'

ORM: summary

  • Every table is a class
  • Every row is an instance
  • Columns become attributes

Usage

__connection__ = 'postgres://pgsql@localhost/test'
class Person(SQLObject):
    first_name = StringCol()
    last_name = StringCol()
    emails = MultipleJoin('Email')

Usage (classes)

class Email(SQLObject):
    person = ForeignKey('Person')
    type = EnumCol(['home', 'work'])
    address = StringCol()

Usage (creating tables)

def createTables():
    for table in (Person, Email):
        table.createTable(ifNotExists=True)

Usage (instances)

Using your classes:
>>> john = Person(first_name='John', last_name='Doe')
>>> email = Email(person=john, type='home', 
...     address='john@work.com')
>>> john.emails
[]

Usage (instances)

>>> tom = Person(first_name='Tom', last_name='Jones')
>>> tom is Person.get(tom.id)
True
>>> list(Person.selectBy(first_name='John'))
[]

Defining Classes

  • The class name matches the table name
  • The attributes match the column names
  • (kind of...)

Defining Classes...

You can add extra methods:

class Person(SQLObject):
    ...
    def _get_name(self):
        return self.first_name + ' ' + self.last_name
>>> tom.name
'Tom Jones'

Automatic properties

  • _get_attr() methods are called whenever the obj.attr attribute is called
  • _set_attr() methods are called whenever the obj.attr = value statement is run
  • _set_attr() is optional

Defining classes...

You can override columns:

class Person(SQLObject):
    ...
    last_name_searchable = StringCol()
    def _set_last_name(self, value):
        self._SO_set_last_name(self, value)
        self.last_name_lower = re.sub(r'[^a-zA-Z]', '', value).lower()

Defining classes...

You can fiddle with the naming:

class Person(SQLObject):
    _table = "people"
    first_name = StringCol(dbName="fname")
    ...

Connecting classes

Foreign Keys:

class Email(SQLObject):
    person = ForeignKey('Person')
    ...
Note we use a string for 'Person'.

Backreferences

The other side of one-to-many:

class Person(SQLObject):
    ...
    emails = MultipleJoin('Email')

Many-to-many

Many to many relationships imply a "hidden" table:

class Address(SQLObject):
    people = RelatedJoin('Person')
    ...
class Person(SQLObject):
    addresses = RelatedJoin('Address')

Many-to-many...

The intermediate table created:

CREATE TABLE address_person (
  address_id INT NOT NULL,
  person_id INT NOT NULL
);

Lazy classes

SQLObject can use reflection to figure out what columns your table has:

class Person(SQLObject):
    _fromDatabase = True
You can start with _fromDatabase = True and add in explicit columns, overriding defaults.

Instances

  • Use Person.get(id) to retrieve a row
  • Use Person(first_name=...) to insert a row (the new object is returned)
  • Use aPerson.destroySelf() to delete a row

Updating

  • aPerson.first_name = "new_name"
    UPDATE is executed immediately
  • Update multiple columns at once:
    aPerson.set(first_name="new_fname", last_name="new_lname")

Selecting

You can use Python expressions (kind of):

query = Person.q.first_name == "Joe"
Person.select(query)

Selecting...

  • Class.q.column_name creates a magical object that creates queries.
  • sqlrepr() turns these query objects into SQL (mostly hidden)
  • sqlrepr(Person.q.first_name == "Joe", 'mysql') creates the SQL person.first_name = 'Joe'

Selecting...

Complicated joins are possible:

Person.select((Person.q.id == Email.q.personID)
              & (Email.q.address.startswith('joe')))
Becomes:
SELECT person.id, person.first_name, person.last_name
FROM person, email
WHERE person.id = email.person_id
      AND email.address LIKE 'joe%'

SelectResult

Select results are more than just lists:
  • You can slice them, and LIMIT and OFFSET statements are added to your query
  • You can iterate through them, and avoid loading all objects into memory
  • You can do things like select_result.count() to run aggregate functions
  • To get a real list, you must do list(select_result)

Caching

  • SQLObject caches everything
  • Maybe too much (bad if you have multiple updaters)
  • Needs to because there's no joins
  • Has potential to be faster than ad hoc queries
  • Thread safety or thread confusion?

Performance

  • Not good if you are dealing with lots and lots of rows
  • Inserts and selects substantially slower
  • But for many applications you'll pay that price sooner or later anyway

The problem with ORMs

ORMs live in the world between Object Oriented programmers (and programs) and Relational programmers (and programs). Neither will be happy.

ORM Mismatch

  • Python classes aren't tables
  • Python instances aren't rows

ORM Mismatch: Classes

  • Databases don't have inheritance
  • Even when they do, no one uses it
  • Some tables are too boring (e.g., intermediate tables for many-to-many relationships), or don't warrant the overhead

ORM Mismatch: Instances

  • Instances get garbage collected, rows are forever
  • Rows exist even when instances don't
  • Instances may still exist when the row is gone
  • In the relational calculus, rows don't have real identity

ORM Mismatch: Relations

  • Things like joins don't make sense for objects, but are central to relations
  • Views and stored procedures also don't make sense
  • Aggregate functions and set operations aren't natural in Python (the for loop is natural)

Solving the question by avoidance

  • SQLObject doesn't let you forget SQL
  • You still have to think relationally
  • But many relational concepts don't work either
  • Advantage: it's easy

Solving the Mismatch

SQLObject's answer: don't try too hard.
  • There exist database restrictions; not every database schema maps well. (But most are fine)
  • Sometimes you have to do things more slowly or more imperatively, compared to a purely relational technique
  • There exist object restrictions; not every object-oriented concept maps well; e.g., no inheritance
  • Inheritance isn't planned; this isn't transparent persistence

Some other alternatives

Several Python ORMs exist; contrasts:
  • SQLObject doesn't try to be too OO, so it works well with normal database schemas
  • SQLObject doesn't have a compile step
  • SQLObject isn't just a SQL wrapper
  • SQLObject is reasonably complete

Other alternatives...

See the Python Wiki at: python.org/moin
The page is HigherLevelDatabaseProgramming.

SQLObject-3.4.0/docs/DeveloperGuide.rst0000644000175000017500000002663713102700617017254 0ustar phdphd00000000000000+++++++++++++++++++++++++ SQLObject Developer Guide +++++++++++++++++++++++++ .. contents:: :backlinks: none .. _start: These are some notes on developing SQLObject. Development Installation ======================== First install `FormEncode `_:: $ git clone git://github.com/formencode/formencode.git $ cd formencode $ sudo python setup.py develop Then do the same for SQLObject:: $ git clone git clone git://github.com/sqlobject/sqlobject.git $ cd sqlobject $ sudo python setup.py develop Or rather fork it and clone your fork. To develop a feature or a bugfix create a separate branch, push it to your fork and create a pull request to the original repo. That way CI will be triggered to test your code. Voila! The packages are globally installed, but the files from the checkout were not copied into ``site-packages``. See `setuptools `_ for more. Architecture ============ There are three main kinds of objects in SQLObject: tables, columns and connections. Tables-related objects are in `sqlobject/main.py`_ module. There are two main classes: ``SQLObject`` and ``sqlmeta``; the latter is not a metaclass but a parent class for ``sqlmeta`` attribute in every class - the authors tried to move there all attributes and methods not directly related to columns to avoid cluttering table namespace. .. _`sqlobject/main.py`: sqlobject/main.py.html Connections are instances of ``DBConnection`` class (from `sqlobject/dbconnection.py`_) and its concrete descendants. ``DBConnection`` contains generic code for generating SQL, working with transactions and so on. Concrete connection classes (like ``PostgresConnection`` and ``SQLiteConnection``) provide backend-specific functionality. .. _`sqlobject/dbconnection.py`: sqlobject/dbconnection.py.html Columns, validators and converters ---------------------------------- Columns are instances of classes from `sqlobject/col.py`_. There are two classes for every column: one is for user to include into an instance of SQLObject, an instance of the other is automatically created by SQLObject's metaclass. The two classes are usually named ``Col`` and ``SOCol``; for example, ``BoolCol`` and ``SOBoolCol``. User-visible classes, descendants of ``Col``, seldom contain any code; the main code for a column is in ``SOCol`` descendants and in validators. .. _`sqlobject/col.py`: sqlobject/col.py.html Every column has a list of validators. Validators validate input data and convert input data to python data and back. Every validator must have methods ``from_python`` and ``to_python``. The former converts data from python to internal representation that will be converted by converters to SQL strings. The latter converts data from SQL data to python. Also please bear in mind that validators can receive ``None`` (for SQL ``NULL``) and ``SQLExpression`` (an object that represents SQLObject expressions); both objects must be passed unchanged by validators. Converters from `sqlobject/converters.py`_ aren't visible to users. They are used behind the scene to convert objects returned by validators to backend-specific SQL strings. The most elaborated converter is ``StringLikeConverter``. Yes, it converts strings to strings. It converts python strings to SQL strings using backend-specific quoting rules. .. _`sqlobject/converters.py`: sqlobject/converters.py.html Let look into ``BoolCol`` as an example. The very ``BoolCol`` doesn't have any code. ``SOBoolCol`` has a method to create ``BoolValidator`` and methods to create backend-specific column type. ``BoolValidator`` has identical methods ``from_python`` and ``to_python``; the method passes ``None``, ``SQLExpression`` and bool values unchanged; int and objects that have method ``__nonzero__`` (``__bool__`` in Python 3) are converted to bool; other objects trigger validation error. Bool values that are returned by call to ``from_python`` will be converted to SQL strings by ``BoolConverter``; bool values from ``to_python`` (is is supposed they are originated from the backend via DB API driver) are passed to the application. Objects that are returned from ``from_python`` must be registered with converters. Another approach for ``from_python`` is to return an object that has ``__sqlrepr__`` method. Such objects convert to SQL strings themselves, converters are not used. Branch workflow =============== Initially ``SQLObject`` was being developed using ``Subversion``. Even after switching to git development process somewhat preserves the old workflow. The ``trunk``, called ``master`` in git, is the most advanced and the most unstable branch. It is where new features are applied. Bug fixes are applied to ``oldstable`` and ``stable`` branches and are merged upward -- from ``oldstable`` to ``stable`` and from ``stable`` to ``master``. Style Guide =========== Generally you should follow the recommendations in `PEP 8`_, the Python Style Guide. Some things to take particular note of: .. _PEP 8: http://www.python.org/dev/peps/pep-0008/ * With a few exceptions sources must be `flake8`_-clean (and hence pep8-clean). Please consider using pre-commit hook installed by running ``flake8 --install-hook``. .. _flake8: https://gitlab.com/pycqa/flake8 * **No tabs**. Not anywhere. Always indent with 4 spaces. * We don't stress too much on line length. But try to break lines up by grouping with parenthesis instead of with backslashes (if you can). Do asserts like:: assert some_condition(a, b), ( "Some condition failed, %r isn't right!" % a) * But if you are having problems with line length, maybe you should just break the expression up into multiple statements. * Blank lines between methods, unless they are very small and closely bound to each other. * *Never* use the form ``condition and trueValue or falseValue``. Break it out and use a variable. * Careful of namespace pollution. SQLObject does allow for ``from sqlobject import *`` so names should be fairly distinct, or they shouldn't be exported in ``sqlobject.__init__``. * We're very picky about whitespace. There's one and only one right way to do it. Good examples:: short = 3 longerVar = 4 if x == 4: do stuff func(arg1='a', arg2='b') func((a + b)*10) **Bad** examples:: short =3 longerVar=4 if x==4: do stuff func(arg1 = 'a', arg2 = 'b') func(a,b) func( a, b ) [ 1, 2, 3 ] To us, the poor use of whitespace seems lazy. We'll think less of your code (justified or not) for this very trivial reason. We will fix all your code for you if you don't do it yourself, because we can't bear to look at sloppy whitespace. * Use ``@@`` to mark something that is suboptimal, or where you have a concern that it's not right. Try to also date it and put your username there. * Docstrings are good. They should look like:: class AClass(object): """ doc string... """ Don't use single quotes ('''). Don't bother trying make the string less vertically compact. * Comments go right before the thing they are commenting on. * Methods never, ever, ever start with capital letters. Generally only classes are capitalized. But definitely never methods. * mixedCase is preferred. * Use ``cls`` to refer to a class. Use ``meta`` to refer to a metaclass (which also happens to be a class, but calling a metaclass ``cls`` will be confusing). * Use ``isinstance`` instead of comparing types. E.g.:: if isinstance(var, str): ... # Bad: if type(var) is StringType: ... * Never, ever use two leading underscores. This is annoyingly private. If name clashes are a concern, use name mangling instead (e.g., ``_SO_blahblah``). This is essentially the same thing as double-underscore, only it's transparent where double underscore obscures. * Module names should be unique in the package. Subpackages shouldn't share module names with sibling or parent packages. Sadly this isn't possible for ``__init__``, but it's otherwise easy enough. * Module names should be all lower case, and probably have no underscores (smushedwords). Testing ======= Tests are important. Tests keep everything from falling apart. All new additions should have tests. Testing uses pytest, an alternative to ``unittest``. It is available at http://pytest.org/ and https://pypi.python.org/pypi/pytest. Read its `getting started`_ document for more. .. _getting started: http://docs.pytest.org/en/latest/getting-started.html To actually run the test, you have to give it a database to connect to. You do so with the option ``-D``. You can either give a complete URI or one of several shortcuts like ``mysql`` (these shortcuts are defined in the top of ``tests/dbtest.py``). All the tests are modules in ``sqlobject/tests``. Each module tests one kind of feature, more or less. If you are testing a module, call the test module ``tests/test_modulename.py`` -- only modules that start with ``test_`` will be picked up by pytest. The "framework" for testing is in ``tests/dbtest``. There's a couple of important functions: ``setupClass(soClass)`` creates the tables for the class. It tries to avoid recreating tables if not necessary. ``supports(featureName)`` checks if the database backend supports the named feature. What backends support what is defined at the top of ``dbtest``. If you ``import *`` you'll also get pytest's version of raises_, an ``inserts`` function that can create instances for you, and a couple miscellaneous functions. .. _raises: http://docs.pytest.org/en/latest/assert.html#assertions-about-expected-exceptions If you submit a patch or implement a feature without a test, we'll be forced to write the test. That's no fun for us, to just be writing tests. So please, write tests; everything at least needs to be exercised, even if the tests are absolutely complete. We now use Travis CI and AppVeyor to run tests. See the statuses: .. image:: https://travis-ci.org/sqlobject/sqlobject.svg?branch=master :target: https://travis-ci.org/sqlobject/sqlobject .. image:: https://ci.appveyor.com/api/projects/status/github/sqlobject/sqlobject?branch=master :target: https://ci.appveyor.com/project/phdru/sqlobject To avoid triggering unnecessary test run at CI services add text `[skip ci] `_ or ``[ci skip]`` anywhere in your commit messages for commits that don't change code (documentation updates and such). We use `coverage.py `_ to measures code coverage by tests and upload the result for analyzis to `Coveralls `_ and `Codecov `_: .. image:: https://coveralls.io/repos/github/sqlobject/sqlobject/badge.svg?branch=master :target: https://coveralls.io/github/sqlobject/sqlobject?branch=master .. image:: https://codecov.io/gh/sqlobject/sqlobject/branch/master/graph/badge.svg :target: https://codecov.io/gh/sqlobject/sqlobject Documentation ============= Please write documentation. Documentation should live in the docs/ directory in reStructuredText format. We use Sphinx to convert docs to HTML. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/community.rst0000644000175000017500000000265013115624261016366 0ustar phdphd00000000000000SQLObject Community =================== SQLObject questions and discussion happens on the `sqlobject-discuss mailing list `_. Bugs should be submitted to the `GitHub issue tracker `_ or `bug tracker at SourceForge `_, and patches as `pull requests at GitHub `_ or `to the patch tracker at SF `_. Development takes place in the `git repositories `_. There are `development docs`_, i.e the docs from the development branch (master). If you are interested in contributing you should read the `Developer Guide `_. .. _`development docs`: /devel/ The `Author List `_ tries to list all the major contributors. Questions can also be asked and answered on `StackOverflow `_. One can also contribute to `community-editable recipe/documentation site `_. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/Python3.rst0000644000175000017500000000520312752476767015731 0ustar phdphd00000000000000++++++++++++++++++++++ SQLObject and Python 3 ++++++++++++++++++++++ .. contents:: Changes between Python 2 and Python 3 ------------------------------------- There are a few changes in the behaviour of SQLObject on Python 3, due to the changed stings / bytes handling introduced in Python 3.0. BLOBCol ~~~~~~~ In Python 3, BLOBCol now accepts and returns bytes, rather than strings as it did in Python 2. StringCol ~~~~~~~~~ In Python 3, StringCol now accepts arbitrary Unicode strings. UnicodeCol ~~~~~~~~~~ The dbEncoding parameter to UnicodeCol has no effect in Python 3 code. This is now handled by the underlying database layer and is no longer exposed via SQLObject. The parameter is still available for those writing Python 2 compatible code. Python 3 and MySQL ------------------ SQLObject is tested using mysqlclient_ as the database driver on Python 3. Note that the default encoding of MySQL databases is *latin1*, which can cause problems with general Unicode strings. We recommend specifying the character set as *utf8* when using MySQL to protect against these issues. .. _mysqlclient: https://pypi.python.org/pypi/mysqlclient Using databases created with SQLObject and Python 2 in Python 3 --------------------------------------------------------------- For most cases, things should just work as before. The only issues should around UnicodeCol, as how this is handled has changed. SQLite ~~~~~~ The Python 3 sqlite driver expects Unicode columns to be encoded using utf8. Columns created using the default encoding on Python 2 should work fine, but columns created with a different encoding set using the dbEncoding parameter may cause problems. Postgres ~~~~~~~~ Postgres' behaviour is similar to sqlite. Columns created using the default encoding on Python 2 should work fine, but columns created with a different encoding set using the dbEncoding may cause problems. MySQL ~~~~~ For MySQL, the results depend on whether the Python 2 database was using MySQLdb's Unicode mode or not. If a character set was specified for the database using the charset parameter, such as:: mysql:///localhost/test?charset=latin1 Things should work provided the same character set is specified when using Python 3. If a character set wasn't specified, then things may work if the character set is set to match the dbEncoding parameter used when defining the UnicodeCol. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/test.py0000755000175000017500000000044312752476767015170 0ustar phdphd00000000000000#!/usr/bin/env python import doctest import os import sys sys.path.insert( 0, os.path.dirname(os.path.dirname(os.path.abspath(__file__)))) def test(): for doc in ['SQLObject.rst']: doctest.testfile(doc, optionflags=doctest.ELLIPSIS) if __name__ == '__main__': test() SQLObject-3.4.0/docs/News1.rst0000644000175000017500000003516612752476767015375 0ustar phdphd00000000000000++++ News ++++ .. contents:: Contents: :backlinks: none .. _start: SQLObject 0.6.1 =============== Interface Changes ----------------- * The long broken and unused ``DBMConnection`` has been removed. * Added a connection parameter to all class methods (patch 974755) * Connection objects have a ``.module`` attribute, which points to the DB-API module. This is useful for getting access to the exception objects. Features -------- * New ``UnicodeCol()`` that converts to and from Unicode in the database. See docs_. .. _docs: SQLObject.html#subclasses-of-col * Added indexing (from Jeremy Fitzhardinge). See `the documentation`__ for more. .. __: SQLObject.html#indexes * All connections are explicitly closed, not just garbage collected. Many database drivers don't close database connections properly when the connection object is garbage collected. * New ``distinct`` option to selects, like ``MyClass.select(..., distinct=True)`` * You can now do ``MyClass.selectBy(joinedTable=joinedTableInstance)``, where before you had to do ``MyClass.selectBy(joinedTableID=joinedTableInstance.id)``. (From Dave Cook) SQLObject 0.6 ============= Interface Changes ----------------- * Lazy updates. Add ``_lazyUpdate=True`` to your class, and updates will only be written when you call ``obj.syncUpdate()`` or ``obj.sync()`` (``sync`` also refetches the data from the database, which ``syncUpdate`` does not do). When enabled, instances have a property ``dirty``, which indicates if they have pending updates. Inserts are still done immediately. * Separated database drivers (PostgresConnection, MySQLConnection, etc.) into separate packages. You can access the driver through URIs, like ``mysql://user:pass@host/dbname`` -- to set drivers after class creation you should use `sqlobject.connectionForURI()`. * The ``SQLObject`` package has been renamed to ``sqlobject``. This makes it similar to several other packages, and emphasizes the distinction between the ``sqlobject`` package and the ``SQLObject`` class. * Class instantiation now creates new rows (like `.new()` used to do), and the `.get()` method now retrieves objects that already have rows (like class instantiation used to do). * We're now using a Subversion repository instead of CVS. It is located at http://svn.colorstudy.com/trunk/SQLObject * If you pass ``forceDBName=True`` to the ``*Col`` constructors, then your column name doesn't have to be restricted to a-z, 0-9, and _. * ``*Col`` constructors now support cascade: ``cascade=None`` (default) means no constraint; ``cascade=True`` means that if the foreign key is deleted, the object will be deleted; ``cascade=False`` means that the delete will fail; ``cascade="null"`` means that the column will be set to NULL. The constraints are only implemented in the DBMS, not in SQLObject (i.e., they will not work in databases like MySQL and SQLite). * New ``_create(id, **kw)`` method that can be overridden to intercept and modify attempts to insert rows in the database. * You can specify ``_idType`` in your class, like ``_idType = str``. The default type is ``int``; i.e., IDs are coerced to integers. This is a temporary interface; a more general specifier for primary keys will be added later. * New classmethod ``createTableSQL()`` method for SQLObject classes, which returns the SQL that can be used to create the table. Analog to ``createTable()``. Bugs ---- * SQLite booleans fixed. * You can now use ``sqlite:/:memory:`` to store the database in memory. * Some bugs resolved when caching is turned off (SF 956847) SQLObject 0.5.3 =============== Bugs ---- * Python 2.2 booleans fixed (SF: 903488) * Longs (e.g., ``1L``) get converted properly (SF: 939965) SQLObject 0.5.2 =============== We're now using Subversion instead of CVS. The repository is located at svn://colorstudy.com/trunk/SQLObject Interface Changes ----------------- * If you commit or rollback a transaction, you must call ``trans.begin()`` to restart the transaction. Any database access on the transaction inbetween commit/rollback and being will result in an AssertionError. (It's also acceptable to create a new transaction object instead of reusing the old one, but objects in that transaction will be invalid) Bugs ---- * Using .select() would hold on to a connection, and also release it back to the connection pool. Very un-threadsafe and all-around bad. * Fixed bug which did not release connections after database (query) error. * When setting columns that use validators, the Pythonic (vs. database) representation wasn't being stored in the column. Now we roundtrip (through toPython and fromPython) the values when they get set. * PostgreConnection is back to using sequences for ID generation, instead of oids. Long explanation -- oids can be unindexed in some versions of Postgres, or not even exist. * When turning caching off and using transactions, got an attribute error on rollback. * Rollback or commit didn't find objects that were expired from the cache but still in memory. * Rollback or commit didn't free the connection object, so as you created more transactions it stole connections and didn't put them back in the pool. SQLObject 0.5.1 =============== Released: 12-Nov-2003 Interface Changes ----------------- * Select results no longer have a __len__ method (i.e., you can't do ``len(Person.select(Person.q.firstName=='Bob'))``). There is now a ``.count()`` method instead. ``__len__`` gets called implicitly in several circumstances, like ``list()``, which causes potentially expensive queries to ``COUNT(*)``. Bugs ---- * Objects retrieved from a join now respect the transaction context of the original instance. * ``.select().reversed()`` works. SQLObject 0.5 ============= Released: 1-Nov-2003 Features -------- * Firebird_ support. * Database-specific literal quoting (motivation: MySQL and Postgres use backslashes, Firebird and SQLite do not). * Generic conversion/validation can be added to columns. * BoolCol for portable boolean columns (BOOL on Postgres, INT on MySQL, etc.) * Non-integer IDs. (Automatic table creation is not supported for non-integer IDs) * Explicit IDs for new instances/rows (required for non-integer IDs). * Instances can be synced with the database (in case there have been updates to the object since it was first fetched). * Instances can be expired, so that they will be synced when they are next accessed. .. _Firebird: http://firebird.sourceforge.net/ Interface Changes ----------------- * `SQLBuilder.sqlRepr` renamed to `SQLBuilder.sqlrepr`, signature changed to ``sqlrepr(value, databaseName)`` to quote ``value``, where ``databaseName`` is one of ``"mysql"``, ``"postgres"``, ``"sqlite"``, ``"firebird"``. * ``sqlRepr`` magic method renamed to ``__sqlrepr__``, and takes new ``databaseName`` argument. * When using explicit booleans, use ``Col.TRUE`` and ``Col.FALSE`` for backward compatibility with Python 2.2. This is not required for ``BoolCol``, however (which converts all true values to TRUE and false values to FALSE) * SQLObject has a ``sqlrepr`` method, so you can construct queries with something like ``"WHERE last_name = %s" % Person.sqlrepr('Bob')`` Bugs ---- * Released all locks with ``finally:``, so that bugs won't cause frozen locks. * Tons of transaction fixes. Transactions pretty much work. * A class can have multiple foreign keys pointing to the same table (e.g., ``spouse = ForeignKey("Person"); supervisor = ForeignKey("Person")``) SQLObject 0.4 ============= Features -------- * You can specify columns in a new, preferred manner:: class SomeObject(SQLObject): someColumn = Col() Equivalent to:: class SomeObject(SQLObject): _columns = [Col('someColumn')] Ditto joins. * Cache objects have a clear method, which empties all objects. However, weak references to objects *are* maintained, so the integrity of the cache can be ensured. * SQLObject subclasses can be further subclassed, adding or removing column definitions (as well as changing settings like connection, style, etc). Each class still refers to a single concrete table in the database -- the class hierarchy is not represented in the database. * Each SQLObject subclass can have an associated style, as given in the `_style` attribute. This object is used to map between Python and database names (e.g., the column name for a Python attribute). Some samples are available in the `Style` module. * Postgres support for `_fromDatabase` (reading a table definition from the database, and creating a class from that). * Postgres id columns more permissive, you don't have to create a specially named sequence (or implicitly create that sequence through ``SERIAL``). lastoid is used instead. * MySQL uses ``localhost`` as the default host, and the empty string as the default password. * Added functions for use with queries: `ISNULL`, `ISNOTNULL`. ``==`` and ``!=`` can be used with None, and is translated into `ISNULL`, `ISNOTNULL`. * Classes can be part of a specific registry. Since classes are referred to by name in several places, the names have to be unique. This can be problematic, so you can add a class variable `_registry`, the value of which should be a string. Classes references are assumed to be inside that registry, and class names need only be unique among classes in that registry. * ``SomeClass.select()`` selects all, instead of using ``SomeClass.select('all')``. You can also use None instead of ``'all'``. * Trying to fetch non-existent objects raises `SQLObjectNotFound`, which is a subclass of the builtin exception `LookupError`. This may not be raised if `_cacheValues` is False and you use the ID to fetch an object (but alternateID fetches will raise the exception in either case). * Can order by descending order, with the `reversed` option to the `select` method, or by prefixing the column with a ``"-"``. * Ordering with joins works better -- you can order with multiple columns, as well as descending ordering. Col and Join ~~~~~~~~~~~~ * `Join` constructors have an argument `orderBy`, which is the name of a Python attribute to sort results by. If not given, the appropriate class's `_defaultOrder` will be used. None implies no sorting (and ``orderBy=None`` will override `_defaultOrder`). * `ForeignKey` class (subclass of `Col`), for somewhat easier/clearer declaration of foreign keys. * `Col` (and subclasses) can take a `sqlType` argument, which is used in table definitions. E.g., ``Col(sqlType="BOOLEAN")`` can be used to create a ``BOOLEAN`` column, even though no `BooleanCol` exists. * `alternateID` (a specifier for columns) implies ``NOT NULL``. Also implies ``UNIQUE``. * `unique` (a specifier for columns) added. * `DecimalCol` and `CurrencyCol` added. * `EnumCol` uses constraints on Postgres (if you use `createTable`). Bugs ---- * `DateTimeCol` uses ``TIMESTAMP`` for Postgres. Note that the Python type name is used for column names, not necessarily the SQL standard name. * Foreign key column names are slightly more permissive. They still need to end in ``id``, but it's case insensitive. * _defaultOrder should be the python attribute's name, not the database name. * SomeClass.q.colName uses proper Python attributes for colName, and proper database names when executed in the database. * SQLite select results back to being proper iterator. * SomeClass.q.colName now does proper translation to database names, using dbName, etc., instead of being entirely algorithm-driven. * Raise `TypeError` if you pass an unknown argument to the `new` method. * You can override the _get_* or _set_* version of a property without overriding the other. * Python 2.3 compatible. * Trying to use ``Col('id')`` or ``id = Col()`` will raise an exception, instead of just acting funky. * ``ForeignKey`` columns return None if the associated column is NULL in the database (used to just act weird). * Instantiating an object with an id of None will give an error, instead of just acting weird. Internal -------- * `Col` class separated into `Col` and `SOCol` (and same for all other `*Col` classes). `Col` defines a column, `SOCol` is that definition bound to a particular SQLObject class. * Instance variable ``_SO_columns`` holds the `SOCol` instances. SQLObject 0.3 ============= Features -------- * Table creation (SQL schema generation) via new class method `createTable`. And of course a `dropTable` method to go with. * Add and remove columns at runtime, optionally modifying the schema in the database (via ``ALTER``). (Does not work in SQLite) * New column classes (see `Col` module), indicates type * Classes can be created by parsing an already existant table (MySQL only). * Objects are not cached indefinitely. Cached objects are expired into a weak dictionary (it allows objects to be garbage collected if nowhere else in the program is using the object, but until it is collected it's still available to the cache). Some cache control, pass ``nocache=True`` to your connection object to eliminate as much caching as possible. See `Cache` module for a bit more. * New DBMConnection, implements a database-like backend without any database to speak of, including queries (so long as you use `SQLBuilder` and don't generate your where clauses manually). Actual SQL generation is done entirely by the database connection, allowing portability across very different backends. * Postgres table IDs should be created with type ``SERIAL`` (which implicitly creates a sequence). * New `_defaultOrder` class variable gives a default for the `orderBy` parameter to `select` queries. Bugs ---- * LIMIT/OFFSET (select result slicing) works in Postgres and SQLite. * ``tableExists`` method from DBConnection works in same. * mxDateTime not required (never should have been, always just an option). SQLObject 0.2.1 =============== Bugs ---- * Fixed caching of new objects Features -------- * SQLite_ support * Select statements are lazily generated, retrieve full rows for speed, and are slicable (`select docs`_). * `alternateID` option for `Col` objects -- select individual objects via UNIQUE columns, e.g., a username (`Col docs`_). .. _SQLite: http://sqlite.org/ .. _select docs: SQLObject.html#selecting-multiple-objects .. _Col docs: SQLObject.html#col-class-specifying-columns .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/SelectResults.rst0000644000175000017500000001534613043410220017135 0ustar phdphd00000000000000SelectResults: Using Queries ============================ .. contents:: Contents: Overview -------- SelectResults are returned from ``.select`` and ``.selectBy`` methods on SQLObject classes, and from ``SQLMultipleJoin``, and ``SQLRelatedJoin`` accessors on SQLObject instances. SelectResults are generators, which are lazily evaluated. The SQL is only executed when you iterate over the SelectResults, fetching rows one at a time. This way you can iterate over large results without keeping the entire result set in memory. You can also do things like ``.reversed()`` without fetching and reversing the entire result -- instead, SQLObject can change the SQL that is sent so you get equivalent results. .. note:: To retrieve the results all at once use the python idiom of calling ``list()`` on the generator to force execution and convert the results to a stored list. You can also slice SelectResults. This modifies the SQL query, so ``peeps[:10]`` will result in ``LIMIT 10`` being added to the end of the SQL query. If the slice cannot be performed in the SQL (e.g., peeps[:-10]), then the select is executed, and the slice is performed on the list of results. This will generally only happen when you use negative indexes. In certain cases, you may get a SelectResults instance with an object in it more than once, e.g., in some joins. If you don't want this, you can add the keyword argument ``MyClass.select(..., distinct=True)``, which results in a ``SELECT DISTINCT`` call. You can get the length of the result without fetching all the results by calling ``count`` on the result object, like ``MyClass.select().count()``. A ``COUNT(*)`` query is used -- the actual objects are not fetched from the database. Together with slicing, this makes batched queries easy to write:: start = 20 size = 10 query = Table.select() results = query[start:start+size] total = query.count() print "Showing page %i of %i" % (start/size + 1, total/size + 1) .. note:: There are several factors when considering the efficiency of this kind of batching, and it depends very much how the batching is being used. Consider a web application where you are showing an average of 100 results, 10 at a time, and the results are ordered by the date they were added to the database. While slicing will keep the database from returning all the results (and so save some communication time), the database will still have to scan through the entire result set to sort the items (so it knows which the first ten are), and depending on your query may need to scan through the entire table (depending on your use of indexes). Indexes are probably the most important way to improve importance in a case like this, and you may find caching to be more effective than slicing. In this case, caching would mean retrieving the *complete* results. You can use ``list(MyClass.select(...))`` to do this. You can save these results for some limited period of time, as the user looks through the results page by page. This means the first page in a search result will be slightly more expensive, but all later pages will be very cheap. Retrieval Methods ----------------- Iteration ~~~~~~~~~ As mentioned in the overview, the typical way to access the results is by treating it as a generator and iterating over it (in a loop, by converting to a list, etc). ``getOne(default=optional)`` ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In cases where your restrictions cause there to always be a single record in the result set, this method will return it or raise an exception: SQLObjectIntegrityError if more than one result is found, or SQLObjectNotFound if there are actually no results, unless you pass in a default like ``.getOne(None)``. Cloning Methods --------------- These methods return a modified copy of the SelectResults instance they are called on, so successive calls can chained, eg ``results = MyClass.selectBy(city='Boston').filter(MyClass.q.commute_distance>10).orderBy('vehicle_mileage')`` or used independently later on. ``orderBy(column)`` ~~~~~~~~~~~~~~~~~~~ Takes a string column name (optionally prefixed with '-' for DESCending) or a `SQLBuilder expression`_. ``limit(num)`` ~~~~~~~~~~~~~~ Only return first num many results. Equivalent to results[:num] slicing. ``lazyColumns(v)`` ~~~~~~~~~~~~~~~~~~ Only fetch the IDs for the results, the rest of the columns will be retrieved when attributes of the returned instances are accessed. ``reversed()`` ~~~~~~~~~~~~~~ Reverse-order. Alternative to calling orderBy with SQLBuilder.DESC or '-'. ``distinct()`` ~~~~~~~~~~~~~~ In SQL, SELECT DISTINCT, removing duplicate rows. ``filter(expression)`` ~~~~~~~~~~~~~~~~~~~~~~ Add additional expressions to restrict result set. Takes either a string static SQL expression valid in a WHERE clause, or a `SQLBuilder expression`_. ANDed with any previous expressions. .. _`SQLBuilder expression`: SQLBuilder.html Aggregate Methods ----------------- These return column values (strings, numbers, etc) not new SQLResults instances, by making the appropriate SQL query (the actual result rows are not retrieved). Any that take a column can also take a SQLBuilder column instance, e.g. ``MyClass.q.size``. ``count()`` ~~~~~~~~~~~ Returns the length of the result set, by a SQL ``SELECT COUNT(...)`` query. ``sum(column)`` ~~~~~~~~~~~~~~~ The sum of values for ``column`` in the result set. ``min(column)`` ~~~~~~~~~~~~~~~ The minimum value for ``column`` in the result set. ``max(column)`` ~~~~~~~~~~~~~~~ The maximum value for ``column`` in the result set. ``avg(column)`` ~~~~~~~~~~~~~~~ The average value for the ``column`` in the result set. Traversal to related SQLObject classes -------------------------------------- ``throughTo.join_name and throughTo.foreign_key_name`` ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This accessor lets you retrieve the objects related to your SelectResults by either a join or foreign key relationship, in the same manner as the cloning methods above. For instance:: Schools.select(Schools.q.student_satisfaction>90).throughTo.teachers returns a SelectResults of Teachers of Schools with satisfied students, assuming Schools has a SQLMultipleJoin or SQLRelatedJoin attribute named ``teachers``. Similarily, with a self-joining foreign key named ``father``:: Person.select(Person.q.name=='Steve').throughTo.father.throughTo.father returns a SelectResults of Persons who are the paternal grandfather of someone named ``Steve``. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/News2.rst0000644000175000017500000003725012752476767015372 0ustar phdphd00000000000000++++ News ++++ .. contents:: Contents: :backlinks: none .. _start: SQLObject 0.8.7 =============== Released 10 Jan 2008. * A number of changes ported from `SQLObject 0.7.10`_. SQLObject 0.8.6 =============== Released 30 Oct 2007. * Removed SelectResults.__nonzero__, which was a design mistake. Raising an exception in __nonzero__() is inconsistent with other iterators (bool(iter([])) => True). * A number of changes ported from `SQLObject 0.7.9`_. SQLObject 0.8.5 =============== Released 25 July 2007. Bug Fixes --------- * Suppress the second RowUpdateSignal in .set() called from ._SO_setValue(). SQLObject 0.8.4 =============== Released 10 May 2007. Bug Fixes --------- * A number of bugfixes forward-ported from 0.7.7. SQLObject 0.8.3 =============== Released 3 May 2007. Bug Fixes --------- * A number of bugfixes forward-ported from 0.7.6. SQLObject 0.8.2 =============== Released 11 Apr 2007. Bug Fixes --------- * Fixed ConnectionHub.doInTransaction() - if the original connection was processConnection - reset processConnection, not threadConnection. SQLObject 0.8.1 =============== Released 19 Mar 2007. Bug Fixes --------- * ID columns are reverted back from INT UNSIGNED to INT for MySQL to be in accord with FOREIGN KEYs. * Fixed return value from Firebird/MaxdbConnection.createTable(). * Fixed and simplified DatabaseIndex.get(). * Fixed ConnectionHub.doInTransaction() - close low-level connection on commit() to prevent connections leaking. SQLObject 0.8.0 =============== Released 12 Feb 2007. Features & Interface -------------------- * It is now possible to create tables that reference each other. Constraints (in the DBMSes that support constraints) are added after the tables have been created. * Added ``createSQL`` as an option for sqlmeta. Here you can add related SQL you want executed by sqlobject-admin create after table creation. createSQL expects a string, list, or dictionary. If using a dictionary the key should be a dbName value (ex. 'postgres') and the value should be a string or list. Examples in sqlobject/tests/test_sqlobject_admin.py or at * Added method ``sqlhub.doInTransaction(callable, *args, **kwargs)``, to be used like:: sqlhub.doInTransaction(process_request, os.environ) This will run ``process_request(os.environ)``. The return value will be preserved. * Added method ``.getOne([default])`` to ``SelectResults`` (these are the objects returned by ``.select()`` and ``.selectBy()``). This returns a single object, when the query is expected to return only one object. The single argument is the value to return when zero results are found (more than one result is always an error). If no default is given, it is an error if no such object exists. * Added a WSGI middleware (in ``sqlobject.wsgi_middleware``) for configuring the database for the request. Also handles transactions. Available as ``egg:SQLObject`` in Paste Deploy configuration files. * New joins! ManyToMany and OneToMany; not fully documented yet, but still more sensible and smarter. * SELECT FOR UPDATE * New module dberrors.py - a hierarchy of exceptions. Translation of DB API module's exceptions to the new hierarchy is performed for SQLite and MySQL. * SQLiteConnection got a new keyword "factory" - a name or a reference to a factory function that returns a connection class; useful for implementing functions or aggregates. See test_select.py and test_sqlite_factory.py for examples. * SQLObject now disallows columns with names that collide with existing variables and methods, such as "_init", "expire", "set" and so on. Small Features -------------- * Configurable client character set (encoding) for MySQL. * Added a close option to .commit(), so you can close the transaction as you commit it. * DecimalValidator. * Added .expireAll() methods to sqlmeta and connection objects, to expire all instances in those cases. * String IDs. * FOREIGN KEY for MySQL. * Support for sqlite3 (a builtin module in Python 2.5). * SelectResults cannot be queried for truth value; in any case it was meaningless - the result was always True; now __nonzero__() raises NotImplementedError in case one tries bool(MyTable.select()) or "if MyTable.select():..." * With empty parameters AND() and OR() returns None. * Allows to use set/frozenset sets/Set/ImmutableSet sets as sequences passed to the IN operator. * ID columns are now INT UNSIGNED for MySQL. Bug Fixes --------- * Fixed problem with sqlite and threads; connections are no longer shared between threads for sqlite (except for :memory:). * The reference loop between SQLObject and SQLObjectState eliminated using weak references. * Another round of bugfixes for MySQL errors 2006 and 2013 (SERVER_GONE, SERVER_LOST). * Fixed a bug in MSSQLConnection caused by column names being unicode. * Fixed a bug in FirebirdConnection caused by column names having trailing spaces. * Order by several columns with inheritance. * Fixed aggregators and accumulators with inheritance. SQLObject 0.7.10 ================ Released 10 Jan 2008. * With PySQLite2 do not use encode()/decode() from PySQLite1 - always use base64 for BLOBs. * MySQLConnection doesn't convert query strings to unicode (but allows to pass unicode query strings if the user build ones). DB URI parameter sqlobject_encoding is no longer used. SQLObject 0.7.9 =============== Released 30 Oct 2007. Bug Fixes --------- * Remove 'limit' from SelectResults after setting start/end so .clone() never sees limit again. * Fixed a bug in sqlbuilder._LikeQuoted() - call sqlrepr() on the expression to escape single quotes if the expression is a string. * Fixed StringCol and UnicodeCol: use sqlType with MSSQL. * Fixed startswith/endswith/contains for UnicodeCol. Other Changes ------------- * Changed the default value for 'varchar' in BLOBColumns from 'auto' to False (so that the default type for the columns in MySQL is BLOB, not TEXT). * Changed the implementation type in BoolCol under MySQL from TINYINT to BOOL (which is a synonym for TINYINT(1)). SQLObject 0.7.8 =============== Released 25 July 2007. Bug Fixes --------- * Replaced calls to style.dbColumnToPythonAttr() in joins.py by name/dbName lookup in case the user named columns differently using dbName. * Minor correction in the tests: we fully support EnumCol in Postgres. * MySQLConnection now recognizes Enum, Double and Time columns when drawing the database scheme from DB. * Minor fix in FirebirdConnection.fromDatabase. * Fixed a bug with default field values for columns for Firebird connection. * Fixed a bug in col.createSQL(). * Fixed a bug in converting date/time for years < 1000 (time.strptime() requires exactly 4 digits for %Y, hence a year < 1000 must be 0-padded). Other Changes ------------- * Changed string quoting style for PostgreSQL and MySQL from \\' to ''. SQLObject 0.7.7 =============== Released 10 May 2007. Bug Fixes --------- * Fixed a bug in SQLRelatedJoin that ignored per-instance connection. * Fixed a bug in MySQL connection in case there is no charset in the DB URI. SQLObject 0.7.6 =============== Released 3 May 2007. Bug Fixes --------- * Fixed a longstanding bug with .select() ignoring 'limit' parameter. * Fixed a bug with absent comma in JOINs. * Fixed sqlbuilder - .startswith(), .endswith() and .contains() assumed their parameter must be a string; now you can pass an SQLExpression: Table.q.name.contains(func.upper('a')), for example. * Fixed a longstanding bug in sqlbuilder.Select() with groupBy being a sequence. * Fixed a bug with Aliases in JOINs. * Yet another patch to properly initialize MySQL connection encoding. * Fixed a minor comparison problem in test_decimal.py. * More documentation about orderBy. SQLObject 0.7.5 =============== Released 11 Apr 2007. Bug Fixes --------- * Fixed test_deep_inheritance.py - setup classes in the correct order (required for Postgres 8.0+ which is strict about referential integrity). * Fixed a bug in DateValidator caused by datetime being a subclass of date. SQLObject 0.7.4 =============== Released 19 Mar 2007. Small Features -------------- * For MySQLdb 1.2.2+ call ping(True) on the connection to allow autoreconnect after a timeout. Bug Fixes --------- * Another round of changes to create/drop the tables in the right order in the command-line client `sqlobject-admin`. * Fixed a bug in UnicodeField - allow comparison with None. SQLObject 0.7.3 =============== Released 30 Jan 2007. Bug Fixes --------- * Allow multiple MSSQL connections. * Psycopg1 requires port to be a string; psycopg2 requires port to be an int. * Fixed a bug in MSSQLConnection caused by column names being unicode. * Fixed a bug in FirebirdConnection caused by column names having trailing spaces. * Fixed a missed import in firebirdconnection.py. * Remove a leading slash in FirebirdConnection. * Fixed a bug in deep Inheritance tree. SQLObject 0.7.2 =============== Released 20 Nov 2006. Features & Interface -------------------- * sqlbuilder.Select now supports JOINs exactly like SQLObject.select. * destroySelf() removes the object from related joins. Bug Fixes --------- * Fixed a number of unicode-related problems with newer MySQLdb. * If the DB API driver returns timedelta instead of time (MySQLdb does this) it is converted to time; but if the timedelta has days an exception is raised. * Fixed a number of bugs in InheritableSQLObject related to foreign keys. * Fixed a bug in InheritableSQLObject related to the order of tableRegistry dictionary. * A bug fix that allows to use SQLObject with DateTime from Zope. Documentation Added ------------------- * Added "How can I define my own intermediate table in my Many-to-Many relationship?" to FAQ. SQLObject 0.7.1 =============== Released 25 Sep 2006. Features & Interface -------------------- * Added support for psycopg2_ .. _psycopg2: http://initd.org/projects/psycopg2 * Added support for MSSQL. * Added ``TimeCol``. * ``RelatedJoin`` and ``SQLRelatedJoin`` objects have a ``createRelatedTable`` keyword argument (default ``True``). If ``False``, then the related table won't be automatically created; instead you must manually create it (e.g., with explicit SQLObject classes for the joins). * Implemented ``RLIKE`` (regular expression LIKE). * Moved _idSequence to sqlmeta.idSequence. Small Features -------------- * Select over RelatedJoin. * SQLite foreign keys. * Postgres DB URIs with a non-default path to unix socket. * Allow the use of foreign keys in selects. * Implemented addColumn() for SQLite. * With PySQLite2 use encode()/decode() from PySQLite1 for BLOBCol if available; else use base64. Bug Fixes --------- * Fixed a longstanding problem with UnicodeCol - at last you can use unicode strings in .select() and .selectBy() queries. There are some limitations, though; see the description of the UnicodeCol_. .. _UnicodeCol: SQLObject.html#column-types * Cull patch (clear cache). * .destroySelf() inside a transaction. * Synchronize main connection cache during transaction commit. * Ordering joins with NULLs. * Fixed bugs with plain/non-plain setters. * Lots of other bug fixes. SQLObject 0.7.0 =============== Features & Interface -------------------- * Inheritance. See Inheritance.html_ .. _Inheritance.html: Inheritance.html * Date/time validators, converters, tests. * Both `mxDateTime `_ and `datetime `_ supported for ``DateTimeCol``. * Added ``BLOBCol``, for binary data. * Added ``PickleCol``, to transparently pickle and unpickle data from column. * New `documented reflection interface `_, using the new ``.sqlmeta`` class/instance. Most special attributes that started with ``_`` were moved into ``sqlmeta`` (with leading underscore removed). * New aggregate functions for select results, like ``cls.select().max(columnName)``: ``.max()``, ``.min()``, ``.avg()``. * ``ConnectionHub`` aka ``sqlhub`` (@@: Needs documentation) * Command-line client `sqlobject-admin `_. * ``StringCol`` has ``char_binary`` attribute, for explicit case handling in MySQL. * Various joins now supported (LEFT, RIGHT, STRAIGHT, INNER, OUTER, CROSS): see `documentation `_. Aliases for joining a table with itself. * Subqueries/subselects (`see docs `_). * Select results support ``.filter(extra_query)`` * ``SQLMultipleJoin`` and ``SQLRelatedJoin``, like ``MultipleJoin`` and ``RelatedJoin``, except return select results (@@: Document). * `SingleJoin `_. * Columns retain their order from the class definition to table creation. * SQLObject now depends on the `FormEncode `_ library, and internal conversion/validation is done through FormEncode (was previously using old fork of FormEncode). * Column instances can have attributes set on them (generally for annotating columns with extra data). Other Changes ------------- * When iterating over select results, a list is now immediately created with the full list of instances being selected. Before instances were created on demand, as select results were pulled out row-by-row. The previous lazy behavior is available with the method ``lazyIter``, used like ``for obj in MyClass.select().lazyIter(): ...``. * Test framework now uses `py.test `_. * SQLObject now uses a simpler metaclass (``sqlobject.declarative.DeclarativeMeta``). * autoCommit and queryIns ?? (@@: expand) * Deprecation (@@: document) * Use `setuptools `_ for packaging and installation. Small Features -------------- * ``IntValidator`` for testing ``IntCol`` inputs. * Base style (``sqlobject.styles.Style``) is now a usable no-op style. * SQLite in-memory databases allowed with ``sqlite:/:memory:`` * Keyword parameters allowed to ``connectionForURI`` (like ``debug=True``). * More parameters passed to MySQL connections (unix_socket, named_pipe, init_command, read_default_file, read_default_group, connect_time, compress, named_pipe, use_unicode, client_flag, local_infile). * ``DateTimeCol.now`` is a function for producing the current date, using whatever date/time module you are using (good for use as a default). * Inherited classes fetched more efficiently (fewer queries). * Decimal converter to create `decimal objects `_. * Repository rearranged (now in ``http://svn.colorstudy.com/SQLObject/trunk``). Bug Fixes --------- * Tables with no columns can work. Why would you have a table without a column? We do not know, we try only to serve. * Sybase ``_fromDatabase`` fixed. * Various fixes to support most recent ``MySQLdb`` adapter, and ``pysqlite`` adapters. * URI parsing improved, including Windows paths (for use with SQLite). * ``selectBy(column=None)`` creates ``IS NULL`` query. * ``selectBy(foreignKey=value)`` now supported (not just selecting by foreign key ID). * ``cascade='null'`` wasn't working properly (was cascading all deletes, not nullifying!). * Lots of other bug fixes. `Older news`__ .. __: News1.html .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/rebuild0000755000175000017500000000023512760105144015160 0ustar phdphd00000000000000#! /bin/sh PYTHONPATH=.. make html && find . -name \*.tmp -type f -delete && exec rsync -ahP --del --exclude=.buildinfo --exclude=objects.inv _build/html . SQLObject-3.4.0/docs/conf.py0000644000175000017500000002612513060574235015116 0ustar phdphd00000000000000# -*- coding: utf-8 -*- # # SQLObject documentation build configuration file, created by # sphinx-quickstart. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. from datetime import date import sys import os # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # sys.path.insert(0, os.path.abspath('.')) sys.path.insert(0, os.path.abspath('..')) # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. # needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.viewcode', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. # source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = u'SQLObject' authors = u'Ian Bicking and contributors' copyright = u'2004-%d, %s' % (date.today().year, authors) # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # from sqlobject.__version__ import version as __version__ # noqa: E402 # The short X.Y version. version = '.'.join(__version__.split('.')[:2]) # The full version, including alpha/beta/rc tags. release = __version__ # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # # This is also used if you do content translation via gettext catalogs. # Usually you set "language" from the command line for these cases. language = 'en' # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: # today = '' # Else, today_fmt is used as the format for a strftime call. # today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all # documents. # default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. # add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). # add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. # show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. # modindex_common_prefix = [] # If true, keep warnings as "system message" paragraphs in the built documents. # keep_warnings = False # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'bizstyle' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. # html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. # html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". # html_title = None # A shorter title for the navigation bar. Default is the same as html_title. # html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. # html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. # html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # Add any extra paths that contain custom files (such as robots.txt or # .htaccess) here, relative to this directory. These files are copied # directly to the root of the documentation. # html_extra_path = [] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. # html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. # html_use_smartypants = True # Custom sidebar templates, maps document names to template names. # html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. # html_additional_pages = {} # If false, no module index is generated. # html_domain_indices = True # If false, no index is generated. # html_use_index = True # If true, the index is split into individual pages for each letter. # html_split_index = False # If true, links to the reST sources are added to the pages. # html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. # html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. # html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. # html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). # html_file_suffix = None # Language to be used for generating the HTML full-text search index. # Sphinx supports the following languages: # 'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja' # 'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr' # html_search_language = 'en' # A dictionary with options for the search language support, empty by default. # Now only 'ja' uses this config value # html_search_options = {'type': 'default'} # The name of a javascript file (relative to the configuration directory) that # implements a search results scorer. If empty, the default will be used. # html_search_scorer = 'scorer.js' # Output file base name for HTML help builder. htmlhelp_basename = 'SQLObjectdoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). # 'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). # 'pointsize': '10pt', # Additional stuff for the LaTeX preamble. # 'preamble': '', # Latex figure (float) alignment # 'figure_align': 'htbp', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ ('index', 'SQLObject.tex', u'SQLObject Documentation', authors, 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. # latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. # latex_use_parts = False # If true, show page references after internal links. # latex_show_pagerefs = False # If true, show URL addresses after external links. # latex_show_urls = False # Documents to append as an appendix to all manuals. # latex_appendices = [] # If false, no module index is generated. # latex_domain_indices = True # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'sqlobject', u'SQLObject Documentation', [authors], 1) ] # If true, show URL addresses after external links. # man_show_urls = False # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'SQLObject', u'SQLObject Documentation', authors, 'SQLObject', 'Object-relational mapper for Python.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. # texinfo_appendices = [] # If false, no module index is generated. # texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. # texinfo_show_urls = 'footnote' # If true, do not generate a @detailmenu in the "Top" node's menu. # texinfo_no_detailmenu = False # -- Options for Epub output ---------------------------------------------- # Bibliographic Dublin Core info. epub_title = u'SQLObject' epub_author = authors epub_publisher = authors epub_copyright = copyright # The basename for the epub file. It defaults to the project name. # epub_basename = u'SQLObject' # The HTML theme for the epub output. # Since the default themes are not optimized # for small screen space, using the same theme for HTML and epub output is # usually not wise. This defaults to 'epub', a theme designed to save visual # space. # epub_theme = 'epub' # The language of the text. It defaults to the language option # or 'en' if the language is not set. # epub_language = '' # The scheme of the identifier. Typical schemes are ISBN or URL. # epub_scheme = '' # The unique identifier of the text. This can be a ISBN number # or the project homepage. # epub_identifier = '' # A unique identification for the text. # epub_uid = '' # A tuple containing the cover image and cover page html template filenames. # epub_cover = () # A sequence of (type, uri, title) tuples for the guide element of content.opf. # epub_guide = () # HTML files that should be inserted before the pages created by sphinx. # The format is a list of tuples containing the path and title. # epub_pre_files = [] # HTML files shat should be inserted after the pages created by sphinx. # The format is a list of tuples containing the path and title. # epub_post_files = [] # A list of files that should not be packed into the epub file. epub_exclude_files = ['search.html'] # The depth of the table of contents in toc.ncx. # epub_tocdepth = 3 # Allow duplicate toc entries. # epub_tocdup = True # Choose between 'default' and 'includehidden'. # epub_tocscope = 'default' # Fix unsupported image types using the Pillow. # epub_fix_images = False # Scale large images. # epub_max_image_width = 0 # How to display URL addresses: 'footnote', 'no', or 'inline'. # epub_show_urls = 'inline' # If false, no index is generated. # epub_use_index = True SQLObject-3.4.0/docs/News.rst0000644000175000017500000001621613141371273015262 0ustar phdphd00000000000000++++ News ++++ .. contents:: Contents: :backlinks: none .. _start: SQLObject 3.4.0 =============== Released 5 Aug 2017. Features -------- * Python 2.6 is no longer supported. The minimal supported version is Python 2.7. Drivers (work in progress) -------------------------- * Encode binary values for py-postgresql driver. This fixes the last remaining problems with the driver. * Encode binary values for PyGreSQL driver using the same encoding as for py-postgresql driver. This fixes the last remaining problems with the driver. Our own encoding is needed because unescape_bytea(escape_bytea()) is not idempotent. See the comment for PQunescapeBytea at https://www.postgresql.org/docs/9.6/static/libpq-exec.html: This conversion is not exactly the inverse of PQescapeBytea, because the string is not expected to be "escaped" when received from PQgetvalue. In particular this means there is no need for string quoting considerations. * List all drivers in extras_require in setup.py. Minor features -------------- * Use base64.b64encode/b64decode instead of deprecated encodestring/decodestring. Tests ----- * Fix a bug with sqlite-memory: rollback transaction and close connection. The solution was found by Dr. Neil Muller. * Use remove-old-files.py from ppu to cleanup pip cache at Travis and AppVeyor. * Add test_csvimport.py more as an example how to use load_csv from sqlobject.util.csvimport. SQLObject 3.3.0 =============== Released 7 May 2017. Features -------- * Support for Python 2.6 is declared obsolete and will be removed in the next release. Minor features -------------- * Convert scripts repository to devscripts subdirectory. Some of thses scripts are version-dependent so it's better to have them in the main repo. * Test for __nonzero__ under Python 2, __bool__ under Python 3 in BoolCol. Drivers (work in progress) -------------------------- * Add support for PyODBC and PyPyODBC (pure-python ODBC DB API driver) for MySQL, PostgreSQL and MS SQL. Driver names are ``pyodbc``, ``pypyodbc`` or ``odbc`` (try ``pyodbc`` and ``pypyodbc``). There are some problems with pyodbc and many problems with pypyodbc. Documentation ------------- * Stop updating http://sqlobject.readthedocs.org/ - it's enough to have http://sqlobject.org/ Tests ----- * Run tests at Travis CI and AppVeyor with Python 3.6, x86 and x64. * Stop running tests at Travis with Python 2.6. * Stop running tests at AppVeyor with pymssql - too many timeouts and problems. SQLObject 3.2.0 =============== Released 11 Mar 2017. Minor features -------------- * Drop table name from ``VACUUM`` command in SQLiteConnection: SQLite doesn't vacuum a single table and SQLite 3.15 uses the supplied name as the name of the attached database to vacuum. * Remove ``driver`` keyword from RdbhostConnection as it allows one driver ``rdbhdb``. * Add ``driver`` keyword for FirebirdConnection. Allowed values are 'fdb', 'kinterbasdb' and 'pyfirebirdsql'. Default is to test 'fdb' and 'kinterbasdb' in that order. pyfirebirdsql is supported but has problems. * Add ``driver`` keyword for MySQLConnection. Allowed values are 'mysqldb', 'connector', 'oursql' and 'pymysql'. Default is to test for mysqldb only. * Add support for `MySQL Connector `_ (pure python; `binary packages `_ are not at PyPI and hence are hard to install and test). * Add support for `oursql `_ MySQL driver (only Python 2.6 and 2.7 until oursql author fixes Python 3 compatibility). * Add support for `PyMySQL `_ - pure python mysql interface). * Add parameter ``timeout`` for MSSQLConnection (usable only with pymssql driver); timeouts are in seconds. * Remove deprecated ez_setup.py. Drivers (work in progress) -------------------------- * Extend support for PyGreSQL driver. There are still some problems. * Add support for `py-postgresql `_ PostgreSQL driver. There are still problems with the driver. * Add support for `pyfirebirdsql `_.There are still problems with the driver. Bug fixes --------- * Fix MSSQLConnection.columnsFromSchema: remove `(` and `)` from default value. * Fix MSSQLConnection and SybaseConnection: insert default values into a table with just one IDENTITY column. * Remove excessive NULLs from ``CREATE TABLE`` for MSSQL/Sybase. * Fix concatenation operator for MSSQL/Sybase (it's ``+``, not ``||``). * Fix MSSQLConnection.server_version() under Py3 (decode version to str). Documentation ------------- * The docs are now generated with Sphinx. * Move ``docs/LICENSE`` to the top-level directory so that Github recognizes it. Tests ----- * Rename ``py.test`` -> ``pytest`` in tests and docs. * Great Renaming: fix ``pytest`` warnings by renaming ``TestXXX`` classes to ``SOTestXXX`` to prevent ``pytest`` to recognize them as test classes. * Fix ``pytest`` warnings by converting yield tests to plain calls: yield tests were deprecated in ``pytest``. * Tests are now run at CIs with Python 3.5. * Drop ``Circle CI``. * Run at Travis CI tests with Firebird backend (server version 2.5; drivers fdb and firebirdsql). There are problems with tests. * Run tests at AppVeyor for windows testing. Run tests with MS SQL, MySQL, Postgres and SQLite backends; use Python 2.7, 3.4 and 3.5, x86 and x64. There are problems with MS SQL and MySQL. SQLObject 3.1.0 =============== Released 16 Aug 2016. Features -------- * Add UuidCol. * Add JsonbCol. Only for PostgreSQL. Requires psycopg2 >= 2.5.4 and PostgreSQL >= 9.2. * Add JSONCol, a universal json column. * For Python >= 3.4 minimal FormEncode version is now 1.3.1. * If mxDateTime is in use, convert timedelta (returned by MySQL) to mxDateTime.Time. Documentation ------------- * Developer's Guide is extended to explain SQLObject architecture and how to create a new column type. * Fix URLs that can be found; remove missing links. * Rename reStructuredText files from \*.txt to \*.rst. Source code ----------- * Fix all `import *` using https://github.com/zestyping/star-destroyer. Tests ----- * Tests are now run at Circle CI. * Use pytest-cov for test coverage. Report test coverage via coveralls.io and codecov.io. * Install mxDateTime to run date/time tests with it. SQLObject 3.0.0 =============== Released 1 Jun 2016. Features -------- * Support for Python 2 and Python 3 with one codebase! (Python version >= 3.4 currently required.) Minor features -------------- * PyDispatcher (>=2.0.4) was made an external dependency. Development ----------- * Source code was made flake8-clean. Documentation ------------- * Documentation is published at http://sqlobject.readthedocs.org/ in Sphinx format. `Older news`__ .. __: News5.html .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/News3.rst0000644000175000017500000002550512752476767015373 0ustar phdphd00000000000000++++ News ++++ .. contents:: Contents: :backlinks: none .. _start: SQLObject 0.10.9 ================ * The cache culling algorithm was enhanced to eliminate memory leaks by removing references to dead objects; tested on a website that runs around 4 million requests a day. SQLObject 0.10.8 ================ Released 30 Sep 2009. * Fixed a bug in logging to console - convert unicode to str. * Fixed an obscure bug in ConnectionHub triggered by an SQLObject class whose instances can be coerced to boolean False. SQLObject 0.10.7 ================ Released 20 Sep 2009. * Fixed a bug: Sybase tables with identity column fire two identity_inserts. * Fixed a bug: q.startswith(), q.contains() and q.endswith() escape (with a backslash) all special characters (backslashes, underscores and percent signs). SQLObject 0.10.6 ================ Released 18 May 2009. * Better support for Python 2.6: do not import the deprecated sets module. * A number of changes ported from `SQLObject 0.9.11`_. SQLObject 0.10.5 ================ Released 6 May 2009. * A number of changes ported from `SQLObject 0.9.10`_. * sqlmeta.getColumns() becomes classmethod. SQLObject 0.10.4 ================ Released 8 Dec 2008. * Fixed createSQL constrains generation under MySQL when the table's name includes the database's name (contains a dot). SQLObject 0.10.3 ================ Released 1 Dec 2008. * A number of changes ported from `SQLObject 0.9.8`_. SQLObject 0.10.2 ================ Released 30 May 2008. * A number of changes ported from `SQLObject 0.9.7`_. SQLObject 0.10.1 ================ Released 4 May 2008. * Fixed a bug: limit doesn't work in sqlbuilder.Select. * A number of changes ported from `SQLObject 0.9.6`_. SQLObject 0.10.0 ================ Released 11 Mar 2008. Features & Interface -------------------- * Dropped support for Python 2.2. The minimal version of Python for SQLObject is 2.3 now. * Removed actively deprecated attributes; lowered deprecation level for other attributes to be removed after 0.10. * SQLBuilder Select supports the rest of SelectResults options (reversed, distinct, joins, etc.) * SQLObject.select() (i.e., SelectResults) and DBConnection.queryForSelect() use SQLBuilder Select queries; this make all SELECTs implemented internally via a single mechanism. * SQLBuilder Joins handle SQLExpression tables (not just str/SQLObject/Alias) and properly sqlrepr. * Added SQLBuilder ImportProxy. It allows one to ignore the circular import issues with referring to SQLObject classes in other files - it uses the classregistry as the string class names for FK/Joins do, but specifically intended for SQLBuilder expressions. See tests/test_sqlbuilder_importproxy.py. * Added SelectResults.throughTo. It allows one to traverse relationships (FK/Join) via SQL, avoiding the intermediate objects. Additionally, it's a simple mechanism for pre-caching/eager-loading of later FK relationships (i.e., going to loop over a select of somePeople and ask for aPerson.group, first call list(somePeople.throughTo.group) to preload those related groups and use 2 db queries instead of N+1). See tests/test_select_through.py. * Added ViewSQLObject. * Added sqlmeta.getColumns() to get all the columns for a class (including parent classes), excluding the column 'childName' and including the column 'id'. sqlmeta.asDict() now uses getColumns(), so there is no need to override it in the inheritable sqlmeta class; this makes asDict() to work properly on inheritable sqlobjects. * Allow MyTable.select(MyTable.q.foreignKey == object) where object is an instance of SQLObject. * Added rich comparison methods; SQLObjects of the same class are considered equal is they have the same id; other methods return NotImplemented. * RowDestroySignal is sent on destroying an SQLObject instance; postfunctions are run after the row has been destroyed. * Changed the implementation type in BoolCol under SQLite from TINYINT to BOOLEAN and made fromDatabase machinery to recognize it. * MySQLConnection (and DB URI) accept a number of SSL-related parameters: ssl_key, ssl_cert, ssl_ca, ssl_capath. * Use sets instead of dicts in tablesUsed. Dropped tablesUsedDict function; instead there is tablesUsedSet that returns a set of strings. * SQLBuilder tablesUsedSet handles sqlrepr'able objects. * Under MySQL, PickleCol no longer used TEXT column types; the smallest column is now BLOB - it is not possible to create TINYBLOB column. SQLObject 0.9.11 ================ Released 18 May 2009. * Two bugs in SQLiteConnection.columnsFromSchema() were fixed: use sqlmeta.idName instead of 'id'; convert default 'NULL' to None. * Use sqlmeta.idName instead of 'id' in all connection classes. * Fixed a bug that prevented to override per class _connection if there is sqlhub.processConnection. SQLObject 0.9.10 ================ Released 6 May 2009. * Another unicode-related patch for MySQL; required because different versions of MySQLdb require different handling:: - MySQLdb < 1.2.1: only ascii - MySQLdb = 1.2.1: only unicode - MySQLdb > 1.2.1: both ascii and unicode * Setup requires FormEncode version 1.1.1+. * A minor bug was fixed in creating a DecimalValidator - pass the column name to it. * A bug was fixed in InheritableIteration - pass connection to child klass.select(). * A bug was fixed in PostgresConnection.columnsFromSchema() - foreign keys are now recognized and created as proper ForeignKey with correct column name and table name. * Bugs in PostgresConnection and MSSQLConnection related to properties was fixed. A note for developers: from now on properties in DBConnection classes are forbidden as they don't work with Transaction - Transaction.__getattr__() cannot properly wrap 'self' so a property is called with wrong 'self'. * Transaction instances now explicitly raises TypeError on close() - without this calling Transaction.close() calls connection.close() which is wrong. * A bug in SQLiteConnection.columnsFromSchema() that led to an infinite loop was fixed. SQLObject 0.9.9 =============== * Backported from the trunk: under MySQL use the connection's dbEncoding instead of ascii, when converting a unicode value from python to database for a StringCol. SQLObject 0.9.8 =============== Released 1 Dec 2008. * Changed interpretation of strings in the DB URI for boolean parameters: '0', 'no', 'off' and 'false' are now interpreted as False. * Fixed a bug with incorrect handling of calls like connectionForURI(dburi, cache=False) when dburi already contains some parameters in the URI. * Convert decimal.to_eng_string() to str to work around a bug in Python 2.5.2; see https://mail.python.org/pipermail/python-dev/2008-March/078189.html * Added test_default_style.py. * Fixed a minor bug in SQLiteConnection that fails to parse Enum columns. SQLObject 0.9.7 =============== Released 30 May 2008. Small Features -------------- * Use VARCHAR(MAX) and VARBINARY(MAX) for MSSQL >= 9.0. * Run post_funcs after RowDestroySignal. Bug Fixes --------- * Fixed a minor bug in Set column. * A bug fixed for RowCreatedSignal together with InheritableSQLObject: run post_funcs after the entire hierarchy has been created. * Aggregate functions now honors 'distinct'. SQLObject 0.9.6 =============== Released 4 May 2008. * A bug in inheritable delColumn() that doesn't remove properties was fixed. * A minor bug was fixed in col.py - the registry must be passed to findClass(). * Reverted the patch declarative.threadSafeMethod() - it causes more harm then good. SQLObject 0.9.5 =============== Released 10 Mar 2008. * Fixed a minor bug in SQLiteConnection.columnsFromSchema() - set dbName. * A bug in delColumn() that removes all properties was fixed by recreating properties. SQLObject 0.9.4 =============== Released 3 Mar 2008. * Use list.reverse() in manager/command.py for Python 2.2 compatibility. * Prevent MultipleJoin from removing the intermediate table if it was not created by the Join. * Fixed a bug with no default when defaultSQL is defined for the column. * Recognize POINT data type as string in PostgresConnection.columnsFromSchema(). SQLObject 0.9.3 =============== Released 10 Jan 2008. * A number of changes ported from SQLObject 0.7.10. SQLObject 0.9.2 =============== Released 30 Oct 2007. * Fixed a bug in Versioning - do not copy "alternateID" and "unique" attributes from the versioned table. * Fixed a misspelled 'zerofill' option's name. * Fixed bugs in SQLiteConnection.guessColumn(). * A number of changes ported from SQLObject 0.7.9 and SQLObject 0.8.6. SQLObject 0.9.1 =============== Released 25 July 2007. Bug Fixes --------- * Fixed misspelled methods in col.py. * A number of bugfixes ported from SQLObject 0.7.8 and SQLObject 0.8.5. SQLObject 0.9.0 =============== Released 10 May 2007. Features & Interface -------------------- * Support for Python 2.2 has been declared obsolete. * Removed actively deprecated attributes; lowered deprecation level for other attributes to be removed after 0.9. * SQLite connection got columnsFromSchema(). Now all connections fully support fromDatabase. There are two version of columnsFromSchema() for SQLite - one parses the result of "SELECT sql FROM sqlite_master" and the other uses "PRAGMA table_info"; the user can choose one over the other by using "use_table_info" parameter in DB URI; default is False as the pragma is available only in the later versions of SQLite. * Changed connection.delColumn(): the first argument is sqlmeta, not tableName (required for SQLite). * SQLite connection got delColumn(). Now all connections fully support delColumn(). As SQLite backend doesn't implement "ALTER TABLE DROP COLUMN" delColumn() is implemented by creating a new table without the column, copying all data, dropping the original table and renaming the new table. * Versioning_. .. _Versioning: Versioning.html * MySQLConnection got new keyword "conv" - a list of custom converters. * Use logging if it's available and is configured via DB URI. * New columns: TimestampCol to support MySQL TIMESTAMP type; SetCol to support MySQL SET type; TinyIntCol for TINYINT; SmallIntCol for SMALLINT; MediumIntCol for MEDIUMINT; BigIntCol for BIGINT. Small Features -------------- * Support for MySQL INT type attributes: UNSIGNED, ZEROFILL. * Support for DEFAULT SQL attribute via defaultSQL keyword argument. * cls.tableExists() as a shortcut for conn.tableExists(cls.sqlmeta.table). * cls.deleteMany(), cls.deleteBy(). Bug Fixes --------- * idName can be inherited from the parent sqlmeta class. `Older news`__ .. __: News2.html .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/Versioning.rst0000644000175000017500000000517412753243417016500 0ustar phdphd00000000000000:Author: David Turner, The Open Planning Project .. contents:: Versioning ----------- Why ~~~ You have a table where rows can be altered, such as a table of wiki pages. You want to retain a history of changes for auditing, backup, or tracking purposes. You could write a decorator that stores old versions and manage all access to this object through it. Or you could take advantage of the event system in SQLObject 0.8+ and just catch row accesses to the object. SQLObject's versioning module does this for you. And it even works with inheritance! How ~~~ Here's how to set it up:: class MyClass(SQLObject): name = StringCol() versions = Versioning() To use it, just create an instance as usual:: mc = MyClass(name='fleem') Then make some changes and check out the results:: mc.set(name='morx') assert mc.versions[0].name == 'fleem' You can also restore to a previous version:: mc.versions[0].restore() assert mc.name == "fleem" Inheritance ~~~~~~~~~~~ There are three ways versioning can be used with inheritance_: .. _inheritance: Inheritance.html 1. Parent versioned, children unversioned:: class Base(InheritableSQLObject): name = StringCol() versions = Versioning() class Child(Base): toy = StringCol() In this case, when changes are made to an instance of Base, new versions are created. But when changes are made to an instance of Child, no new versions are created. 2. Children versioned, parents unversioned. In this case, when changes are made to an instance of Child, new versions are created. But when changes are made to an instance of Base, no new versions are created. The version data for Child contains all of the columns from child and from base, so that a full restore is possible. 3. Both children and parents versioned. In this case, changes to either Child or Base instances create new versions, but in different tables. Child versions still contain all Base data, and a change to a Child only creates a new Child version, not a new Base version. Version Tables ~~~~~~~~~~~~~~ Versions are stored in a special table which is created when the table for a versioned class is created. Version tables are not altered when the main table is altered, so if you add a column to your main class, you will need to manually add the column to your version table. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/genapidocs0000755000175000017500000000021312760105144015642 0ustar phdphd00000000000000#! /bin/sh cd "`dirname $0`"/.. && exec sphinx-apidoc --separate --module-first --suffix=rst --force \ --output-dir=docs/api sqlobject SQLObject-3.4.0/docs/SQLObject.rst0000644000175000017500000024066713140202164016134 0ustar phdphd00000000000000````````` SQLObject ````````` .. contents:: Contents: Credits ======= SQLObject is by Ian Bicking (ianb@colorstudy.com) and `Contributors `_. The website is `sqlobject.org `_. License ======= The code is licensed under the `Lesser General Public License`_ (LGPL). .. _`Lesser General Public License`: https://www.gnu.org/copyleft/lesser.html This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. Introduction ============ SQLObject is an *object-relational mapper* for Python_ programming language. It allows you to translate RDBMS table rows into Python objects, and manipulate those objects to transparently manipulate the database. .. _Python: https://www.python.org/ In using SQLObject, you will create a class definition that will describe how the object translates to the database table. SQLObject will produce the code to access the database, and update the database with your changes. The generated interface looks similar to any other interface, and callers need not be aware of the database backend. SQLObject also includes a novel feature to avoid generating, textually, your SQL queries. This also allows non-SQL databases to be used with the same query syntax. Requirements ============ Currently SQLObject supports MySQL_ via MySQLdb_ aka MySQL-python (called mysqlclient_ for Python 3), `MySQL Connector`_, oursql_, PyMySQL_, PyODBC_ and PyPyODBC_. For PostgreSQL_ psycopg2_ or psycopg1 are recommended; PyGreSQL_, py-postgresql_, PyODBC_ and PyPyODBC_ are supported but have problems (not all tests passed). SQLite_ has a built-in driver or PySQLite_. Firebird_ is supported via fdb_ or kinterbasdb_; pyfirebirdsql_ is supported but has problems. `MAX DB`_ (also known as SAP DB) is supported via sapdb_. Sybase via Sybase_. `MSSQL Server`_ via pymssql_ (+ FreeTDS_) or adodbapi_ (Win32). .. _MySQL: https://www.mysql.com/ .. _MySQLdb: https://sourceforge.net/projects/mysql-python/ .. _mysqlclient: https://pypi.python.org/pypi/mysqlclient .. _`MySQL Connector`: https://pypi.python.org/pypi/mysql-connector .. _oursql: https://github.com/python-oursql/oursql .. _PyMySQL: https://github.com/PyMySQL/PyMySQL/ .. _PostgreSQL: https://postgresql.org .. _psycopg2: http://initd.org/psycopg/ .. _PyGreSQL: http://www.pygresql.org/ .. _py-postgresql: https://pypi.python.org/pypi/py-postgresql .. _PyODBC: https://pypi.python.org/pypi/pyodbc .. _PyPyODBC: https://pypi.python.org/pypi/pypyodbc .. _SQLite: https://sqlite.org/ .. _PySQLite: https://github.com/ghaering/pysqlite .. _Firebird: http://www.firebirdsql.org/en/python-driver/ .. _fdb: http://www.firebirdsql.org/en/devel-python-driver/ .. _kinterbasdb: http://kinterbasdb.sourceforge.net/ .. _pyfirebirdsql: https://pypi.python.org/pypi/firebirdsql .. _`MAX DB`: http://maxdb.sap.com/ .. _sapdb: http://maxdb.sap.com/doc/7_8/50/01923f25b842438a408805774f6989/frameset.htm .. _Sybase: http://www.object-craft.com.au/projects/sybase/ .. _`MSSQL Server`: http://www.microsoft.com/sql/ .. _pymssql: http://www.pymssql.org/en/latest/index.html .. _FreeTDS: http://www.freetds.org/ .. _adodbapi: http://adodbapi.sourceforge.net/ Python 2.7 or 3.4+ is required. Compared To Other Database Wrappers =================================== There are several object-relational mappers (ORM) for Python. We honestly can't comment deeply on the quality of those packages, but we'll try to place SQLObject in perspective. Objects have built-in magic -- setting attributes has side effects (it changes the database), and defining classes has side effects (through the use of metaclasses). Attributes are generally exposed, not marked private, knowing that they can be made dynamic or write-only later. SQLObject creates objects that feel similar to normal Python objects. An attribute attached to a column doesn't look different than an attribute that's attached to a file, or an attribute that is calculated. It is a specific goal that you be able to change the database without changing the interface, including changing the scope of the database, making it more or less prominent as a storage mechanism. This is in contrast to some ORMs that provide a dictionary-like interface to the database (for example, PyDO_). The dictionary interface distinguishes the row from a normal Python object. We also don't care for the use of strings where an attribute seems more natural -- columns are limited in number and predefined, just like attributes. (Note: newer version of PyDO apparently allow attribute access as well) .. _PyDO: http://skunkweb.sourceforge.net/pydo.html SQLObject is, to my knowledge, unique in using metaclasses to facilitate this seamless integration. Some other ORMs use code generation to create an interface, expressing the schema in a CSV or XML file (for example, MiddleKit_, part of Webware_). By using metaclasses you are able to comfortably define your schema in the Python source code. No code generation, no weird tools, no compilation step. .. _MiddleKit: http://webware.sourceforge.net/Webware/MiddleKit/Docs/ .. _Webware: http://webware.sourceforge.net/Webware/Docs/ SQLObject provides a strong database abstraction, allowing cross-database compatibility (so long as you don't sidestep SQLObject). SQLObject has joins, one-to-many, and many-to-many, something which many ORMs do not have. The join system is also intended to be extensible. You can map between database names and Python attribute and class names; often these two won't match, or the database style would be inappropriate for a Python attribute. This way your database schema does not have to be designed with SQLObject in mind, and the resulting classes do not have to inherit the database's naming schemes. Using SQLObject: An Introduction ================================ Let's start off quickly. We'll generally just import everything from the ``sqlobject`` class:: >>> from sqlobject import * Declaring a Connection ---------------------- The connection URI must follow the standard URI syntax:: scheme://[user[:password]@]host[:port]/database[?parameters] Scheme is one of ``sqlite``, ``mysql``, ``postgres``, ``firebird``, ``interbase``, ``maxdb``, ``sapdb``, ``mssql``, ``sybase``. Examples:: mysql://user:password@host/database mysql://host/database?debug=1 postgres://user@host/database?debug=&cache= postgres:///full/path/to/socket/database postgres://host:5432/database sqlite:///full/path/to/database sqlite:/C:/full/path/to/database sqlite:/:memory: Parameters are: ``debug`` (default: False), ``debugOutput`` (default: False), ``cache`` (default: True), ``autoCommit`` (default: True), ``debugThreading`` (default: False), ``logger`` (default: None), ``loglevel`` (default: None), ``schema`` (default: None). If you want to pass True value in a connection URI - pass almost any non-empty string, especially ``yes``, ``true``, ``on`` or ``1``; an empty string or ``no``, ``false``, ``off`` or ``0`` for False. There are also connection-specific parameters, they are listed in the appropriate sections. Lets first set up a connection:: >>> import os >>> db_filename = os.path.abspath('data.db') >>> connection_string = 'sqlite:' + db_filename >>> connection = connectionForURI(connection_string) >>> sqlhub.processConnection = connection The ``sqlhub.processConnection`` assignment means that all classes will, by default, use this connection we've just set up. Declaring the Class ------------------- We'll develop a simple addressbook-like database. We could create the tables ourselves, and just have SQLObject access those tables, but let's have SQLObject do that work. First, the class: >>> class Person(SQLObject): ... ... firstName = StringCol() ... middleInitial = StringCol(length=1, default=None) ... lastName = StringCol() Many basic table schemas won't be any more complicated than that. `firstName`, `middleInitial`, and `lastName` are all columns in the database. The general schema implied by this class definition is:: CREATE TABLE person ( id INT PRIMARY KEY AUTO_INCREMENT, first_name TEXT, middle_initial CHAR(1), last_name TEXT ); This is for SQLite or MySQL. The schema for other databases looks slightly different (especially the ``id`` column). You'll notice the names were changed from mixedCase to underscore_separated -- this is done by the `style object`_. There are a variety of ways to handle names that don't fit conventions (see `Irregular Naming`_). .. _`style object`: `Changing the Naming Style`_ Now we'll create the table in the database:: >>> Person.createTable() [] We can change the type of the various columns by using something other than `StringCol`, or using different arguments. More about this in `Column Types`_. You'll note that the ``id`` column is not given in the class definition, it is implied. For MySQL databases it should be defined as ``INT PRIMARY KEY AUTO_INCREMENT``, in Postgres ``SERIAL PRIMARY KEY``, in SQLite as ``INTEGER PRIMARY KEY AUTOINCREMENT``, and for other backends accordingly. You can't use tables with SQLObject that don't have a single primary key, and you must treat that key as immutable (otherwise you'll confuse SQLObject terribly). You can `override the id name`_ in the database, but it is always called ``.id`` from Python. .. _`override the id name`: `Class sqlmeta`_ Using the Class --------------- Now that you have a class, how will you use it? We'll be considering the class defined above. To create a new object (and row), use class instantiation, like:: >>> Person(firstName="John", lastName="Doe") .. note:: In SQLObject NULL/None does *not* mean default. NULL is a funny thing; it mean very different things in different contexts and to different people. Sometimes it means "default", sometimes "not applicable", sometimes "unknown". If you want a default, NULL or otherwise, you always have to be explicit in your class definition. Also note that the SQLObject default isn't the same as the database's default (SQLObject never uses the database's default). If you had left out ``firstName`` or ``lastName`` you would have gotten an error, as no default was given for these columns (``middleInitial`` has a default, so it will be set to ``NULL``, the database equivalent of ``None``). You can use the class method `.get()` to fetch instances that already exist:: >>> Person.get(1) When you create an object, it is immediately inserted into the database. SQLObject uses the database as immediate storage, unlike some other systems where you explicitly save objects into a database. Here's a longer example of using the class:: >>> p = Person.get(1) >>> p >>> p.firstName 'John' >>> p.middleInitial = 'Q' >>> p.middleInitial 'Q' >>> p2 = Person.get(1) >>> p2 >>> p is p2 True Columns are accessed like attributes. (This uses the ``property`` feature of Python, so that retrieving and setting these attributes executes code). Also note that objects are unique -- there is generally only one ``Person`` instance of a particular id in memory at any one time. If you ask for a person by a particular ID more than once, you'll get back the same instance. This way you can be sure of a certain amount of consistency if you have multiple threads accessing the same data (though of course across processes there can be no sharing of an instance). This isn't true if you're using transactions_, which are necessarily isolated. To get an idea of what's happening behind the surface, we'll give the same actions with the SQL that is sent, along with some commentary:: >>> # This will make SQLObject print out the SQL it executes: >>> Person._connection.debug = True >>> p = Person(firstName='Bob', lastName='Hope') 1/QueryIns: INSERT INTO person (first_name, middle_initial, last_name) VALUES ('Bob', NULL, 'Hope') 1/QueryR : INSERT INTO person (first_name, middle_initial, last_name) VALUES ('Bob', NULL, 'Hope') 1/COMMIT : auto 1/QueryOne: SELECT first_name, middle_initial, last_name FROM person WHERE ((person.id) = (2)) 1/QueryR : SELECT first_name, middle_initial, last_name FROM person WHERE ((person.id) = (2)) 1/COMMIT : auto >>> p >>> p.middleInitial = 'Q' 1/Query : UPDATE person SET middle_initial = ('Q') WHERE id = (2) 1/QueryR : UPDATE person SET middle_initial = ('Q') WHERE id = (2) 1/COMMIT : auto >>> p2 = Person.get(1) >>> # Note: no database access, since we're just grabbing the same >>> # instance we already had. Hopefully you see that the SQL that gets sent is pretty clear and predictable. To view the SQL being sent, add ``?debug=true`` to your connection URI, or set the ``debug`` attribute on the connection, and all SQL will be printed to the console. This can be reassuring, and we would encourage you to try it. As a small optimization, instead of assigning each attribute individually, you can assign a number of them using the ``set`` method, like:: >>> p.set(firstName='Robert', lastName='Hope Jr.') This will send only one ``UPDATE`` statement. You can also use `set` with non-database properties (there's no benefit, but it helps hide the difference between database and non-database attributes). Selecting Multiple Objects -------------------------- While the full power of all the kinds of joins you can do with a relational database are not revealed in SQLObject, a simple ``SELECT`` is available. ``select`` is a class method, and you call it like (with the SQL that's generated):: >>> Person._connection.debug = True >>> peeps = Person.select(Person.q.firstName=="John") >>> list(peeps) 1/Select : SELECT person.id, person.first_name, person.middle_initial, person.last_name FROM person WHERE ((person.first_name) = ('John')) 1/QueryR : SELECT person.id, person.first_name, person.middle_initial, person.last_name FROM person WHERE ((person.first_name) = ('John')) 1/COMMIT : auto [] This example returns everyone with the first name John. Queries can be more complex:: >>> peeps = Person.select( ... OR(Person.q.firstName == "John", ... LIKE(Person.q.lastName, "%Hope%"))) >>> list(peeps) 1/Select : SELECT person.id, person.first_name, person.middle_initial, person.last_name FROM person WHERE (((person.first_name) = ('John')) OR (person.last_name LIKE ('%Hope%'))) 1/QueryR : SELECT person.id, person.first_name, person.middle_initial, person.last_name FROM person WHERE (((person.first_name) = ('John')) OR (person.last_name LIKE ('%Hope%'))) 1/COMMIT : auto [, ] You'll note that classes have an attribute ``q``, which gives access to special objects for constructing query clauses. All attributes under ``q`` refer to column names and if you construct logical statements with these it'll give you the SQL for that statement. You can also create your SQL more manually:: >>> Person._connection.debug = False # Need for doctests >>> peeps = Person.select("""person.first_name = 'John' AND ... person.last_name LIKE 'D%'""") You should use `MyClass.sqlrepr` to quote any values you use if you create SQL manually (quoting is automatic if you use ``q``). .. _orderBy: You can use the keyword arguments `orderBy` to create ``ORDER BY`` in the select statements: `orderBy` takes a string, which should be the *database* name of the column, or a column in the form ``Person.q.firstName``. You can use ``"-colname"`` or ``DESC(Person.q.firstName``) to specify descending order (this is translated to DESC, so it works on non-numeric types as well), or call ``MyClass.select().reversed()``. orderBy can also take a list of columns in the same format: ``["-weight", "name"]``. You can use the `sqlmeta`_ class variable `defaultOrder` to give a default ordering for all selects. To get an unordered result when `defaultOrder` is used, use ``orderBy=None``. .. _`sqlmeta`: `Class sqlmeta`_ Select results are generators, which are lazily evaluated. So the SQL is only executed when you iterate over the select results, or if you use ``list()`` to force the result to be executed. When you iterate over the select results, rows are fetched one at a time. This way you can iterate over large results without keeping the entire result set in memory. You can also do things like ``.reversed()`` without fetching and reversing the entire result -- instead, SQLObject can change the SQL that is sent so you get equivalent results. You can also slice select results. This modifies the SQL query, so ``peeps[:10]`` will result in ``LIMIT 10`` being added to the end of the SQL query. If the slice cannot be performed in the SQL (e.g., peeps[:-10]), then the select is executed, and the slice is performed on the list of results. This will generally only happen when you use negative indexes. In certain cases, you may get a select result with an object in it more than once, e.g., in some joins. If you don't want this, you can add the keyword argument ``MyClass.select(..., distinct=True)``, which results in a ``SELECT DISTINCT`` call. You can get the length of the result without fetching all the results by calling ``count`` on the result object, like ``MyClass.select().count()``. A ``COUNT(*)`` query is used -- the actual objects are not fetched from the database. Together with slicing, this makes batched queries easy to write: start = 20 size = 10 query = Table.select() results = query[start:start+size] total = query.count() print "Showing page %i of %i" % (start/size + 1, total/size + 1) .. note:: There are several factors when considering the efficiency of this kind of batching, and it depends very much how the batching is being used. Consider a web application where you are showing an average of 100 results, 10 at a time, and the results are ordered by the date they were added to the database. While slicing will keep the database from returning all the results (and so save some communication time), the database will still have to scan through the entire result set to sort the items (so it knows which the first ten are), and depending on your query may need to scan through the entire table (depending on your use of indexes). Indexes are probably the most important way to improve importance in a case like this, and you may find caching to be more effective than slicing. In this case, caching would mean retrieving the *complete* results. You can use ``list(MyClass.select(...))`` to do this. You can save these results for some limited period of time, as the user looks through the results page by page. This means the first page in a search result will be slightly more expensive, but all later pages will be very cheap. For more information on the where clause in the queries, see the `SQLBuilder documentation`_. q-magic ~~~~~~~ Please note the use of the `q` attribute in examples above. `q` is an object that returns special objects to construct SQL expressions. Operations on objects returned by `q-magic` are not evaluated immediately but stored in a manner similar to symbolic algebra; the entire expression is evaluated by constructing a string that is sent then to the backend. For example, for the code:: >>> peeps = Person.select(Person.q.firstName=="John") SQLObject doesn't evaluate firstName but stores the expression: Person.q.firstName=="John" Later SQLObject converts it to the string ``first_name = 'John'`` and passes the string to the backend. selectBy Method ~~~~~~~~~~~~~~~ An alternative to ``.select`` is ``.selectBy``. It works like: >>> peeps = Person.selectBy(firstName="John", lastName="Doe") Each keyword argument is a column, and all the keyword arguments are ANDed together. The return value is a `SelectResults`, so you can slice it, count it, order it, etc. Lazy Updates ------------ By default SQLObject sends an ``UPDATE`` to the database for every attribute you set, or every time you call ``.set()``. If you want to avoid this many updates, add ``lazyUpdate = True`` to your class `sqlmeta definition`_. .. _`sqlmeta definition`: `Class sqlmeta`_ Then updates will only be written to the database when you call ``inst.syncUpdate()`` or ``inst.sync()``: ``.sync()`` also refetches the data from the database, which ``.syncUpdate()`` does not do. When enabled instances will have a property ``.sqlmeta.dirty``, which indicates if there are pending updates. Inserts are still done immediately; there's no way to do lazy inserts at this time. One-to-Many Relationships ------------------------- An address book is nothing without addresses. First, let's define the new address table. People can have multiple addresses, of course:: >>> class Address(SQLObject): ... ... street = StringCol() ... city = StringCol() ... state = StringCol(length=2) ... zip = StringCol(length=9) ... person = ForeignKey('Person') >>> Address.createTable() [] Note the column ``person = ForeignKey("Person")``. This is a reference to a `Person` object. We refer to other classes by name (with a string). In the database there will be a ``person_id`` column, type ``INT``, which points to the ``person`` column. .. note:: The reason SQLObject uses strings to refer to other classes is because the other class often does not yet exist. Classes in Python are *created*, not *declared*; so when a module is imported the commands are executed. ``class`` is just another command; one that creates a class and assigns it to the name you give. If class ``A`` referred to class ``B``, but class ``B`` was defined below ``A`` in the module, then when the ``A`` class was created (including creating all its column attributes) the ``B`` class simply wouldn't exist. By referring to classes by name, we can wait until all the required classes exist before creating the links between classes. We want an attribute that gives the addresses for a person. In a class definition we'd do:: class Person(SQLObject): ... addresses = MultipleJoin('Address') But we already have the class. We can add this to the class in-place:: >>> Person.sqlmeta.addJoin(MultipleJoin('Address', ... joinMethodName='addresses')) .. note:: In almost all cases you can modify SQLObject classes after they've been created. Having attributes that contain ``*Col`` objects in the class definition is equivalent to calling certain class methods (like ``addColumn()``). Now we can get the backreference with ``aPerson.addresses``, which returns a list. An example:: >>> p.addresses [] >>> Address(street='123 W Main St', city='Smallsville', ... state='MN', zip='55407', person=p)
>>> p.addresses [
] .. note:: MultipleJoin, as well as RelatedJoin, returns a list of results. It is often preferable to get a `SelectResults`_ object instead, in which case you should use SQLMultipleJoin and SQLRelatedJoin. The declaration of these joins is unchanged from above, but the returned iterator has many additional useful methods. .. _`SelectResults` : SelectResults.html Many-to-Many Relationships -------------------------- For this example we will have user and role objects. The two have a many-to-many relationship, which is represented with the `RelatedJoin`. >>> class User(SQLObject): ... ... class sqlmeta: ... # user is a reserved word in some databases, so we won't ... # use that for the table name: ... table = "user_table" ... ... username = StringCol(alternateID=True, length=20) ... # We'd probably define more attributes, but we'll leave ... # that exercise to the reader... ... ... roles = RelatedJoin('Role') >>> class Role(SQLObject): ... ... name = StringCol(alternateID=True, length=20) ... ... users = RelatedJoin('User') >>> User.createTable() [] >>> Role.createTable() [] .. note:: The sqlmeta class is used to store different kinds of metadata (and override that metadata, like table). This is new in SQLObject 0.7. See the section `Class sqlmeta`_ for more information on how it works and what attributes have special meanings. And usage:: >>> bob = User(username='bob') >>> tim = User(username='tim') >>> jay = User(username='jay') >>> admin = Role(name='admin') >>> editor = Role(name='editor') >>> bob.addRole(admin) >>> bob.addRole(editor) >>> tim.addRole(editor) >>> bob.roles [, ] >>> tim.roles [] >>> jay.roles [] >>> admin.users [] >>> editor.users [, ] In the process an intermediate table is created, ``role_user``, which references both of the other classes. This table is never exposed as a class, and its rows do not have equivalent Python objects -- this hides some of the nuisance of a many-to-many relationship. By the way, if you want to create an intermediate table of your own, maybe with additional columns, be aware that the standard SQLObject methods add/removesomething may not work as expected. Assuming that you are providing the join with the correct joinColumn and otherColumn arguments, be aware it's not possible to insert extra data via such methods, nor will they set any default value. Let's have an example: in the previous User/Role system, you're creating a UserRole intermediate table, with the two columns containing the foreign keys for the MTM relationship, and an additional DateTimeCol defaulting to datetime.datetime.now : that column will stay empty when adding roles with the addRole method. If you want to get a list of rows from the intermediate table directly add a MultipleJoin to User or Role class. You may notice that the columns have the extra keyword argument `alternateID`. If you use ``alternateID=True``, this means that the column uniquely identifies rows -- like a username uniquely identifies a user. This identifier is in addition to the primary key (``id``), which is always present. .. note:: SQLObject has a strong requirement that the primary key be unique and *immutable*. You cannot change the primary key through SQLObject, and if you change it through another mechanism you can cause inconsistency in any running SQLObject program (and in your data). For this reason meaningless integer IDs are encouraged -- something like a username that could change in the future may uniquely identify a row, but it may be changed in the future. So long as it is not used to reference the row, it is also *safe* to change it in the future. A alternateID column creates a class method, like ``byUsername`` for a column named ``username`` (or you can use the `alternateMethodName` keyword argument to override this). Its use: >>> User.byUsername('bob') >>> Role.byName('admin') Selecting Objects Using Relationships ------------------------------------- An select expression can refer to multiple classes, like:: >>> Person._connection.debug = False # Needed for doctests >>> peeps = Person.select( ... AND(Address.q.personID == Person.q.id, ... Address.q.zip.startswith('504'))) >>> list(peeps) [] >>> peeps = Person.select( ... AND(Address.q.personID == Person.q.id, ... Address.q.zip.startswith('554'))) >>> list(peeps) [] It is also possible to use the ``q`` attribute when constructing complex queries, like:: >>> Person._connection.debug = False # Needed for doctests >>> peeps = Person.select("""address.person_id = person.id AND ... address.zip LIKE '504%'""", ... clauseTables=['address']) Note that you have to use ``clauseTables`` if you use tables besides the one you are selecting from. If you use the ``q`` attributes SQLObject will automatically figure out what extra classes you might have used. Class sqlmeta ------------- This new class is available starting with SQLObject 0.7 and allows specifying metadata in a clearer way, without polluting the class namespace with more attributes. There are some special attributes that can be used inside this class that will change the behavior of the class that contains it. Those values are: `table`: The name of the table in the database. This is derived from ``style`` and the class name if no explicit name is given. If you don't give a name and haven't defined an alternative ``style``, then the standard `MixedCase` to `mixed_case` translation is performed. `idName`: The name of the primary key column in the database. This is derived from ``style`` if no explicit name is given. The default name is ``id``. `idType`: A function that coerces/normalizes IDs when setting IDs. This is ``int`` by default (all IDs are normalized to integers). `style`: A style object -- this object allows you to use other algorithms for translating between Python attribute and class names, and the database's column and table names. See `Changing the Naming Style`_ for more. It is an instance of the `IStyle` interface. `lazyUpdate`: A boolean (default false). If true, then setting attributes on instances (or using ``inst.set(.)`` will not send ``UPDATE`` queries immediately (you must call ``inst.syncUpdates()`` or ``inst.sync()`` first). `defaultOrder`: When selecting objects and not giving an explicit order, this attribute indicates the default ordering. It is like this value is passed to ``.select()`` and related methods; see those method's documentation for details. `cacheValues`: A boolean (default true). If true, then the values in the row are cached as long as the instance is kept (and ``inst.expire()`` is not called). If set to `False` then values for attributes from the database won't be cached. So every time you access an attribute in the object the database will be queried for a value, i.e., a ``SELECT`` will be issued. If you want to handle concurrent access to the database from multiple processes then this is probably the way to do so. `registry`: Because SQLObject uses strings to relate classes, and these strings do not respect module names, name clashes will occur if you put different systems together. This string value serves as a namespace for classes. `fromDatabase`: A boolean (default false). If true, then on class creation the database will be queried for the table's columns, and any missing columns (possible all columns) will be added automatically. Please be warned that not all connections fully implement database introspection. `dbEncoding`: UnicodeCol_ looks up `sqlmeta.dbEncoding` if `column.dbEncoding` is ``None`` (if `sqlmeta.dbEncoding` is ``None`` UnicodeCol_ looks up `connection.dbEncoding` and if `dbEncoding` isn't defined anywhere it defaults to ``"utf-8"``). For Python 3 there must be one encoding for connection - do not define different columns with different encodings, it's not implemented. .. _UnicodeCol: `Column Types`_ The following attributes provide introspection but should not be set directly - see `Runtime Column and Join Changes`_ for dynamically modifying these class elements. `columns`: A dictionary of ``{columnName: anSOColInstance}``. You can get information on the columns via this read-only attribute. `columnList`: A list of the values in ``columns``. Sometimes a stable, ordered version of the columns is necessary; this is used for that. `columnDefinitions`: A dictionary like ``columns``, but contains the original column definitions (which are not class-specific, and have no logic). `joins`: A list of all the Join objects for this class. `indexes`: A list of all the indexes for this class. `createSQL`: SQL queries run after table creation. createSQL can be a string with a single SQL command, a list of SQL commands, or a dictionary with keys that are dbNames and values that are either single SQL command string or a list of SQL commands. This is usually for ALTER TABLE commands. There is also one instance attribute: `expired`: A boolean. If true, then the next time this object's column attributes are accessed a query will be run. While in previous versions of SQLObject those attributes were defined directly at the class that will map your database data to Python and all of them were prefixed with an underscore, now it is suggested that you change your code to this new style. The old way was removed in SQLObject 0.8. Please note: when using InheritedSQLObject, sqlmeta attributes don't get inherited, e.g. you can't access via the sqlmeta.columns dictionary the parent's class column objects. Using sqlmeta ~~~~~~~~~~~~~ To use sqlmeta you should write code like this example:: class MyClass(SQLObject): class sqlmeta: lazyUpdate = True cacheValues = False columnA = StringCol() columnB = IntCol() def _set_attr1(self, value): # do something with value def _get_attr1(self): # do something to retrieve value The above definition is creating a table ``my_class`` (the name may be different if you change the ``style`` used) with two columns called columnA and columnB. There's also a third field that can be accessed using ``MyClass.attr1``. The sqlmeta class is changing the behavior of ``MyClass`` so that it will perform lazy updates (you'll have to call the ``.sync()`` method to write the updates to the database) and it is also telling that ``MyClass`` won't have any cache, so that every time you ask for some information it will be retrieved from the database. j-magic ~~~~~~~ There is a magic attribute `j` similar to q_ with attributes for ForeignKey and SQLMultipleJoin/SQLRelatedJoin, providing a shorthand for the SQLBuilder join expressions to traverse the given relationship. For example, for a ForeignKey AClass.j.someB is equivalent to (AClass.q.someBID==BClass.q.id), as is BClass.j.someAs for the matching SQLMultipleJoin. .. _q: q-magic_ SQLObject Class --------------- There is one special attribute - `_connection`. It is the connection defined for the table. `_connection`: The connection object to use, from `DBConnection`. You can also set the variable `__connection__` in the enclosing module and it will be picked up (be sure to define `__connection__` before your class). You can also pass a connection object in at instance creation time, as described in transactions_. If you have defined `sqlhub.processConnection` then this attribute can be omitted from your class and the sqlhub will be used instead. If you have several classes using the same connection that might be an advantage, besides saving a lot of typing. Customizing the Objects ----------------------- While we haven't done so in the examples, you can include your own methods in the class definition. Writing your own methods should be obvious enough (just do so like in any other class), but there are some other details to be aware of. Initializing the Objects ~~~~~~~~~~~~~~~~~~~~~~~~ There are two ways SQLObject instances can come into existence: they can be fetched from the database, or they can be inserted into the database. In both cases a new Python object is created. This makes the role of `__init__` a little confusing. In general, you should not touch `__init__`. Instead use the `_init` method, which is called after an object is fetched or inserted. This method has the signature ``_init(self, id, connection=None, selectResults=None)``, though you may just want to use ``_init(self, *args, **kw)``. **Note:** don't forget to call ``SQLObject._init(self, *args, **kw)`` if you override the method! Adding Magic Attributes (properties) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ You can use all the normal techniques for defining methods in this class, including `classmethod`, `staticmethod`, and `property`, but you can also use a shortcut. If you have a method that's name starts with ``_set_``, ``_get_``, ``_del_``, or ``_doc_``, it will be used to create a property. So, for instance, say you have images stored under the ID of the person in the ``/var/people/images`` directory:: class Person(SQLObject): # ... def imageFilename(self): return 'images/person-%s.jpg' % self.id def _get_image(self): if not os.path.exists(self.imageFilename()): return None f = open(self.imageFilename()) v = f.read() f.close() return v def _set_image(self, value): # assume we get a string for the image f = open(self.imageFilename(), 'w') f.write(value) f.close() def _del_image(self, value): # We usually wouldn't include a method like this, but for # instructional purposes... os.unlink(self.imageFilename()) Later, you can use the ``.image`` property just like an attribute, and the changes will be reflected in the filesystem by calling these methods. This is a good technique for information that is better to keep in files as opposed to the database (such as large, opaque data like images). You can also pass an ``image`` keyword argument to the constructor or the `set` method, like ``Person(..., image=imageText)``. All of the methods (``_get_``, ``_set_``, etc) are optional -- you can use any one of them without using the others. So you could define just a ``_get_attr`` method so that ``attr`` was read-only. Overriding Column Attributes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ It's a little more complicated if you want to override the behavior of an database column attribute. For instance, imagine there's special code you want to run whenever someone's name changes. In many systems you'd do some custom code, then call the superclass's code. But the superclass (``SQLObject``) doesn't know anything about the column in your subclass. It's even worse with properties. SQLObject creates methods like ``_set_lastName`` for each of your columns, but again you can't use this, since there's no superclass to reference (and you can't write ``SQLObject._set_lastName(...)``, because the SQLObject class doesn't know about your class's columns). You want to override that ``_set_lastName`` method yourself. To deal with this, SQLObject creates two methods for each getter and setter, for example: ``_set_lastName`` and ``_SO_set_lastName``. So to intercept all changes to ``lastName``:: class Person(SQLObject): lastName = StringCol() firstName = StringCol() def _set_lastName(self, value): self.notifyLastNameChange(value) self._SO_set_lastName(value) Or perhaps you want to constrain a phone numbers to be actual digits, and of proper length, and make the formatting nice:: import re class PhoneNumber(SQLObject): phoneNumber = StringCol(length=30) _garbageCharactersRE = re.compile(r'[\-\.\(\) ]') _phoneNumberRE = re.compile(r'^[0-9]+$') def _set_phoneNumber(self, value): value = self._garbageCharactersRE.sub('', value) if not len(value) >= 10: raise ValueError( 'Phone numbers must be at least 10 digits long') if not self._phoneNumberRE.match(value): raise ValueError, 'Phone numbers can contain only digits' self._SO_set_phoneNumber(value) def _get_phoneNumber(self): value = self._SO_get_phoneNumber() number = '(%s) %s-%s' % (value[0:3], value[3:6], value[6:10]) if len(value) > 10: number += ' ext.%s' % value[10:] return number .. note:: You should be a little cautious when modifying data that gets set in an attribute. Generally someone using your class will expect that the value they set the attribute to will be the same value they get back. In this example we removed some of the characters before putting it in the database, and reformatted it on the way out. One advantage of methods (as opposed to attribute access) is that the programmer is more likely to expect this disconnect. Also note while these conversions will take place when getting and setting the column, in queries the conversions will not take place. So if you convert the value from a "Pythonic" representation to a "SQLish" representation, your queries (when using ``.select()`` and ``.selectBy()``) will have to be in terms of the SQL/Database representation (as those commands generate SQL that is run on the database). Undefined attributes ~~~~~~~~~~~~~~~~~~~~ There's one more thing worth telling, because you may something get strange results when making a typo. SQLObject won't ever complain or raise any error when setting a previously undefined attribute; it will simply set it, without making any change to the database, i.e: it will work as any other attribute you set on any Python class, it will 'forget' it is a SQLObject class. This may sometimes be a problem: if you have got a 'name' attribute and you you write ``a.namme="Victor"`` once, when setting it, you'll get no error, no warning, nothing at all, and you may get crazy at understanding why you don't get that value set in your DB. Reference ========= The instructions above should tell you enough to get you started, and be useful for many situations. Now we'll show how to specify the class more completely. Col Class: Specifying Columns ----------------------------- The list of columns is a list of `Col` objects. These objects don't have functionality in themselves, but give you a way to specify the column. `dbName`: This is the name of the column in the database. If you don't give a name, your Pythonic name will be converted from mixed-case to underscore-separated. `default`: The default value for this column. Used when creating a new row. If you give a callable object or function, the function will be called, and the return value will be used. So you can give ``DateTimeCol.now`` to make the default value be the current time. Or you can use ``sqlbuilder.func.NOW()`` to have the database use the ``NOW()`` function internally. If you don't give a default there will be an exception if this column isn't specified in the call to `new`. `defaultSQL`: ``DEFAULT`` SQL attribute. `alternateID`: This boolean (default False) indicates if the column can be used as an ID for the field (for instance, a username), though it is not a primary key. If so a class method will be added, like ``byUsername`` which will return that object. Use `alternateMethodName` if you don't like the ``by*`` name (e.g. ``alternateMethodName="username"``). The column should be declared ``UNIQUE`` in your table schema. `unique`: If true, when SQLObject creates a table it will declare this column to be ``UNIQUE``. `notNone`: If true, None/``NULL`` is not allowed for this column. Useful if you are using SQLObject to create your tables. `sqlType`: The SQL type for this column (like ``INT``, ``BOOLEAN``, etc). You can use classes (defined below) for this, but if those don't work it's sometimes easiest just to use `sqlType`. Only necessary if SQLObject is creating your tables. `validator`: formencode_-like validator_. Making long story short, this is an object that provides ``to_python()`` and ``from_python()`` to validate *and* convert (adapt or cast) the values when they are read/written from/to the database. You should see formencode_ validator_ documentation for more details. This validator is appended to the end of the list of the list of column validators. If the column has a list of validators their ``from_python()`` methods are ran from the beginnig of the list to the end; ``to_python()`` in the reverse order. That said, ``from_python()`` method of this validator is called last, after all validators in the list; ``to_python()`` is called first. `validator2`: Another validator. It is inserted in the beginning of the list of the list of validators, i.e. its ``from_python()`` method is called first; ``to_python()`` last. .. _formencode: http://formencode.org/ .. _validator: http://www.formencode.org/en/latest/Validator.html Column Types ~~~~~~~~~~~~ The `ForeignKey` class should be used instead of `Col` when the column is a reference to another table/class. It is generally used like ``ForeignKey('Role')``, in this instance to create a reference to a table `Role`. This is largely equivalent to ``Col(foreignKey='Role', sqlType='INT')``. Two attributes will generally be created, ``role``, which returns a `Role` instance, and ``roleID``, which returns an integer ID for the related role. There are some other subclasses of `Col`. These are used to indicate different types of columns, when SQLObject creates your tables. `BLOBCol`: A column for binary data. Presently works only with MySQL, PostgreSQL and SQLite backends. `BoolCol`: Will create a ``BOOLEAN`` column in Postgres, or ``INT`` in other databases. It will also convert values to ``"t"/"f"`` or ``0/1`` according to the database backend. `CurrencyCol`: Equivalent to ``DecimalCol(size=10, precision=2)``. WARNING: as DecimalCol MAY NOT return precise numbers, this column may share the same behavior. Please read the DecimalCol warning. `DateTimeCol`: A date and time (usually returned as an datetime or mxDateTime object). `DateCol`: A date (usually returned as an datetime or mxDateTime object). `TimeCol`: A time (usually returned as an datetime or mxDateTime object). `TimestampCol`: Supports MySQL TIMESTAMP type. `DecimalCol`: Base-10, precise number. Uses the keyword arguments `size` for number of digits stored, and `precision` for the number of digits after the decimal point. WARNING: it may happen that DecimalCol values, although correctly stored in the DB, are returned as floats instead of decimals. For example, due to the `type affinity`_ SQLite stores decimals as integers or floats (NUMERIC storage class). You should test with your database adapter, and you should try importing the Decimal type and your DB adapter before importing SQLObject. .. _`type affinity`: http://sqlite.org/datatype3.html#affinity `DecimalStringCol`: Similar to `DecimalCol` but stores data as strings to work around problems in some drivers and type affinity problem in SQLite. As it stores data as strings the column cannot be used in SQL expressions (column1 + column2) and probably will has problems with ORDER BY. `EnumCol`: One of several string values -- give the possible strings as a list, with the `enumValues` keyword argument. MySQL has a native ``ENUM`` type, but will work with other databases too (storage just won't be as efficient). For PostgreSQL, EnumCol's are implemented using check constraints. Due to the way PostgreSQL handles check constraints involving NULL, specifying None as a member of an EnumCol will effectively mean that, at the SQL level, the check constraint will be ignored (see http://archives.postgresql.org/pgsql-sql/2004-12/msg00065.php for more details). `SetCol`: Supports MySQL SET type. `FloatCol`: Floats. `ForeignKey`: A key to another table/class. Use like ``user = ForeignKey('User')``. It can check for referential integrity using the keyword argument `cascade`, please see ForeignKey_ for details. `IntCol`: Integers. `JsonbCol`: A column for jsonb objects. Only supported on Postgres. Any Python object that can be serialized with json.dumps can be stored. `JSONCol`: A universal json column that converts simple Python objects (None, bool, int, float, long, dict, list, str/unicode to/from JSON using json.dumps/loads. A subclass of StringCol. `PickleCol`: An extension of BLOBCol; this column can store/retrieve any Python object; it actually (un)pickles the object from/to string and stores/retrieves the string. One can get and set the value of the column but cannot search (use it in WHERE). `StringCol`: A string (character) column. Extra keywords: `length`: If given, the type will be something like ``VARCHAR(length)``. If not given, then ``TEXT`` is assumed (i.e., lengthless). `varchar`: A boolean; if you have a length, differentiates between ``CHAR`` and ``VARCHAR``, default True, i.e., use ``VARCHAR``. `UnicodeCol`: A subclass of `StringCol`. Also accepts a `dbEncoding` keyword argument, it defaults to ``None`` which means to lookup `dbEncoding` in sqlmeta_ and connection, and if `dbEncoding` isn't defined anywhere it defaults to ``"utf-8"``. Values coming in and out from the database will be encoded and decoded. **Note**: there are some limitations on using UnicodeCol in queries: - only simple q-magic fields are supported; no expressions; - only == and != operators are supported; The following code works:: MyTable.select(u'value' == MyTable.q.name) MyTable.select(MyTable.q.name != u'value') MyTable.select(OR(MyTable.q.col1 == u'value1', MyTable.q.col2 != u'value2')) MyTable.selectBy(name = u'value') MyTable.selectBy(col1=u'value1', col2=u'value2') MyTable.byCol1(u'value1') # if col1 is an alternateID The following does not work:: MyTable.select((MyTable.q.name + MyTable.q.surname) == u'value') In that case you must apply the encoding yourself:: MyTable.select((MyTable.q.name + MyTable.q.surname) == u'value'.encode(dbEncoding)) `UuidCol`: A column for UUID. On Postgres uses 'UUID' data type, on all other backends uses VARCHAR(36). Relationships Between Classes/Tables ------------------------------------ ForeignKey ~~~~~~~~~~ You can use the `ForeignKey` to handle foreign references in a table, but for back references and many-to-many relationships you'll use joins. `ForeignKey` allows you to specify referential integrity using the keyword `cascade`, which can have these values: `None`: No action is taken on related deleted columns (this is the default). Following the Person/Address example, if you delete the object `Person` with id 1 (John Doe), the `Address` with id 1 (123 W Main St) will be kept untouched (with ``personID=1``). `False`: Deletion of an object that has other objects related to it using a `ForeignKey` will fail (sets ``ON DELETE RESTRICT``). Following the Person/Address example, if you delete the object `Person` with id 1 (John Doe) a `SQLObjectIntegrityError` exception will be raised, because the `Address` with id 1 (123 W Main St) has a reference (``personID=1``) to it. `True`: Deletion of an object that has other objects related to it using a `ForeignKey` will delete all the related objects too (sets ``ON DELETE CASCADE``). Following the Person/Address example, if you delete the object `Person` with id 1 (John Doe), the `Address` with id 1 (123 W Main St) will be deleted too. `'null'`: Deletion of an object that has other objects related to it using a `ForeignKey` will set the `ForeignKey` column to `NULL`/`None` (sets ``ON DELETE SET NULL``). Following the Person/Address example, if you delete the object `Person` with id 1 (John Doe), the `Address` with id 1 (123 W Main St) will be kept but the reference to person will be set to `NULL`/`None` (``personID=None``). MultipleJoin and SQLMultipleJoin: One-to-Many ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ See `One-to-Many Relationships`_ for an example of one-to-many relationships. MultipleJoin returns a list of results, while SQLMultipleJoin returns a SelectResults object. Several keyword arguments are allowed to the `MultipleJoin` constructor: .. _`Multiple Join Keywords`: `joinColumn`: The column name of the key that points to this table. So, if you have a table ``Product``, and another table has a column ``ProductNo`` that points to this table, then you'd use ``joinColumn="ProductNo"``. WARNING: the argument you pass must conform to the column name in the database, not to the column in the class. So, if you have a SQLObject containing the ``ProductNo`` column, this will probably be translated into ``product_no_id`` in the DB (``product_no`` is the normal uppercase- to-lowercase + underscores SQLO Translation, the added _id is just because the column referring to the table is probably a ForeignKey, and SQLO translates foreign keys that way). You should pass that parameter. `orderBy`: Like the `orderBy`_ argument to `select()`, you can specify the order that the joined objects should be returned in. `defaultOrder` will be used if not specified; ``None`` forces unordered results. `joinMethodName`: When adding joins dynamically (using the class method `addJoin`_), you can give the name of the accessor for the join. It can also be created automatically, and is normally implied (i.e., ``addresses = MultipleJoin(...)`` implies ``joinMethodName="addresses"``). RelatedJoin and SQLRelatedJoin: Many-to-Many ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ See `Many-to-Many Relationships`_ for examples of using many-to-many joins. RelatedJoin returns a list of results, while SQLRelatedJoin returns a SelectResults object. `RelatedJoin` has all the keyword arguments of `MultipleJoin`__, plus: __ `Multiple Join Keywords`_ `otherColumn`: Similar to `joinColumn`, but referring to the joined class. Same warning about column name. `intermediateTable`: The name of the intermediate table which references both classes. WARNING: you should pass the database table name, not the SQLO class representing. `addRemoveName`: In the `user/role example`__, the methods `addRole(role)` and `removeRole(role)` are created. The ``Role`` portion of these method names can be changed by giving a string value here. `createRelatedTable`: default: ``True``. If ``False``, then the related table won't be automatically created; instead you must manually create it (e.g., with explicit SQLObject classes for the joins). New in 0.7.1. .. note:: Let's suppose you have SQLObject-inherited classes Alpha and Beta, and an AlphasAndBetas used for the many-to-many relationship. AlphasAndBetas contains the alphaIndex Foreign Key column referring to Alpha, and the betaIndex FK column referring to Beta. if you want a 'betas' RelatedJoin in Alpha, you should add it to Alpha passing 'Beta' (class name!) as the first parameter, then passing 'alpha_index_id' as joinColumn, 'beta_index_id' as otherColumn, and 'alphas_and_betas' as intermediateTable. __ `Many-to-Many Relationships`_ An example schema that requires the use of `joinColumn`, `otherColumn`, and `intermediateTable`:: CREATE TABLE person ( id SERIAL, username VARCHAR(100) NOT NULL UNIQUE ); CREATE TABLE role ( id SERIAL, name VARCHAR(50) NOT NULL UNIQUE ); CREATE TABLE assigned_roles ( person INT NOT NULL, role INT NOT NULL ); Then the usage in a class:: class Person(SQLObject): username = StringCol(length=100, alternateID=True) roles = RelatedJoin('Role', joinColumn='person', otherColumn='role', intermediateTable='assigned_roles') class Role(SQLObject): name = StringCol(length=50, alternateID=True) roles = RelatedJoin('Person', joinColumn='role', otherColumn='person', intermediateTable='assigned_roles') SingleJoin: One-to-One ~~~~~~~~~~~~~~~~~~~~~~~~~ Similar to `MultipleJoin`, but returns just one object, not a list. Connection pooling ------------------ Connection object acquires a new low-level DB API connection from the pool and stores it; the low-level connection is removed from the pool; "releasing" means "return it to the pool". For single-threaded programs there is one connection in the pool. If the pool is empty a new low-level connection opened; if one has disabled pooling (by setting conn._pool = None) the connection will be closed instead of returning to the pool. Transactions ------------ Transaction support in SQLObject is left to the database. Transactions can be used like:: conn = DBConnection.PostgresConnection('yada') trans = conn.transaction() p = Person.get(1, trans) p.firstName = 'Bob' trans.commit() p.firstName = 'Billy' trans.rollback() The ``trans`` object here is essentially a wrapper around a single database connection, and `commit` and `rollback` just pass that message to the low-level connection. One can call as much ``.commit()``'s, but after a ``.rollback()`` one has to call ``.begin()``. The last ``.commit()`` should be called as ``.commit(close=True)`` to release low-level connection back to the connection pool. You can use SELECT FOR UPDATE in those databases that support it:: Person.select(Person.q.name=="value", forUpdate=True, connection=trans) Method ``sqlhub.doInTransaction`` can be used to run a piece of code in a transaction. The method accepts a callable and positional and keywords arguments. It begins a transaction using its ``processConnection`` or ``threadConnection``, calls the callable, commits the transaction and closes the underlying connection; it returns whatever the callable returned. If an error occurs during call to the callable it rolls the transaction back and reraise the exception. Automatic Schema Generation --------------------------- All the connections support creating and dropping tables based on the class definition. First you have to prepare your class definition, which means including type information in your columns. Indexes ~~~~~~~ You can also define indexes for your tables, which is only meaningful when creating your tables through SQLObject (SQLObject relies on the database to implement the indexes). You do this again with attribute assignment, like:: firstLastIndex = DatabaseIndex('firstName', 'lastName') This creates an index on two columns, useful if you are selecting a particular name. Of course, you can give a single column, and you can give the column object (``firstName``) instead of the string name. Note that if you use ``unique`` or ``alternateID`` (which implies ``unique``) the database may make an index for you, and primary keys are always indexed. If you give the keyword argument ``unique`` to `DatabaseIndex` you'll create a unique index -- the combination of columns must be unique. You can also use dictionaries in place of the column names, to add extra options. E.g.:: lastNameIndex = DatabaseIndex({'expression': 'lower(last_name)'}) In that case, the index will be on the lower-case version of the column. It seems that only PostgreSQL supports this. You can also do:: lastNameIndex = DatabaseIndex({'column': lastName, 'length': 10}) Which asks the database to only pay attention to the first ten characters. Only MySQL supports this, but it is ignored in other databases. Creating and Dropping Tables ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ To create a table call `createTable`. It takes two arguments: `ifNotExists`: If the table already exists, then don't try to create it. Default False. `createJoinTables`: If you used `Many-to-Many relationships`_, then the intermediate tables will be created (but only for one of the two involved classes). Default True. `dropTable` takes arguments `ifExists` and `dropJoinTables`, self-explanatory. Dynamic Classes =============== SQLObject classes can be manipulated dynamically. This leaves open the possibility of constructing SQLObject classes from an XML file, from database introspection, or from a graphical interface. Automatic Class Generation --------------------------- SQLObject can read the table description from the database, and fill in the class columns (as would normally be described in the `_columns` attribute). Do this like:: class Person(SQLObject): class sqlmeta: fromDatabase = True You can still specify columns (in `_columns`), and only missing columns will be added. Runtime Column and Join Changes ------------------------------- You can add and remove columns to your class at runtime. Such changes will effect all instances, since changes are made in place to the class. There are two methods of the `class sqlmeta object`_, `addColumn` and `delColumn`, both of which take a `Col` object (or subclass) as an argument. There's also an option argument `changeSchema` which, if True, will add or drop the column from the database (typically with an ``ALTER`` command). When adding columns, you must pass the name as part of the column constructor, like ``StringCol("username", length=20)``. When removing columns, you can either use the Col object (as found in `sqlmeta.columns`, or which you used in `addColumn`), or you can use the column name (like ``MyClass.delColumn("username")``). .. _`class sqlmeta object`: `Class sqlmeta`_ .. _addJoin: You can also add Joins_, like ``MyClass.addJoin(MultipleJoin("MyOtherClass"))``, and remove joins with `delJoin`. `delJoin` does not take strings, you have to get the join object out of the `sqlmeta.joins` attribute. .. _Joins : `Relationships between Classes/Tables`_ Legacy Database Schemas ======================= Often you will have a database that already exists, and does not use the naming conventions that SQLObject expects, or does not use any naming convention at all. SQLObject requirements ---------------------- While SQLObject tries not to make too many requirements on your schema, some assumptions are made. Some of these may be relaxed in the future. All tables that you want to turn into a class need to have an integer primary key. That key should be defined like: MySQL: ``INT PRIMARY KEY AUTO_INCREMENT`` Postgres: ``SERIAL PRIMARY KEY`` SQLite: ``INTEGER PRIMARY KEY AUTOINCREMENT`` SQLObject does not support primary keys made up of multiple columns (that probably won't change). It does not generally support tables with primary keys with business meaning -- i.e., primary keys are assumed to be immutable (that won't change). At the moment foreign key column names must end in ``"ID"`` (case-insensitive). This restriction will probably be removed in the next release. Workaround for primary keys made up of multiple columns ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ If the database table/view has ONE NUMERIC Primary Key then sqlmeta - idName should be used to map the table column name to SQLObject id column. If the Primary Key consists only of number columns it is possible to create a virtual column ``id`` this way: Example for Postgresql: select '1'||lpad(PK1,max_length_of_PK1,'0')||lpad(PK2,max_length_of_PK2,'0')||...||lpad(PKn,max_length_of_PKn,'0') as "id", column_PK1, column_PK2, .., column_PKn, column... from table; Note: * The arbitrary '1' at the beginning of the string to allow for leading zeros of the first PK. * The application designer has to determine the maximum length of each Primary Key. This statement can be saved as a view or the column can be added to the database table, where it can be kept up to date with a database trigger. Obviously the "view" method does generally not allow insert, updates or deletes. For Postgresql you may want to consult the chapter "RULES" for manipulating underlying tables. For an alphanumeric Primary Key column a similar method is possible: Every character of the lpaded PK has to be transfered using ascii(character) which returns a 3digit number which can be concatenated as shown above. Caveats: * this way the ``id`` may become a very large integer number which may cause troubles elsewhere. * no performance loss takes place if the where clauses specifies the PK columns. Example: CD-Album * Album: PK=ean * Tracks: PK=ean,disc_nr,track_nr The database view to show the tracks starts: SELECT ean||lpad("disc_nr",2,'0')||lpad("track_nr",2,'0') as id, ... Note: no leading '1' and no padding necessary for ean numbers Tracks.select(Tracks.q.ean==id) ... where id is the ean of the Album. Changing the Naming Style ------------------------- By default names in SQLObject are expected to be mixed case in Python (like ``mixedCase``), and underscore-separated in SQL (like ``mixed_case``). This applies to table and column names. The primary key is assumed to be simply ``id``. Other styles exist. A typical one is mixed case column names, and a primary key that includes the table name, like ``ProductID``. You can use a different `Style` object to indicate a different naming convention. For instance:: class Person(SQLObject): class sqlmeta: style = MixedCaseStyle(longID=True) firstName = StringCol() lastName = StringCol() If you use ``Person.createTable()``, you'll get:: CREATE TABLE Person ( PersonID INT PRIMARY KEY, FirstName Text, LastName Text ) The `MixedCaseStyle` object handles the initial capitalization of words, but otherwise leaves them be. By using ``longID=True``, we indicate that the primary key should look like a normal reference (``PersonID`` for `MixedCaseStyle`, or ``person_id`` for the default style). If you wish to change the style globally, assign the style to the connection, like:: __connection__.style = MixedCaseStyle(longID=True) Irregular Naming ---------------- This is now covered in the `Class sqlmeta`_ section. Non-Integer Keys ---------------- While not strictly a legacy database issue, this fits into the category of "irregularities". If you use non-integer keys, all primary key management is up to you. You must create the table yourself (SQLObject can create tables with int or str IDs), and when you create instances you must pass a ``id`` keyword argument into constructor (like ``Person(id='555-55-5555', ...)``). DBConnection: Database Connections ================================== The `DBConnection` module currently has six external classes, `MySQLConnection`, `PostgresConnection`, `SQLiteConnection`, `SybaseConnection`, `MaxdbConnection`, `MSSQLConnection`. You can pass the keyword argument `debug` to any connector. If set to true, then any SQL sent to the database will also be printed to the console. You can additionally pass `logger` keyword argument which should be a name of the logger to use. If specified and `debug` is ``True``, SQLObject will write debug print statements via that logger instead of printing directly to console. The argument `loglevel` allows to choose the logging level - it can be ``debug``, ``info``, ``warning``, ``error``, ``critical`` or ``exception``. In case `logger` is absent or empty SQLObject uses ``print``'s instead of logging; `loglevel` can be ``stdout`` or ``stderr`` in this case; default is ``stdout``. To configure logging one can do something like that:: import logging logging.basicConfig( filename='test.log', format='[%(asctime)s] %(name)s %(levelname)s: %(message)s', level=logging.DEBUG, ) log = logging.getLogger("TEST") log.info("Log started") __connection__ = "sqlite:/:memory:?debug=1&logger=TEST&loglevel=debug" The code redirects SQLObject debug messages to `test.log` file. MySQL ----- `MySQLConnection` takes the keyword arguments `host`, `port`, `db`, `user`, and `password`, just like `MySQLdb.connect` does. MySQLConnection supports all the features, though MySQL only supports transactions_ when using the InnoDB backend; SQLObject can explicitly define the backend using ``sqlmeta.createSQL``. Supported drivers are ``mysqldb``, ``connector``, ``oursql`` and ``pymysql``, ``pyodbc``, ``pypyodbc`` or ``odbc`` (try ``pyodbc`` and ``pypyodbc``); defualt is ``mysqldb``. Keyword argument ``conv`` allows to pass a list of custom converters. Example:: import time import sqlobject import MySQLdb.converters def _mysql_timestamp_converter(raw): """Convert a MySQL TIMESTAMP to a floating point number representing the seconds since the Un*x Epoch. It uses custom code the input seems to be the new (MySQL 4.1+) timestamp format, otherwise code from the MySQLdb module is used.""" if raw[4] == '-': return time.mktime(time.strptime(raw, '%Y-%m-%d %H:%M:%S')) else: return MySQLdb.converters.mysql_timestamp_converter(raw) conversions = MySQLdb.converters.conversions.copy() conversions[MySQLdb.constants.FIELD_TYPE.TIMESTAMP] = _mysql_timestamp_converter MySQLConnection = sqlobject.mysql.builder() connection = MySQLConnection(user='foo', db='somedb', conv=conversions) Connection-specific parameters are: ``unix_socket``, ``init_command``, ``read_default_file``, ``read_default_group``, ``conv``, ``connect_timeout``, ``compress``, ``named_pipe``, ``use_unicode``, ``client_flag``, ``local_infile``, ``ssl_key``, ``ssl_cert``, ``ssl_ca``, ``ssl_capath``, ``charset``. Postgres -------- `PostgresConnection` takes a single connection string, like ``"dbname=something user=some_user"``, just like `psycopg.connect`. You can also use the same keyword arguments as for `MySQLConnection`, and a dsn string will be constructed. PostgresConnection supports transactions and all other features. The user can choose a DB API driver for PostgreSQL by using a ``driver`` parameter in DB URI or PostgresConnection that can be a comma-separated list of driver names. Possible drivers are: ``psycopg2``, psycopg1, ``psycopg`` (tries psycopg2 and psycopg1), ``pygresql``, ``pypostgresql``, ``pyodbc``, ``pypyodbc`` or ``odbc`` (try ``pyodbc`` and ``pypyodbc``). Default is ``psycopg``. Connection-specific parameters are: ``sslmode``, ``unicodeCols``, ``schema``, ``charset``. SQLite ------ `SQLiteConnection` takes the a single string, which is the path to the database file. SQLite puts all data into one file, with a journal file that is opened in the same directory during operation (the file is deleted when the program quits). SQLite does not restrict the types you can put in a column -- strings can go in integer columns, dates in integers, etc. SQLite may have concurrency issues, depending on your usage in a multi-threaded environment. The user can choose a DB API driver for SQLite by using a ``driver`` parameter in DB URI or SQLiteConnection that can be a comma-separated list of driver names. Possible drivers are: ``pysqlite2`` (alias ``sqlite2``), ``sqlite3``, ``sqlite`` (alias ``sqlite1``). Default is to test pysqlite2, sqlite3 and sqlite in that order. Connection-specific parameters are: ``encoding``, ``mode``, ``timeout``, ``check_same_thread``, ``use_table_info``. Firebird -------- `FirebirdConnection` takes the arguments `host`, `db`, `user` (default ``"sysdba"``), `password` (default ``"masterkey"``). Firebird supports all the features. Support is still young, so there may be some issues, especially with concurrent access, and especially using lazy selects. Try ``list(MyClass.select())`` to avoid concurrent cursors if you have problems (using ``list()`` will pre-fetch all the results of a select). Firebird support ``fdb``, ``kinterbasdb`` or ``firebirdsql`` drivers. Default are ``fdb`` and ``kinterbasdb``. There could be a problem if one tries to connect to a server running on w32 from a program running on Unix; the problem is how to specify the database so that SQLObject correctly parses it. Vertical bar is replaces by a semicolon only on a w32. On Unix a vertical bar is a pretty normal character and must not be processed. The most correct way to fix the problem is to connect to the DB using a database name, not a file name. In the Firebird a DBA can set an alias instead of database name in the aliases.conf file Example from `Firebird 2.0 Administrators Manual`_:: # fbdb1 is on a Windows server: fbdb1 = c:\Firebird\sample\Employee.fdb .. _`Firebird 2.0 Administrators Manual`: http://www.firebirdmanual.com/firebird/en/firebird-manual/2 Now a program can connect to firebird://host:port/fbdb1. One can edit aliases.conf whilst the server is running. There is no need to stop and restart the server in order for new aliases.conf entries to be recognised. If you are using indexes and get an error like *key size exceeds implementation restriction for index*, see `this page`_ to understand the restrictions on your indexing. .. _this page: http://mujweb.cz/iprenosil/interbase/ip_ib_indexcalculator.htm Connection-specific parameters are: ``dialect``, ``role``, ``charset``. Sybase ------ `SybaseConnection` takes the arguments `host`, `db`, `user`, and `password`. It also takes the extra boolean argument `locking` (default True), which is passed through when performing a connection. You may use a False value for `locking` if you are not using multiple threads, for a slight performance boost. It uses the Sybase_ module. Connection-specific parameters are: ``locking``, ``autoCommit``. MAX DB ------ MAX DB, also known as SAP DB, is available from a partnership of SAP and MySQL. It takes the typical arguments: `host`, `database`, `user`, `password`. It also takes the arguments `sqlmode` (default ``"internal"``), `isolation`, and `timeout`, which are passed through when creating the connection to the database. It uses the sapdb_ module. Connection-specific parameters are: ``autoCommit``, ``sqlmode``, ``isolation``, ``timeout``. MS SQL Server ------------- The `MSSQLConnection` objects wants to use new style connection strings in the format of mssql://user:pass@host:port/db This will then be mapped to either the correct driver format. If running SQL Server on a "named" port, make sure to specify the port number in the URI. The two drivers currently supported are adodbapi_ and pymssql_. The user can choose a DB API driver for MSSQL by using a ``driver`` parameter in DB URI or MSSQLConnection that can be a comma-separated list of driver names. Possible drivers are: ``adodb`` (alias ``adodbapi``) and ``pymssql``. Default is to test ``adodbapi`` and ``pymssql`` in that order. Connection-specific parameters are: ``autoCommit``, ``timeout``. Events (signals) ================ Signals are a mechanism to be notified when data or schema changes happen through SQLObject. This may be useful for doing custom data validation, logging changes, setting default attributes, etc. Some of what signals can do is also possible by overriding methods, but signals may provide a cleaner way, especially across classes not related by inheritance. Example:: from sqlobject.events import listen, RowUpdateSignal, RowCreatedSignal from model import Users def update_listener(instance, kwargs): """keep "last_updated" field current""" import datetime # BAD method 1, causes infinite recursion? # instance should be read-only instance.last_updated = datetime.datetime.now() # GOOD method 2 kwargs['last_updated'] = datetime.datetime.now() def created_listener(instance, kwargs, post_funcs): """"email me when new users added""" # email() implementation left as an exercise for the reader msg = "%s just was just added to the database!" % kwargs['name'] email(msg) listen(update_listener, Users, RowUpdateSignal) listen(created_listener, Users, RowCreatedSignal) Exported Symbols ================ You can use ``from sqlobject import *``, though you don't have to. It exports a minimal number of symbols. The symbols exported: From `sqlobject.main`: * `NoDefault` * `SQLObject` * `getID` * `getObject` From `sqlobject.col`: * `Col` * `StringCol` * `IntCol` * `FloatCol` * `KeyCol` * `ForeignKey` * `EnumCol` * `SetCol` * `DateTimeCol` * `DateCol` * `TimeCol` * `TimestampCol` * `DecimalCol` * `CurrencyCol` From `sqlobject.joins`: * `MultipleJoin` * `RelatedJoin` From `sqlobject.styles`: * `Style` * `MixedCaseUnderscoreStyle` * `DefaultStyle` * `MixedCaseStyle` From `sqlobject.sqlbuilder`: * `AND` * `OR` * `NOT` * `IN` * `LIKE` * `DESC` * `CONTAINSSTRING` * `const` * `func` LEFT JOIN and other JOINs ------------------------- First look in the FAQ_, question "How can I do a LEFT JOIN?" Still here? Well. To perform a JOIN use one of the JOIN helpers from SQLBuilder_. Pass an instance of the helper to .select() method. For example:: from sqlobject.sqlbuilder import LEFTJOINOn MyTable.select( join=LEFTJOINOn(Table1, Table2, Table1.q.name == Table2.q.value)) will generate the query:: SELECT my_table.* FROM my_table, table1 LEFT JOIN table2 ON table1.name = table2.value; .. _FAQ: FAQ.html#how-can-i-do-a-left-join If you want to join with the primary table - leave the first table None:: MyTable.select( join=LEFTJOINOn(None, Table1, MyTable.q.name == Table1.q.value)) will generate the query:: SELECT my_table.* FROM my_table LEFT JOIN table2 ON my_table.name = table1.value; The join argument for .select() can be a JOIN() or a sequence (list/tuple) of JOIN()s. Available joins are JOIN, INNERJOIN, CROSSJOIN, STRAIGHTJOIN, LEFTJOIN, LEFTOUTERJOIN, NATURALJOIN, NATURALLEFTJOIN, NATURALLEFTOUTERJOIN, RIGHTJOIN, RIGHTOUTERJOIN, NATURALRIGHTJOIN, NATURALRIGHTOUTERJOIN, FULLJOIN, FULLOUTERJOIN, NATURALFULLJOIN, NATURALFULLOUTERJOIN, INNERJOINOn, LEFTJOINOn, LEFTOUTERJOINOn, RIGHTJOINOn, RIGHTOUTERJOINOn, FULLJOINOn, FULLOUTERJOINOn, INNERJOINUsing, LEFTJOINUsing, LEFTOUTERJOINUsing, RIGHTJOINUsing, RIGHTOUTERJOINUsing, FULLJOINUsing, FULLOUTERJOINUsing. How can I join a table with itself? ----------------------------------- Use Alias from SQLBuilder_. Example:: from sqlobject.sqlbuilder import Alias alias = Alias(MyTable, "my_table_alias") MyTable.select(MyTable.q.name == alias.q.value) will generate the query:: SELECT my_table.* FROM my_table, my_table AS my_table_alias WHERE my_table.name = my_table_alias.value; Can I use a JOIN() with aliases? ---------------------------------- Sure! That's a situation the JOINs and aliases were primary developed for. Code:: from sqlobject.sqlbuilder import LEFTJOINOn, Alias alias = Alias(OtherTable, "other_table_alias") MyTable.select(MyTable.q.name == OtherTable.q.value, join=LEFTJOINOn(MyTable, alias, MyTable.col1 == alias.q.col2)) will result in the query:: SELECT my_table.* FROM other_table, my_table LEFT JOIN other_table AS other_table_alias WHERE my_table.name == other_table.value AND my_table.col1 = other_table_alias.col2. Subqueries (subselects) ----------------------- You can run queries with subqueries (subselects) on those DBMS that can do subqueries (MySQL supports subqueries from version 4.1). Use corresponding classes and functions from SQLBuilder_:: from sqlobject.sqlbuilder import EXISTS, Select select = Test1.select(EXISTS(Select(Test2.q.col2, where=(Outer(Test1).q.col1 == Test2.q.col2)))) generates the query:: SELECT test1.id, test1.col1 FROM test1 WHERE EXISTS (SELECT test2.col2 FROM test2 WHERE (test1.col1 = test2.col2)) Note the usage of Outer - it is a helper to allow referring to a table in the outer query. Select() is used instead of .select() because you need to control what columns the inner query returns. Available queries are ``IN()``, ``NOTIN()``, ``EXISTS()``, ``NOTEXISTS()``, ``SOME()``, ``ANY()`` and ``ALL()``. The last 3 are used with comparison operators, like this: ``somevalue = ANY(Select(...))``. Utilities --------- Some useful utility functions are included with SQLObject. For more information see their module docstrings. * `sqlobject.util.csvexport `_ SQLBuilder ---------- For more information on SQLBuilder, read the `SQLBuilder Documentation`_. .. _SQLBuilder: SQLBuilder.html .. _`SQLBuilder Documentation`: SQLBuilder.html .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/Views.rst0000644000175000017500000000453012752476767015464 0ustar phdphd00000000000000Views and SQLObjects ==================== In general, if your database backend supports defining views you may define them outside of SQLObject and treat them as a regular table when defining your SQLObject class. ViewSQLObject ------------- The rest of this document is experimental. ``from sqlobject.views import *`` ``ViewSQLObject`` is an attempt to allow defining views that allow you to define a SQL query that acts like a SQLObject class. You define columns based on other SQLObject classes .q SQLBuilder columns, have columns that are aggregates of other columns, and join multiple SQLObject classes into one and add restrictions using SQLBuilder expressions. The resulting classes are currently read only, if you find use for this idea please bring discussion to the mailing list. A short example from the tests will suffice for now. Base classes:: class PhoneNumber(SQLObject): number = StringCol() calls = SQLMultipleJoin('PhoneCall') incoming = SQLMultipleJoin('PhoneCall', joinColumn='toID') class PhoneCall(SQLObject): phoneNumber = ForeignKey('PhoneNumber') to = ForeignKey('PhoneNumber') minutes = IntCol() View classes:: class ViewPhoneCall(ViewSQLObject): class sqlmeta: idName = PhoneCall.q.id clause = PhoneCall.q.phoneNumberID==PhoneNumber.q.id minutes = IntCol(dbName=PhoneCall.q.minutes) number = StringCol(dbName=PhoneNumber.q.number) phoneNumber = ForeignKey('PhoneNumber', dbName=PhoneNumber.q.id) call = ForeignKey('PhoneCall', dbName=PhoneCall.q.id) class ViewPhone(ViewSQLObject): class sqlmeta: idName = PhoneNumber.q.id clause = PhoneCall.q.phoneNumberID==PhoneNumber.q.id minutes = IntCol(dbName=func.SUM(PhoneCall.q.minutes)) numberOfCalls = IntCol(dbName=func.COUNT(PhoneCall.q.phoneNumberID)) number = StringCol(dbName=PhoneNumber.q.number) phoneNumber = ForeignKey('PhoneNumber', dbName=PhoneNumber.q.id) calls = SQLMultipleJoin('PhoneCall', joinColumn='phoneNumberID') vCalls = SQLMultipleJoin('ViewPhoneCall', joinColumn='phoneNumberID') .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/api/0000755000175000017500000000000013141371614014356 5ustar phdphd00000000000000SQLObject-3.4.0/docs/api/sqlobject.versioning.test.test_version.rst0000644000175000017500000000031512760105144024757 0ustar phdphd00000000000000sqlobject.versioning.test.test_version module ============================================= .. automodule:: sqlobject.versioning.test.test_version :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.include.tests.test_hashcol.rst0000644000175000017500000000030712760105144024337 0ustar phdphd00000000000000sqlobject.include.tests.test_hashcol module =========================================== .. automodule:: sqlobject.include.tests.test_hashcol :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.dbtest.rst0000644000175000017500000000023512760105144021522 0ustar phdphd00000000000000sqlobject.tests.dbtest module ============================= .. automodule:: sqlobject.tests.dbtest :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_slice.rst0000644000175000017500000000025112760105144022371 0ustar phdphd00000000000000sqlobject.tests.test_slice module ================================= .. automodule:: sqlobject.tests.test_slice :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.postgres.pgconnection.rst0000644000175000017500000000027012760105144023426 0ustar phdphd00000000000000sqlobject.postgres.pgconnection module ====================================== .. automodule:: sqlobject.postgres.pgconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.versioning.rst0000644000175000017500000000034012760105144021254 0ustar phdphd00000000000000sqlobject.versioning package ============================ .. automodule:: sqlobject.versioning :members: :undoc-members: :show-inheritance: Subpackages ----------- .. toctree:: sqlobject.versioning.test SQLObject-3.4.0/docs/api/sqlobject.tests.test_aliases.rst0000644000175000017500000000025712760105144022721 0ustar phdphd00000000000000sqlobject.tests.test_aliases module =================================== .. automodule:: sqlobject.tests.test_aliases :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.mysql.rst0000644000175000017500000000032412760105144020240 0ustar phdphd00000000000000sqlobject.mysql package ======================= .. automodule:: sqlobject.mysql :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.mysql.mysqlconnection SQLObject-3.4.0/docs/api/sqlobject.tests.test_unicode.rst0000644000175000017500000000025712760105144022726 0ustar phdphd00000000000000sqlobject.tests.test_unicode module =================================== .. automodule:: sqlobject.tests.test_unicode :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.util.csvimport.rst0000644000175000017500000000024312760105144022075 0ustar phdphd00000000000000sqlobject.util.csvimport module =============================== .. automodule:: sqlobject.util.csvimport :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.constraints.rst0000644000175000017500000000023212760105144021440 0ustar phdphd00000000000000sqlobject.constraints module ============================ .. automodule:: sqlobject.constraints :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_inheritance.rst0000644000175000017500000000027312760105144023567 0ustar phdphd00000000000000sqlobject.tests.test_inheritance module ======================================= .. automodule:: sqlobject.tests.test_inheritance :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_NoneValuedResultItem.rst0000644000175000017500000000032612760105144025353 0ustar phdphd00000000000000sqlobject.tests.test_NoneValuedResultItem module ================================================ .. automodule:: sqlobject.tests.test_NoneValuedResultItem :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_sqlbuilder_importproxy.rst0000644000175000017500000000033412760105144026136 0ustar phdphd00000000000000sqlobject.tests.test_sqlbuilder_importproxy module ================================================== .. automodule:: sqlobject.tests.test_sqlbuilder_importproxy :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.versioning.test.rst0000644000175000017500000000037112760105144022236 0ustar phdphd00000000000000sqlobject.versioning.test package ================================= .. automodule:: sqlobject.versioning.test :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.versioning.test.test_version SQLObject-3.4.0/docs/api/sqlobject.maxdb.rst0000644000175000017500000000032412760105144020166 0ustar phdphd00000000000000sqlobject.maxdb package ======================= .. automodule:: sqlobject.maxdb :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.maxdb.maxdbconnection SQLObject-3.4.0/docs/api/sqlobject.tests.test_views.rst0000644000175000017500000000025112760105144022427 0ustar phdphd00000000000000sqlobject.tests.test_views module ================================= .. automodule:: sqlobject.tests.test_views :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.rdbhost.rdbhostconnection.rst0000644000175000017500000000030412760105144024262 0ustar phdphd00000000000000sqlobject.rdbhost.rdbhostconnection module ========================================== .. automodule:: sqlobject.rdbhost.rdbhostconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_boundattributes.rst0000644000175000017500000000030712760105144024512 0ustar phdphd00000000000000sqlobject.tests.test_boundattributes module =========================================== .. automodule:: sqlobject.tests.test_boundattributes :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_ForeignKey.rst0000644000175000017500000000027012760105144023335 0ustar phdphd00000000000000sqlobject.tests.test_ForeignKey module ====================================== .. automodule:: sqlobject.tests.test_ForeignKey :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.manager.rst0000644000175000017500000000032412760105144020505 0ustar phdphd00000000000000sqlobject.manager package ========================= .. automodule:: sqlobject.manager :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.manager.command SQLObject-3.4.0/docs/api/sqlobject.tests.test_delete.rst0000644000175000017500000000025412760105144022537 0ustar phdphd00000000000000sqlobject.tests.test_delete module ================================== .. automodule:: sqlobject.tests.test_delete :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_paste.rst0000644000175000017500000000025112760105144022406 0ustar phdphd00000000000000sqlobject.tests.test_paste module ================================= .. automodule:: sqlobject.tests.test_paste :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_perConnection.rst0000644000175000017500000000030112760105144024074 0ustar phdphd00000000000000sqlobject.tests.test_perConnection module ========================================= .. automodule:: sqlobject.tests.test_perConnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_schema.rst0000644000175000017500000000025412760105144022535 0ustar phdphd00000000000000sqlobject.tests.test_schema module ================================== .. automodule:: sqlobject.tests.test_schema :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_sorting.rst0000644000175000017500000000025712760105144022765 0ustar phdphd00000000000000sqlobject.tests.test_sorting module =================================== .. automodule:: sqlobject.tests.test_sorting :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.include.tests.rst0000644000175000017500000000036112760105144021660 0ustar phdphd00000000000000sqlobject.include.tests package =============================== .. automodule:: sqlobject.include.tests :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.include.tests.test_hashcol SQLObject-3.4.0/docs/api/sqlobject.firebird.rst0000644000175000017500000000034312760105144020662 0ustar phdphd00000000000000sqlobject.firebird package ========================== .. automodule:: sqlobject.firebird :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.firebird.firebirdconnection SQLObject-3.4.0/docs/api/sqlobject.sybase.sybaseconnection.rst0000644000175000017500000000027612760105144023734 0ustar phdphd00000000000000sqlobject.sybase.sybaseconnection module ======================================== .. automodule:: sqlobject.sybase.sybaseconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_inheritance_tree.rst0000644000175000017500000000035612760105144027100 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_inheritance_tree module ======================================================== .. automodule:: sqlobject.inheritance.tests.test_inheritance_tree :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_cyclic_reference.rst0000644000175000017500000000031212760105144024554 0ustar phdphd00000000000000sqlobject.tests.test_cyclic_reference module ============================================ .. automodule:: sqlobject.tests.test_cyclic_reference :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_lazy.rst0000644000175000017500000000024612760105144022255 0ustar phdphd00000000000000sqlobject.tests.test_lazy module ================================ .. automodule:: sqlobject.tests.test_lazy :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.classregistry.rst0000644000175000017500000000024012760105144021766 0ustar phdphd00000000000000sqlobject.classregistry module ============================== .. automodule:: sqlobject.classregistry :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_SQLMultipleJoin.rst0000644000175000017500000000030712760105144024267 0ustar phdphd00000000000000sqlobject.tests.test_SQLMultipleJoin module =========================================== .. automodule:: sqlobject.tests.test_SQLMultipleJoin :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_picklecol.rst0000644000175000017500000000026512760105144023244 0ustar phdphd00000000000000sqlobject.tests.test_picklecol module ===================================== .. automodule:: sqlobject.tests.test_picklecol :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_foreignKey.rst0000644000175000017500000000033412760105144025666 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_foreignKey module ================================================== .. automodule:: sqlobject.inheritance.tests.test_foreignKey :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.manager.command.rst0000644000175000017500000000024612760105144022125 0ustar phdphd00000000000000sqlobject.manager.command module ================================ .. automodule:: sqlobject.manager.command :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.cache.rst0000644000175000017500000000021012760105144020130 0ustar phdphd00000000000000sqlobject.cache module ====================== .. automodule:: sqlobject.cache :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_blob.rst0000644000175000017500000000024612760105144022214 0ustar phdphd00000000000000sqlobject.tests.test_blob module ================================ .. automodule:: sqlobject.tests.test_blob :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.sqlbuilder.rst0000644000175000017500000000022712760105144021243 0ustar phdphd00000000000000sqlobject.sqlbuilder module =========================== .. automodule:: sqlobject.sqlbuilder :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.index.rst0000644000175000017500000000021012760105144020174 0ustar phdphd00000000000000sqlobject.index module ====================== .. automodule:: sqlobject.index :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_converters.rst0000644000175000017500000000027012760105144023465 0ustar phdphd00000000000000sqlobject.tests.test_converters module ====================================== .. automodule:: sqlobject.tests.test_converters :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_sqlmeta_idName.rst0000644000175000017500000000030412760105144024214 0ustar phdphd00000000000000sqlobject.tests.test_sqlmeta_idName module ========================================== .. automodule:: sqlobject.tests.test_sqlmeta_idName :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.compat.rst0000644000175000017500000000021312760105144020353 0ustar phdphd00000000000000sqlobject.compat module ======================= .. automodule:: sqlobject.compat :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.dbconnection.rst0000644000175000017500000000023512760105144021541 0ustar phdphd00000000000000sqlobject.dbconnection module ============================= .. automodule:: sqlobject.dbconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_new_joins.rst0000644000175000017500000000026512760105144023272 0ustar phdphd00000000000000sqlobject.tests.test_new_joins module ===================================== .. automodule:: sqlobject.tests.test_new_joins :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_destroy_cascade.rst0000644000175000017500000000035312775621605026734 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_destroy_cascade module ======================================================= .. automodule:: sqlobject.inheritance.tests.test_destroy_cascade :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_complex_sorting.rst0000644000175000017500000000030712760105144024510 0ustar phdphd00000000000000sqlobject.tests.test_complex_sorting module =========================================== .. automodule:: sqlobject.tests.test_complex_sorting :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.rdbhost.rst0000644000175000017500000000033612760105144020543 0ustar phdphd00000000000000sqlobject.rdbhost package ========================= .. automodule:: sqlobject.rdbhost :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.rdbhost.rdbhostconnection SQLObject-3.4.0/docs/api/sqlobject.sybase.rst0000644000175000017500000000033112760105144020357 0ustar phdphd00000000000000sqlobject.sybase package ======================== .. automodule:: sqlobject.sybase :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.sybase.sybaseconnection SQLObject-3.4.0/docs/api/sqlobject.util.moduleloader.rst0000644000175000017500000000025412760105144022525 0ustar phdphd00000000000000sqlobject.util.moduleloader module ================================== .. automodule:: sqlobject.util.moduleloader :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/modules.rst0000644000175000017500000000010012760105144016546 0ustar phdphd00000000000000sqlobject ========= .. toctree:: :maxdepth: 4 sqlobject SQLObject-3.4.0/docs/api/sqlobject.tests.test_groupBy.rst0000644000175000017500000000025712760105144022727 0ustar phdphd00000000000000sqlobject.tests.test_groupBy module =================================== .. automodule:: sqlobject.tests.test_groupBy :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.maxdb.maxdbconnection.rst0000644000175000017500000000027012760105144023340 0ustar phdphd00000000000000sqlobject.maxdb.maxdbconnection module ====================================== .. automodule:: sqlobject.maxdb.maxdbconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.util.csvexport.rst0000644000175000017500000000024312760105144022104 0ustar phdphd00000000000000sqlobject.util.csvexport module =============================== .. automodule:: sqlobject.util.csvexport :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_asdict.rst0000644000175000017500000000032012760105144025026 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_asdict module ============================================== .. automodule:: sqlobject.inheritance.tests.test_asdict :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_aggregates.rst0000644000175000017500000000033412760105144025675 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_aggregates module ================================================== .. automodule:: sqlobject.inheritance.tests.test_aggregates :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.sqlite.sqliteconnection.rst0000644000175000017500000000027612760105144023762 0ustar phdphd00000000000000sqlobject.sqlite.sqliteconnection module ======================================== .. automodule:: sqlobject.sqlite.sqliteconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_inheritance.rst0000644000175000017500000000033712760105144026060 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_inheritance module =================================================== .. automodule:: sqlobject.inheritance.tests.test_inheritance :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_indexes.rst0000644000175000017500000000032312760105144025221 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_indexes module =============================================== .. automodule:: sqlobject.inheritance.tests.test_indexes :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_setters.rst0000644000175000017500000000025712760105144022771 0ustar phdphd00000000000000sqlobject.tests.test_setters module =================================== .. automodule:: sqlobject.tests.test_setters :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_default_style.rst0000644000175000017500000000030112760105144024132 0ustar phdphd00000000000000sqlobject.tests.test_default_style module ========================================= .. automodule:: sqlobject.tests.test_default_style :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_identity.rst0000644000175000017500000000026212760105144023125 0ustar phdphd00000000000000sqlobject.tests.test_identity module ==================================== .. automodule:: sqlobject.tests.test_identity :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.sqlite.rst0000644000175000017500000000033112760105144020372 0ustar phdphd00000000000000sqlobject.sqlite package ======================== .. automodule:: sqlobject.sqlite :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.sqlite.sqliteconnection SQLObject-3.4.0/docs/api/sqlobject.include.rst0000644000175000017500000000043012760105144020514 0ustar phdphd00000000000000sqlobject.include package ========================= .. automodule:: sqlobject.include :members: :undoc-members: :show-inheritance: Subpackages ----------- .. toctree:: sqlobject.include.tests Submodules ---------- .. toctree:: sqlobject.include.hashcol SQLObject-3.4.0/docs/api/sqlobject.tests.test_sqlbuilder.rst0000644000175000017500000000027012760105144023441 0ustar phdphd00000000000000sqlobject.tests.test_sqlbuilder module ====================================== .. automodule:: sqlobject.tests.test_sqlbuilder :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_sqlobject_admin.rst0000644000175000017500000000030712760105144024432 0ustar phdphd00000000000000sqlobject.tests.test_sqlobject_admin module =========================================== .. automodule:: sqlobject.tests.test_sqlobject_admin :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_postgres.rst0000644000175000017500000000026212760105144023142 0ustar phdphd00000000000000sqlobject.tests.test_postgres module ==================================== .. automodule:: sqlobject.tests.test_postgres :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_constraints.rst0000644000175000017500000000027312760105144023645 0ustar phdphd00000000000000sqlobject.tests.test_constraints module ======================================= .. automodule:: sqlobject.tests.test_constraints :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_mysql.rst0000644000175000017500000000025112760105144022437 0ustar phdphd00000000000000sqlobject.tests.test_mysql module ================================= .. automodule:: sqlobject.tests.test_mysql :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_jsoncol.rst0000644000175000017500000000025712760105144022747 0ustar phdphd00000000000000sqlobject.tests.test_jsoncol module =================================== .. automodule:: sqlobject.tests.test_jsoncol :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_auto.rst0000644000175000017500000000024612760105144022246 0ustar phdphd00000000000000sqlobject.tests.test_auto module ================================ .. automodule:: sqlobject.tests.test_auto :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_sqlbuilder_dbspecific.rst0000644000175000017500000000033112760105144025612 0ustar phdphd00000000000000sqlobject.tests.test_sqlbuilder_dbspecific module ================================================= .. automodule:: sqlobject.tests.test_sqlbuilder_dbspecific :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_md5.rst0000644000175000017500000000024312760105144021760 0ustar phdphd00000000000000sqlobject.tests.test_md5 module =============================== .. automodule:: sqlobject.tests.test_md5 :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.sresults.rst0000644000175000017500000000022112760105144020753 0ustar phdphd00000000000000sqlobject.sresults module ========================= .. automodule:: sqlobject.sresults :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.util.rst0000644000175000017500000000044612760105144020055 0ustar phdphd00000000000000sqlobject.util package ====================== .. automodule:: sqlobject.util :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.util.csvexport sqlobject.util.csvimport sqlobject.util.moduleloader sqlobject.util.threadinglocal SQLObject-3.4.0/docs/api/sqlobject.mssql.mssqlconnection.rst0000644000175000017500000000027012760105144023450 0ustar phdphd00000000000000sqlobject.mssql.mssqlconnection module ====================================== .. automodule:: sqlobject.mssql.mssqlconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.conftest.rst0000644000175000017500000000022112760105144020714 0ustar phdphd00000000000000sqlobject.conftest module ========================= .. automodule:: sqlobject.conftest :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_sqlite.rst0000644000175000017500000000025412760105144022576 0ustar phdphd00000000000000sqlobject.tests.test_sqlite module ================================== .. automodule:: sqlobject.tests.test_sqlite :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_decimal.rst0000644000175000017500000000025712760105144022676 0ustar phdphd00000000000000sqlobject.tests.test_decimal module =================================== .. automodule:: sqlobject.tests.test_decimal :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_create_drop.rst0000644000175000017500000000027312760105144023565 0ustar phdphd00000000000000sqlobject.tests.test_create_drop module ======================================= .. automodule:: sqlobject.tests.test_create_drop :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.col.rst0000644000175000017500000000020212760105144017643 0ustar phdphd00000000000000sqlobject.col module ==================== .. automodule:: sqlobject.col :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.include.hashcol.rst0000644000175000017500000000024612760105144022141 0ustar phdphd00000000000000sqlobject.include.hashcol module ================================ .. automodule:: sqlobject.include.hashcol :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_reparent_sqlmeta.rst0000644000175000017500000000031212760105144024636 0ustar phdphd00000000000000sqlobject.tests.test_reparent_sqlmeta module ============================================ .. automodule:: sqlobject.tests.test_reparent_sqlmeta :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.util.threadinglocal.rst0000644000175000017500000000026212760105144023030 0ustar phdphd00000000000000sqlobject.util.threadinglocal module ==================================== .. automodule:: sqlobject.util.threadinglocal :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.rst0000644000175000017500000000166512760105144017105 0ustar phdphd00000000000000sqlobject package ================= .. automodule:: sqlobject :members: :undoc-members: :show-inheritance: Subpackages ----------- .. toctree:: sqlobject.firebird sqlobject.include sqlobject.inheritance sqlobject.manager sqlobject.maxdb sqlobject.mssql sqlobject.mysql sqlobject.postgres sqlobject.rdbhost sqlobject.sqlite sqlobject.sybase sqlobject.tests sqlobject.util sqlobject.versioning Submodules ---------- .. toctree:: sqlobject.boundattributes sqlobject.cache sqlobject.classregistry sqlobject.col sqlobject.compat sqlobject.conftest sqlobject.constraints sqlobject.converters sqlobject.dbconnection sqlobject.dberrors sqlobject.declarative sqlobject.events sqlobject.index sqlobject.joins sqlobject.main sqlobject.sqlbuilder sqlobject.sresults sqlobject.styles sqlobject.views sqlobject.wsgi_middleware SQLObject-3.4.0/docs/api/sqlobject.inheritance.rst0000644000175000017500000000045612760105144021372 0ustar phdphd00000000000000sqlobject.inheritance package ============================= .. automodule:: sqlobject.inheritance :members: :undoc-members: :show-inheritance: Subpackages ----------- .. toctree:: sqlobject.inheritance.tests Submodules ---------- .. toctree:: sqlobject.inheritance.iteration SQLObject-3.4.0/docs/api/sqlobject.tests.test_enum.rst0000644000175000017500000000024612760105144022242 0ustar phdphd00000000000000sqlobject.tests.test_enum module ================================ .. automodule:: sqlobject.tests.test_enum :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.boundattributes.rst0000644000175000017500000000024612760105144022314 0ustar phdphd00000000000000sqlobject.boundattributes module ================================ .. automodule:: sqlobject.boundattributes :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.rst0000644000175000017500000000113012775621605022534 0ustar phdphd00000000000000sqlobject.inheritance.tests package =================================== .. automodule:: sqlobject.inheritance.tests :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.inheritance.tests.test_aggregates sqlobject.inheritance.tests.test_asdict sqlobject.inheritance.tests.test_deep_inheritance sqlobject.inheritance.tests.test_destroy_cascade sqlobject.inheritance.tests.test_foreignKey sqlobject.inheritance.tests.test_indexes sqlobject.inheritance.tests.test_inheritance sqlobject.inheritance.tests.test_inheritance_tree SQLObject-3.4.0/docs/api/sqlobject.inheritance.iteration.rst0000644000175000017500000000027012760105144023361 0ustar phdphd00000000000000sqlobject.inheritance.iteration module ====================================== .. automodule:: sqlobject.inheritance.iteration :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_class_hash.rst0000644000175000017500000000027012760105144023403 0ustar phdphd00000000000000sqlobject.tests.test_class_hash module ====================================== .. automodule:: sqlobject.tests.test_class_hash :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_combining_joins.rst0000644000175000017500000000030712760105144024443 0ustar phdphd00000000000000sqlobject.tests.test_combining_joins module =========================================== .. automodule:: sqlobject.tests.test_combining_joins :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_asdict.rst0000644000175000017500000000025412760105144022544 0ustar phdphd00000000000000sqlobject.tests.test_asdict module ================================== .. automodule:: sqlobject.tests.test_asdict :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.rst0000644000175000017500000000531312760105144020240 0ustar phdphd00000000000000sqlobject.tests package ======================= .. automodule:: sqlobject.tests :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.tests.dbtest sqlobject.tests.test_ForeignKey sqlobject.tests.test_NoneValuedResultItem sqlobject.tests.test_SQLMultipleJoin sqlobject.tests.test_SQLRelatedJoin sqlobject.tests.test_SingleJoin sqlobject.tests.test_aggregates sqlobject.tests.test_aliases sqlobject.tests.test_asdict sqlobject.tests.test_auto sqlobject.tests.test_basic sqlobject.tests.test_blob sqlobject.tests.test_boundattributes sqlobject.tests.test_cache sqlobject.tests.test_class_hash sqlobject.tests.test_columns_order sqlobject.tests.test_combining_joins sqlobject.tests.test_comparison sqlobject.tests.test_complex_sorting sqlobject.tests.test_constraints sqlobject.tests.test_converters sqlobject.tests.test_create_drop sqlobject.tests.test_csvexport sqlobject.tests.test_cyclic_reference sqlobject.tests.test_datetime sqlobject.tests.test_decimal sqlobject.tests.test_declarative sqlobject.tests.test_default_style sqlobject.tests.test_delete sqlobject.tests.test_distinct sqlobject.tests.test_empty sqlobject.tests.test_enum sqlobject.tests.test_events sqlobject.tests.test_exceptions sqlobject.tests.test_expire sqlobject.tests.test_groupBy sqlobject.tests.test_identity sqlobject.tests.test_indexes sqlobject.tests.test_inheritance sqlobject.tests.test_joins sqlobject.tests.test_joins_conditional sqlobject.tests.test_jsonbcol sqlobject.tests.test_jsoncol sqlobject.tests.test_lazy sqlobject.tests.test_md5 sqlobject.tests.test_mysql sqlobject.tests.test_new_joins sqlobject.tests.test_parse_uri sqlobject.tests.test_paste sqlobject.tests.test_perConnection sqlobject.tests.test_pickle sqlobject.tests.test_picklecol sqlobject.tests.test_postgres sqlobject.tests.test_reparent_sqlmeta sqlobject.tests.test_schema sqlobject.tests.test_select sqlobject.tests.test_select_through sqlobject.tests.test_setters sqlobject.tests.test_slice sqlobject.tests.test_sorting sqlobject.tests.test_sqlbuilder sqlobject.tests.test_sqlbuilder_dbspecific sqlobject.tests.test_sqlbuilder_importproxy sqlobject.tests.test_sqlbuilder_joins_instances sqlobject.tests.test_sqlite sqlobject.tests.test_sqlmeta_idName sqlobject.tests.test_sqlobject_admin sqlobject.tests.test_string_id sqlobject.tests.test_style sqlobject.tests.test_subqueries sqlobject.tests.test_transactions sqlobject.tests.test_unicode sqlobject.tests.test_uuidcol sqlobject.tests.test_validation sqlobject.tests.test_views SQLObject-3.4.0/docs/api/sqlobject.tests.test_subqueries.rst0000644000175000017500000000027012760105144023462 0ustar phdphd00000000000000sqlobject.tests.test_subqueries module ====================================== .. automodule:: sqlobject.tests.test_subqueries :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_string_id.rst0000644000175000017500000000026512760105144023261 0ustar phdphd00000000000000sqlobject.tests.test_string_id module ===================================== .. automodule:: sqlobject.tests.test_string_id :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_declarative.rst0000644000175000017500000000027312760105144023561 0ustar phdphd00000000000000sqlobject.tests.test_declarative module ======================================= .. automodule:: sqlobject.tests.test_declarative :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_columns_order.rst0000644000175000017500000000030112760105144024141 0ustar phdphd00000000000000sqlobject.tests.test_columns_order module ========================================= .. automodule:: sqlobject.tests.test_columns_order :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_SingleJoin.rst0000644000175000017500000000027012760105144023334 0ustar phdphd00000000000000sqlobject.tests.test_SingleJoin module ====================================== .. automodule:: sqlobject.tests.test_SingleJoin :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.wsgi_middleware.rst0000644000175000017500000000024612760105144022244 0ustar phdphd00000000000000sqlobject.wsgi_middleware module ================================ .. automodule:: sqlobject.wsgi_middleware :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.declarative.rst0000644000175000017500000000023212760105144021354 0ustar phdphd00000000000000sqlobject.declarative module ============================ .. automodule:: sqlobject.declarative :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_basic.rst0000644000175000017500000000025112760105144022353 0ustar phdphd00000000000000sqlobject.tests.test_basic module ================================= .. automodule:: sqlobject.tests.test_basic :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_SQLRelatedJoin.rst0000644000175000017500000000030412760105144024051 0ustar phdphd00000000000000sqlobject.tests.test_SQLRelatedJoin module ========================================== .. automodule:: sqlobject.tests.test_SQLRelatedJoin :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_transactions.rst0000644000175000017500000000027612760105144024011 0ustar phdphd00000000000000sqlobject.tests.test_transactions module ======================================== .. automodule:: sqlobject.tests.test_transactions :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_datetime.rst0000644000175000017500000000026212760105144023070 0ustar phdphd00000000000000sqlobject.tests.test_datetime module ==================================== .. automodule:: sqlobject.tests.test_datetime :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.firebird.firebirdconnection.rst0000644000175000017500000000031212760105144024523 0ustar phdphd00000000000000sqlobject.firebird.firebirdconnection module ============================================ .. automodule:: sqlobject.firebird.firebirdconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.events.rst0000644000175000017500000000021312760105144020374 0ustar phdphd00000000000000sqlobject.events module ======================= .. automodule:: sqlobject.events :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_aggregates.rst0000644000175000017500000000027012760105144023404 0ustar phdphd00000000000000sqlobject.tests.test_aggregates module ====================================== .. automodule:: sqlobject.tests.test_aggregates :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_style.rst0000644000175000017500000000025112760105144022432 0ustar phdphd00000000000000sqlobject.tests.test_style module ================================= .. automodule:: sqlobject.tests.test_style :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_exceptions.rst0000644000175000017500000000027012760105144023454 0ustar phdphd00000000000000sqlobject.tests.test_exceptions module ====================================== .. automodule:: sqlobject.tests.test_exceptions :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.main.rst0000644000175000017500000000020512760105144020015 0ustar phdphd00000000000000sqlobject.main module ===================== .. automodule:: sqlobject.main :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_events.rst0000644000175000017500000000025412760105144022601 0ustar phdphd00000000000000sqlobject.tests.test_events module ================================== .. automodule:: sqlobject.tests.test_events :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_indexes.rst0000644000175000017500000000025712760105144022737 0ustar phdphd00000000000000sqlobject.tests.test_indexes module =================================== .. automodule:: sqlobject.tests.test_indexes :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_parse_uri.rst0000644000175000017500000000026512760105144023270 0ustar phdphd00000000000000sqlobject.tests.test_parse_uri module ===================================== .. automodule:: sqlobject.tests.test_parse_uri :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_empty.rst0000644000175000017500000000025112760105144022430 0ustar phdphd00000000000000sqlobject.tests.test_empty module ================================= .. automodule:: sqlobject.tests.test_empty :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.styles.rst0000644000175000017500000000021312760105144020413 0ustar phdphd00000000000000sqlobject.styles module ======================= .. automodule:: sqlobject.styles :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_jsonbcol.rst0000644000175000017500000000026212760105144023105 0ustar phdphd00000000000000sqlobject.tests.test_jsonbcol module ==================================== .. automodule:: sqlobject.tests.test_jsonbcol :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_distinct.rst0000644000175000017500000000026212760105144023115 0ustar phdphd00000000000000sqlobject.tests.test_distinct module ==================================== .. automodule:: sqlobject.tests.test_distinct :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_validation.rst0000644000175000017500000000027012760105144023425 0ustar phdphd00000000000000sqlobject.tests.test_validation module ====================================== .. automodule:: sqlobject.tests.test_validation :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_select_through.rst0000644000175000017500000000030412760105144024310 0ustar phdphd00000000000000sqlobject.tests.test_select_through module ========================================== .. automodule:: sqlobject.tests.test_select_through :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.inheritance.tests.test_deep_inheritance.rst0000644000175000017500000000035612760105144027056 0ustar phdphd00000000000000sqlobject.inheritance.tests.test_deep_inheritance module ======================================================== .. automodule:: sqlobject.inheritance.tests.test_deep_inheritance :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.views.rst0000644000175000017500000000021012760105144020222 0ustar phdphd00000000000000sqlobject.views module ====================== .. automodule:: sqlobject.views :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_pickle.rst0000644000175000017500000000025412760105144022544 0ustar phdphd00000000000000sqlobject.tests.test_pickle module ================================== .. automodule:: sqlobject.tests.test_pickle :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_uuidcol.rst0000644000175000017500000000025712760105144022744 0ustar phdphd00000000000000sqlobject.tests.test_uuidcol module =================================== .. automodule:: sqlobject.tests.test_uuidcol :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.dberrors.rst0000644000175000017500000000022112760105144020711 0ustar phdphd00000000000000sqlobject.dberrors module ========================= .. automodule:: sqlobject.dberrors :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_joins.rst0000644000175000017500000000025112760105144022414 0ustar phdphd00000000000000sqlobject.tests.test_joins module ================================= .. automodule:: sqlobject.tests.test_joins :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_select.rst0000644000175000017500000000025412760105144022554 0ustar phdphd00000000000000sqlobject.tests.test_select module ================================== .. automodule:: sqlobject.tests.test_select :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.converters.rst0000644000175000017500000000022712760105144021267 0ustar phdphd00000000000000sqlobject.converters module =========================== .. automodule:: sqlobject.converters :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_cache.rst0000644000175000017500000000025112760105144022335 0ustar phdphd00000000000000sqlobject.tests.test_cache module ================================= .. automodule:: sqlobject.tests.test_cache :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_sqlbuilder_joins_instances.rst0000644000175000017500000000035012760105144026711 0ustar phdphd00000000000000sqlobject.tests.test_sqlbuilder_joins_instances module ====================================================== .. automodule:: sqlobject.tests.test_sqlbuilder_joins_instances :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.mysql.mysqlconnection.rst0000644000175000017500000000027012760105144023464 0ustar phdphd00000000000000sqlobject.mysql.mysqlconnection module ====================================== .. automodule:: sqlobject.mysql.mysqlconnection :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_csvexport.rst0000644000175000017500000000026512760105144023334 0ustar phdphd00000000000000sqlobject.tests.test_csvexport module ===================================== .. automodule:: sqlobject.tests.test_csvexport :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.postgres.rst0000644000175000017500000000033512760105144020743 0ustar phdphd00000000000000sqlobject.postgres package ========================== .. automodule:: sqlobject.postgres :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.postgres.pgconnection SQLObject-3.4.0/docs/api/sqlobject.mssql.rst0000644000175000017500000000032412760105144020232 0ustar phdphd00000000000000sqlobject.mssql package ======================= .. automodule:: sqlobject.mssql :members: :undoc-members: :show-inheritance: Submodules ---------- .. toctree:: sqlobject.mssql.mssqlconnection SQLObject-3.4.0/docs/api/sqlobject.tests.test_comparison.rst0000644000175000017500000000027012760105144023445 0ustar phdphd00000000000000sqlobject.tests.test_comparison module ====================================== .. automodule:: sqlobject.tests.test_comparison :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.joins.rst0000644000175000017500000000021012760105144020207 0ustar phdphd00000000000000sqlobject.joins module ====================== .. automodule:: sqlobject.joins :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_joins_conditional.rst0000644000175000017500000000031512760105144025000 0ustar phdphd00000000000000sqlobject.tests.test_joins_conditional module ============================================= .. automodule:: sqlobject.tests.test_joins_conditional :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/api/sqlobject.tests.test_expire.rst0000644000175000017500000000025412760105144022571 0ustar phdphd00000000000000sqlobject.tests.test_expire module ================================== .. automodule:: sqlobject.tests.test_expire :members: :undoc-members: :show-inheritance: SQLObject-3.4.0/docs/europython/0000755000175000017500000000000013141371614016021 5ustar phdphd00000000000000SQLObject-3.4.0/docs/europython/main.css0000644000175000017500000000103710707101542017454 0ustar phdphd00000000000000BODY { margin-left: 1em; margin-right: 1em; border: 0px; padding: 0px; font-family: helvetica, ariel; color: #445566; } H1, H2 { font-family: helvetica; } PRE { padding: 5px; font-family: Courier Bold, Courier; border: thin black solid; background-color: #f5f5f5; font-size: medium; } HR { display: inline; } UL { padding: 0px; margin: 0px; margin-left: 1em; padding-left: 1em; border-left: 1em; } LI { padding: 2px; } .footer { position: absolute; bottom: 0.5em; text-align: right; } SQLObject-3.4.0/docs/europython/europython_sqlobj.py0000644000175000017500000002604510707101542022164 0ustar phdphd00000000000000#!/usr/bin/python2.2 from slides import Lecture, NumSlide, Slide, Bullet, SubBullet, PRE, URL lecture = Lecture( "High-Level Database Interaction with SQLObject", Slide("What is SQLObject?", Bullet("YAORM -- Yet Another Object-Relational Mapper"), Bullet("Allows you to map your Python classes to your database schema"), Bullet("Takes the 'SQL' out of 'Database Programming'"), Bullet("No special XML files to create, just normal Python classes")), Slide("Why Do I Care?", Bullet("Storing things in databases is fairly common in day-to-day programming"), Bullet("SQL is the standard language used to manipulate data in a database"), Bullet("Writing SQL is boring, repetitive and depressing"), Bullet("SQLObject relieves you of the burden of writing SQL"), Bullet("...but still lets you write SQL when you need to")), Slide("How does SQLObject differ from other ORM's?", Bullet("Simple is better than complex; SQLObject is very simple to use"), Bullet("Mappings are defined using normal Python classes"), Bullet("Uses properties instead of wordy get/set methods for column attributes"), Bullet("No awkward auto-generation of Python classes/files from an external format"), Bullet("Can generate the database schema directly from your Python classes, and vice versa"), Bullet("Overloading magic provides convenient high-level SQL conditionals (dot q magic, stay tuned)")), Slide("A Simple Example", Bullet("To create a table that looks like this in SQL:", PRE("""\ CREATE TABLE person ( id SERIAL, first_name VARCHAR(100) NOT NULL, middle_initial CHAR(1), last_name VARCHAR(150) NOT NULL );""")), Bullet("You would write this Python code:", PRE("""\ class Person(SQLObject): firstName = StringCol(length = 100) middleInitial = StringCol(length = 1) lastName = StringCol(length = 150)""")), Bullet("No, you're not dreaming :P")), Slide("Declaring the Class", PRE("""\ from SQLObject import * conn = PostgresConnection(db = 'testdb', user = 'testuser', password = 'testpass') class Person(SQLObject): _connection = conn firstName = StringCol(length = 100) middleInitial = StringCol(length = 1) lastName = StringCol(length = 150)"""), Bullet("Use one of MySQLConnection, PostgresConnection, SQLiteConnection or DBMConnection as your _connection"), Bullet("Use StudlyCaps for your classes and mixedCase for your columns"), Bullet("SQLObject will map TableName to table_name and columnName to column_name"), Bullet("In the above example, class Person becomes table person with columns first_name, middle_initial and last_name")), Slide("Creating and Dropping Tables", Bullet("Use the createTable class method to create the table, and two optional keyword arguments",SubBullet( Bullet("ifNotExists: only try to create the table if it doesn't exist"), Bullet("createJoinTables: will create the intermediate tables for many-to-many relationships"))), Bullet("Conversely, use dropTable, passing in optional ifExists and dropJoinTables keyword arguments")), Slide("More on Column Syntax", Bullet("Columns are implemented as properties"), Bullet("Columns supported: StringCol, IntCol, FloatCol, EnumCol, DateTimeCol, ForeignKey, DecimalCol, CurrencyCol"), Bullet("The first argument is the column name"), Bullet("Keyword arguments specify additional information (e.g. notNone, default, length, etc)"), Bullet("SQLObject lets the database do type checking and coercion"), Bullet("An id column is implicitly created")), Slide("Keyword Arguments for Col Classes", Bullet("dbName: the column name in the database"), Bullet("default: the default value (can be a callable that returns a value)"), Bullet("alternateID: set this to True if you want a byColumnName method to lookup rows based on this column"), Bullet("unique: declare this column as UNIQUE in the database"), Bullet("notNone: when True, column cannot be None/NULL"), Bullet("sqlType: to specify the column type manually")), Slide("Special Class Attributes", Bullet("_connection: the database connection for this class"), Bullet("_table: the database name of the table behind this class"), Bullet("_joins: a list of join relationships to other classes"), Bullet("_cacheValues: you'll want to set this false if you're using SQLObject classes from multiple processes"), Bullet("_idName: the name of the PK (defaults to id)"), Bullet("_style: a style object that provides a custom Python to DB name mapping algorithm")), Slide("Using SQLObject classes", Bullet("Create a new row with the .new() class method:", PRE("""Person.new(firstName = "Brad", middleInitial = "E", lastName = "Bollenbach")""")), Bullet("The row is inserted into the database as soon as you call .new()"), Bullet("Access an existing row by passing an ID to the class constructor", PRE("""\ >>> me = Person(1) >>> me.firstName 'Brad' >>> me.lastName 'Bollenbach'""")), Bullet("Modify column values by modifying property values"), Bullet("Changes to your object's properties are updated immediately in the database"), Bullet("...but transactions are there if you need them (as long as the database supports them)")), Slide("Relating Your Classes with Joins", Bullet("Use a ForeignKey to point a column's value at an instance of another class"), Bullet("To relate the PK class back to the FK class, use MultipleJoin"), Bullet("Let's give a Person some PhoneNumbers:", PRE("""\ from SQLObject import * conn = PostgresConnection(db = 'testdb', user = 'testuser', password = 'testpass') class Person(SQLObject): _connection = conn firstName = StringCol(length = 100) middleInitial = StringCol(length = 1) lastName = StringCol(length = 150) phoneNumbers = MultipleJoin("PhoneNumber") class PhoneNumber(SQLObject): _connection = conn person = ForeignKey('Person') phoneNumber = StringCol(length = 10)"""))), Slide("Many-to-Many Relationships", Bullet("A Person might have many Roles"), Bullet("A Role might be associated to more than one Person"), Bullet("Use a RelatedJoin to specify this many-to-many relation:", PRE("""\ class Role(SQLObject): _connection = conn name = StringCol(length = 20) people = RelatedJoin('Person') class Person(SQLObject): _connection = conn firstName = StringCol(length = 100) middleInitial = StringCol(length = 1) lastName = StringCol(length = 150) phoneNumbers = MultipleJoin("PhoneNumber") roles = RelatedJoin('Role') me = Person.new(firstName = "Brad", middleInitial = "E", lastName = "Bollenbach") pg = Role.new(name = "Python Geek") me.addRole(pg) """)), Bullet("SQLObject added .addRole() and .removeRole() methods to Person, and .addPerson() and .removePerson() methods to Role"), Bullet("...and created the person_role table by combining the table names of the classes alphabetically")), Slide("Selecting Multiple Objects", Bullet("The class's .select() method can be used to return multiple objects of a given class"), Bullet("Here comes that dot q magic!"), Bullet("Every SQLObject derived class has a special .q attribute, which uses some funky operator overloading to construct queries using Python:", PRE("""\ >>> from person import Person >>> brads = Person.select(Person.q.firstName == 'Brad') >>> list(brads) [] >>> brads[0].lastName 'Bollenbach' """)), Bullet("select accepts orderBy and groupBy arguments, which are strings (or lists of strings) referring to the database column name")), Slide("Selects using AND", Bullet("Use the AND function for more specific queries:", PRE("""\ >>> from SQLObject import AND >>> from person import Person >>> bradbs = Person.select(AND( ... Person.q.firstName == "Brad", ... Person.q.lastName.startswith("B") ... )) """)), Bullet("Notice that columns-as-properties behave much like their regular Python brethren (e.g. the startswith method used above)")), Slide("Customizing Column Behaviour", Bullet("SQLObject uses metaclasses to generate the classes that map to your database tables"), Bullet("Because of this, you need to take care when overriding property get/set methods"), Bullet("You cannot simply override the get or set method and then call the parent's get/set"), Bullet("Your class was created on-the-fly by SQLObject, so there is no parent! :)"), Bullet("SQLObject gives you special _SO_set_foo and _SO_get_foo methods to call after you've done something special in the normal setter/getter"), Bullet("An (admittedly contrived) example:", PRE("""\ class Person(SQLObject): ... def _set_lastName(self, value): print "you changed the lastName" self._SO_set_lastName(value) """))), Slide("Better Living Through Transactions", Bullet("Sometimes you need indivisibility"), Bullet("SQLObject lets you use transactions when the database supports it", PRE("""\ trans = Person._connection.transaction() me = Person(1, trans) me.firstName = "Bradley" trans.commit() me.firstName = "Yada" trans.rollback()"""))), Slide("Automatic Class Generation", Bullet("Laziness is a virtue"), Bullet("If you've already got a database schema, why not make SQLObject create the classes for you?", PRE("""\ class Person(SQLObject): _fromDatabase = True""")), Bullet("You can still specify additional columns, as normal"), Bullet("Currently this only works with MySQL")), Slide("SQLObject in the Wild -- Who's Using It?", Bullet("BBnet.ca used SQLObject to develop a time tracking and invoicing system for a consultancy firm"), Bullet("XSOLI (my current employer) is using SQLObject to drive an online merchant proxy system"), Bullet("Future projects will probably include porting a sales application for a fixed-rate long distance provider onto the SQLObject framework"), Bullet("There's an active community of SQLObject developers on the mailing list")), Slide("Future Plans for SQLObject", Bullet("Add support for more databases"), Bullet("Improved transaction handling, so that even non-transaction aware backends can support them"), Bullet("Hooks into a validation and form generation package"), Bullet("More powerful joins")), Slide("Summing Up", Bullet("SQLObject is a framework that can make your (programming) life easier"), Bullet("It provides a high-level Python interface to things you'd normally suffer through in SQL"), Bullet("It offers advanced features like automatic class generation, dot q magic, and various ways of relating your classes"), Bullet("There's an active developer base, so you're sure to find help when you need it")), Slide("Resources", Bullet("SQLObject homepage -- ", URL("http://www.sqlobject.org")), Bullet("Join the mailing list -- ", URL("http://lists.sourceforge.net/lists/listinfo/sqlobject-discuss")), Bullet("Contact me -- brad@bbnet.ca")), Slide("Questions?", Bullet("Questions?"), Bullet("Comments?"), Bullet("Free beer?"))) lecture.renderHTML(".", "slide-%02d.html", css="main.css") SQLObject-3.4.0/docs/europython/person.py0000644000175000017500000000121110707101542017670 0ustar phdphd00000000000000from SQLObject import * conn = PostgresConnection(db = 'merchant_test', user = 'merchant_test', password = 'mtest') class Role(SQLObject): _connection = conn name = StringCol(length = 20) people = RelatedJoin('Person') class Person(SQLObject): _cacheValues = False _connection = conn firstName = StringCol(length = 100) middleInitial = StringCol(length = 1) lastName = StringCol(length = 150) phoneNumbers = MultipleJoin("PhoneNumber") roles = RelatedJoin('Role') class PhoneNumber(SQLObject): _connection = conn person = ForeignKey('Person') phoneNumber = StringCol(length = 10) SQLObject-3.4.0/docs/Inheritance.rst0000644000175000017500000002602512753243417016604 0ustar phdphd00000000000000:Author: Daniel Savard, XSOLI Inc. .. contents:: Inheritance ----------- Why ~~~ Imagine you have a list of persons, and every person plays a certain role. Some persons are students, some are professors, some are employees. Every role has different attributes. Students are known by their department and year. Professors have a department (some attributes are common for all or some roles), timetable and other attributes. How does one implement this in SQLObject? Well, the obvious approach is to create a table Person with a column that describes or name the role, and a table for an every role. Then one must write code that interprets and dereferences the role column. Well, the inheritance machinery described below does exactly this! Only it does it automagically and mostly transparent to the user. First, you create a table Person. Nothing magical here:: class Person(SQLObject): name = StringCol() age = FloatCol() Now you need a hierarchy of roles:: class Role(InheritableSQLObject): department = StringCol() The magic starts here! You inherit the class from the special root class ``InheritableSQLObject`` and provide a set of attributes common for all roles. Other roles must be inherited from Role:: class Student(Role): year = IntCol() class Professor(Role): timetable = StringCol() Now you want a column in Person that can be interpreted as the role. Easy:: class Person(SQLObject): name = StringCol() age = FloatCol() role = ForeignKey("Role") That's all, really! When asked for its role, Person returns the value of its .role attribute dereferenced and interpreted. Instead of returning an instance of class Role it returns an instance of the corresponding subclass - a Student or a Professor. This is a brief explanation based on a task people meet most often, but of course it can be used far beyond the person/role task. I also omitted all details in the explanation. Now look at the real working program:: from sqlobject import * from sqlobject.inheritance import InheritableSQLObject __connection__ = "sqlite:/:memory:" class Role(InheritableSQLObject): department = StringCol() class Student(Role): year = IntCol() class Professor(Role): timetable = StringCol(default=None) class Person(SQLObject): name = StringCol() age = FloatCol() role = ForeignKey("Role", default=None) Role.createTable() Student.createTable() Professor.createTable() Person.createTable() first_year = Student(department="CS", year=1) lecturer = Professor(department="Mathematics") student = Person(name="A student", age=21, role=first_year) professor = Person(name="A professor", age=42, role=lecturer) print student.role print professor.role It prints:: You can get the list of all available roles:: print list(Role.select()) It prints:: [, ] Look - you have gotten a list of Role's subclasses! If you add a MultipleJoin column to Role, you can list all persons for a given role:: class Role(InheritableSQLObject): department = StringCol() persons = MultipleJoin("Person") for role in Role.select(): print role.persons It prints:: [] [] If you you want your persons to have many roles, use RelatedJoin:: class Role(InheritableSQLObject): department = StringCol() persons = RelatedJoin("Person") class Student(Role): year = IntCol() class Professor(Role): timetable = StringCol(default=None) class Person(SQLObject): name = StringCol() age = FloatCol() roles = RelatedJoin("Role") Role.createTable() Student.createTable() Professor.createTable() Person.createTable() first_year = Student(department="CS", year=1) lecturer = Professor(department="Mathematics") student = Person(name="A student", age=21) student.addRole(first_year) professor = Person(name="A professor", age=42) professor.addRole(lecturer) print student.roles print professor.roles for role in Role.select(): print role.persons It prints:: [] [] [] [] Who, What and How ~~~~~~~~~~~~~~~~~ Daniel Savard has implemented inheritance for SQLObject. In `terms of ORM`_ this is a kind of vertical inheritance. The only difference is that objects reference their leaves, not parents. Links to parents are reconstructed at run-time using the hierarchy of Python classes. .. _`terms of ORM`: https://cayenne.apache.org/docs/3.0/modeling-inheritance.html * As suggested by Ian Bicking, each child class now has the same ID as the parent class. No more need for childID column and parent foreignKey (and a small speed boost). * No more need to call getSubClass, as the 'latest' child will always be returned when an instance of a class is created. * This version now seems to work correctly with addColumn, delColumn, addJoin and delJoin. The following code:: from sqlobject.inheritance import InheritableSQLObject class Person(InheritableSQLObject): firstName = StringCol() lastName = StringCol() class Employee(Person): _inheritable = False position = StringCol() will generate the following tables:: CREATE TABLE person ( id INT PRIMARY KEY, child_name TEXT, first_name TEXT, last_name TEXT ); CREATE TABLE employee ( id INT PRIMARY KEY, position TEXT ) A new class attribute ``_inheritable`` is added. When this new attribute is set to 1, the class is marked 'inheritable' and a new columns will automatically be added: childName (TEXT). Each class that inherits from a parent class will get the same ID as the parent class. So, there is no need to keep track of parent ID and child ID, as they are the same. The column childName will contain the name of the child class (for example 'Employee'). This will permit a class to always return its child class if available (a person that is also an employee will always return an instance of the employee class). For example, the following code:: p = Person(firstName='John', lastName='Doe') e = Employee(firstName='Jane', lastName='Doe', position='Chief') p2 = Person.get(1) Will create the following data in the database:: *Person* id child_name first_name last_name 0 Null John Doe 1 Employee Jane Doe *Employee* id position 1 Chief You will still be able to ask for the attribute normally: e.firstName will return Jane and setting it will write the new value in the person table. If you use p2, as p2 is a person object, you will get an employee object. person(0) will return a Person instance and will have the following attributes: firstName and lastName. person(1) or employee(1) will both return the same Employee instance and will have the following attributes: firstName, lastName and position. Also, deleting a person or an employee that is linked will destroy both entries as one would expect. The SQLObject q magic also works. Using these selects is valid:: Employee.select(AND(Employee.q.firstName == 'Jane', Employee.q.position == 'Chief')) will return Jane Doe Employee.select(AND(Person.q.firstName == 'Jane', Employee.q.position == 'Chief')) will return Jane Doe Employee.select(Employee.q.lastName == 'Doe') will only return Jane Doe (as Joe isn't an employee) Person.select(Person.q.lastName == 'Doe') will return both entries. The SQL 'where' clause will contain additional clauses when used with 'inherited' classes. These clauses are the link between the id and the parent id. This will look like the following request:: SELECT employee.id, person.first_name, person.last_name FROM person, employee WHERE person.first_name = 'Jane' AND employee.position = 'Chief' AND person.id = employee.id Limitations and notes ~~~~~~~~~~~~~~~~~~~~~ * Only single inheritance will work. It is not possible to inherit from multiple SQLObject classes. * It is possible to inherit from an inherited class and this will work well. In the above example, you can have a Chief class that inherits from Employee and all parents attributes will be available through the Chief class. * You may not redefine columns in an inherited class (this will raise an exception). * If you don't want 'childName' columns in your last class (one that will never be inherited), you must set '_inheritable' to False in this class. * The inheritance implementation is incompatible with lazy updates. Do not set lazyUpdate to True. If you need this, you have to patch SQLObject and override many methods - _SO_setValue(), sync(), syncUpdate() at least. Patches will be gladly accepted. * You'd better restrain yourself to simple use cases. The inheritance implementation is easily choked on more complex cases. * A join between tables inherited from the same parent produces incorrect result due to joins to the same parent table (they must use different aliases). * Inheritance works in two stages - first it draws the IDs from the parent table and then it draws the rows from the children tables. The first stage could fail if you try to do complex things. For example, Children.select(orderBy=Children.q.column, distinct=True) could fail because at the first stage inheritance generates a SELECT query for the parent table with ORDER BY the column from the children table. * I made it because I needed to be able to have automatic inheritance with linked tables. * This version works for me; it may not work for you. I tried to do my best but it is possible that I broke some things... So, there is no warranty that this version will work. * Thanks to Ian Bicking for SQLObject; this is a wonderful python module. * Although all the attributes are inherited, the same does not apply to sqlmeta data. Don't try to get a parent column via the sqlmeta.columns dictionary of an inherited InheritableSQLObject: it will raise a KeyError. The same applies to joins: the sqlmeta.joins list will be empty in an inherited InheritableSQLObject if a join has been defined in the parent class, even though the join method will work correctly. * If you have suggestion, bugs, or patch to this patch, you can contact the SQLObject team: .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/index.rst0000644000175000017500000000367113111120107015437 0ustar phdphd00000000000000+++++++++ SQLObject +++++++++ SQLObject is a popular *Object Relational Manager* for providing an object interface to your database, with tables as classes, rows as instances, and columns as attributes. SQLObject includes a Python-object-based query language that makes SQL more abstract, and provides substantial database independence for applications. Documentation ============= .. toctree:: :maxdepth: 1 download community links News Python3 SQLObject FAQ SQLBuilder SelectResults sqlobject-admin Inheritance Versioning Views DeveloperGuide Authors TODO Example ======= This is just a snippet that creates a simple class that wraps a table:: >>> from sqlobject import * >>> >>> sqlhub.processConnection = connectionForURI('sqlite:/:memory:') >>> >>> class Person(SQLObject): ... fname = StringCol() ... mi = StringCol(length=1, default=None) ... lname = StringCol() ... >>> Person.createTable() SQLObject supports most database schemas that you already have, and can also issue the ``CREATE`` statement for you (seen in ``Person.createTable()``). Here's how you'd use the object:: >>> p = Person(fname="John", lname="Doe") >>> p >>> p.fname 'John' >>> p.mi = 'Q' >>> p2 = Person.get(1) >>> p2 >>> p is p2 True Queries:: >>> p3 = Person.selectBy(lname="Doe")[0] >>> p3 >>> pc = Person.select(Person.q.lname=="Doe").count() >>> pc 1 Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/sqlobject-admin.rst0000644000175000017500000001642712752476767017453 0ustar phdphd00000000000000++++++++++++++++++++++++ The sqlobject-admin Tool ++++++++++++++++++++++++ :author: Ian Bicking :revision: $Rev$ :date: $LastChangedDate$ .. contents:: .. warning:: This document isn't entirely accurate; some of what it describes are the intended features of the tool, not the actual features. Particularly inaccurate is how modules and classes are found. Introduction ------------ The ``sqlobject-admin`` tool included with SQLObject allows you to manage your database as defined with SQLObject classes. Some of the features include creating tables, checking the status of the database, recording a version of a schema, and updating the database to match the version of the schema in your code. To see a list of commands run ``sqlobject-admin help``. Each sub-command has ``-h`` option which explains the details of that command. Common Options ============== Many of the commands share some common options, mostly for finding the database and classes. ``-c CONNECTION`` or ``--connection=CONNECTION``: This takes an argument, the connection string for the database. This overrides any connection the classes have (if they are hardwired to a connection). ``-f FILENAME`` or ``--config-file=FILENAME``: This is a configuration file from which to get the connection. This configuration file should be a Python-syntax file that defines a global variable ``database``, which is the connection string for the database. ``-m MODULE`` or ``--module=MODULE``: A module to look in for classes. ``MODULE`` is something like ``myapp.amodule``. Remember to set your ``$PYTHONPATH`` if the module can't be imported. You can provide this argument multiple times. ``-p PACKAGE`` or ``--package=PACKAGE``: A package to look in. This looks in all the modules in this class and subclasses for SQLObject classes. ``--class=CLASSMATCH``: This *restricts* the classes found to the matching classes. You may use wildcards. You can provide multiple ``--class`` arguments, and if any pattern matches the class will be included. ``--egg=EGG_SPEC``: This is an `Egg `_ description that should be loaded. So if you give ``--egg=ProjectName`` it'll load that egg, and look in ``ProjectName.egg-info/sqlobject.txt`` for some settings (like ``db_module`` and ``history_dir``). When finding SQLObject classes, we look in the modules for classes that belong to the module -- so if you import a class from another module it won't be "matched". You have to indicate its original module. If classes have to be handled in a specific order, create a ``soClasses`` global variable that holds a list of the classes. This overrides the module restrictions. This is important in databases with referential integrity, where dependent tables can't be created before the tables they depend on. Simple Commands =============== The ``create`` Command ---------------------- This finds the tables and creates them. Any tables that exist are simply skipped. It also collects data from sqlmeta.createSQL (added in svn trunk) and runs the queries after table creation. createSQL can be a string with a single SQL command, a list of SQL commands, or a dictionary with keys that are dbNames and values that are either single SQL command string or a list of SQL commands. An example follows:: class MyTable(SQLObject): class sqlmeta: createSQL = {'postgres': [ "ALTER TABLE my_table ADD CHECK(my_field != '');", ]} myField = StringCol() The ``sql`` Command ------------------- This shows the SQL to create all the tables. The ``drop`` Command -------------------- Drops tables! Missing tables are skipped. The ``execute`` Command ----------------------- This executes an arbitrary SQL expression. This is mostly useful if you want to run a query against a database described by a SQLObject connection string. Use ``--stdin`` if you want to pipe commands in; otherwise you give the commands as arguments. The ``list`` Command -------------------- Lists out all the classes found. This can help you figure out what classes you are dealing with, and if there's any missing that you expected. The ``status`` Command ---------------------- This shows if tables are present in the database. If possible (it depends on the database) it will also show if the tables are missing any columns, or have any extra columns, when compared to the table the SQLObject class describes. It doesn't check column types, indexes, or constraints. This feature may be added in the future. Versioning & Upgrading ====================== There's two commands related to storing the schema and upgrading the database: ``record`` and ``upgrade``. The idea is that you record each iteration of your schema, and this gets a version number. Something like ``2003-05-04a``. If you are using source control you'll check all versions into your repository; you don't overwrite one with the next. In addition to the on-disk record of the different schemas you have gone through, the database itself contains a record of what version it is at. By having all the versions available at once, we can upgrade from any version. But more on that `later `_ Basic Usage ----------- Here's a quick summary of how you use these commands: 1. In project where you've never used ``sqlobject-admin`` before, you run ``sqlobject-admin record --output-dir=sqlobject-history``. If your active database is up-to-date with the code, then the tool will add a ``sqlobject_db_version`` table to the database with the current version. 2. Now, make some updates to your code. Don't update the database! (You could, but for now it's more fun if you don't.) 3. Run ``sqlobject-admin record --edit``. A new version will be created, and an editor will be opened up. The ``record`` Command ---------------------- Record will take the SQL ``CREATE`` statements for your tables, and output them in new version. It creates the version by using the ISO-formatted date (YYYY-MM-DD) and a suffix to make it unique. It puts each table in its own file. This normally doesn't touch the database at all -- it only records the schema as defined in your code, regardless of the database. In fact, I recommend calling ``record`` *before* you update your database. The ``upgrade`` Command ----------------------- Future ====== * Get ``record`` to do ``svn cp`` when creating a new version, then write over those files; this way the version control system will have nice diffs. * An option to ``record`` the SQL for multiple database backends at once (now only the active backend is recorded). * An option to upgrade databases with Python scripts instead of SQL commands. Or a little of both. * Review all the verbosity, maybe add logging, review simulation. * Generate simple ``ALTER`` statements for upgrade scripts, to give people something to work with. Maybe. * A command to trim versions, by merging upgrade scripts. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/docs/download.rst0000644000175000017500000000343113044134753016152 0ustar phdphd00000000000000Download SQLObject ++++++++++++++++++ The latest releases are always available on the `Python Package Index `_, and is installable with `pip `_ or `easy_install `_. You can install the latest release with:: pip install -U SQLObject or:: easy_install -U SQLObject You can install the latest version of SQLObject with:: easy_install SQLObject==dev You can install the latest bug fixing branch with:: easy_install SQLObject==bugfix If you want to require a specific revision (because, for instance, you need a bugfix that hasn't appeared in a release), you can put this in your `setuptools `_ using ``setup.py`` file:: setup(... install_requires=["SQLObject==bugfix,>=0.7.1dev-r1485"], ) This says that you *need* revision 1485 or higher. But it also says that you can aquire the "bugfix" version to try to get that. In fact, when you install ``SQLObject==bugfix`` you will be installing a specific version, and "bugfix" is just a kind of label for a way of acquiring the version (it points to a branch in the repository). Repositories ------------ The SQLObject `git `_ repositories are located at https://github.com/sqlobject and https://sourceforge.net/p/sqlobject/_list/git Before switching to git development was performed at the Subversion repository that is no longer available. .. image:: https://sourceforge.net/sflogo.php?group_id=74338&type=10 :target: https://sourceforge.net/projects/sqlobject :class: noborder :align: center :height: 15 :width: 80 :alt: Get SQLObject at SourceForge.net. Fast, secure and Free Open Source software downloads SQLObject-3.4.0/LICENSE0000644000175000017500000006364212775263455013714 0ustar phdphd00000000000000 GNU LESSER GENERAL PUBLIC LICENSE Version 2.1, February 1999 Copyright (C) 1991, 1999 Free Software Foundation, Inc. 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. [This is the first released version of the Lesser GPL. It also counts as the successor of the GNU Library Public License, version 2, hence the version number 2.1.] Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public Licenses are intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This license, the Lesser General Public License, applies to some specially designated software packages--typically libraries--of the Free Software Foundation and other authors who decide to use it. You can use it too, but we suggest you first think carefully about whether this license or the ordinary General Public License is the better strategy to use in any particular case, based on the explanations below. When we speak of free software, we are referring to freedom of use, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish); that you receive source code or can get it if you want it; that you can change the software and use pieces of it in new free programs; and that you are informed that you can do these things. To protect your rights, we need to make restrictions that forbid distributors to deny you these rights or to ask you to surrender these rights. These restrictions translate to certain responsibilities for you if you distribute copies of the library or if you modify it. For example, if you distribute copies of the library, whether gratis or for a fee, you must give the recipients all the rights that we gave you. You must make sure that they, too, receive or can get the source code. If you link other code with the library, you must provide complete object files to the recipients, so that they can relink them with the library after making changes to the library and recompiling it. And you must show them these terms so they know their rights. We protect your rights with a two-step method: (1) we copyright the library, and (2) we offer you this license, which gives you legal permission to copy, distribute and/or modify the library. To protect each distributor, we want to make it very clear that there is no warranty for the free library. Also, if the library is modified by someone else and passed on, the recipients should know that what they have is not the original version, so that the original author's reputation will not be affected by problems that might be introduced by others. Finally, software patents pose a constant threat to the existence of any free program. We wish to make sure that a company cannot effectively restrict the users of a free program by obtaining a restrictive license from a patent holder. Therefore, we insist that any patent license obtained for a version of the library must be consistent with the full freedom of use specified in this license. Most GNU software, including some libraries, is covered by the ordinary GNU General Public License. This license, the GNU Lesser General Public License, applies to certain designated libraries, and is quite different from the ordinary General Public License. We use this license for certain libraries in order to permit linking those libraries into non-free programs. When a program is linked with a library, whether statically or using a shared library, the combination of the two is legally speaking a combined work, a derivative of the original library. The ordinary General Public License therefore permits such linking only if the entire combination fits its criteria of freedom. The Lesser General Public License permits more lax criteria for linking other code with the library. We call this license the "Lesser" General Public License because it does Less to protect the user's freedom than the ordinary General Public License. It also provides other free software developers Less of an advantage over competing non-free programs. These disadvantages are the reason we use the ordinary General Public License for many libraries. However, the Lesser license provides advantages in certain special circumstances. For example, on rare occasions, there may be a special need to encourage the widest possible use of a certain library, so that it becomes a de-facto standard. To achieve this, non-free programs must be allowed to use the library. A more frequent case is that a free library does the same job as widely used non-free libraries. In this case, there is little to gain by limiting the free library to free software only, so we use the Lesser General Public License. In other cases, permission to use a particular library in non-free programs enables a greater number of people to use a large body of free software. For example, permission to use the GNU C Library in non-free programs enables many more people to use the whole GNU operating system, as well as its variant, the GNU/Linux operating system. Although the Lesser General Public License is Less protective of the users' freedom, it does ensure that the user of a program that is linked with the Library has the freedom and the wherewithal to run that program using a modified version of the Library. The precise terms and conditions for copying, distribution and modification follow. Pay close attention to the difference between a "work based on the library" and a "work that uses the library". The former contains code derived from the library, whereas the latter must be combined with the library in order to run. GNU LESSER GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License Agreement applies to any software library or other program which contains a notice placed by the copyright holder or other authorized party saying it may be distributed under the terms of this Lesser General Public License (also called "this License"). Each licensee is addressed as "you". A "library" means a collection of software functions and/or data prepared so as to be conveniently linked with application programs (which use some of those functions and data) to form executables. The "Library", below, refers to any such software library or work which has been distributed under these terms. A "work based on the Library" means either the Library or any derivative work under copyright law: that is to say, a work containing the Library or a portion of it, either verbatim or with modifications and/or translated straightforwardly into another language. (Hereinafter, translation is included without limitation in the term "modification".) "Source code" for a work means the preferred form of the work for making modifications to it. For a library, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the library. Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running a program using the Library is not restricted, and output from such a program is covered only if its contents constitute a work based on the Library (independent of the use of the Library in a tool for writing it). Whether that is true depends on what the Library does and what the program that uses the Library does. 1. You may copy and distribute verbatim copies of the Library's complete source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and distribute a copy of this License along with the Library. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Library or any portion of it, thus forming a work based on the Library, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) The modified work must itself be a software library. b) You must cause the files modified to carry prominent notices stating that you changed the files and the date of any change. c) You must cause the whole of the work to be licensed at no charge to all third parties under the terms of this License. d) If a facility in the modified Library refers to a function or a table of data to be supplied by an application program that uses the facility, other than as an argument passed when the facility is invoked, then you must make a good faith effort to ensure that, in the event an application does not supply such function or table, the facility still operates, and performs whatever part of its purpose remains meaningful. (For example, a function in a library to compute square roots has a purpose that is entirely well-defined independent of the application. Therefore, Subsection 2d requires that any application-supplied function or table used by this function must be optional: if the application does not supply it, the square root function must still compute square roots.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Library, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Library, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Library. In addition, mere aggregation of another work not based on the Library with the Library (or with a work based on the Library) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may opt to apply the terms of the ordinary GNU General Public License instead of this License to a given copy of the Library. To do this, you must alter all the notices that refer to this License, so that they refer to the ordinary GNU General Public License, version 2, instead of to this License. (If a newer version than version 2 of the ordinary GNU General Public License has appeared, then you can specify that version instead if you wish.) Do not make any other change in these notices. Once this change is made in a given copy, it is irreversible for that copy, so the ordinary GNU General Public License applies to all subsequent copies and derivative works made from that copy. This option is useful when you wish to copy part of the code of the Library into a program that is not a library. 4. You may copy and distribute the Library (or a portion or derivative of it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange. If distribution of object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place satisfies the requirement to distribute the source code, even though third parties are not compelled to copy the source along with the object code. 5. A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a "work that uses the Library". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License. However, linking a "work that uses the Library" with the Library creates an executable that is a derivative of the Library (because it contains portions of the Library), rather than a "work that uses the library". The executable is therefore covered by this License. Section 6 states terms for distribution of such executables. When a "work that uses the Library" uses material from a header file that is part of the Library, the object code for the work may be a derivative work of the Library even though the source code is not. Whether this is true is especially significant if the work can be linked without the Library, or if the work is itself a library. The threshold for this to be true is not precisely defined by law. If such an object file uses only numerical parameters, data structure layouts and accessors, and small macros and small inline functions (ten lines or less in length), then the use of the object file is unrestricted, regardless of whether it is legally a derivative work. (Executables containing this object code plus portions of the Library will still fall under Section 6.) Otherwise, if the work is a derivative of the Library, you may distribute the object code for the work under the terms of Section 6. Any executables containing that work also fall under Section 6, whether or not they are linked directly with the Library itself. 6. As an exception to the Sections above, you may also combine or link a "work that uses the Library" with the Library to produce a work containing portions of the Library, and distribute that work under terms of your choice, provided that the terms permit modification of the work for the customer's own use and reverse engineering for debugging such modifications. You must give prominent notice with each copy of the work that the Library is used in it and that the Library and its use are covered by this License. You must supply a copy of this License. If the work during execution displays copyright notices, you must include the copyright notice for the Library among them, as well as a reference directing the user to the copy of this License. Also, you must do one of these things: a) Accompany the work with the complete corresponding machine-readable source code for the Library including whatever changes were used in the work (which must be distributed under Sections 1 and 2 above); and, if the work is an executable linked with the Library, with the complete machine-readable "work that uses the Library", as object code and/or source code, so that the user can modify the Library and then relink to produce a modified executable containing the modified Library. (It is understood that the user who changes the contents of definitions files in the Library will not necessarily be able to recompile the application to use the modified definitions.) b) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (1) uses at run time a copy of the library already present on the user's computer system, rather than copying library functions into the executable, and (2) will operate properly with a modified version of the library, if the user installs one, as long as the modified version is interface-compatible with the version that the work was made with. c) Accompany the work with a written offer, valid for at least three years, to give the same user the materials specified in Subsection 6a, above, for a charge no more than the cost of performing this distribution. d) If distribution of the work is made by offering access to copy from a designated place, offer equivalent access to copy the above specified materials from the same place. e) Verify that the user has already received a copy of these materials or that you have already sent this user a copy. For an executable, the required form of the "work that uses the Library" must include any data and utility programs needed for reproducing the executable from it. However, as a special exception, the materials to be distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. It may happen that this requirement contradicts the license restrictions of other proprietary libraries that do not normally accompany the operating system. Such a contradiction means you cannot use both them and the Library together in an executable that you distribute. 7. You may place library facilities that are a work based on the Library side-by-side in a single library together with other library facilities not covered by this License, and distribute such a combined library, provided that the separate distribution of the work based on the Library and of the other library facilities is otherwise permitted, and provided that you do these two things: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities. This must be distributed under the terms of the Sections above. b) Give prominent notice with the combined library of the fact that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 8. You may not copy, modify, sublicense, link with, or distribute the Library except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense, link with, or distribute the Library is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 9. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Library or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Library (or any work based on the Library), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Library or works based on it. 10. Each time you redistribute the Library (or any work based on the Library), the recipient automatically receives a license from the original licensor to copy, distribute, link with or modify the Library subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties with this License. 11. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Library at all. For example, if a patent license would not permit royalty-free redistribution of the Library by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Library. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply, and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 12. If the distribution and/or use of the Library is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Library under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 13. The Free Software Foundation may publish revised and/or new versions of the Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Library does not specify a license version number, you may choose any version ever published by the Free Software Foundation. 14. If you wish to incorporate parts of the Library into other free programs whose distribution conditions are incompatible with these, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Libraries If you develop a new library, and you want it to be of the greatest possible use to the public, we recommend making it free software that everyone can redistribute and change. You can do so by permitting redistribution under these terms (or, alternatively, under the terms of the ordinary General Public License). To apply these terms, attach the following notices to the library. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Also add information on how to contact you by electronic and paper mail. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the library, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the library `Frob' (a library for tweaking knobs) written by James Random Hacker. , 1 April 1990 Ty Coon, President of Vice That's all there is to it! SQLObject-3.4.0/debian/0000755000175000017500000000000013141371614014077 5ustar phdphd00000000000000SQLObject-3.4.0/debian/control0000644000175000017500000000124010066344323015477 0ustar phdphd00000000000000Source: sqlobject Section: python Priority: optional Maintainer: Philippe Normand (phil) Build-Depends: debhelper (>= 4.0.0), python-dev (>= 2.3) Standards-Version: 3.6.1 Package: python-sqlobject #Section: python Architecture: any Depends: ${python:Depends} Recommends: python-mysqldb (>= 0.9.2-0.4), python-sqlite (>= 0.4.3-2), python-psycopg (>= 1.1.13-1), python-kinterbasdb (>= 3.0.1-3.1), python-maxdb (>= 7.5.00.12-1) Description: An object-relational mapper for Python It allows you to translate RDBMS table rows into Python objects, and manipulate those objects to transparently manipulate the database. . Homepage: http://sqlobject.org SQLObject-3.4.0/debian/docs0000644000175000017500000000002112752476767014771 0ustar phdphd00000000000000README.rst docs/ SQLObject-3.4.0/debian/changelog0000644000175000017500000000112610066344323015751 0ustar phdphd00000000000000sqlobject (0.6-1) unstable; urgency=low * Debian package will be part of the Colorstudy SVN repository * Added Build-dependencies : python-{mysqldb,sqlite,psycopg,kinterbasdb,maxdb} * Fixed copyright to suit new SQLOBject license -- Philippe Normand (phil) Tue, 8 Jun 2004 20:39:16 +0200 sqlobject (0.5.2-1) unstable; urgency=low * Update to 0.5.2. -- Keisuke URAGO Sat, 5 Jun 2004 02:32:43 +0900 sqlobject (0.3-1) unstable; urgency=low * New upstream version. -- Keisuke URAGO Fri, 09 May 2003 17:06:23 +0900 SQLObject-3.4.0/debian/copyright0000644000175000017500000000054710066344323016040 0ustar phdphd00000000000000This package was debianized by Keisuke URAGO on Fri, 09 May 2003 17:06:23 +0900 It was downloaded from http://sourceforge.net/projects/sqlobject Upstream Author: Ian Bicking Copyright: PSF The Python Software Foundation License can be downloaded from http://www.opensource.org/licenses/PythonSoftFoundation.php SQLObject-3.4.0/debian/rules0000755000175000017500000000231510066344323015160 0ustar phdphd00000000000000#!/usr/bin/make -f # Sample debian/rules that uses debhelper. # GNU copyright 1997 to 1999 by Joey Hess. # Uncomment this to turn on verbose mode. #export DH_VERBOSE=1 PYTHON=python configure: configure-stamp configure-stamp: dh_testdir #$(PYTHON) config_unix.py --prefix /usr #$(PYTHON) setup.py config touch configure-stamp build: build-stamp build-stamp: configure-stamp dh_testdir $(PYTHON) setup.py build touch build-stamp clean: dh_testdir dh_testroot rm -f build-stamp configure-stamp -$(PYTHON) setup.py clean --all dh_clean install: build dh_testdir dh_testroot dh_clean -k dh_installdirs $(PYTHON) setup.py install --prefix $(CURDIR)/debian/python-sqlobject/usr # Build architecture-independent files here. binary-indep: build install # We have nothing to do by default. # Build architecture-dependent files here. binary-arch: build install dh_testdir dh_testroot dh_installdocs -X.svn dh_installexamples -X.svn dh_installchangelogs dh_link dh_strip dh_compress dh_fixperms dh_makeshlibs -V dh_python dh_installdeb dh_gencontrol dh_md5sums dh_builddeb -X.svn binary: binary-indep binary-arch .PHONY: build clean binary-indep binary-arch binary install configure SQLObject-3.4.0/debian/examples0000644000175000017500000000000712224003267015632 0ustar phdphd00000000000000tests/ SQLObject-3.4.0/.travis.yml0000644000175000017500000000610713140202164014762 0ustar phdphd00000000000000# Only test master and pull requests; skip tags. # Other branches can allow themselves. branches: only: - master # Prefer docker container with setuid/sudo sudo: required language: python python: - "3.6" cache: pip addons: apt: packages: - python-egenix-mxdatetime - python-mysqldb - python-psycopg2 - python3-psycopg2 - firebird2.5-super postgresql: "9.4" env: - TOXENV=py27-mysqldb - TOXENV=py34-mysqlclient - TOXENV=py35-mysqlclient - TOXENV=py36-mysqlclient - TOXENV=py27-mysql-connector - TOXENV=py34-mysql-connector - TOXENV=py35-mysql-connector - TOXENV=py36-mysql-connector - TOXENV=py27-mysql-oursql - TOXENV=py27-pymysql - TOXENV=py34-pymysql - TOXENV=py35-pymysql - TOXENV=py36-pymysql - TOXENV=py27-postgres-psycopg - TOXENV=py34-postgres-psycopg - TOXENV=py35-postgres-psycopg - TOXENV=py36-postgres-psycopg - TOXENV=py27-postgres-pygresql - TOXENV=py34-postgres-pygresql - TOXENV=py35-postgres-pygresql - TOXENV=py36-postgres-pygresql - TOXENV=py34-pypostgresql - TOXENV=py35-pypostgresql - TOXENV=py36-pypostgresql - TOXENV=py27-sqlite - TOXENV=py34-sqlite - TOXENV=py35-sqlite - TOXENV=py36-sqlite - TOXENV=py27-sqlite-memory - TOXENV=py34-sqlite-memory - TOXENV=py35-sqlite-memory - TOXENV=py36-sqlite-memory - TOXENV=py27-flake8 - TOXENV=py34-flake8 - TOXENV=py27-firebird-fdb - TOXENV=py34-firebird-fdb - TOXENV=py35-firebird-fdb - TOXENV=py36-firebird-fdb - TOXENV=py27-firebirdsql - TOXENV=py34-firebirdsql - TOXENV=py35-firebirdsql - TOXENV=py36-firebirdsql matrix: allow_failures: - env: TOXENV=py27-firebird-fdb - env: TOXENV=py34-firebird-fdb - env: TOXENV=py35-firebird-fdb - env: TOXENV=py36-firebird-fdb - env: TOXENV=py27-firebirdsql - env: TOXENV=py34-firebirdsql - env: TOXENV=py35-firebirdsql - env: TOXENV=py36-firebirdsql fast_finish: true before_install: # Start the firebird database server. # We use firebird-super, so there's none of the inetd configuration # required by firebird-classic. # We also create a test user for the firebird test and # create a script that can be fed into isql-fb # to create the test database. # Copied password initializtion from # https://github.com/xdenser/node-firebird-libfbclient/blob/master/.travis.yml - if [[ $TOXENV = *firebird* ]]; then sudo sed -i /etc/default/firebird2.5 -e 's/=no/=yes/' && sudo /etc/init.d/firebird2.5-super start && sleep 5 && sudo /bin/bash -c '(export FB_VER="2.5"; export FB_FLAVOUR="super";source /usr/share/firebird2.5-common/functions.sh; writeNewPassword masterkey)' && sudo gsec -user sysdba -pass masterkey -add test -pw test && sudo /bin/bash -c "echo \"CREATE DATABASE 'localhost:/tmp/test.fdb';\" > /var/lib/firebird/create_test_db" && sudo chmod 644 /var/lib/firebird/create_test_db; fi install: travis_retry pip install tox coveralls codecov ppu script: tox -e ${TOXENV} after_success: - cd sqlobject - coveralls - codecov before_cache: - remove-old-files.py -o 180 ~/.cache/pip SQLObject-3.4.0/PKG-INFO0000644000175000017500000000350613141371614013756 0ustar phdphd00000000000000Metadata-Version: 1.1 Name: SQLObject Version: 3.4.0 Summary: Object-Relational Manager, aka database wrapper Home-page: http://sqlobject.org/ Author: Oleg Broytman Author-email: phd@phdru.name License: LGPL Download-URL: https://pypi.python.org/pypi/SQLObject/3.4.0 Description: SQLObject is a popular *Object Relational Manager* for providing an object interface to your database, with tables as classes, rows as instances, and columns as attributes. SQLObject includes a Python-object-based query language that makes SQL more abstract, and provides substantial database independence for applications. Supports MySQL, PostgreSQL, SQLite, Firebird, Sybase, MSSQL and MaxDB (SAPDB). Python 2.7 or 3.4+ is required. For development see the projects at `SourceForge `_ and `GitHub `_. .. image:: https://travis-ci.org/sqlobject/sqlobject.svg?branch=master :target: https://travis-ci.org/sqlobject/sqlobject Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL) Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Topic :: Database Classifier: Topic :: Database :: Front-Ends Classifier: Topic :: Software Development :: Libraries :: Python Modules Requires: FormEncode Requires: PyDispatcher SQLObject-3.4.0/MANIFEST.in0000644000175000017500000000045113072240647014417 0ustar phdphd00000000000000global-include *.py *.rst *.txt recursive-include docs *.css *.html *.js *.gif *.png include LICENSE MANIFEST.in .travis.yml circle.yml tox.ini include debian/* sqlobject/.coveragerc include docs/Makefile docs/genapidocs docs/rebuild recursive-exclude devscripts * recursive-exclude docs/_build *