databases-0.5.5/0000755000175100001710000000000014172351472014255 5ustar runnerdocker00000000000000databases-0.5.5/PKG-INFO0000644000175100001710000001337114172351472015357 0ustar runnerdocker00000000000000Metadata-Version: 2.1 Name: databases Version: 0.5.5 Summary: Async database support for Python. Home-page: https://github.com/encode/databases Author: Tom Christie Author-email: tom@tomchristie.com License: BSD Description: # Databases

Test Suite Package version

Databases gives you simple asyncio support for a range of databases. It allows you to make queries using the powerful [SQLAlchemy Core][sqlalchemy-core] expression language, and provides support for PostgreSQL, MySQL, and SQLite. Databases is suitable for integrating against any async Web framework, such as [Starlette][starlette], [Sanic][sanic], [Responder][responder], [Quart][quart], [aiohttp][aiohttp], [Tornado][tornado], or [FastAPI][fastapi]. **Documentation**: [https://www.encode.io/databases/](https://www.encode.io/databases/) **Requirements**: Python 3.6+ --- ## Installation ```shell $ pip install databases ``` You can install the required database drivers with: ```shell $ pip install databases[postgresql] $ pip install databases[mysql] $ pip install databases[sqlite] ``` Default driver support is provided using one of [asyncpg][asyncpg], [aiomysql][aiomysql], or [aiosqlite][aiosqlite]. You can also use other database drivers supported by `databases`: ```shel $ pip install databases[postgresql+aiopg] $ pip install databases[mysql+asyncmy] ``` Note that if you are using any synchronous SQLAlchemy functions such as `engine.create_all()` or [alembic][alembic] migrations then you still have to install a synchronous DB driver: [psycopg2][psycopg2] for PostgreSQL and [pymysql][pymysql] for MySQL. --- ## Quickstart For this example we'll create a very simple SQLite database to run some queries against. ```shell $ pip install databases[sqlite] $ pip install ipython ``` We can now run a simple example from the console. Note that we want to use `ipython` here, because it supports using `await` expressions directly from the console. ```python # Create a database instance, and connect to it. from databases import Database database = Database('sqlite:///example.db') await database.connect() # Create a table. query = """CREATE TABLE HighScores (id INTEGER PRIMARY KEY, name VARCHAR(100), score INTEGER)""" await database.execute(query=query) # Insert some data. query = "INSERT INTO HighScores(name, score) VALUES (:name, :score)" values = [ {"name": "Daisy", "score": 92}, {"name": "Neil", "score": 87}, {"name": "Carol", "score": 43}, ] await database.execute_many(query=query, values=values) # Run a database query. query = "SELECT * FROM HighScores" rows = await database.fetch_all(query=query) print('High Scores:', rows) ``` Check out the documentation on [making database queries](https://www.encode.io/databases/database_queries/) for examples of how to start using databases together with SQLAlchemy core expressions.

— ⭐️ —

Databases is BSD licensed code. Designed & built in Brighton, England.

[sqlalchemy-core]: https://docs.sqlalchemy.org/en/latest/core/ [sqlalchemy-core-tutorial]: https://docs.sqlalchemy.org/en/latest/core/tutorial.html [alembic]: https://alembic.sqlalchemy.org/en/latest/ [psycopg2]: https://www.psycopg.org/ [pymysql]: https://github.com/PyMySQL/PyMySQL [asyncpg]: https://github.com/MagicStack/asyncpg [aiomysql]: https://github.com/aio-libs/aiomysql [aiosqlite]: https://github.com/jreese/aiosqlite [starlette]: https://github.com/encode/starlette [sanic]: https://github.com/huge-success/sanic [responder]: https://github.com/kennethreitz/responder [quart]: https://gitlab.com/pgjones/quart [aiohttp]: https://github.com/aio-libs/aiohttp [tornado]: https://github.com/tornadoweb/tornado [fastapi]: https://github.com/tiangolo/fastapi Platform: UNKNOWN Classifier: Development Status :: 3 - Alpha Classifier: Environment :: Web Environment Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Topic :: Internet :: WWW/HTTP Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3 :: Only Requires-Python: >=3.6 Description-Content-Type: text/markdown Provides-Extra: postgresql Provides-Extra: mysql Provides-Extra: mysql+asyncmy Provides-Extra: sqlite Provides-Extra: postgresql+aiopg databases-0.5.5/README.md0000644000175100001710000000746514172351434015546 0ustar runnerdocker00000000000000# Databases

Test Suite Package version

Databases gives you simple asyncio support for a range of databases. It allows you to make queries using the powerful [SQLAlchemy Core][sqlalchemy-core] expression language, and provides support for PostgreSQL, MySQL, and SQLite. Databases is suitable for integrating against any async Web framework, such as [Starlette][starlette], [Sanic][sanic], [Responder][responder], [Quart][quart], [aiohttp][aiohttp], [Tornado][tornado], or [FastAPI][fastapi]. **Documentation**: [https://www.encode.io/databases/](https://www.encode.io/databases/) **Requirements**: Python 3.6+ --- ## Installation ```shell $ pip install databases ``` You can install the required database drivers with: ```shell $ pip install databases[postgresql] $ pip install databases[mysql] $ pip install databases[sqlite] ``` Default driver support is provided using one of [asyncpg][asyncpg], [aiomysql][aiomysql], or [aiosqlite][aiosqlite]. You can also use other database drivers supported by `databases`: ```shel $ pip install databases[postgresql+aiopg] $ pip install databases[mysql+asyncmy] ``` Note that if you are using any synchronous SQLAlchemy functions such as `engine.create_all()` or [alembic][alembic] migrations then you still have to install a synchronous DB driver: [psycopg2][psycopg2] for PostgreSQL and [pymysql][pymysql] for MySQL. --- ## Quickstart For this example we'll create a very simple SQLite database to run some queries against. ```shell $ pip install databases[sqlite] $ pip install ipython ``` We can now run a simple example from the console. Note that we want to use `ipython` here, because it supports using `await` expressions directly from the console. ```python # Create a database instance, and connect to it. from databases import Database database = Database('sqlite:///example.db') await database.connect() # Create a table. query = """CREATE TABLE HighScores (id INTEGER PRIMARY KEY, name VARCHAR(100), score INTEGER)""" await database.execute(query=query) # Insert some data. query = "INSERT INTO HighScores(name, score) VALUES (:name, :score)" values = [ {"name": "Daisy", "score": 92}, {"name": "Neil", "score": 87}, {"name": "Carol", "score": 43}, ] await database.execute_many(query=query, values=values) # Run a database query. query = "SELECT * FROM HighScores" rows = await database.fetch_all(query=query) print('High Scores:', rows) ``` Check out the documentation on [making database queries](https://www.encode.io/databases/database_queries/) for examples of how to start using databases together with SQLAlchemy core expressions.

— ⭐️ —

Databases is BSD licensed code. Designed & built in Brighton, England.

[sqlalchemy-core]: https://docs.sqlalchemy.org/en/latest/core/ [sqlalchemy-core-tutorial]: https://docs.sqlalchemy.org/en/latest/core/tutorial.html [alembic]: https://alembic.sqlalchemy.org/en/latest/ [psycopg2]: https://www.psycopg.org/ [pymysql]: https://github.com/PyMySQL/PyMySQL [asyncpg]: https://github.com/MagicStack/asyncpg [aiomysql]: https://github.com/aio-libs/aiomysql [aiosqlite]: https://github.com/jreese/aiosqlite [starlette]: https://github.com/encode/starlette [sanic]: https://github.com/huge-success/sanic [responder]: https://github.com/kennethreitz/responder [quart]: https://gitlab.com/pgjones/quart [aiohttp]: https://github.com/aio-libs/aiohttp [tornado]: https://github.com/tornadoweb/tornado [fastapi]: https://github.com/tiangolo/fastapi databases-0.5.5/databases/0000755000175100001710000000000014172351472016204 5ustar runnerdocker00000000000000databases-0.5.5/databases/__init__.py0000644000175100001710000000015614172351434020315 0ustar runnerdocker00000000000000from databases.core import Database, DatabaseURL __version__ = "0.5.5" __all__ = ["Database", "DatabaseURL"] databases-0.5.5/databases/backends/0000755000175100001710000000000014172351472017756 5ustar runnerdocker00000000000000databases-0.5.5/databases/backends/__init__.py0000644000175100001710000000000014172351434022053 0ustar runnerdocker00000000000000databases-0.5.5/databases/backends/aiopg.py0000644000175100001710000002353414172351434021434 0ustar runnerdocker00000000000000import getpass import json import logging import typing import uuid import aiopg from aiopg.sa.engine import APGCompiler_psycopg2 from sqlalchemy.dialects.postgresql.psycopg2 import PGDialect_psycopg2 from sqlalchemy.engine.cursor import CursorResultMetaData from sqlalchemy.engine.interfaces import Dialect, ExecutionContext from sqlalchemy.engine.row import Row from sqlalchemy.sql import ClauseElement from sqlalchemy.sql.ddl import DDLElement from databases.core import DatabaseURL from databases.interfaces import ConnectionBackend, DatabaseBackend, TransactionBackend logger = logging.getLogger("databases") class AiopgBackend(DatabaseBackend): def __init__( self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any ) -> None: self._database_url = DatabaseURL(database_url) self._options = options self._dialect = self._get_dialect() self._pool = None def _get_dialect(self) -> Dialect: dialect = PGDialect_psycopg2( json_serializer=json.dumps, json_deserializer=lambda x: x ) dialect.statement_compiler = APGCompiler_psycopg2 dialect.implicit_returning = True dialect.supports_native_enum = True dialect.supports_smallserial = True # 9.2+ dialect._backslash_escapes = False dialect.supports_sane_multi_rowcount = True # psycopg 2.0.9+ dialect._has_native_hstore = True dialect.supports_native_decimal = True return dialect def _get_connection_kwargs(self) -> dict: url_options = self._database_url.options kwargs = {} min_size = url_options.get("min_size") max_size = url_options.get("max_size") ssl = url_options.get("ssl") if min_size is not None: kwargs["minsize"] = int(min_size) if max_size is not None: kwargs["maxsize"] = int(max_size) if ssl is not None: kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()] for key, value in self._options.items(): # Coerce 'min_size' and 'max_size' for consistency. if key == "min_size": key = "minsize" elif key == "max_size": key = "maxsize" kwargs[key] = value return kwargs async def connect(self) -> None: assert self._pool is None, "DatabaseBackend is already running" kwargs = self._get_connection_kwargs() self._pool = await aiopg.create_pool( host=self._database_url.hostname, port=self._database_url.port, user=self._database_url.username or getpass.getuser(), password=self._database_url.password, database=self._database_url.database, **kwargs, ) async def disconnect(self) -> None: assert self._pool is not None, "DatabaseBackend is not running" self._pool.close() await self._pool.wait_closed() self._pool = None def connection(self) -> "AiopgConnection": return AiopgConnection(self, self._dialect) class CompilationContext: def __init__(self, context: ExecutionContext): self.context = context class AiopgConnection(ConnectionBackend): def __init__(self, database: AiopgBackend, dialect: Dialect): self._database = database self._dialect = dialect self._connection = None # type: typing.Optional[aiopg.Connection] async def acquire(self) -> None: assert self._connection is None, "Connection is already acquired" assert self._database._pool is not None, "DatabaseBackend is not running" self._connection = await self._database._pool.acquire() async def release(self) -> None: assert self._connection is not None, "Connection is not acquired" assert self._database._pool is not None, "DatabaseBackend is not running" await self._database._pool.release(self._connection) self._connection = None async def fetch_all(self, query: ClauseElement) -> typing.List[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) rows = await cursor.fetchall() metadata = CursorResultMetaData(context, cursor.description) return [ Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) for row in rows ] finally: cursor.close() async def fetch_one(self, query: ClauseElement) -> typing.Optional[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) row = await cursor.fetchone() if row is None: return None metadata = CursorResultMetaData(context, cursor.description) return Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) finally: cursor.close() async def execute(self, query: ClauseElement) -> typing.Any: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) return cursor.lastrowid finally: cursor.close() async def execute_many(self, queries: typing.List[ClauseElement]) -> None: assert self._connection is not None, "Connection is not acquired" cursor = await self._connection.cursor() try: for single_query in queries: single_query, args, context = self._compile(single_query) await cursor.execute(single_query, args) finally: cursor.close() async def iterate( self, query: ClauseElement ) -> typing.AsyncGenerator[typing.Any, None]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) metadata = CursorResultMetaData(context, cursor.description) async for row in cursor: yield Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) finally: cursor.close() def transaction(self) -> TransactionBackend: return AiopgTransaction(self) def _compile( self, query: ClauseElement ) -> typing.Tuple[str, dict, CompilationContext]: compiled = query.compile( dialect=self._dialect, compile_kwargs={"render_postcompile": True} ) execution_context = self._dialect.execution_ctx_cls() execution_context.dialect = self._dialect if not isinstance(query, DDLElement): args = compiled.construct_params() for key, val in args.items(): if key in compiled._bind_processors: args[key] = compiled._bind_processors[key](val) execution_context.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._loose_column_name_matching, ) else: args = {} logger.debug("Query: %s\nArgs: %s", compiled.string, args) return compiled.string, args, CompilationContext(execution_context) @property def raw_connection(self) -> aiopg.connection.Connection: assert self._connection is not None, "Connection is not acquired" return self._connection class AiopgTransaction(TransactionBackend): def __init__(self, connection: AiopgConnection): self._connection = connection self._is_root = False self._savepoint_name = "" async def start( self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any] ) -> None: assert self._connection._connection is not None, "Connection is not acquired" self._is_root = is_root cursor = await self._connection._connection.cursor() if self._is_root: await cursor.execute("BEGIN") else: id = str(uuid.uuid4()).replace("-", "_") self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}" try: await cursor.execute(f"SAVEPOINT {self._savepoint_name}") finally: cursor.close() async def commit(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" cursor = await self._connection._connection.cursor() if self._is_root: await cursor.execute("COMMIT") else: try: await cursor.execute(f"RELEASE SAVEPOINT {self._savepoint_name}") finally: cursor.close() async def rollback(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" cursor = await self._connection._connection.cursor() if self._is_root: await cursor.execute("ROLLBACK") else: try: await cursor.execute(f"ROLLBACK TO SAVEPOINT {self._savepoint_name}") finally: cursor.close() databases-0.5.5/databases/backends/asyncmy.py0000644000175100001710000002415414172351434022017 0ustar runnerdocker00000000000000import getpass import logging import typing import uuid import asyncmy from sqlalchemy.dialects.mysql import pymysql from sqlalchemy.engine.cursor import CursorResultMetaData from sqlalchemy.engine.interfaces import Dialect, ExecutionContext from sqlalchemy.engine.row import Row from sqlalchemy.sql import ClauseElement from sqlalchemy.sql.ddl import DDLElement from databases.core import LOG_EXTRA, DatabaseURL from databases.interfaces import ConnectionBackend, DatabaseBackend, TransactionBackend logger = logging.getLogger("databases") class AsyncMyBackend(DatabaseBackend): def __init__( self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any ) -> None: self._database_url = DatabaseURL(database_url) self._options = options self._dialect = pymysql.dialect(paramstyle="pyformat") self._dialect.supports_native_decimal = True self._pool = None def _get_connection_kwargs(self) -> dict: url_options = self._database_url.options kwargs = {} min_size = url_options.get("min_size") max_size = url_options.get("max_size") pool_recycle = url_options.get("pool_recycle") ssl = url_options.get("ssl") if min_size is not None: kwargs["minsize"] = int(min_size) if max_size is not None: kwargs["maxsize"] = int(max_size) if pool_recycle is not None: kwargs["pool_recycle"] = int(pool_recycle) if ssl is not None: kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()] for key, value in self._options.items(): # Coerce 'min_size' and 'max_size' for consistency. if key == "min_size": key = "minsize" elif key == "max_size": key = "maxsize" kwargs[key] = value return kwargs async def connect(self) -> None: assert self._pool is None, "DatabaseBackend is already running" kwargs = self._get_connection_kwargs() self._pool = await asyncmy.create_pool( host=self._database_url.hostname, port=self._database_url.port or 3306, user=self._database_url.username or getpass.getuser(), password=self._database_url.password, db=self._database_url.database, autocommit=True, **kwargs, ) async def disconnect(self) -> None: assert self._pool is not None, "DatabaseBackend is not running" self._pool.close() await self._pool.wait_closed() self._pool = None def connection(self) -> "AsyncMyConnection": return AsyncMyConnection(self, self._dialect) class CompilationContext: def __init__(self, context: ExecutionContext): self.context = context class AsyncMyConnection(ConnectionBackend): def __init__(self, database: AsyncMyBackend, dialect: Dialect): self._database = database self._dialect = dialect self._connection = None # type: typing.Optional[asyncmy.Connection] async def acquire(self) -> None: assert self._connection is None, "Connection is already acquired" assert self._database._pool is not None, "DatabaseBackend is not running" self._connection = await self._database._pool.acquire() async def release(self) -> None: assert self._connection is not None, "Connection is not acquired" assert self._database._pool is not None, "DatabaseBackend is not running" await self._database._pool.release(self._connection) self._connection = None async def fetch_all(self, query: ClauseElement) -> typing.List[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.cursor() as cursor: try: await cursor.execute(query_str, args) rows = await cursor.fetchall() metadata = CursorResultMetaData(context, cursor.description) return [ Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) for row in rows ] finally: await cursor.close() async def fetch_one(self, query: ClauseElement) -> typing.Optional[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.cursor() as cursor: try: await cursor.execute(query_str, args) row = await cursor.fetchone() if row is None: return None metadata = CursorResultMetaData(context, cursor.description) return Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) finally: await cursor.close() async def execute(self, query: ClauseElement) -> typing.Any: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.cursor() as cursor: try: await cursor.execute(query_str, args) if cursor.lastrowid == 0: return cursor.rowcount return cursor.lastrowid finally: await cursor.close() async def execute_many(self, queries: typing.List[ClauseElement]) -> None: assert self._connection is not None, "Connection is not acquired" async with self._connection.cursor() as cursor: try: for single_query in queries: single_query, args, context = self._compile(single_query) await cursor.execute(single_query, args) finally: await cursor.close() async def iterate( self, query: ClauseElement ) -> typing.AsyncGenerator[typing.Any, None]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.cursor() as cursor: try: await cursor.execute(query_str, args) metadata = CursorResultMetaData(context, cursor.description) async for row in cursor: yield Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) finally: await cursor.close() def transaction(self) -> TransactionBackend: return AsyncMyTransaction(self) def _compile( self, query: ClauseElement ) -> typing.Tuple[str, dict, CompilationContext]: compiled = query.compile( dialect=self._dialect, compile_kwargs={"render_postcompile": True} ) execution_context = self._dialect.execution_ctx_cls() execution_context.dialect = self._dialect if not isinstance(query, DDLElement): args = compiled.construct_params() for key, val in args.items(): if key in compiled._bind_processors: args[key] = compiled._bind_processors[key](val) execution_context.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._loose_column_name_matching, ) else: args = {} query_message = compiled.string.replace(" \n", " ").replace("\n", " ") logger.debug("Query: %s Args: %s", query_message, repr(args), extra=LOG_EXTRA) return compiled.string, args, CompilationContext(execution_context) @property def raw_connection(self) -> asyncmy.connection.Connection: assert self._connection is not None, "Connection is not acquired" return self._connection class AsyncMyTransaction(TransactionBackend): def __init__(self, connection: AsyncMyConnection): self._connection = connection self._is_root = False self._savepoint_name = "" async def start( self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any] ) -> None: assert self._connection._connection is not None, "Connection is not acquired" self._is_root = is_root if self._is_root: await self._connection._connection.begin() else: id = str(uuid.uuid4()).replace("-", "_") self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}" async with self._connection._connection.cursor() as cursor: try: await cursor.execute(f"SAVEPOINT {self._savepoint_name}") finally: await cursor.close() async def commit(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" if self._is_root: await self._connection._connection.commit() else: async with self._connection._connection.cursor() as cursor: try: await cursor.execute(f"RELEASE SAVEPOINT {self._savepoint_name}") finally: await cursor.close() async def rollback(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" if self._is_root: await self._connection._connection.rollback() else: async with self._connection._connection.cursor() as cursor: try: await cursor.execute( f"ROLLBACK TO SAVEPOINT {self._savepoint_name}" ) finally: await cursor.close() databases-0.5.5/databases/backends/mysql.py0000644000175100001710000002334614172351434021503 0ustar runnerdocker00000000000000import getpass import logging import typing import uuid import aiomysql from sqlalchemy.dialects.mysql import pymysql from sqlalchemy.engine.cursor import CursorResultMetaData from sqlalchemy.engine.interfaces import Dialect, ExecutionContext from sqlalchemy.engine.row import Row from sqlalchemy.sql import ClauseElement from sqlalchemy.sql.ddl import DDLElement from databases.core import LOG_EXTRA, DatabaseURL from databases.interfaces import ConnectionBackend, DatabaseBackend, TransactionBackend logger = logging.getLogger("databases") class MySQLBackend(DatabaseBackend): def __init__( self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any ) -> None: self._database_url = DatabaseURL(database_url) self._options = options self._dialect = pymysql.dialect(paramstyle="pyformat") self._dialect.supports_native_decimal = True self._pool = None def _get_connection_kwargs(self) -> dict: url_options = self._database_url.options kwargs = {} min_size = url_options.get("min_size") max_size = url_options.get("max_size") pool_recycle = url_options.get("pool_recycle") ssl = url_options.get("ssl") if min_size is not None: kwargs["minsize"] = int(min_size) if max_size is not None: kwargs["maxsize"] = int(max_size) if pool_recycle is not None: kwargs["pool_recycle"] = int(pool_recycle) if ssl is not None: kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()] for key, value in self._options.items(): # Coerce 'min_size' and 'max_size' for consistency. if key == "min_size": key = "minsize" elif key == "max_size": key = "maxsize" kwargs[key] = value return kwargs async def connect(self) -> None: assert self._pool is None, "DatabaseBackend is already running" kwargs = self._get_connection_kwargs() self._pool = await aiomysql.create_pool( host=self._database_url.hostname, port=self._database_url.port or 3306, user=self._database_url.username or getpass.getuser(), password=self._database_url.password, db=self._database_url.database, autocommit=True, **kwargs, ) async def disconnect(self) -> None: assert self._pool is not None, "DatabaseBackend is not running" self._pool.close() await self._pool.wait_closed() self._pool = None def connection(self) -> "MySQLConnection": return MySQLConnection(self, self._dialect) class CompilationContext: def __init__(self, context: ExecutionContext): self.context = context class MySQLConnection(ConnectionBackend): def __init__(self, database: MySQLBackend, dialect: Dialect): self._database = database self._dialect = dialect self._connection = None # type: typing.Optional[aiomysql.Connection] async def acquire(self) -> None: assert self._connection is None, "Connection is already acquired" assert self._database._pool is not None, "DatabaseBackend is not running" self._connection = await self._database._pool.acquire() async def release(self) -> None: assert self._connection is not None, "Connection is not acquired" assert self._database._pool is not None, "DatabaseBackend is not running" await self._database._pool.release(self._connection) self._connection = None async def fetch_all(self, query: ClauseElement) -> typing.List[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) rows = await cursor.fetchall() metadata = CursorResultMetaData(context, cursor.description) return [ Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) for row in rows ] finally: await cursor.close() async def fetch_one(self, query: ClauseElement) -> typing.Optional[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) row = await cursor.fetchone() if row is None: return None metadata = CursorResultMetaData(context, cursor.description) return Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) finally: await cursor.close() async def execute(self, query: ClauseElement) -> typing.Any: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) if cursor.lastrowid == 0: return cursor.rowcount return cursor.lastrowid finally: await cursor.close() async def execute_many(self, queries: typing.List[ClauseElement]) -> None: assert self._connection is not None, "Connection is not acquired" cursor = await self._connection.cursor() try: for single_query in queries: single_query, args, context = self._compile(single_query) await cursor.execute(single_query, args) finally: await cursor.close() async def iterate( self, query: ClauseElement ) -> typing.AsyncGenerator[typing.Any, None]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) cursor = await self._connection.cursor() try: await cursor.execute(query_str, args) metadata = CursorResultMetaData(context, cursor.description) async for row in cursor: yield Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) finally: await cursor.close() def transaction(self) -> TransactionBackend: return MySQLTransaction(self) def _compile( self, query: ClauseElement ) -> typing.Tuple[str, dict, CompilationContext]: compiled = query.compile( dialect=self._dialect, compile_kwargs={"render_postcompile": True} ) execution_context = self._dialect.execution_ctx_cls() execution_context.dialect = self._dialect if not isinstance(query, DDLElement): args = compiled.construct_params() for key, val in args.items(): if key in compiled._bind_processors: args[key] = compiled._bind_processors[key](val) execution_context.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._loose_column_name_matching, ) else: args = {} query_message = compiled.string.replace(" \n", " ").replace("\n", " ") logger.debug("Query: %s Args: %s", query_message, repr(args), extra=LOG_EXTRA) return compiled.string, args, CompilationContext(execution_context) @property def raw_connection(self) -> aiomysql.connection.Connection: assert self._connection is not None, "Connection is not acquired" return self._connection class MySQLTransaction(TransactionBackend): def __init__(self, connection: MySQLConnection): self._connection = connection self._is_root = False self._savepoint_name = "" async def start( self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any] ) -> None: assert self._connection._connection is not None, "Connection is not acquired" self._is_root = is_root if self._is_root: await self._connection._connection.begin() else: id = str(uuid.uuid4()).replace("-", "_") self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}" cursor = await self._connection._connection.cursor() try: await cursor.execute(f"SAVEPOINT {self._savepoint_name}") finally: await cursor.close() async def commit(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" if self._is_root: await self._connection._connection.commit() else: cursor = await self._connection._connection.cursor() try: await cursor.execute(f"RELEASE SAVEPOINT {self._savepoint_name}") finally: await cursor.close() async def rollback(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" if self._is_root: await self._connection._connection.rollback() else: cursor = await self._connection._connection.cursor() try: await cursor.execute(f"ROLLBACK TO SAVEPOINT {self._savepoint_name}") finally: await cursor.close() databases-0.5.5/databases/backends/postgres.py0000644000175100001710000002743214172351434022204 0ustar runnerdocker00000000000000import logging import typing from collections.abc import Mapping import asyncpg from sqlalchemy.dialects.postgresql import pypostgresql from sqlalchemy.engine.interfaces import Dialect from sqlalchemy.sql import ClauseElement from sqlalchemy.sql.ddl import DDLElement from sqlalchemy.sql.schema import Column from sqlalchemy.types import TypeEngine from databases.core import LOG_EXTRA, DatabaseURL from databases.interfaces import ConnectionBackend, DatabaseBackend, TransactionBackend logger = logging.getLogger("databases") class PostgresBackend(DatabaseBackend): def __init__( self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any ) -> None: self._database_url = DatabaseURL(database_url) self._options = options self._dialect = self._get_dialect() self._pool = None def _get_dialect(self) -> Dialect: dialect = pypostgresql.dialect(paramstyle="pyformat") dialect.implicit_returning = True dialect.supports_native_enum = True dialect.supports_smallserial = True # 9.2+ dialect._backslash_escapes = False dialect.supports_sane_multi_rowcount = True # psycopg 2.0.9+ dialect._has_native_hstore = True dialect.supports_native_decimal = True return dialect def _get_connection_kwargs(self) -> dict: url_options = self._database_url.options kwargs = {} # type: typing.Dict[str, typing.Any] min_size = url_options.get("min_size") max_size = url_options.get("max_size") ssl = url_options.get("ssl") if min_size is not None: kwargs["min_size"] = int(min_size) if max_size is not None: kwargs["max_size"] = int(max_size) if ssl is not None: kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()] kwargs.update(self._options) return kwargs async def connect(self) -> None: assert self._pool is None, "DatabaseBackend is already running" kwargs = dict( host=self._database_url.hostname, port=self._database_url.port, user=self._database_url.username, password=self._database_url.password, database=self._database_url.database, ) kwargs.update(self._get_connection_kwargs()) self._pool = await asyncpg.create_pool(**kwargs) async def disconnect(self) -> None: assert self._pool is not None, "DatabaseBackend is not running" await self._pool.close() self._pool = None def connection(self) -> "PostgresConnection": return PostgresConnection(self, self._dialect) class Record(Mapping): __slots__ = ( "_row", "_result_columns", "_dialect", "_column_map", "_column_map_int", "_column_map_full", ) def __init__( self, row: asyncpg.Record, result_columns: tuple, dialect: Dialect, column_maps: typing.Tuple[ typing.Mapping[typing.Any, typing.Tuple[int, TypeEngine]], typing.Mapping[int, typing.Tuple[int, TypeEngine]], typing.Mapping[str, typing.Tuple[int, TypeEngine]], ], ) -> None: self._row = row self._result_columns = result_columns self._dialect = dialect self._column_map, self._column_map_int, self._column_map_full = column_maps @property def _mapping(self) -> asyncpg.Record: return self._row def keys(self) -> typing.KeysView: import warnings warnings.warn( "The `Row.keys()` method is deprecated to mimic SQLAlchemy behaviour, " "use `Row._mapping.keys()` instead.", DeprecationWarning, ) return self._mapping.keys() def values(self) -> typing.ValuesView: import warnings warnings.warn( "The `Row.values()` method is deprecated to mimic SQLAlchemy behaviour, " "use `Row._mapping.values()` instead.", DeprecationWarning, ) return self._mapping.values() def __getitem__(self, key: typing.Any) -> typing.Any: if len(self._column_map) == 0: # raw query return self._row[key] elif isinstance(key, Column): idx, datatype = self._column_map_full[str(key)] elif isinstance(key, int): idx, datatype = self._column_map_int[key] else: idx, datatype = self._column_map[key] raw = self._row[idx] processor = datatype._cached_result_processor(self._dialect, None) if processor is not None: return processor(raw) return raw def __iter__(self) -> typing.Iterator: return iter(self._row.keys()) def __len__(self) -> int: return len(self._row) def __getattr__(self, name: str) -> typing.Any: return self._mapping.get(name) class PostgresConnection(ConnectionBackend): def __init__(self, database: PostgresBackend, dialect: Dialect): self._database = database self._dialect = dialect self._connection = None # type: typing.Optional[asyncpg.connection.Connection] async def acquire(self) -> None: assert self._connection is None, "Connection is already acquired" assert self._database._pool is not None, "DatabaseBackend is not running" self._connection = await self._database._pool.acquire() async def release(self) -> None: assert self._connection is not None, "Connection is not acquired" assert self._database._pool is not None, "DatabaseBackend is not running" self._connection = await self._database._pool.release(self._connection) self._connection = None async def fetch_all(self, query: ClauseElement) -> typing.List[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, result_columns = self._compile(query) rows = await self._connection.fetch(query_str, *args) dialect = self._dialect column_maps = self._create_column_maps(result_columns) return [Record(row, result_columns, dialect, column_maps) for row in rows] async def fetch_one(self, query: ClauseElement) -> typing.Optional[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, result_columns = self._compile(query) row = await self._connection.fetchrow(query_str, *args) if row is None: return None return Record( row, result_columns, self._dialect, self._create_column_maps(result_columns), ) async def fetch_val( self, query: ClauseElement, column: typing.Any = 0 ) -> typing.Any: # we are not calling self._connection.fetchval here because # it does not convert all the types, e.g. JSON stays string # instead of an object # see also: # https://github.com/encode/databases/pull/131 # https://github.com/encode/databases/pull/132 # https://github.com/encode/databases/pull/246 row = await self.fetch_one(query) if row is None: return None return row[column] async def execute(self, query: ClauseElement) -> typing.Any: assert self._connection is not None, "Connection is not acquired" query_str, args, result_columns = self._compile(query) return await self._connection.fetchval(query_str, *args) async def execute_many(self, queries: typing.List[ClauseElement]) -> None: assert self._connection is not None, "Connection is not acquired" # asyncpg uses prepared statements under the hood, so we just # loop through multiple executes here, which should all end up # using the same prepared statement. for single_query in queries: single_query, args, result_columns = self._compile(single_query) await self._connection.execute(single_query, *args) async def iterate( self, query: ClauseElement ) -> typing.AsyncGenerator[typing.Any, None]: assert self._connection is not None, "Connection is not acquired" query_str, args, result_columns = self._compile(query) column_maps = self._create_column_maps(result_columns) async for row in self._connection.cursor(query_str, *args): yield Record(row, result_columns, self._dialect, column_maps) def transaction(self) -> TransactionBackend: return PostgresTransaction(connection=self) def _compile(self, query: ClauseElement) -> typing.Tuple[str, list, tuple]: compiled = query.compile( dialect=self._dialect, compile_kwargs={"render_postcompile": True} ) if not isinstance(query, DDLElement): compiled_params = sorted(compiled.params.items()) mapping = { key: "$" + str(i) for i, (key, _) in enumerate(compiled_params, start=1) } compiled_query = compiled.string % mapping processors = compiled._bind_processors args = [ processors[key](val) if key in processors else val for key, val in compiled_params ] result_map = compiled._result_columns else: compiled_query = compiled.string args = [] result_map = None query_message = compiled_query.replace(" \n", " ").replace("\n", " ") logger.debug( "Query: %s Args: %s", query_message, repr(tuple(args)), extra=LOG_EXTRA ) return compiled_query, args, result_map @staticmethod def _create_column_maps( result_columns: tuple, ) -> typing.Tuple[ typing.Mapping[typing.Any, typing.Tuple[int, TypeEngine]], typing.Mapping[int, typing.Tuple[int, TypeEngine]], typing.Mapping[str, typing.Tuple[int, TypeEngine]], ]: """ Generate column -> datatype mappings from the column definitions. These mappings are used throughout PostgresConnection methods to initialize Record-s. The underlying DB driver does not do type conversion for us so we have wrap the returned asyncpg.Record-s. :return: Three mappings from different ways to address a column to \ corresponding column indexes and datatypes: \ 1. by column identifier; \ 2. by column index; \ 3. by column name in Column sqlalchemy objects. """ column_map, column_map_int, column_map_full = {}, {}, {} for idx, (column_name, _, column, datatype) in enumerate(result_columns): column_map[column_name] = (idx, datatype) column_map_int[idx] = (idx, datatype) column_map_full[str(column[0])] = (idx, datatype) return column_map, column_map_int, column_map_full @property def raw_connection(self) -> asyncpg.connection.Connection: assert self._connection is not None, "Connection is not acquired" return self._connection class PostgresTransaction(TransactionBackend): def __init__(self, connection: PostgresConnection): self._connection = connection self._transaction = ( None ) # type: typing.Optional[asyncpg.transaction.Transaction] async def start( self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any] ) -> None: assert self._connection._connection is not None, "Connection is not acquired" self._transaction = self._connection._connection.transaction(**extra_options) await self._transaction.start() async def commit(self) -> None: assert self._transaction is not None await self._transaction.commit() async def rollback(self) -> None: assert self._transaction is not None await self._transaction.rollback() databases-0.5.5/databases/backends/sqlite.py0000644000175100001710000002156514172351434021640 0ustar runnerdocker00000000000000import logging import typing import uuid import aiosqlite from sqlalchemy.dialects.sqlite import pysqlite from sqlalchemy.engine.cursor import CursorResultMetaData from sqlalchemy.engine.interfaces import Dialect, ExecutionContext from sqlalchemy.engine.row import Row from sqlalchemy.sql import ClauseElement from sqlalchemy.sql.ddl import DDLElement from databases.core import LOG_EXTRA, DatabaseURL from databases.interfaces import ConnectionBackend, DatabaseBackend, TransactionBackend logger = logging.getLogger("databases") class SQLiteBackend(DatabaseBackend): def __init__( self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any ) -> None: self._database_url = DatabaseURL(database_url) self._options = options self._dialect = pysqlite.dialect(paramstyle="qmark") # aiosqlite does not support decimals self._dialect.supports_native_decimal = False self._pool = SQLitePool(self._database_url, **self._options) async def connect(self) -> None: pass # assert self._pool is None, "DatabaseBackend is already running" # self._pool = await aiomysql.create_pool( # host=self._database_url.hostname, # port=self._database_url.port or 3306, # user=self._database_url.username or getpass.getuser(), # password=self._database_url.password, # db=self._database_url.database, # autocommit=True, # ) async def disconnect(self) -> None: pass # assert self._pool is not None, "DatabaseBackend is not running" # self._pool.close() # await self._pool.wait_closed() # self._pool = None def connection(self) -> "SQLiteConnection": return SQLiteConnection(self._pool, self._dialect) class SQLitePool: def __init__(self, url: DatabaseURL, **options: typing.Any) -> None: self._url = url self._options = options async def acquire(self) -> aiosqlite.Connection: connection = aiosqlite.connect( database=self._url.database, isolation_level=None, **self._options ) await connection.__aenter__() return connection async def release(self, connection: aiosqlite.Connection) -> None: await connection.__aexit__(None, None, None) class CompilationContext: def __init__(self, context: ExecutionContext): self.context = context class SQLiteConnection(ConnectionBackend): def __init__(self, pool: SQLitePool, dialect: Dialect): self._pool = pool self._dialect = dialect self._connection = None # type: typing.Optional[aiosqlite.Connection] async def acquire(self) -> None: assert self._connection is None, "Connection is already acquired" self._connection = await self._pool.acquire() async def release(self) -> None: assert self._connection is not None, "Connection is not acquired" await self._pool.release(self._connection) self._connection = None async def fetch_all(self, query: ClauseElement) -> typing.List[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.execute(query_str, args) as cursor: rows = await cursor.fetchall() metadata = CursorResultMetaData(context, cursor.description) return [ Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) for row in rows ] async def fetch_one(self, query: ClauseElement) -> typing.Optional[typing.Mapping]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.execute(query_str, args) as cursor: row = await cursor.fetchone() if row is None: return None metadata = CursorResultMetaData(context, cursor.description) return Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) async def execute(self, query: ClauseElement) -> typing.Any: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.cursor() as cursor: await cursor.execute(query_str, args) if cursor.lastrowid == 0: return cursor.rowcount return cursor.lastrowid async def execute_many(self, queries: typing.List[ClauseElement]) -> None: assert self._connection is not None, "Connection is not acquired" for single_query in queries: await self.execute(single_query) async def iterate( self, query: ClauseElement ) -> typing.AsyncGenerator[typing.Any, None]: assert self._connection is not None, "Connection is not acquired" query_str, args, context = self._compile(query) async with self._connection.execute(query_str, args) as cursor: metadata = CursorResultMetaData(context, cursor.description) async for row in cursor: yield Row( metadata, metadata._processors, metadata._keymap, Row._default_key_style, row, ) def transaction(self) -> TransactionBackend: return SQLiteTransaction(self) def _compile( self, query: ClauseElement ) -> typing.Tuple[str, list, CompilationContext]: compiled = query.compile( dialect=self._dialect, compile_kwargs={"render_postcompile": True} ) execution_context = self._dialect.execution_ctx_cls() execution_context.dialect = self._dialect args = [] if not isinstance(query, DDLElement): params = compiled.construct_params() for key in compiled.positiontup: raw_val = params[key] if key in compiled._bind_processors: val = compiled._bind_processors[key](raw_val) else: val = raw_val args.append(val) execution_context.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._loose_column_name_matching, ) query_message = compiled.string.replace(" \n", " ").replace("\n", " ") logger.debug( "Query: %s Args: %s", query_message, repr(tuple(args)), extra=LOG_EXTRA ) return compiled.string, args, CompilationContext(execution_context) @property def raw_connection(self) -> aiosqlite.core.Connection: assert self._connection is not None, "Connection is not acquired" return self._connection class SQLiteTransaction(TransactionBackend): def __init__(self, connection: SQLiteConnection): self._connection = connection self._is_root = False self._savepoint_name = "" async def start( self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any] ) -> None: assert self._connection._connection is not None, "Connection is not acquired" self._is_root = is_root if self._is_root: async with self._connection._connection.execute("BEGIN") as cursor: await cursor.close() else: id = str(uuid.uuid4()).replace("-", "_") self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}" async with self._connection._connection.execute( f"SAVEPOINT {self._savepoint_name}" ) as cursor: await cursor.close() async def commit(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" if self._is_root: async with self._connection._connection.execute("COMMIT") as cursor: await cursor.close() else: async with self._connection._connection.execute( f"RELEASE SAVEPOINT {self._savepoint_name}" ) as cursor: await cursor.close() async def rollback(self) -> None: assert self._connection._connection is not None, "Connection is not acquired" if self._is_root: async with self._connection._connection.execute("ROLLBACK") as cursor: await cursor.close() else: async with self._connection._connection.execute( f"ROLLBACK TO SAVEPOINT {self._savepoint_name}" ) as cursor: await cursor.close() databases-0.5.5/databases/core.py0000644000175100001710000004376714172351434017525 0ustar runnerdocker00000000000000import asyncio import contextlib import functools import logging import sys import typing from types import TracebackType from urllib.parse import SplitResult, parse_qsl, unquote, urlsplit from sqlalchemy import text from sqlalchemy.sql import ClauseElement from databases.importer import import_from_string from databases.interfaces import ConnectionBackend, DatabaseBackend, TransactionBackend if sys.version_info >= (3, 7): # pragma: no cover import contextvars as contextvars else: # pragma: no cover import aiocontextvars as contextvars try: # pragma: no cover import click # Extra log info for optional coloured terminal outputs. LOG_EXTRA = { "color_message": "Query: " + click.style("%s", bold=True) + " Args: %s" } CONNECT_EXTRA = { "color_message": "Connected to database " + click.style("%s", bold=True) } DISCONNECT_EXTRA = { "color_message": "Disconnected from database " + click.style("%s", bold=True) } except ImportError: # pragma: no cover LOG_EXTRA = {} CONNECT_EXTRA = {} DISCONNECT_EXTRA = {} logger = logging.getLogger("databases") class Database: SUPPORTED_BACKENDS = { "postgresql": "databases.backends.postgres:PostgresBackend", "postgresql+aiopg": "databases.backends.aiopg:AiopgBackend", "postgres": "databases.backends.postgres:PostgresBackend", "mysql": "databases.backends.mysql:MySQLBackend", "mysql+asyncmy": "databases.backends.asyncmy:AsyncMyBackend", "sqlite": "databases.backends.sqlite:SQLiteBackend", } def __init__( self, url: typing.Union[str, "DatabaseURL"], *, force_rollback: bool = False, **options: typing.Any, ): self.url = DatabaseURL(url) self.options = options self.is_connected = False self._force_rollback = force_rollback backend_str = self._get_backend() backend_cls = import_from_string(backend_str) assert issubclass(backend_cls, DatabaseBackend) self._backend = backend_cls(self.url, **self.options) # Connections are stored as task-local state. self._connection_context = contextvars.ContextVar( "connection_context" ) # type: contextvars.ContextVar # When `force_rollback=True` is used, we use a single global # connection, within a transaction that always rolls back. self._global_connection = None # type: typing.Optional[Connection] self._global_transaction = None # type: typing.Optional[Transaction] async def connect(self) -> None: """ Establish the connection pool. """ if self.is_connected: logger.debug("Already connected, skipping connection") return None await self._backend.connect() logger.info( "Connected to database %s", self.url.obscure_password, extra=CONNECT_EXTRA ) self.is_connected = True if self._force_rollback: assert self._global_connection is None assert self._global_transaction is None self._global_connection = Connection(self._backend) self._global_transaction = self._global_connection.transaction( force_rollback=True ) await self._global_transaction.__aenter__() async def disconnect(self) -> None: """ Close all connections in the connection pool. """ if not self.is_connected: logger.debug("Already disconnected, skipping disconnection") return None if self._force_rollback: assert self._global_connection is not None assert self._global_transaction is not None await self._global_transaction.__aexit__() self._global_transaction = None self._global_connection = None else: self._connection_context = contextvars.ContextVar("connection_context") await self._backend.disconnect() logger.info( "Disconnected from database %s", self.url.obscure_password, extra=DISCONNECT_EXTRA, ) self.is_connected = False async def __aenter__(self) -> "Database": await self.connect() return self async def __aexit__( self, exc_type: typing.Type[BaseException] = None, exc_value: BaseException = None, traceback: TracebackType = None, ) -> None: await self.disconnect() async def fetch_all( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.List[typing.Mapping]: async with self.connection() as connection: return await connection.fetch_all(query, values) async def fetch_one( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.Optional[typing.Mapping]: async with self.connection() as connection: return await connection.fetch_one(query, values) async def fetch_val( self, query: typing.Union[ClauseElement, str], values: dict = None, column: typing.Any = 0, ) -> typing.Any: async with self.connection() as connection: return await connection.fetch_val(query, values, column=column) async def execute( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.Any: async with self.connection() as connection: return await connection.execute(query, values) async def execute_many( self, query: typing.Union[ClauseElement, str], values: list ) -> None: async with self.connection() as connection: return await connection.execute_many(query, values) async def iterate( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.AsyncGenerator[typing.Mapping, None]: async with self.connection() as connection: async for record in connection.iterate(query, values): yield record def _new_connection(self) -> "Connection": connection = Connection(self._backend) self._connection_context.set(connection) return connection def connection(self) -> "Connection": if self._global_connection is not None: return self._global_connection try: return self._connection_context.get() except LookupError: return self._new_connection() def transaction( self, *, force_rollback: bool = False, **kwargs: typing.Any ) -> "Transaction": try: connection = self._connection_context.get() is_root = not connection._transaction_stack if is_root: newcontext = contextvars.copy_context() get_conn = lambda: newcontext.run(self._new_connection) else: get_conn = self.connection except LookupError: get_conn = self.connection return Transaction(get_conn, force_rollback=force_rollback, **kwargs) @contextlib.contextmanager def force_rollback(self) -> typing.Iterator[None]: initial = self._force_rollback self._force_rollback = True try: yield finally: self._force_rollback = initial def _get_backend(self) -> str: return self.SUPPORTED_BACKENDS.get( self.url.scheme, self.SUPPORTED_BACKENDS[self.url.dialect] ) class Connection: def __init__(self, backend: DatabaseBackend) -> None: self._backend = backend self._connection_lock = asyncio.Lock() self._connection = self._backend.connection() self._connection_counter = 0 self._transaction_lock = asyncio.Lock() self._transaction_stack = [] # type: typing.List[Transaction] self._query_lock = asyncio.Lock() async def __aenter__(self) -> "Connection": async with self._connection_lock: self._connection_counter += 1 try: if self._connection_counter == 1: await self._connection.acquire() except Exception as e: self._connection_counter -= 1 raise e return self async def __aexit__( self, exc_type: typing.Type[BaseException] = None, exc_value: BaseException = None, traceback: TracebackType = None, ) -> None: async with self._connection_lock: assert self._connection is not None self._connection_counter -= 1 if self._connection_counter == 0: await self._connection.release() async def fetch_all( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.List[typing.Mapping]: built_query = self._build_query(query, values) async with self._query_lock: return await self._connection.fetch_all(built_query) async def fetch_one( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.Optional[typing.Mapping]: built_query = self._build_query(query, values) async with self._query_lock: return await self._connection.fetch_one(built_query) async def fetch_val( self, query: typing.Union[ClauseElement, str], values: dict = None, column: typing.Any = 0, ) -> typing.Any: built_query = self._build_query(query, values) async with self._query_lock: return await self._connection.fetch_val(built_query, column) async def execute( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.Any: built_query = self._build_query(query, values) async with self._query_lock: return await self._connection.execute(built_query) async def execute_many( self, query: typing.Union[ClauseElement, str], values: list ) -> None: queries = [self._build_query(query, values_set) for values_set in values] async with self._query_lock: await self._connection.execute_many(queries) async def iterate( self, query: typing.Union[ClauseElement, str], values: dict = None ) -> typing.AsyncGenerator[typing.Any, None]: built_query = self._build_query(query, values) async with self.transaction(): async with self._query_lock: async for record in self._connection.iterate(built_query): yield record def transaction( self, *, force_rollback: bool = False, **kwargs: typing.Any ) -> "Transaction": def connection_callable() -> Connection: return self return Transaction(connection_callable, force_rollback, **kwargs) @property def raw_connection(self) -> typing.Any: return self._connection.raw_connection @staticmethod def _build_query( query: typing.Union[ClauseElement, str], values: dict = None ) -> ClauseElement: if isinstance(query, str): query = text(query) return query.bindparams(**values) if values is not None else query elif values: return query.values(**values) return query class Transaction: def __init__( self, connection_callable: typing.Callable[[], Connection], force_rollback: bool, **kwargs: typing.Any, ) -> None: self._connection_callable = connection_callable self._force_rollback = force_rollback self._extra_options = kwargs async def __aenter__(self) -> "Transaction": """ Called when entering `async with database.transaction()` """ await self.start() return self async def __aexit__( self, exc_type: typing.Type[BaseException] = None, exc_value: BaseException = None, traceback: TracebackType = None, ) -> None: """ Called when exiting `async with database.transaction()` """ if exc_type is not None or self._force_rollback: await self.rollback() else: await self.commit() def __await__(self) -> typing.Generator: """ Called if using the low-level `transaction = await database.transaction()` """ return self.start().__await__() def __call__(self, func: typing.Callable) -> typing.Callable: """ Called if using `@database.transaction()` as a decorator. """ @functools.wraps(func) async def wrapper(*args: typing.Any, **kwargs: typing.Any) -> typing.Any: async with self: return await func(*args, **kwargs) return wrapper async def start(self) -> "Transaction": self._connection = self._connection_callable() self._transaction = self._connection._connection.transaction() async with self._connection._transaction_lock: is_root = not self._connection._transaction_stack await self._connection.__aenter__() await self._transaction.start( is_root=is_root, extra_options=self._extra_options ) self._connection._transaction_stack.append(self) return self async def commit(self) -> None: async with self._connection._transaction_lock: assert self._connection._transaction_stack[-1] is self self._connection._transaction_stack.pop() await self._transaction.commit() await self._connection.__aexit__() async def rollback(self) -> None: async with self._connection._transaction_lock: assert self._connection._transaction_stack[-1] is self self._connection._transaction_stack.pop() await self._transaction.rollback() await self._connection.__aexit__() class _EmptyNetloc(str): def __bool__(self) -> bool: return True class DatabaseURL: def __init__(self, url: typing.Union[str, "DatabaseURL"]): if isinstance(url, DatabaseURL): self._url: str = url._url elif isinstance(url, str): self._url = url else: raise TypeError( f"Invalid type for DatabaseURL. Expected str or DatabaseURL, got {type(url)}" ) @property def components(self) -> SplitResult: if not hasattr(self, "_components"): self._components = urlsplit(self._url) return self._components @property def scheme(self) -> str: return self.components.scheme @property def dialect(self) -> str: return self.components.scheme.split("+")[0] @property def driver(self) -> str: if "+" not in self.components.scheme: return "" return self.components.scheme.split("+", 1)[1] @property def userinfo(self) -> typing.Optional[bytes]: if self.components.username: info = self.components.username if self.components.password: info += ":" + self.components.password return info.encode("utf-8") return None @property def username(self) -> typing.Optional[str]: if self.components.username is None: return None return unquote(self.components.username) @property def password(self) -> typing.Optional[str]: if self.components.password is None: return None return unquote(self.components.password) @property def hostname(self) -> typing.Optional[str]: return ( self.components.hostname or self.options.get("host") or self.options.get("unix_sock") ) @property def port(self) -> typing.Optional[int]: return self.components.port @property def netloc(self) -> typing.Optional[str]: return self.components.netloc @property def database(self) -> str: path = self.components.path if path.startswith("/"): path = path[1:] return unquote(path) @property def options(self) -> dict: if not hasattr(self, "_options"): self._options = dict(parse_qsl(self.components.query)) return self._options def replace(self, **kwargs: typing.Any) -> "DatabaseURL": if ( "username" in kwargs or "password" in kwargs or "hostname" in kwargs or "port" in kwargs ): hostname = kwargs.pop("hostname", self.hostname) port = kwargs.pop("port", self.port) username = kwargs.pop("username", self.components.username) password = kwargs.pop("password", self.components.password) netloc = hostname if port is not None: netloc += f":{port}" if username is not None: userpass = username if password is not None: userpass += f":{password}" netloc = f"{userpass}@{netloc}" kwargs["netloc"] = netloc if "database" in kwargs: kwargs["path"] = "/" + kwargs.pop("database") if "dialect" in kwargs or "driver" in kwargs: dialect = kwargs.pop("dialect", self.dialect) driver = kwargs.pop("driver", self.driver) kwargs["scheme"] = f"{dialect}+{driver}" if driver else dialect if not kwargs.get("netloc", self.netloc): # Using an empty string that evaluates as True means we end up # with URLs like `sqlite:///database` instead of `sqlite:/database` kwargs["netloc"] = _EmptyNetloc() components = self.components._replace(**kwargs) return self.__class__(components.geturl()) @property def obscure_password(self) -> str: if self.password: return self.replace(password="********")._url return self._url def __str__(self) -> str: return self._url def __repr__(self) -> str: return f"{self.__class__.__name__}({repr(self.obscure_password)})" def __eq__(self, other: typing.Any) -> bool: return str(self) == str(other) databases-0.5.5/databases/importer.py0000644000175100001710000000212014172351434020410 0ustar runnerdocker00000000000000import importlib import typing class ImportFromStringError(Exception): pass def import_from_string(import_str: str) -> typing.Any: module_str, _, attrs_str = import_str.partition(":") if not module_str or not attrs_str: message = ( 'Import string "{import_str}" must be in format ":".' ) raise ImportFromStringError(message.format(import_str=import_str)) try: module = importlib.import_module(module_str) except ImportError as exc: if exc.name != module_str: raise exc from None message = 'Could not import module "{module_str}".' raise ImportFromStringError(message.format(module_str=module_str)) instance = module try: for attr_str in attrs_str.split("."): instance = getattr(instance, attr_str) except AttributeError as exc: message = 'Attribute "{attrs_str}" not found in module "{module_str}".' raise ImportFromStringError( message.format(attrs_str=attrs_str, module_str=module_str) ) return instance databases-0.5.5/databases/interfaces.py0000644000175100001710000000436014172351434020702 0ustar runnerdocker00000000000000import typing from sqlalchemy.sql import ClauseElement class DatabaseBackend: async def connect(self) -> None: raise NotImplementedError() # pragma: no cover async def disconnect(self) -> None: raise NotImplementedError() # pragma: no cover def connection(self) -> "ConnectionBackend": raise NotImplementedError() # pragma: no cover class ConnectionBackend: async def acquire(self) -> None: raise NotImplementedError() # pragma: no cover async def release(self) -> None: raise NotImplementedError() # pragma: no cover async def fetch_all(self, query: ClauseElement) -> typing.List[typing.Mapping]: raise NotImplementedError() # pragma: no cover async def fetch_one(self, query: ClauseElement) -> typing.Optional[typing.Mapping]: raise NotImplementedError() # pragma: no cover async def fetch_val( self, query: ClauseElement, column: typing.Any = 0 ) -> typing.Any: row = await self.fetch_one(query) return None if row is None else row[column] async def execute(self, query: ClauseElement) -> typing.Any: raise NotImplementedError() # pragma: no cover async def execute_many(self, queries: typing.List[ClauseElement]) -> None: raise NotImplementedError() # pragma: no cover async def iterate( self, query: ClauseElement ) -> typing.AsyncGenerator[typing.Mapping, None]: raise NotImplementedError() # pragma: no cover # mypy needs async iterators to contain a `yield` # https://github.com/python/mypy/issues/5385#issuecomment-407281656 yield True # pragma: no cover def transaction(self) -> "TransactionBackend": raise NotImplementedError() # pragma: no cover @property def raw_connection(self) -> typing.Any: raise NotImplementedError() # pragma: no cover class TransactionBackend: async def start( self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any] ) -> None: raise NotImplementedError() # pragma: no cover async def commit(self) -> None: raise NotImplementedError() # pragma: no cover async def rollback(self) -> None: raise NotImplementedError() # pragma: no cover databases-0.5.5/databases/py.typed0000644000175100001710000000000114172351434017670 0ustar runnerdocker00000000000000 databases-0.5.5/databases.egg-info/0000755000175100001710000000000014172351472017676 5ustar runnerdocker00000000000000databases-0.5.5/databases.egg-info/PKG-INFO0000644000175100001710000001337114172351472021000 0ustar runnerdocker00000000000000Metadata-Version: 2.1 Name: databases Version: 0.5.5 Summary: Async database support for Python. Home-page: https://github.com/encode/databases Author: Tom Christie Author-email: tom@tomchristie.com License: BSD Description: # Databases

Test Suite Package version

Databases gives you simple asyncio support for a range of databases. It allows you to make queries using the powerful [SQLAlchemy Core][sqlalchemy-core] expression language, and provides support for PostgreSQL, MySQL, and SQLite. Databases is suitable for integrating against any async Web framework, such as [Starlette][starlette], [Sanic][sanic], [Responder][responder], [Quart][quart], [aiohttp][aiohttp], [Tornado][tornado], or [FastAPI][fastapi]. **Documentation**: [https://www.encode.io/databases/](https://www.encode.io/databases/) **Requirements**: Python 3.6+ --- ## Installation ```shell $ pip install databases ``` You can install the required database drivers with: ```shell $ pip install databases[postgresql] $ pip install databases[mysql] $ pip install databases[sqlite] ``` Default driver support is provided using one of [asyncpg][asyncpg], [aiomysql][aiomysql], or [aiosqlite][aiosqlite]. You can also use other database drivers supported by `databases`: ```shel $ pip install databases[postgresql+aiopg] $ pip install databases[mysql+asyncmy] ``` Note that if you are using any synchronous SQLAlchemy functions such as `engine.create_all()` or [alembic][alembic] migrations then you still have to install a synchronous DB driver: [psycopg2][psycopg2] for PostgreSQL and [pymysql][pymysql] for MySQL. --- ## Quickstart For this example we'll create a very simple SQLite database to run some queries against. ```shell $ pip install databases[sqlite] $ pip install ipython ``` We can now run a simple example from the console. Note that we want to use `ipython` here, because it supports using `await` expressions directly from the console. ```python # Create a database instance, and connect to it. from databases import Database database = Database('sqlite:///example.db') await database.connect() # Create a table. query = """CREATE TABLE HighScores (id INTEGER PRIMARY KEY, name VARCHAR(100), score INTEGER)""" await database.execute(query=query) # Insert some data. query = "INSERT INTO HighScores(name, score) VALUES (:name, :score)" values = [ {"name": "Daisy", "score": 92}, {"name": "Neil", "score": 87}, {"name": "Carol", "score": 43}, ] await database.execute_many(query=query, values=values) # Run a database query. query = "SELECT * FROM HighScores" rows = await database.fetch_all(query=query) print('High Scores:', rows) ``` Check out the documentation on [making database queries](https://www.encode.io/databases/database_queries/) for examples of how to start using databases together with SQLAlchemy core expressions.

— ⭐️ —

Databases is BSD licensed code. Designed & built in Brighton, England.

[sqlalchemy-core]: https://docs.sqlalchemy.org/en/latest/core/ [sqlalchemy-core-tutorial]: https://docs.sqlalchemy.org/en/latest/core/tutorial.html [alembic]: https://alembic.sqlalchemy.org/en/latest/ [psycopg2]: https://www.psycopg.org/ [pymysql]: https://github.com/PyMySQL/PyMySQL [asyncpg]: https://github.com/MagicStack/asyncpg [aiomysql]: https://github.com/aio-libs/aiomysql [aiosqlite]: https://github.com/jreese/aiosqlite [starlette]: https://github.com/encode/starlette [sanic]: https://github.com/huge-success/sanic [responder]: https://github.com/kennethreitz/responder [quart]: https://gitlab.com/pgjones/quart [aiohttp]: https://github.com/aio-libs/aiohttp [tornado]: https://github.com/tornadoweb/tornado [fastapi]: https://github.com/tiangolo/fastapi Platform: UNKNOWN Classifier: Development Status :: 3 - Alpha Classifier: Environment :: Web Environment Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Topic :: Internet :: WWW/HTTP Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3 :: Only Requires-Python: >=3.6 Description-Content-Type: text/markdown Provides-Extra: postgresql Provides-Extra: mysql Provides-Extra: mysql+asyncmy Provides-Extra: sqlite Provides-Extra: postgresql+aiopg databases-0.5.5/databases.egg-info/SOURCES.txt0000644000175100001710000000077214172351472021570 0ustar runnerdocker00000000000000README.md setup.cfg setup.py databases/__init__.py databases/core.py databases/importer.py databases/interfaces.py databases/py.typed databases.egg-info/PKG-INFO databases.egg-info/SOURCES.txt databases.egg-info/dependency_links.txt databases.egg-info/not-zip-safe databases.egg-info/requires.txt databases.egg-info/top_level.txt databases/backends/__init__.py databases/backends/aiopg.py databases/backends/asyncmy.py databases/backends/mysql.py databases/backends/postgres.py databases/backends/sqlite.pydatabases-0.5.5/databases.egg-info/dependency_links.txt0000644000175100001710000000000114172351472023744 0ustar runnerdocker00000000000000 databases-0.5.5/databases.egg-info/not-zip-safe0000644000175100001710000000000114172351465022126 0ustar runnerdocker00000000000000 databases-0.5.5/databases.egg-info/requires.txt0000644000175100001710000000025614172351472022301 0ustar runnerdocker00000000000000sqlalchemy<1.5,>=1.4 [:python_version < "3.7"] aiocontextvars [mysql] aiomysql [mysql+asyncmy] asyncmy [postgresql] asyncpg [postgresql+aiopg] aiopg [sqlite] aiosqlite databases-0.5.5/databases.egg-info/top_level.txt0000644000175100001710000000003514172351472022426 0ustar runnerdocker00000000000000databases databases/backends databases-0.5.5/setup.cfg0000644000175100001710000000031314172351472016073 0ustar runnerdocker00000000000000[mypy] disallow_untyped_defs = True ignore_missing_imports = True [tool:isort] profile = black combine_as_imports = True [coverage:run] source = databases, tests [egg_info] tag_build = tag_date = 0 databases-0.5.5/setup.py0000644000175100001710000000415714172351434015774 0ustar runnerdocker00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- import os import re from setuptools import setup def get_version(package): """ Return package version as listed in `__version__` in `init.py`. """ with open(os.path.join(package, "__init__.py")) as f: return re.search("__version__ = ['\"]([^'\"]+)['\"]", f.read()).group(1) def get_long_description(): """ Return the README. """ with open("README.md", encoding="utf8") as f: return f.read() def get_packages(package): """ Return root package and all sub-packages. """ return [ dirpath for dirpath, dirnames, filenames in os.walk(package) if os.path.exists(os.path.join(dirpath, "__init__.py")) ] setup( name="databases", version=get_version("databases"), python_requires=">=3.6", url="https://github.com/encode/databases", license="BSD", description="Async database support for Python.", long_description=get_long_description(), long_description_content_type="text/markdown", author="Tom Christie", author_email="tom@tomchristie.com", packages=get_packages("databases"), package_data={"databases": ["py.typed"]}, install_requires=["sqlalchemy>=1.4,<1.5", 'aiocontextvars;python_version<"3.7"'], extras_require={ "postgresql": ["asyncpg"], "mysql": ["aiomysql"], "mysql+asyncmy": ["asyncmy"], "sqlite": ["aiosqlite"], "postgresql+aiopg": ["aiopg"], }, classifiers=[ "Development Status :: 3 - Alpha", "Environment :: Web Environment", "Intended Audience :: Developers", "License :: OSI Approved :: BSD License", "Operating System :: OS Independent", "Topic :: Internet :: WWW/HTTP", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3 :: Only", ], zip_safe=False, )