././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1724451421.742627 django_dbbackup-4.2.1/0000777000000000000000000000000014662205136011552 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320784.0 django_dbbackup-4.2.1/AUTHORS.txt0000666000000000000000000000046714314702420013437 0ustar00Primary Authors: * Mark (Archmonger) * John Hagen (johnthagen) * Michael Shepanski * Anthony Monthe (ZuluPro) * Benjamin Bach (benjaoming) Contributors: * Hannes Hapke * Joe Hu * Marco Braak * Nathan Duthoit * Rich Leland * Toumhi (Bitbucket) * Tobias McNulty * Grant McConnaughey ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/LICENSE.txt0000666000000000000000000000300514314702350013365 0ustar00Copyright (c) 2010, Michael Shepanski All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name django-dbbackup nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/MANIFEST.in0000666000000000000000000000026114662175773013324 0ustar00recursive-include requirements * include requirements.txt include README.rst include LICENSE.txt include dbbackup/VERSION recursive-include dbbackup/tests/testapp/blobs * ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7416265 django_dbbackup-4.2.1/PKG-INFO0000666000000000000000000002234214662205136012652 0ustar00Metadata-Version: 2.1 Name: django-dbbackup Version: 4.2.1 Summary: Management commands to help backup and restore a project database and media. Home-page: https://github.com/jazzband/django-dbbackup Author: Archmonger Author-email: archiethemonger@gmail.com License: BSD Keywords: django,database,media,backup,amazon,s3,dropbox Classifier: Development Status :: 4 - Beta Classifier: Environment :: Web Environment Classifier: Environment :: Console Classifier: Framework :: Django :: 3.2 Classifier: Framework :: Django :: 4.2 Classifier: Framework :: Django :: 5.0 Classifier: Intended Audience :: Developers Classifier: Intended Audience :: System Administrators Classifier: License :: OSI Approved :: BSD License Classifier: Natural Language :: English Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Topic :: Database Classifier: Topic :: System :: Archiving Classifier: Topic :: System :: Archiving :: Backup Classifier: Topic :: System :: Archiving :: Compression Requires-Python: >=3.7 Description-Content-Type: text/x-rst License-File: LICENSE.txt License-File: AUTHORS.txt Requires-Dist: django>=3.2 Requires-Dist: pytz Django Database Backup ====================== .. image:: https://github.com/jazzband/django-dbbackup/actions/workflows/build.yml/badge.svg :target: https://github.com/jazzband/django-dbbackup/actions .. image:: https://readthedocs.org/projects/django-dbbackup/badge/?version=stable :target: https://django-dbbackup.readthedocs.io/ :alt: Documentation Status .. image:: https://codecov.io/gh/jazzband/django-dbbackup/branch/master/graph/badge.svg?token=zaYmStcsuX :target: https://codecov.io/gh/jazzband/django-dbbackup .. image:: https://jazzband.co/static/img/badge.svg :target: https://jazzband.co/ :alt: Jazzband This Django application provides management commands to help backup and restore your project database and media files with various storages such as Amazon S3, Dropbox, local file storage or any Django storage. It is made to: - Allow you to secure your backup with GPG signature and encryption - Archive with compression - Deal easily with remote archiving - Keep your development database up to date - Use Crontab or Celery to setup automated backups Docs ==== See our official documentation at `Read The Docs`_. Why use DBBackup ================ This software doesn't reinvent the wheel, in a few words it is a pipe between your Django project and your backup storage. It tries to use the traditional dump & restore mechanisms, apply compression and/or encryption and use the storage system you desire. It gives a simple interface to backup and restore your database or media files. Management Commands =================== dbbackup -------- Backup your database to the specified storage. By default this will backup all databases specified in your settings.py file and will not delete any old backups. You can optionally specify a server name to be included in the backup filename. :: Usage: ./manage.py dbbackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -d DATABASE, --database=DATABASE Database to backup (default: everything) -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the backup files -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem -x EXCLUDE_TABLES, --exclude-tables=EXCLUDE_TABLES Exclude tables data from backup (-x 'public.table1, public.table2') dbrestore --------- Restore your database from the specified storage. By default this will lookup the latest backup and restore from that. You may optionally specify a servername if you you want to backup a database image that was created from a different server. You may also specify an explicit local file to backup from. :: Usage: ./manage.py dbrestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -d DATABASE, --database=DATABASE Database to restore -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -s SERVERNAME, --servername=SERVERNAME Use a different servername backup -c, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring mediabackup ----------- Backup media files by get them one by one, include in a TAR file. :: Usage: ./manage.py mediabackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the archive -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem mediarestore ------------ Restore media files from storage backup to your media storage. :: Usage: ./manage.py mediarestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -e, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring -r, --replace Replace existing files Tests ===== Tests are stored in `dbbackup.tests` and to run them you must launch: :: python runtests.py In fact, ``runtests.py`` acts as a ``manage.py`` file and all Django commands are available. So you could launch: :: python runtests.py shell to get a Python shell configured with the test project. Also all test command options are available and usable to run only a selection of tests. See `Django test command documentation`_ for more information about it. .. _`Django test command documentation`: https://docs.djangoproject.com/en/stable/topics/testing/overview/#running-tests There are even functional tests: :: ./functional.sh See documentation for details. To run the tests across all supported versions of Django and Python, you can use Tox. Firstly install Tox: :: pip install tox To run the tests just use the command ``tox`` in the command line. If you want to run the tests against just one specific test environment you can run ``tox -e ``. For example, to run the tests with Python3.9 and Django3.2 you would run: :: tox -e py39-django32 The available test environments can be found in ``tox.ini``. Contributing ============ .. image:: https://jazzband.co/static/img/jazzband.svg :target: https://jazzband.co/ :alt: Jazzband This is a `Jazzband `_ project. By contributing you agree to abide by the `Contributor Code of Conduct `_ and follow the `guidelines `_. All contribution are very welcomed, propositions, problems, bugs and enhancement are tracked with `GitHub issues`_ system and patches are submitted via `pull requests`_. We use GitHub Actions as continuous integration tools. .. _`Read The Docs`: https://django-dbbackup.readthedocs.org/ .. _`GitHub issues`: https://github.com/jazzband/django-dbbackup/issues .. _`pull requests`: https://github.com/jazzband/django-dbbackup/pulls .. _Coveralls: https://coveralls.io/github/jazzband/django-dbbackup ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/README.rst0000666000000000000000000001735414662175773013270 0ustar00Django Database Backup ====================== .. image:: https://github.com/jazzband/django-dbbackup/actions/workflows/build.yml/badge.svg :target: https://github.com/jazzband/django-dbbackup/actions .. image:: https://readthedocs.org/projects/django-dbbackup/badge/?version=stable :target: https://django-dbbackup.readthedocs.io/ :alt: Documentation Status .. image:: https://codecov.io/gh/jazzband/django-dbbackup/branch/master/graph/badge.svg?token=zaYmStcsuX :target: https://codecov.io/gh/jazzband/django-dbbackup .. image:: https://jazzband.co/static/img/badge.svg :target: https://jazzband.co/ :alt: Jazzband This Django application provides management commands to help backup and restore your project database and media files with various storages such as Amazon S3, Dropbox, local file storage or any Django storage. It is made to: - Allow you to secure your backup with GPG signature and encryption - Archive with compression - Deal easily with remote archiving - Keep your development database up to date - Use Crontab or Celery to setup automated backups Docs ==== See our official documentation at `Read The Docs`_. Why use DBBackup ================ This software doesn't reinvent the wheel, in a few words it is a pipe between your Django project and your backup storage. It tries to use the traditional dump & restore mechanisms, apply compression and/or encryption and use the storage system you desire. It gives a simple interface to backup and restore your database or media files. Management Commands =================== dbbackup -------- Backup your database to the specified storage. By default this will backup all databases specified in your settings.py file and will not delete any old backups. You can optionally specify a server name to be included in the backup filename. :: Usage: ./manage.py dbbackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -d DATABASE, --database=DATABASE Database to backup (default: everything) -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the backup files -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem -x EXCLUDE_TABLES, --exclude-tables=EXCLUDE_TABLES Exclude tables data from backup (-x 'public.table1, public.table2') dbrestore --------- Restore your database from the specified storage. By default this will lookup the latest backup and restore from that. You may optionally specify a servername if you you want to backup a database image that was created from a different server. You may also specify an explicit local file to backup from. :: Usage: ./manage.py dbrestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -d DATABASE, --database=DATABASE Database to restore -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -s SERVERNAME, --servername=SERVERNAME Use a different servername backup -c, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring mediabackup ----------- Backup media files by get them one by one, include in a TAR file. :: Usage: ./manage.py mediabackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the archive -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem mediarestore ------------ Restore media files from storage backup to your media storage. :: Usage: ./manage.py mediarestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -e, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring -r, --replace Replace existing files Tests ===== Tests are stored in `dbbackup.tests` and to run them you must launch: :: python runtests.py In fact, ``runtests.py`` acts as a ``manage.py`` file and all Django commands are available. So you could launch: :: python runtests.py shell to get a Python shell configured with the test project. Also all test command options are available and usable to run only a selection of tests. See `Django test command documentation`_ for more information about it. .. _`Django test command documentation`: https://docs.djangoproject.com/en/stable/topics/testing/overview/#running-tests There are even functional tests: :: ./functional.sh See documentation for details. To run the tests across all supported versions of Django and Python, you can use Tox. Firstly install Tox: :: pip install tox To run the tests just use the command ``tox`` in the command line. If you want to run the tests against just one specific test environment you can run ``tox -e ``. For example, to run the tests with Python3.9 and Django3.2 you would run: :: tox -e py39-django32 The available test environments can be found in ``tox.ini``. Contributing ============ .. image:: https://jazzband.co/static/img/jazzband.svg :target: https://jazzband.co/ :alt: Jazzband This is a `Jazzband `_ project. By contributing you agree to abide by the `Contributor Code of Conduct `_ and follow the `guidelines `_. All contribution are very welcomed, propositions, problems, bugs and enhancement are tracked with `GitHub issues`_ system and patches are submitted via `pull requests`_. We use GitHub Actions as continuous integration tools. .. _`Read The Docs`: https://django-dbbackup.readthedocs.org/ .. _`GitHub issues`: https://github.com/jazzband/django-dbbackup/issues .. _`pull requests`: https://github.com/jazzband/django-dbbackup/pulls .. _Coveralls: https://coveralls.io/github/jazzband/django-dbbackup ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.6871126 django_dbbackup-4.2.1/dbbackup/0000777000000000000000000000000014662205136013325 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724450862.0 django_dbbackup-4.2.1/dbbackup/VERSION0000666000000000000000000000000714662204056014372 0ustar004.2.1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/__init__.py0000666000000000000000000000101114662175773015444 0ustar00"""Management commands to help backup and restore a project database and media""" from pathlib import Path import django src_dir = Path(__file__).parent with (src_dir / "VERSION").open() as f: __version__ = f.read().strip() """The full version, including alpha/beta/rc tags.""" VERSION = (x, y, z) = __version__.split(".") VERSION = ".".join(VERSION[:2]) """The X.Y version. Needed for `docs/conf.py`.""" if django.VERSION < (3, 2): default_app_config = "dbbackup.apps.DbbackupConfig" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/apps.py0000666000000000000000000000066614662175773014667 0ustar00"""Apps for DBBackup""" from django.apps import AppConfig from django.utils.translation import gettext_lazy from dbbackup import log class DbbackupConfig(AppConfig): """ Config for DBBackup application. """ name = "dbbackup" label = "dbbackup" verbose_name = gettext_lazy("Backup and restore") default_auto_field = "django.db.models.AutoField" def ready(self): log.load() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/checks.py0000666000000000000000000000620414662175773015156 0ustar00import re from datetime import datetime from django.core.checks import Tags, Warning, register from dbbackup import settings W001 = Warning( "Invalid HOSTNAME parameter", hint="Set a non empty string to this settings.DBBACKUP_HOSTNAME", id="dbbackup.W001", ) W002 = Warning( "Invalid STORAGE parameter", hint="Set a valid path to a storage in settings.DBBACKUP_STORAGE", id="dbbackup.W002", ) W003 = Warning( "Invalid FILENAME_TEMPLATE parameter", hint="Include {datetime} to settings.DBBACKUP_FILENAME_TEMPLATE", id="dbbackup.W003", ) W004 = Warning( "Invalid MEDIA_FILENAME_TEMPLATE parameter", hint="Include {datetime} to settings.DBBACKUP_MEDIA_FILENAME_TEMPLATE", id="dbbackup.W004", ) W005 = Warning( "Invalid DATE_FORMAT parameter", hint="settings.DBBACKUP_DATE_FORMAT can contain only [A-Za-z0-9%_-]", id="dbbackup.W005", ) W006 = Warning( "FAILURE_RECIPIENTS has been deprecated", hint="settings.DBBACKUP_FAILURE_RECIPIENTS is replaced by " "settings.DBBACKUP_ADMINS", id="dbbackup.W006", ) W007 = Warning( "Invalid FILENAME_TEMPLATE parameter", hint="settings.DBBACKUP_FILENAME_TEMPLATE must not contain slashes ('/'). " "Did you mean to change the value for 'location'?", id="dbbackup.W007", ) W008 = Warning( "Invalid MEDIA_FILENAME_TEMPLATE parameter", hint="settings.DBBACKUP_MEDIA_FILENAME_TEMPLATE must not contain slashes ('/')" "Did you mean to change the value for 'location'?", id="dbbackup.W007", ) def check_filename_templates(): return _check_filename_template( settings.FILENAME_TEMPLATE, W007, "db", ) + _check_filename_template( settings.MEDIA_FILENAME_TEMPLATE, W008, "media", ) def _check_filename_template(filename_template, check_code, content_type) -> list: if callable(filename_template): params = { "servername": "localhost", "datetime": datetime.now().strftime(settings.DATE_FORMAT), "databasename": "default", "extension": "dump", "content_type": content_type, } filename_template = filename_template(params) if "/" in filename_template: return [check_code] return [] @register(Tags.compatibility) def check_settings(app_configs, **kwargs): errors = [] if not settings.HOSTNAME: errors.append(W001) if not settings.STORAGE or not isinstance(settings.STORAGE, str): errors.append(W002) if ( not callable(settings.FILENAME_TEMPLATE) and "{datetime}" not in settings.FILENAME_TEMPLATE ): errors.append(W003) if ( not callable(settings.MEDIA_FILENAME_TEMPLATE) and "{datetime}" not in settings.MEDIA_FILENAME_TEMPLATE ): errors.append(W004) if re.search(r"[^A-Za-z0-9%_-]", settings.DATE_FORMAT): errors.append(W005) if getattr(settings, "FAILURE_RECIPIENTS", None) is not None: errors.append(W006) errors += check_filename_templates() return errors ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.6911137 django_dbbackup-4.2.1/dbbackup/db/0000777000000000000000000000000014662205136013712 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/db/__init__.py0000666000000000000000000000000014314702350016003 0ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/db/base.py0000666000000000000000000001434014662175773015215 0ustar00""" Base database connectors """ import logging import os import shlex from importlib import import_module from subprocess import Popen from tempfile import SpooledTemporaryFile from django.core.files.base import File from dbbackup import settings, utils from . import exceptions logger = logging.getLogger("dbbackup.command") logger.setLevel(logging.DEBUG) CONNECTOR_MAPPING = { "django.db.backends.sqlite3": "dbbackup.db.sqlite.SqliteConnector", "django.db.backends.mysql": "dbbackup.db.mysql.MysqlDumpConnector", "django.db.backends.postgresql": "dbbackup.db.postgresql.PgDumpBinaryConnector", "django.db.backends.postgresql_psycopg2": "dbbackup.db.postgresql.PgDumpBinaryConnector", "django.db.backends.oracle": None, "django_mongodb_engine": "dbbackup.db.mongodb.MongoDumpConnector", "djongo": "dbbackup.db.mongodb.MongoDumpConnector", "django.contrib.gis.db.backends.postgis": "dbbackup.db.postgresql.PgDumpGisConnector", "django.contrib.gis.db.backends.mysql": "dbbackup.db.mysql.MysqlDumpConnector", "django.contrib.gis.db.backends.oracle": None, "django.contrib.gis.db.backends.spatialite": "dbbackup.db.sqlite.SqliteConnector", "django_prometheus.db.backends.postgresql": "dbbackup.db.postgresql.PgDumpBinaryConnector", "django_prometheus.db.backends.sqlite3": "dbbackup.db.sqlite.SqliteConnector", "django_prometheus.db.backends.mysql": "dbbackup.db.mysql.MysqlDumpConnector", "django_prometheus.db.backends.postgis": "dbbackup.db.postgresql.PgDumpGisConnector", } if settings.CUSTOM_CONNECTOR_MAPPING: CONNECTOR_MAPPING.update(settings.CUSTOM_CONNECTOR_MAPPING) def get_connector(database_name=None): """ Get a connector from its database key in settings. """ from django.db import DEFAULT_DB_ALIAS, connections # Get DB database_name = database_name or DEFAULT_DB_ALIAS connection = connections[database_name] engine = connection.settings_dict["ENGINE"] connector_settings = settings.CONNECTORS.get(database_name, {}) connector_path = connector_settings.get("CONNECTOR", CONNECTOR_MAPPING[engine]) connector_module_path = ".".join(connector_path.split(".")[:-1]) module = import_module(connector_module_path) connector_name = connector_path.split(".")[-1] connector = getattr(module, connector_name) return connector(database_name, **connector_settings) class BaseDBConnector: """ Base class for create database connector. This kind of object creates interaction with database and allow backup and restore operations. """ extension = "dump" exclude = [] def __init__(self, database_name=None, **kwargs): from django.db import DEFAULT_DB_ALIAS, connections self.database_name = database_name or DEFAULT_DB_ALIAS self.connection = connections[self.database_name] for attr, value in kwargs.items(): setattr(self, attr.lower(), value) @property def settings(self): """Mix of database and connector settings.""" if not hasattr(self, "_settings"): sett = self.connection.settings_dict.copy() sett.update(settings.CONNECTORS.get(self.database_name, {})) self._settings = sett return self._settings def generate_filename(self, server_name=None): return utils.filename_generate(self.extension, self.database_name, server_name) def create_dump(self): return self._create_dump() def _create_dump(self): """ Override this method to define dump creation. """ raise NotImplementedError("_create_dump not implemented") def restore_dump(self, dump): """ :param dump: Dump file :type dump: file """ return self._restore_dump(dump) def _restore_dump(self, dump): """ Override this method to define dump creation. :param dump: Dump file :type dump: file """ raise NotImplementedError("_restore_dump not implemented") class BaseCommandDBConnector(BaseDBConnector): """ Base class for create database connector based on command line tools. """ dump_prefix = "" dump_suffix = "" restore_prefix = "" restore_suffix = "" use_parent_env = True env = {} dump_env = {} restore_env = {} def run_command(self, command, stdin=None, env=None): """ Launch a shell command line. :param command: Command line to launch :type command: str :param stdin: Standard input of command :type stdin: file :param env: Environment variable used in command :type env: dict :return: Standard output of command :rtype: file """ logger.debug(command) cmd = shlex.split(command) stdout = SpooledTemporaryFile( max_size=settings.TMP_FILE_MAX_SIZE, dir=settings.TMP_DIR ) stderr = SpooledTemporaryFile( max_size=settings.TMP_FILE_MAX_SIZE, dir=settings.TMP_DIR ) full_env = os.environ.copy() if self.use_parent_env else {} full_env.update(self.env) full_env.update(env or {}) try: if isinstance(stdin, File): process = Popen( cmd, stdin=stdin.open("rb"), stdout=stdout, stderr=stderr, env=full_env, ) else: process = Popen( cmd, stdin=stdin, stdout=stdout, stderr=stderr, env=full_env ) process.wait() if process.poll(): stderr.seek(0) raise exceptions.CommandConnectorError( "Error running: {}\n{}".format( command, stderr.read().decode("utf-8") ) ) stdout.seek(0) stderr.seek(0) return stdout, stderr except OSError as err: raise exceptions.CommandConnectorError( f"Error running: {command}\n{str(err)}" ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/db/exceptions.py0000666000000000000000000000050214314702350016434 0ustar00"""Exceptions for database connectors.""" class ConnectorError(Exception): """Base connector error""" class DumpError(ConnectorError): """Error on dump""" class RestoreError(ConnectorError): """Error on restore""" class CommandConnectorError(ConnectorError): """Failing command""" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/db/mongodb.py0000666000000000000000000000406514662175773015733 0ustar00from dbbackup import utils from .base import BaseCommandDBConnector class MongoDumpConnector(BaseCommandDBConnector): """ MongoDB connector, creates dump with ``mongodump`` and restore with ``mongorestore``. """ dump_cmd = "mongodump" restore_cmd = "mongorestore" object_check = True drop = True def _create_dump(self): cmd = f"{self.dump_cmd} --db {self.settings['NAME']}" host = self.settings.get("HOST") or "localhost" port = self.settings.get("PORT") or 27017 cmd += f" --host {host}:{port}" if self.settings.get("USER"): cmd += f" --username {self.settings['USER']}" if self.settings.get("PASSWORD"): cmd += f" --password {utils.get_escaped_command_arg(self.settings['PASSWORD'])}" if self.settings.get("AUTH_SOURCE"): cmd += f" --authenticationDatabase {self.settings['AUTH_SOURCE']}" for collection in self.exclude: cmd += f" --excludeCollection {collection}" cmd += " --archive" cmd = f"{self.dump_prefix} {cmd} {self.dump_suffix}" stdout, stderr = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): cmd = self.restore_cmd host = self.settings.get("HOST") or "localhost" port = self.settings.get("PORT") or 27017 cmd += f" --host {host}:{port}" if self.settings.get("USER"): cmd += f" --username {self.settings['USER']}" if self.settings.get("PASSWORD"): cmd += f" --password {utils.get_escaped_command_arg(self.settings['PASSWORD'])}" if self.settings.get("AUTH_SOURCE"): cmd += f" --authenticationDatabase {self.settings['AUTH_SOURCE']}" if self.object_check: cmd += " --objcheck" if self.drop: cmd += " --drop" cmd += " --archive" cmd = f"{self.restore_prefix} {cmd} {self.restore_suffix}" return self.run_command(cmd, stdin=dump, env=self.restore_env) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/db/mysql.py0000666000000000000000000000334714662175773015455 0ustar00from dbbackup import utils from .base import BaseCommandDBConnector class MysqlDumpConnector(BaseCommandDBConnector): """ MySQL connector, creates dump with ``mysqldump`` and restore with ``mysql``. """ dump_cmd = "mysqldump" restore_cmd = "mysql" def _create_dump(self): cmd = f"{self.dump_cmd} {self.settings['NAME']} --quick" if self.settings.get("HOST"): cmd += f" --host={self.settings['HOST']}" if self.settings.get("PORT"): cmd += f" --port={self.settings['PORT']}" if self.settings.get("USER"): cmd += f" --user={self.settings['USER']}" if self.settings.get("PASSWORD"): cmd += f" --password={utils.get_escaped_command_arg(self.settings['PASSWORD'])}" for table in self.exclude: cmd += f" --ignore-table={self.settings['NAME']}.{table}" cmd = f"{self.dump_prefix} {cmd} {self.dump_suffix}" stdout, stderr = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): cmd = f"{self.restore_cmd} {self.settings['NAME']}" if self.settings.get("HOST"): cmd += f" --host={self.settings['HOST']}" if self.settings.get("PORT"): cmd += f" --port={self.settings['PORT']}" if self.settings.get("USER"): cmd += f" --user={self.settings['USER']}" if self.settings.get("PASSWORD"): cmd += f" --password={utils.get_escaped_command_arg(self.settings['PASSWORD'])}" cmd = f"{self.restore_prefix} {cmd} {self.restore_suffix}" stdout, stderr = self.run_command(cmd, stdin=dump, env=self.restore_env) return stdout, stderr ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/db/postgresql.py0000666000000000000000000001060314662175773016504 0ustar00import logging from typing import List, Optional from urllib.parse import quote from .base import BaseCommandDBConnector logger = logging.getLogger("dbbackup.command") def create_postgres_uri(self): host = self.settings.get("HOST") or "localhost" dbname = self.settings.get("NAME") or "" user = quote(self.settings.get("USER") or "") password = self.settings.get("PASSWORD") or "" password = f":{quote(password)}" if password else "" if not user: password = "" else: host = "@" + host port = ":{}".format(self.settings.get("PORT")) if self.settings.get("PORT") else "" dbname = f"--dbname=postgresql://{user}{password}{host}{port}/{dbname}" return dbname class PgDumpConnector(BaseCommandDBConnector): """ PostgreSQL connector, it uses pg_dump`` to create an SQL text file and ``psql`` for restore it. """ extension = "psql" dump_cmd = "pg_dump" restore_cmd = "psql" single_transaction = True drop = True schemas: Optional[List[str]] = [] def _create_dump(self): cmd = f"{self.dump_cmd} " cmd = cmd + create_postgres_uri(self) for table in self.exclude: cmd += f" --exclude-table-data={table}" if self.drop: cmd += " --clean" if self.schemas: # First schema is not prefixed with -n # when using join function so add it manually. cmd += " -n " + " -n ".join(self.schemas) cmd = f"{self.dump_prefix} {cmd} {self.dump_suffix}" stdout, stderr = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): cmd = f"{self.restore_cmd} " cmd = cmd + create_postgres_uri(self) # without this, psql terminates with an exit value of 0 regardless of errors cmd += " --set ON_ERROR_STOP=on" if self.schemas: cmd += " -n " + " -n ".join(self.schemas) if self.single_transaction: cmd += " --single-transaction" cmd += " {}".format(self.settings["NAME"]) cmd = f"{self.restore_prefix} {cmd} {self.restore_suffix}" stdout, stderr = self.run_command(cmd, stdin=dump, env=self.restore_env) return stdout, stderr class PgDumpGisConnector(PgDumpConnector): """ PostgreGIS connector, same than :class:`PgDumpGisConnector` but enable postgis if not made. """ psql_cmd = "psql" def _enable_postgis(self): cmd = f'{self.psql_cmd} -c "CREATE EXTENSION IF NOT EXISTS postgis;"' cmd += " --username={}".format(self.settings["ADMIN_USER"]) cmd += " --no-password" if self.settings.get("HOST"): cmd += " --host={}".format(self.settings["HOST"]) if self.settings.get("PORT"): cmd += " --port={}".format(self.settings["PORT"]) return self.run_command(cmd) def _restore_dump(self, dump): if self.settings.get("ADMIN_USER"): self._enable_postgis() return super()._restore_dump(dump) class PgDumpBinaryConnector(PgDumpConnector): """ PostgreSQL connector, it uses pg_dump`` to create an SQL text file and ``pg_restore`` for restore it. """ extension = "psql.bin" dump_cmd = "pg_dump" restore_cmd = "pg_restore" single_transaction = True drop = True def _create_dump(self): cmd = f"{self.dump_cmd} " cmd = cmd + create_postgres_uri(self) cmd += " --format=custom" for table in self.exclude: cmd += f" --exclude-table-data={table}" if self.schemas: cmd += " -n " + " -n ".join(self.schemas) cmd = f"{self.dump_prefix} {cmd} {self.dump_suffix}" stdout, _ = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): dbname = create_postgres_uri(self) cmd = f"{self.restore_cmd} {dbname}" if self.single_transaction: cmd += " --single-transaction" if self.drop: cmd += " --clean" if self.schemas: cmd += " -n " + " -n ".join(self.schemas) cmd = f"{self.restore_prefix} {cmd} {self.restore_suffix}" stdout, stderr = self.run_command(cmd, stdin=dump, env=self.restore_env) return stdout, stderr ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724449435.0 django_dbbackup-4.2.1/dbbackup/db/sqlite.py0000666000000000000000000000762614662201233015572 0ustar00import warnings from io import BytesIO from shutil import copyfileobj from tempfile import SpooledTemporaryFile from django.db import IntegrityError, OperationalError from .base import BaseDBConnector DUMP_TABLES = """ SELECT "name", "type", "sql" FROM "sqlite_master" WHERE "sql" NOT NULL AND "type" == 'table' ORDER BY "name" """ DUMP_ETC = """ SELECT "name", "type", "sql" FROM "sqlite_master" WHERE "sql" NOT NULL AND "type" IN ('index', 'trigger', 'view') """ class SqliteConnector(BaseDBConnector): """ Create a dump at SQL layer like could make ``.dumps`` in sqlite3. Restore by evaluate the created SQL. """ def _write_dump(self, fileobj): cursor = self.connection.cursor() cursor.execute(DUMP_TABLES) for table_name, _, sql in cursor.fetchall(): if table_name.startswith("sqlite_") or table_name in self.exclude: continue if sql.startswith("CREATE TABLE"): sql = sql.replace("CREATE TABLE", "CREATE TABLE IF NOT EXISTS") # Make SQL commands in 1 line sql = sql.replace("\n ", "") sql = sql.replace("\n)", ")") fileobj.write(f"{sql};\n".encode()) table_name_ident = table_name.replace('"', '""') res = cursor.execute(f'PRAGMA table_info("{table_name_ident}")') column_names = [str(table_info[1]) for table_info in res.fetchall()] q = """SELECT 'INSERT INTO "{0}" VALUES({1})' FROM "{0}";\n""".format( table_name_ident, ",".join( """'||quote("{}")||'""".format(col.replace('"', '""')) for col in column_names ), ) query_res = cursor.execute(q) for row in query_res: fileobj.write(f"{row[0]};\n".encode()) schema_res = cursor.execute(DUMP_ETC) for name, _, sql in schema_res.fetchall(): if sql.startswith("CREATE INDEX"): sql = sql.replace("CREATE INDEX", "CREATE INDEX IF NOT EXISTS") fileobj.write(f"{sql};\n".encode()) cursor.close() def create_dump(self): if not self.connection.is_usable(): self.connection.connect() dump_file = SpooledTemporaryFile(max_size=10 * 1024 * 1024) self._write_dump(dump_file) dump_file.seek(0) return dump_file def restore_dump(self, dump): if not self.connection.is_usable(): self.connection.connect() cursor = self.connection.cursor() sql_command = b"" sql_is_complete = True for line in dump.readlines(): sql_command = sql_command + line line_str = line.decode("UTF-8") if line_str.startswith("INSERT") and not line_str.endswith(");\n"): sql_is_complete = False continue if not sql_is_complete and line_str.endswith(");\n"): sql_is_complete = True if sql_is_complete: try: cursor.execute(sql_command.decode("UTF-8")) except (OperationalError, IntegrityError) as err: warnings.warn(f"Error in db restore: {err}") sql_command = b"" class SqliteCPConnector(BaseDBConnector): """ Create a dump by copy the binary data file. Restore by simply copy to the good location. """ def create_dump(self): path = self.connection.settings_dict["NAME"] dump = BytesIO() with open(path, "rb") as db_file: copyfileobj(db_file, dump) dump.seek(0) return dump def restore_dump(self, dump): path = self.connection.settings_dict["NAME"] with open(path, "wb") as db_file: copyfileobj(dump, db_file) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/log.py0000666000000000000000000000207214662175773014476 0ustar00import logging import django from django.utils.log import AdminEmailHandler class DbbackupAdminEmailHandler(AdminEmailHandler): def emit(self, record): # Monkey patch for old Django versions without send_mail method if django.VERSION < (1, 8): from . import utils django.core.mail.mail_admins = utils.mail_admins super().emit(record) def send_mail(self, subject, message, *args, **kwargs): from . import utils utils.mail_admins( subject, message, *args, connection=self.connection(), **kwargs ) class MailEnabledFilter(logging.Filter): def filter(self, record): from .settings import SEND_EMAIL return SEND_EMAIL def load(): mail_admins_handler = DbbackupAdminEmailHandler(include_html=True) mail_admins_handler.setLevel(logging.ERROR) mail_admins_handler.addFilter(MailEnabledFilter()) logger = logging.getLogger("dbbackup") logger.setLevel(logging.INFO) logger.handlers = [mail_admins_handler] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.6916137 django_dbbackup-4.2.1/dbbackup/management/0000777000000000000000000000000014662205136015441 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/management/__init__.py0000666000000000000000000000000014314702350017532 0ustar00././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1724451421.696114 django_dbbackup-4.2.1/dbbackup/management/commands/0000777000000000000000000000000014662205136017242 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/management/commands/__init__.py0000666000000000000000000000000014314702350021333 0ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/management/commands/_base.py0000666000000000000000000001071614662175773020707 0ustar00""" Abstract Command. """ import logging import sys from optparse import make_option as optparse_make_option from shutil import copyfileobj import django from django.core.management.base import BaseCommand, CommandError from ...storage import StorageError USELESS_ARGS = ("callback", "callback_args", "callback_kwargs", "metavar") TYPES = { "string": str, "int": int, "long": int, "float": float, "complex": complex, "choice": list, } LOGGING_VERBOSITY = { 0: logging.WARN, 1: logging.INFO, 2: logging.DEBUG, 3: logging.DEBUG, } def make_option(*args, **kwargs): return args, kwargs class BaseDbBackupCommand(BaseCommand): """ Base command class used for create all dbbackup command. """ base_option_list = ( make_option( "--noinput", action="store_false", dest="interactive", default=True, help="Tells Django to NOT prompt the user for input of any kind.", ), make_option( "-q", "--quiet", action="store_true", default=False, help="Tells Django to NOT output other text than errors.", ), ) option_list = () verbosity = 1 quiet = False logger = logging.getLogger("dbbackup.command") def __init__(self, *args, **kwargs): self.option_list = self.base_option_list + self.option_list if django.VERSION < (1, 10): options = tuple( optparse_make_option(*_args, **_kwargs) for _args, _kwargs in self.option_list ) self.option_list = options + BaseCommand.option_list super().__init__(*args, **kwargs) def add_arguments(self, parser): for args, kwargs in self.option_list: kwargs = { k: v for k, v in kwargs.items() if not k.startswith("_") and k not in USELESS_ARGS } parser.add_argument(*args, **kwargs) def _set_logger_level(self): level = 60 if self.quiet else LOGGING_VERBOSITY[int(self.verbosity)] self.logger.setLevel(level) def _ask_confirmation(self): answer = input("Are you sure you want to continue? [Y/n] ") if answer.lower().startswith("n"): self.logger.info("Quitting") sys.exit(0) def read_from_storage(self, path): return self.storage.read_file(path) def write_to_storage(self, file, path): self.logger.info("Writing file to %s", path) self.storage.write_file(file, path) def read_local_file(self, path): """Open file in read mode on local filesystem.""" return open(path, "rb") def write_local_file(self, outputfile, path): """Write file to the desired path.""" self.logger.info("Writing file to %s", path) outputfile.seek(0) with open(path, "wb") as fd: copyfileobj(outputfile, fd) def _get_backup_file(self, database=None, servername=None): if self.path: input_filename = self.path input_file = self.read_local_file(self.path) else: if self.filename: input_filename = self.filename # Fetch the latest backup if filepath not specified else: self.logger.info("Finding latest backup") try: input_filename = self.storage.get_latest_backup( encrypted=self.decrypt, compressed=self.uncompress, content_type=self.content_type, database=database, servername=servername, ) except StorageError as err: raise CommandError(err.args[0]) from err input_file = self.read_from_storage(input_filename) return input_filename, input_file def _cleanup_old_backups(self, database=None, servername=None): """ Cleanup old backups, keeping the number of backups specified by DBBACKUP_CLEANUP_KEEP. """ self.storage.clean_old_backups( encrypted=self.encrypt, compressed=self.compress, content_type=self.content_type, database=database, servername=servername, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/management/commands/dbbackup.py0000666000000000000000000001074314662175773021411 0ustar00""" Command for backup database. """ from django.core.management.base import CommandError from ... import settings, utils from ...db.base import get_connector from ...storage import StorageError, get_storage from ._base import BaseDbBackupCommand, make_option class Command(BaseDbBackupCommand): help = "Backup a database, encrypt and/or compress." content_type = "db" option_list = BaseDbBackupCommand.option_list + ( make_option( "-c", "--clean", dest="clean", action="store_true", default=False, help="Clean up old backup files", ), make_option( "-d", "--database", help="Database(s) to backup specified by key separated by" " commas(default: all)", ), make_option( "-s", "--servername", help="Specify server name to include in backup filename", ), make_option( "-z", "--compress", action="store_true", default=False, help="Compress the backup files", ), make_option( "-e", "--encrypt", action="store_true", default=False, help="Encrypt the backup files", ), make_option( "-o", "--output-filename", default=None, help="Specify filename on storage" ), make_option( "-O", "--output-path", default=None, help="Specify where to store on local filesystem", ), make_option( "-x", "--exclude-tables", default=None, help="Exclude tables from backup" ), make_option( "-n", "--schema", action="append", default=[], help="Specify schema(s) to backup. Can be used multiple times.", ), ) @utils.email_uncaught_exception def handle(self, **options): self.verbosity = options.get("verbosity") self.quiet = options.get("quiet") self._set_logger_level() self.clean = options.get("clean") self.servername = options.get("servername") self.compress = options.get("compress") self.encrypt = options.get("encrypt") self.filename = options.get("output_filename") self.path = options.get("output_path") self.exclude_tables = options.get("exclude_tables") self.storage = get_storage() self.schemas = options.get("schema") self.database = options.get("database") or "" for database_key in self._get_database_keys(): self.connector = get_connector(database_key) if self.connector and self.exclude_tables: self.connector.exclude.extend( list(self.exclude_tables.replace(" ", "").split(",")) ) database = self.connector.settings try: self._save_new_backup(database) if self.clean: self._cleanup_old_backups(database=database_key) except StorageError as err: raise CommandError(err) from err def _get_database_keys(self): return self.database.split(",") if self.database else settings.DATABASES def _save_new_backup(self, database): """ Save a new backup file. """ self.logger.info("Backing Up Database: %s", database["NAME"]) # Get backup, schema and name filename = self.connector.generate_filename(self.servername) if self.schemas: self.connector.schemas = self.schemas outputfile = self.connector.create_dump() # Apply trans if self.compress: compressed_file, filename = utils.compress_file(outputfile, filename) outputfile = compressed_file if self.encrypt: encrypted_file, filename = utils.encrypt_file(outputfile, filename) outputfile = encrypted_file # Set file name filename = self.filename or filename self.logger.debug("Backup size: %s", utils.handle_size(outputfile)) # Store backup outputfile.seek(0) if self.path is None: self.write_to_storage(outputfile, filename) else: self.write_local_file(outputfile, self.path) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724449455.0 django_dbbackup-4.2.1/dbbackup/management/commands/dbrestore.py0000666000000000000000000001216214662201257021607 0ustar00""" Restore database. """ from django.conf import settings from django.core.management.base import CommandError from django.db import connection from ... import utils from ...db.base import get_connector from ...storage import StorageError, get_storage from ._base import BaseDbBackupCommand, make_option class Command(BaseDbBackupCommand): help = "Restore a database backup from storage, encrypted and/or compressed." content_type = "db" no_drop = False option_list = BaseDbBackupCommand.option_list + ( make_option("-d", "--database", help="Database to restore"), make_option("-i", "--input-filename", help="Specify filename to backup from"), make_option( "-I", "--input-path", help="Specify path on local filesystem to backup from" ), make_option( "-s", "--servername", help="If backup file is not specified, filter the " "existing ones with the given servername", ), make_option( "-c", "--decrypt", default=False, action="store_true", help="Decrypt data before restoring", ), make_option( "-p", "--passphrase", help="Passphrase for decrypt file", default=None ), make_option( "-z", "--uncompress", action="store_true", default=False, help="Uncompress gzip data before restoring", ), make_option( "-n", "--schema", action="append", default=[], help="Specify schema(s) to restore. Can be used multiple times.", ), make_option( "-r", "--no-drop", action="store_true", default=False, help="Don't clean (drop) the database. This only works with mongodb and postgresql.", ), ) def handle(self, *args, **options): """Django command handler.""" self.verbosity = int(options.get("verbosity")) self.quiet = options.get("quiet") self._set_logger_level() try: connection.close() self.filename = options.get("input_filename") self.path = options.get("input_path") self.servername = options.get("servername") self.decrypt = options.get("decrypt") self.uncompress = options.get("uncompress") self.passphrase = options.get("passphrase") self.interactive = options.get("interactive") self.input_database_name = options.get("database") self.database_name, self.database = self._get_database( self.input_database_name ) self.storage = get_storage() self.no_drop = options.get("no_drop") self.schemas = options.get("schema") self._restore_backup() except StorageError as err: raise CommandError(err) from err def _get_database(self, database_name: str): """Get the database to restore.""" if not database_name: if len(settings.DATABASES) > 1: errmsg = ( "Because this project contains more than one database, you" " must specify the --database option." ) raise CommandError(errmsg) database_name = list(settings.DATABASES.keys())[0] if database_name not in settings.DATABASES: raise CommandError(f"Database {database_name} does not exist.") return database_name, settings.DATABASES[database_name] def _restore_backup(self): """Restore the specified database.""" input_filename, input_file = self._get_backup_file( database=self.input_database_name, servername=self.servername ) self.logger.info( "Restoring backup for database '%s' and server '%s'", self.database_name, self.servername, ) if self.schemas: self.logger.info(f"Restoring schemas: {self.schemas}") self.logger.info(f"Restoring: {input_filename}") if self.decrypt: unencrypted_file, input_filename = utils.unencrypt_file( input_file, input_filename, self.passphrase ) input_file.close() input_file = unencrypted_file if self.uncompress: uncompressed_file, input_filename = utils.uncompress_file( input_file, input_filename ) input_file.close() input_file = uncompressed_file self.logger.info("Restore tempfile created: %s", utils.handle_size(input_file)) if self.interactive: self._ask_confirmation() input_file.seek(0) self.connector = get_connector(self.database_name) if self.schemas: self.connector.schemas = self.schemas self.connector.restore_dump(input_file) self.connector.drop = not self.no_drop ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/management/commands/listbackups.py0000666000000000000000000000424514662175773022162 0ustar00""" List backups. """ from ... import utils from ...storage import get_storage from ._base import BaseDbBackupCommand, make_option ROW_TEMPLATE = "{name:40} {datetime:20}" FILTER_KEYS = ("encrypted", "compressed", "content_type", "database") class Command(BaseDbBackupCommand): option_list = ( make_option("-d", "--database", help="Filter by database name"), make_option( "-z", "--compressed", help="Exclude non-compressed", action="store_true", default=None, dest="compressed", ), make_option( "-Z", "--not-compressed", help="Exclude compressed", action="store_false", default=None, dest="compressed", ), make_option( "-e", "--encrypted", help="Exclude non-encrypted", action="store_true", default=None, dest="encrypted", ), make_option( "-E", "--not-encrypted", help="Exclude encrypted", action="store_false", default=None, dest="encrypted", ), make_option( "-c", "--content-type", help="Filter by content type 'db' or 'media'" ), ) def handle(self, **options): self.quiet = options.get("quiet") self.storage = get_storage() files_attr = self.get_backup_attrs(options) if not self.quiet: title = ROW_TEMPLATE.format(name="Name", datetime="Datetime") self.stdout.write(title) for file_attr in files_attr: row = ROW_TEMPLATE.format(**file_attr) self.stdout.write(row) def get_backup_attrs(self, options): filters = {k: v for k, v in options.items() if k in FILTER_KEYS} filenames = self.storage.list_backups(**filters) return [ { "datetime": utils.filename_to_date(filename).strftime("%x %X"), "name": filename, } for filename in filenames ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/management/commands/mediabackup.py0000666000000000000000000001030114662175773022071 0ustar00""" Save media files. """ import os import tarfile from django.core.management.base import CommandError from ... import utils from ...storage import StorageError, get_storage, get_storage_class from ._base import BaseDbBackupCommand, make_option class Command(BaseDbBackupCommand): help = """Backup media files, gather all in a tarball and encrypt or compress.""" content_type = "media" option_list = BaseDbBackupCommand.option_list + ( make_option( "-c", "--clean", help="Clean up old backup files", action="store_true", default=False, ), make_option( "-s", "--servername", help="Specify server name to include in backup filename", ), make_option( "-z", "--compress", help="Compress the archive", action="store_true", default=False, ), make_option( "-e", "--encrypt", help="Encrypt the backup files", action="store_true", default=False, ), make_option( "-o", "--output-filename", default=None, help="Specify filename on storage" ), make_option( "-O", "--output-path", default=None, help="Specify where to store on local filesystem", ), ) @utils.email_uncaught_exception def handle(self, **options): self.verbosity = options.get("verbosity") self.quiet = options.get("quiet") self._set_logger_level() self.encrypt = options.get("encrypt", False) self.compress = options.get("compress", False) self.servername = options.get("servername") self.filename = options.get("output_filename") self.path = options.get("output_path") try: self.media_storage = get_storage_class()() self.storage = get_storage() self.backup_mediafiles() if options.get("clean"): self._cleanup_old_backups(servername=self.servername) except StorageError as err: raise CommandError(err) from err def _explore_storage(self): """Generator of all files contained in media storage.""" path = "" dirs = [path] while dirs: path = dirs.pop() subdirs, files = self.media_storage.listdir(path) for media_filename in files: yield os.path.join(path, media_filename) dirs.extend([os.path.join(path, subdir) for subdir in subdirs]) def _create_tar(self, name): """Create TAR file.""" fileobj = utils.create_spooled_temporary_file() mode = "w:gz" if self.compress else "w" tar_file = tarfile.open(name=name, fileobj=fileobj, mode=mode) for media_filename in self._explore_storage(): tarinfo = tarfile.TarInfo(media_filename) media_file = self.media_storage.open(media_filename) tarinfo.size = len(media_file) tar_file.addfile(tarinfo, media_file) # Close the TAR for writing tar_file.close() return fileobj def backup_mediafiles(self): """ Create backup file and write it to storage. """ # Check for filename option if self.filename: filename = self.filename else: extension = f"tar{'.gz' if self.compress else ''}" filename = utils.filename_generate( extension, servername=self.servername, content_type=self.content_type ) tarball = self._create_tar(filename) # Apply trans if self.encrypt: encrypted_file = utils.encrypt_file(tarball, filename) tarball, filename = encrypted_file self.logger.debug("Backup size: %s", utils.handle_size(tarball)) # Store backup tarball.seek(0) if self.path is None: self.write_to_storage(tarball, filename) else: self.write_local_file(tarball, self.path) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/management/commands/mediarestore.py0000666000000000000000000000742214662175773022321 0ustar00""" Restore media files. """ import tarfile from ... import utils from ...storage import get_storage, get_storage_class from ._base import BaseDbBackupCommand, make_option class Command(BaseDbBackupCommand): help = """Restore a media backup from storage, encrypted and/or compressed.""" content_type = "media" option_list = ( make_option( "-i", "--input-filename", action="store", help="Specify filename to backup from", ), make_option( "-I", "--input-path", help="Specify path on local filesystem to backup from" ), make_option( "-s", "--servername", help="If backup file is not specified, filter the existing ones with the " "given servername", ), make_option( "-e", "--decrypt", default=False, action="store_true", help="Decrypt data before restoring", ), make_option( "-p", "--passphrase", default=None, help="Passphrase for decrypt file" ), make_option( "-z", "--uncompress", action="store_true", help="Uncompress gzip data before restoring", ), make_option( "-r", "--replace", help="Replace existing files", action="store_true" ), ) def handle(self, *args, **options): """Django command handler.""" self.verbosity = int(options.get("verbosity")) self.quiet = options.get("quiet") self._set_logger_level() self.servername = options.get("servername") self.decrypt = options.get("decrypt") self.uncompress = options.get("uncompress") self.filename = options.get("input_filename") self.path = options.get("input_path") self.replace = options.get("replace") self.passphrase = options.get("passphrase") self.interactive = options.get("interactive") self.storage = get_storage() self.media_storage = get_storage_class()() self._restore_backup() def _upload_file(self, name, media_file): if self.media_storage.exists(name): if not self.replace: return self.media_storage.delete(name) self.logger.info("%s deleted", name) self.media_storage.save(name, media_file) self.logger.info("%s uploaded", name) def _restore_backup(self): self.logger.info("Restoring backup for media files") input_filename, input_file = self._get_backup_file(servername=self.servername) self.logger.info("Restoring: %s", input_filename) if self.decrypt: unencrypted_file, input_filename = utils.unencrypt_file( input_file, input_filename, self.passphrase ) input_file.close() input_file = unencrypted_file self.logger.debug("Backup size: %s", utils.handle_size(input_file)) if self.interactive: self._ask_confirmation() input_file.seek(0) tar_file = ( tarfile.open(fileobj=input_file, mode="r:gz") if self.uncompress else tarfile.open(fileobj=input_file, mode="r:") ) # Restore file 1 by 1 for media_file_info in tar_file: if media_file_info.path == "media": continue # Don't copy root directory media_file = tar_file.extractfile(media_file_info) if media_file is None: continue # Skip directories name = media_file_info.path.replace("media/", "") self._upload_file(name, media_file) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/settings.py0000666000000000000000000000526214662175773015561 0ustar00# DO NOT IMPORT THIS BEFORE django.configure() has been run! import socket import tempfile from django.conf import settings DATABASES = getattr(settings, "DBBACKUP_DATABASES", list(settings.DATABASES.keys())) # Fake host HOSTNAME = getattr(settings, "DBBACKUP_HOSTNAME", socket.gethostname()) # Directory to use for temporary files TMP_DIR = getattr(settings, "DBBACKUP_TMP_DIR", tempfile.gettempdir()) TMP_FILE_MAX_SIZE = getattr(settings, "DBBACKUP_TMP_FILE_MAX_SIZE", 10 * 1024 * 1024) TMP_FILE_READ_SIZE = getattr(settings, "DBBACKUP_TMP_FILE_READ_SIZE", 1024 * 1000) # Number of old backup files to keep CLEANUP_KEEP = getattr(settings, "DBBACKUP_CLEANUP_KEEP", 10) CLEANUP_KEEP_MEDIA = getattr(settings, "DBBACKUP_CLEANUP_KEEP_MEDIA", CLEANUP_KEEP) CLEANUP_KEEP_FILTER = getattr(settings, "DBBACKUP_CLEANUP_KEEP_FILTER", lambda x: False) MEDIA_PATH = getattr(settings, "DBBACKUP_MEDIA_PATH", settings.MEDIA_ROOT) DATE_FORMAT = getattr(settings, "DBBACKUP_DATE_FORMAT", "%Y-%m-%d-%H%M%S") FILENAME_TEMPLATE = getattr( settings, "DBBACKUP_FILENAME_TEMPLATE", "{databasename}-{servername}-{datetime}.{extension}", ) MEDIA_FILENAME_TEMPLATE = getattr( settings, "DBBACKUP_MEDIA_FILENAME_TEMPLATE", "{servername}-{datetime}.{extension}" ) GPG_ALWAYS_TRUST = getattr(settings, "DBBACKUP_GPG_ALWAYS_TRUST", False) GPG_RECIPIENT = GPG_ALWAYS_TRUST = getattr(settings, "DBBACKUP_GPG_RECIPIENT", None) STORAGE = getattr(settings, "DBBACKUP_STORAGE", None) STORAGE_OPTIONS = getattr(settings, "DBBACKUP_STORAGE_OPTIONS", {}) # https://docs.djangoproject.com/en/5.1/ref/settings/#std-setting-STORAGES STORAGES_DBBACKUP_ALIAS = "dbbackup" DJANGO_STORAGES = getattr(settings, "STORAGES", {}) django_dbbackup_storage = DJANGO_STORAGES.get(STORAGES_DBBACKUP_ALIAS, {}) if not STORAGE: STORAGE = ( django_dbbackup_storage.get("BACKEND") or "django.core.files.storage.FileSystemStorage" ) if not STORAGE_OPTIONS: STORAGE_OPTIONS = django_dbbackup_storage.get("OPTIONS") or STORAGE_OPTIONS CONNECTORS = getattr(settings, "DBBACKUP_CONNECTORS", {}) CUSTOM_CONNECTOR_MAPPING = getattr(settings, "DBBACKUP_CONNECTOR_MAPPING", {}) DEFAULT_AUTO_FIELD = "django.db.models.AutoField" # Mail SEND_EMAIL = getattr(settings, "DBBACKUP_SEND_EMAIL", True) SERVER_EMAIL = getattr(settings, "DBBACKUP_SERVER_EMAIL", settings.SERVER_EMAIL) FAILURE_RECIPIENTS = getattr(settings, "DBBACKUP_FAILURE_RECIPIENTS", None) if FAILURE_RECIPIENTS is None: ADMINS = getattr(settings, "DBBACKUP_ADMIN", settings.ADMINS) else: ADMINS = FAILURE_RECIPIENTS EMAIL_SUBJECT_PREFIX = getattr(settings, "DBBACKUP_EMAIL_SUBJECT_PREFIX", "[dbbackup] ") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/storage.py0000666000000000000000000002370314662175773015365 0ustar00""" Utils for handle files. """ import logging from django.core.exceptions import ImproperlyConfigured from . import settings, utils def get_storage(path=None, options=None): """ Get the specified storage configured with options. :param path: Path in Python dot style to module containing the storage class. If empty settings.DBBACKUP_STORAGE will be used. :type path: ``str`` :param options: Parameters for configure the storage, if empty settings.DBBACKUP_STORAGE_OPTIONS will be used. :type options: ``dict`` :return: Storage configured :rtype: :class:`.Storage` """ path = path or settings.STORAGE options = options or settings.STORAGE_OPTIONS if not path: raise ImproperlyConfigured( "You must specify a storage class using " "DBBACKUP_STORAGE settings." ) return Storage(path, **options) class StorageError(Exception): pass class FileNotFound(StorageError): pass class Storage: """ This object make high-level storage operations for upload/download or list and filter files. It uses a Django storage object for low-level operations. """ @property def logger(self): if not hasattr(self, "_logger"): self._logger = logging.getLogger("dbbackup.storage") return self._logger def __init__(self, storage_path=None, **options): """ Initialize a Django Storage instance with given options. :param storage_path: Path to a Django Storage class with dot style If ``None``, ``settings.DBBACKUP_STORAGE`` will be used. :type storage_path: str """ self._storage_path = storage_path or settings.STORAGE options = options.copy() options.update(settings.STORAGE_OPTIONS) options = {key.lower(): value for key, value in options.items()} self.storageCls = get_storage_class(self._storage_path) self.storage = self.storageCls(**options) self.name = self.storageCls.__name__ def __str__(self): return f"dbbackup-{self.storage.__str__()}" def delete_file(self, filepath): self.logger.debug("Deleting file %s", filepath) self.storage.delete(name=filepath) def list_directory(self, path=""): return self.storage.listdir(path)[1] def write_file(self, filehandle, filename): self.logger.debug("Writing file %s", filename) self.storage.save(name=filename, content=filehandle) def read_file(self, filepath): self.logger.debug("Reading file %s", filepath) file_ = self.storage.open(name=filepath, mode="rb") if not getattr(file_, "name", None): file_.name = filepath return file_ def list_backups( self, encrypted=None, compressed=None, content_type=None, database=None, servername=None, ): """ List stored files except given filter. If filter is None, it won't be used. ``content_type`` must be ``'db'`` for database backups or ``'media'`` for media backups. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :returns: List of files :rtype: ``list`` of ``str`` """ if content_type not in ("db", "media", None): msg = "Bad content_type %s, must be 'db', 'media', or None" % (content_type) raise TypeError(msg) # TODO: Make better filter for include only backups files = [f for f in self.list_directory() if utils.filename_to_datestring(f)] if encrypted is not None: files = [f for f in files if (".gpg" in f) == encrypted] if compressed is not None: files = [f for f in files if (".gz" in f) == compressed] if content_type == "media": files = [f for f in files if ".tar" in f] elif content_type == "db": files = [f for f in files if ".tar" not in f] if database: files = [f for f in files if database in f] if servername: files = [f for f in files if servername in f] return files def get_latest_backup( self, encrypted=None, compressed=None, content_type=None, database=None, servername=None, ): """ Return the latest backup file name. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :returns: Most recent file :rtype: ``str`` :raises: FileNotFound: If no backup file is found """ files = self.list_backups( encrypted=encrypted, compressed=compressed, content_type=content_type, database=database, servername=servername, ) if not files: raise FileNotFound("There's no backup file available.") return max(files, key=utils.filename_to_date) def get_older_backup( self, encrypted=None, compressed=None, content_type=None, database=None, servername=None, ): """ Return the older backup's file name. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :returns: Older file :rtype: ``str`` :raises: FileNotFound: If no backup file is found """ files = self.list_backups( encrypted=encrypted, compressed=compressed, content_type=content_type, database=database, servername=servername, ) if not files: raise FileNotFound("There's no backup file available.") return min(files, key=utils.filename_to_date) def clean_old_backups( self, encrypted=None, compressed=None, content_type=None, database=None, servername=None, keep_number=None, ): """ Delete olders backups and hold the number defined. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :param keep_number: Number of files to keep, other will be deleted :type keep_number: ``int`` or ``None`` """ if keep_number is None: keep_number = ( settings.CLEANUP_KEEP if content_type == "db" else settings.CLEANUP_KEEP_MEDIA ) keep_filter = settings.CLEANUP_KEEP_FILTER files = self.list_backups( encrypted=encrypted, compressed=compressed, content_type=content_type, database=database, servername=servername, ) files = sorted(files, key=utils.filename_to_date, reverse=True) files_to_delete = [fi for i, fi in enumerate(files) if i >= keep_number] for filename in files_to_delete: if keep_filter(filename): continue self.delete_file(filename) def get_storage_class(path=None): """ Return the configured storage class. :param path: Path in Python dot style to module containing the storage class. If empty, the default storage class will be used. :type path: str or None :returns: Storage class :rtype: :class:`django.core.files.storage.Storage` """ from django.utils.module_loading import import_string if path: # this is a workaround to keep compatibility with Django >= 5.1 (django.core.files.storage.get_storage_class is removed) return import_string(path) try: from django.core.files.storage import DefaultStorage return DefaultStorage except Exception: from django.core.files.storage import get_storage_class return get_storage_class() ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.6996138 django_dbbackup-4.2.1/dbbackup/tests/0000777000000000000000000000000014662205136014467 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/__init__.py0000666000000000000000000000000014314702350016560 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7031193 django_dbbackup-4.2.1/dbbackup/tests/commands/0000777000000000000000000000000014662205136016270 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/commands/__init__.py0000666000000000000000000000000014314702350020361 0ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/commands/test_base.py0000666000000000000000000001431514662175773020634 0ustar00""" Tests for base command class. """ import logging import os from io import BytesIO from unittest.mock import patch from django.core.files import File from django.test import TestCase from dbbackup.management.commands._base import BaseDbBackupCommand from dbbackup.storage import get_storage from dbbackup.tests.utils import DEV_NULL, HANDLED_FILES class BaseDbBackupCommandSetLoggerLevelTest(TestCase): def setUp(self): self.command = BaseDbBackupCommand() def test_0_level(self): self.command.verbosity = 0 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.WARNING) def test_1_level(self): self.command.verbosity = 1 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.INFO) def test_2_level(self): self.command.verbosity = 2 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.DEBUG) def test_3_level(self): self.command.verbosity = 3 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.DEBUG) def test_quiet(self): self.command.quiet = True self.command._set_logger_level() self.assertGreater(self.command.logger.level, logging.ERROR) class BaseDbBackupCommandMethodsTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.command = BaseDbBackupCommand() self.command.storage = get_storage() def test_read_from_storage(self): HANDLED_FILES["written_files"].append(["foo", File(BytesIO(b"bar"))]) file_ = self.command.read_from_storage("foo") self.assertEqual(file_.read(), b"bar") def test_write_to_storage(self): self.command.write_to_storage(BytesIO(b"foo"), "bar") self.assertEqual(HANDLED_FILES["written_files"][0][0], "bar") def test_read_local_file(self): # setUp self.command.path = "/tmp/foo.bak" open(self.command.path, "w").close() # Test self.command.read_local_file(self.command.path) # tearDown os.remove(self.command.path) def test_write_local_file(self): fd, path = File(BytesIO(b"foo")), "/tmp/foo.bak" self.command.write_local_file(fd, path) self.assertTrue(os.path.exists(path)) # tearDown os.remove(path) def test_ask_confirmation(self): # Yes with patch("dbbackup.management.commands._base.input", return_value="y"): self.command._ask_confirmation() with patch("dbbackup.management.commands._base.input", return_value="Y"): self.command._ask_confirmation() with patch("dbbackup.management.commands._base.input", return_value=""): self.command._ask_confirmation() with patch("dbbackup.management.commands._base.input", return_value="foo"): self.command._ask_confirmation() # No with patch("dbbackup.management.commands._base.input", return_value="n"): with self.assertRaises(SystemExit): self.command._ask_confirmation() with patch("dbbackup.management.commands._base.input", return_value="N"): with self.assertRaises(SystemExit): self.command._ask_confirmation() with patch("dbbackup.management.commands._base.input", return_value="No"): with self.assertRaises(SystemExit): self.command._ask_confirmation() class BaseDbBackupCommandCleanupOldBackupsTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.command = BaseDbBackupCommand() self.command.stdout = DEV_NULL self.command.encrypt = False self.command.compress = False self.command.servername = "foo-server" self.command.storage = get_storage() HANDLED_FILES["written_files"] = [ (f, None) for f in [ "fooserver-2015-02-06-042810.tar", "fooserver-2015-02-07-042810.tar", "fooserver-2015-02-08-042810.tar", "foodb-fooserver-2015-02-06-042810.dump", "foodb-fooserver-2015-02-07-042810.dump", "foodb-fooserver-2015-02-08-042810.dump", "bardb-fooserver-2015-02-06-042810.dump", "bardb-fooserver-2015-02-07-042810.dump", "bardb-fooserver-2015-02-08-042810.dump", "hamdb-hamserver-2015-02-06-042810.dump", "hamdb-hamserver-2015-02-07-042810.dump", "hamdb-hamserver-2015-02-08-042810.dump", ] ] @patch("dbbackup.settings.CLEANUP_KEEP", 1) def test_clean_db(self): self.command.content_type = "db" self.command.database = "foodb" self.command._cleanup_old_backups(database="foodb") self.assertEqual(2, len(HANDLED_FILES["deleted_files"])) self.assertNotIn( "foodb-fooserver-2015-02-08-042810.dump", HANDLED_FILES["deleted_files"] ) @patch("dbbackup.settings.CLEANUP_KEEP", 1) def test_clean_other_db(self): self.command.content_type = "db" self.command._cleanup_old_backups(database="bardb") self.assertEqual(2, len(HANDLED_FILES["deleted_files"])) self.assertNotIn( "bardb-fooserver-2015-02-08-042810.dump", HANDLED_FILES["deleted_files"] ) @patch("dbbackup.settings.CLEANUP_KEEP", 1) def test_clean_other_server_db(self): self.command.content_type = "db" self.command._cleanup_old_backups(database="bardb") self.assertEqual(2, len(HANDLED_FILES["deleted_files"])) self.assertNotIn( "bardb-fooserver-2015-02-08-042810.dump", HANDLED_FILES["deleted_files"] ) @patch("dbbackup.settings.CLEANUP_KEEP_MEDIA", 1) def test_clean_media(self): self.command.content_type = "media" self.command._cleanup_old_backups() self.assertEqual(2, len(HANDLED_FILES["deleted_files"])) self.assertNotIn( "foo-server-2015-02-08-042810.tar", HANDLED_FILES["deleted_files"] ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/commands/test_dbbackup.py0000666000000000000000000000625414662175773021500 0ustar00""" Tests for dbbackup command. """ import os from unittest.mock import patch from django.test import TestCase from dbbackup.db.base import get_connector from dbbackup.management.commands.dbbackup import Command as DbbackupCommand from dbbackup.storage import get_storage from dbbackup.tests.utils import DEV_NULL, TEST_DATABASE, add_public_gpg, clean_gpg_keys @patch("dbbackup.settings.GPG_RECIPIENT", "test@test") @patch("sys.stdout", DEV_NULL) class DbbackupCommandSaveNewBackupTest(TestCase): def setUp(self): self.command = DbbackupCommand() self.command.servername = "foo-server" self.command.encrypt = False self.command.compress = False self.command.database = TEST_DATABASE["NAME"] self.command.storage = get_storage() self.command.connector = get_connector() self.command.stdout = DEV_NULL self.command.filename = None self.command.path = None self.command.schemas = [] def tearDown(self): clean_gpg_keys() def test_func(self): self.command._save_new_backup(TEST_DATABASE) def test_compress(self): self.command.compress = True self.command._save_new_backup(TEST_DATABASE) def test_encrypt(self): add_public_gpg() self.command.encrypt = True self.command._save_new_backup(TEST_DATABASE) def test_path(self): self.command.path = "/tmp/foo.bak" self.command._save_new_backup(TEST_DATABASE) self.assertTrue(os.path.exists(self.command.path)) # tearDown os.remove(self.command.path) def test_schema(self): self.command.schemas = ["public"] result = self.command._save_new_backup(TEST_DATABASE) self.assertIsNone(result) @patch("dbbackup.settings.DATABASES", ["db-from-settings"]) def test_get_database_keys(self): with self.subTest("use --database from CLI"): self.command.database = "db-from-cli" self.assertEqual(self.command._get_database_keys(), ["db-from-cli"]) with self.subTest("fallback to DBBACKUP_DATABASES"): self.command.database = "" self.assertEqual(self.command._get_database_keys(), ["db-from-settings"]) @patch("dbbackup.settings.GPG_RECIPIENT", "test@test") @patch("sys.stdout", DEV_NULL) @patch("dbbackup.db.sqlite.SqliteConnector.create_dump") @patch("dbbackup.utils.handle_size", returned_value=4.2) class DbbackupCommandSaveNewMongoBackupTest(TestCase): def setUp(self): self.command = DbbackupCommand() self.command.servername = "foo-server" self.command.encrypt = False self.command.compress = False self.command.storage = get_storage() self.command.stdout = DEV_NULL self.command.filename = None self.command.path = None self.command.connector = get_connector("default") self.command.schemas = [] def tearDown(self): clean_gpg_keys() def test_func(self, mock_run_commands, mock_handle_size): self.command._save_new_backup(TEST_DATABASE) self.assertTrue(mock_run_commands.called) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/commands/test_dbrestore.py0000666000000000000000000001625214662175773021715 0ustar00""" Tests for dbrestore command. """ from shutil import copyfileobj from tempfile import mktemp from unittest.mock import patch from django.conf import settings from django.core.files import File from django.core.management.base import CommandError from django.test import TestCase from dbbackup import utils from dbbackup.db.base import get_connector from dbbackup.db.mongodb import MongoDumpConnector from dbbackup.db.postgresql import PgDumpConnector from dbbackup.management.commands.dbrestore import Command as DbrestoreCommand from dbbackup.settings import HOSTNAME from dbbackup.storage import get_storage from dbbackup.tests.utils import ( DEV_NULL, HANDLED_FILES, TARED_FILE, TEST_DATABASE, TEST_MONGODB, add_private_gpg, clean_gpg_keys, get_dump, get_dump_name, ) @patch("dbbackup.management.commands._base.input", return_value="y") class DbrestoreCommandRestoreBackupTest(TestCase): def setUp(self): self.command = DbrestoreCommand() self.command.stdout = DEV_NULL self.command.uncompress = False self.command.decrypt = False self.command.backup_extension = "bak" self.command.filename = "foofile" self.command.database = TEST_DATABASE self.command.passphrase = None self.command.interactive = True self.command.storage = get_storage() self.command.servername = HOSTNAME self.command.input_database_name = None self.command.database_name = "default" self.command.connector = get_connector("default") self.command.schemas = [] HANDLED_FILES.clean() def tearDown(self): clean_gpg_keys() def test_no_filename(self, *args): # Prepare backup HANDLED_FILES["written_files"].append( (utils.filename_generate("default"), File(get_dump())) ) # Check self.command.path = None self.command.filename = None self.command._restore_backup() def test_no_backup_found(self, *args): self.command.path = None self.command.filename = None with self.assertRaises(CommandError): self.command._restore_backup() def test_uncompress(self, *args): self.command.path = None compressed_file, self.command.filename = utils.compress_file( get_dump(), get_dump_name() ) HANDLED_FILES["written_files"].append( (self.command.filename, File(compressed_file)) ) self.command.uncompress = True self.command._restore_backup() @patch("dbbackup.utils.getpass", return_value=None) def test_decrypt(self, *args): self.command.path = None self.command.decrypt = True encrypted_file, self.command.filename = utils.encrypt_file( get_dump(), get_dump_name() ) HANDLED_FILES["written_files"].append( (self.command.filename, File(encrypted_file)) ) self.command._restore_backup() def test_path(self, *args): temp_dump = get_dump() dump_path = mktemp() with open(dump_path, "wb") as dump: copyfileobj(temp_dump, dump) self.command.path = dump.name self.command._restore_backup() self.command.decrypt = False self.command.filepath = get_dump_name() HANDLED_FILES["written_files"].append((self.command.filepath, get_dump())) self.command._restore_backup() @patch("dbbackup.management.commands.dbrestore.get_connector") @patch("dbbackup.db.base.BaseDBConnector.restore_dump") def test_schema(self, mock_restore_dump, mock_get_connector, *args): """Schema is only used for postgresql.""" mock_get_connector.return_value = PgDumpConnector() mock_restore_dump.return_value = True mock_file = File(get_dump()) HANDLED_FILES["written_files"].append((self.command.filename, mock_file)) with self.assertLogs("dbbackup.command", "INFO") as cm: # Without self.command.path = None self.command._restore_backup() self.assertEqual(self.command.connector.schemas, []) # With self.command.path = None self.command.schemas = ["public"] self.command._restore_backup() self.assertEqual(self.command.connector.schemas, ["public"]) self.assertIn( "INFO:dbbackup.command:Restoring schemas: ['public']", cm.output, ) # With multiple self.command.path = None self.command.schemas = ["public", "other"] self.command._restore_backup() self.assertEqual(self.command.connector.schemas, ["public", "other"]) self.assertIn( "INFO:dbbackup.command:Restoring schemas: ['public', 'other']", cm.output, ) mock_get_connector.assert_called_with("default") mock_restore_dump.assert_called_with(mock_file) class DbrestoreCommandGetDatabaseTest(TestCase): def setUp(self): self.command = DbrestoreCommand() def test_give_db_name(self): name, db = self.command._get_database("default") self.assertEqual(name, "default") self.assertEqual(db, settings.DATABASES["default"]) def test_no_given_db(self): name, db = self.command._get_database(None) self.assertEqual(name, "default") self.assertEqual(db, settings.DATABASES["default"]) @patch("django.conf.settings.DATABASES", {"db1": {}, "db2": {}}) def test_no_given_db_multidb(self): with self.assertRaises(CommandError): self.command._get_database({}) @patch("dbbackup.management.commands._base.input", return_value="y") @patch( "dbbackup.management.commands.dbrestore.get_connector", return_value=MongoDumpConnector(), ) @patch("dbbackup.db.mongodb.MongoDumpConnector.restore_dump") class DbMongoRestoreCommandRestoreBackupTest(TestCase): def setUp(self): self.command = DbrestoreCommand() self.command.stdout = DEV_NULL self.command.uncompress = False self.command.decrypt = False self.command.backup_extension = "bak" self.command.path = None self.command.filename = "foofile" self.command.database = TEST_MONGODB self.command.passphrase = None self.command.interactive = True self.command.storage = get_storage() self.command.connector = MongoDumpConnector() self.command.database_name = "mongo" self.command.input_database_name = None self.command.servername = HOSTNAME self.command.schemas = [] HANDLED_FILES.clean() add_private_gpg() def test_mongo_settings_backup_command(self, mock_runcommands, *args): self.command.storage.file_read = TARED_FILE self.command.filename = TARED_FILE HANDLED_FILES["written_files"].append((TARED_FILE, open(TARED_FILE, "rb"))) self.command._restore_backup() self.assertTrue(mock_runcommands.called) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/commands/test_listbackups.py0000666000000000000000000000765314662175773022255 0ustar00from io import StringIO from unittest.mock import patch from django.core.management import execute_from_command_line from django.test import TestCase from dbbackup.management.commands.listbackups import Command as ListbackupsCommand from dbbackup.storage import get_storage from dbbackup.tests.utils import HANDLED_FILES class ListbackupsCommandTest(TestCase): def setUp(self): self.command = ListbackupsCommand() self.command.storage = get_storage() HANDLED_FILES["written_files"] = [ (f, None) for f in [ "2015-02-06-042810.bak", "2015-02-07-042810.bak", "2015-02-08-042810.bak", ] ] def test_get_backup_attrs(self): options = {} attrs = self.command.get_backup_attrs(options) self.assertEqual(len(HANDLED_FILES["written_files"]), len(attrs)) class ListbackupsCommandArgComputingTest(TestCase): def setUp(self): HANDLED_FILES["written_files"] = [ (f, None) for f in [ "2015-02-06-042810_foo.db", "2015-02-06-042810_foo.db.gz", "2015-02-06-042810_foo.db.gpg", "2015-02-06-042810_foo.db.gz.gpg", "2015-02-06-042810_foo.tar", "2015-02-06-042810_foo.tar.gz", "2015-02-06-042810_foo.tar.gpg", "2015-02-06-042810_foo.tar.gz.gpg", "2015-02-06-042810_bar.db", "2015-02-06-042810_bar.db.gz", "2015-02-06-042810_bar.db.gpg", "2015-02-06-042810_bar.db.gz.gpg", "2015-02-06-042810_bar.tar", "2015-02-06-042810_bar.tar.gz", "2015-02-06-042810_bar.tar.gpg", "2015-02-06-042810_bar.tar.gz.gpg", ] ] def test_list(self): execute_from_command_line(["", "listbackups"]) def test_filter_encrypted(self): stdout = StringIO() with patch("sys.stdout", stdout): execute_from_command_line(["", "listbackups", "--encrypted", "-q"]) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn(".gpg", line) def test_filter_not_encrypted(self): stdout = StringIO() with patch("sys.stdout", stdout): execute_from_command_line(["", "listbackups", "--not-encrypted", "-q"]) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertNotIn(".gpg", line) def test_filter_compressed(self): stdout = StringIO() with patch("sys.stdout", stdout): execute_from_command_line(["", "listbackups", "--compressed", "-q"]) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn(".gz", line) def test_filter_not_compressed(self): stdout = StringIO() with patch("sys.stdout", stdout): execute_from_command_line(["", "listbackups", "--not-compressed", "-q"]) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertNotIn(".gz", line) def test_filter_db(self): stdout = StringIO() with patch("sys.stdout", stdout): execute_from_command_line(["", "listbackups", "--content-type", "db", "-q"]) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn(".db", line) def test_filter_media(self): stdout = StringIO() with patch("sys.stdout", stdout): execute_from_command_line( ["", "listbackups", "--content-type", "media", "-q"] ) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn(".tar", line) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/commands/test_mediabackup.py0000666000000000000000000000515714662175773022173 0ustar00""" Tests for mediabackup command. """ import contextlib import os import tempfile from django.test import TestCase from dbbackup.management.commands.mediabackup import Command as DbbackupCommand from dbbackup.storage import get_storage, get_storage_class from dbbackup.tests.utils import DEV_NULL, HANDLED_FILES, add_public_gpg class MediabackupBackupMediafilesTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.command = DbbackupCommand() self.command.servername = "foo-server" self.command.storage = get_storage() self.command.stdout = DEV_NULL self.command.compress = False self.command.encrypt = False self.command.path = None self.command.media_storage = get_storage_class()() self.command.filename = None def tearDown(self): if self.command.path is not None: with contextlib.suppress(OSError): os.remove(self.command.path) def test_func(self): self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES["written_files"])) def test_compress(self): self.command.compress = True self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES["written_files"])) self.assertTrue(HANDLED_FILES["written_files"][0][0].endswith(".gz")) def test_encrypt(self): self.command.encrypt = True add_public_gpg() self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES["written_files"])) outputfile = HANDLED_FILES["written_files"][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b"-----BEGIN PGP MESSAGE-----")) def test_compress_and_encrypt(self): self.command.compress = True self.command.encrypt = True add_public_gpg() self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES["written_files"])) outputfile = HANDLED_FILES["written_files"][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b"-----BEGIN PGP MESSAGE-----")) def test_write_local_file(self): self.command.path = tempfile.mktemp() self.command.backup_mediafiles() self.assertTrue(os.path.exists(self.command.path)) self.assertEqual(0, len(HANDLED_FILES["written_files"])) def test_output_filename(self): self.command.filename = "my_new_name.tar" self.command.backup_mediafiles() self.assertEqual(HANDLED_FILES["written_files"][0][0], self.command.filename) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7041192 django_dbbackup-4.2.1/dbbackup/tests/functional/0000777000000000000000000000000014662205136016631 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/functional/__init__.py0000666000000000000000000000000014314702350020722 0ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/functional/test_commands.py0000666000000000000000000002265114662175773022066 0ustar00import os import tempfile from unittest.mock import patch from django.conf import settings from django.core.management import execute_from_command_line from django.test import TransactionTestCase as TestCase from dbbackup.tests.testapp import models from dbbackup.tests.utils import ( HANDLED_FILES, TEST_DATABASE, add_private_gpg, add_public_gpg, clean_gpg_keys, ) class DbBackupCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() open(TEST_DATABASE["NAME"], "a").close() self.instance = models.CharModel.objects.create(field="foo") def tearDown(self): clean_gpg_keys() def test_database(self): argv = ["", "dbbackup", "--database=default"] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES["written_files"])) filename, outputfile = HANDLED_FILES["written_files"][0] # Test file content outputfile.seek(0) self.assertTrue(outputfile.read()) def test_encrypt(self): argv = ["", "dbbackup", "--encrypt"] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES["written_files"])) filename, outputfile = HANDLED_FILES["written_files"][0] self.assertTrue(filename.endswith(".gpg")) # Test file content outputfile = HANDLED_FILES["written_files"][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b"-----BEGIN PGP MESSAGE-----")) def test_compress(self): argv = ["", "dbbackup", "--compress"] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES["written_files"])) filename, outputfile = HANDLED_FILES["written_files"][0] self.assertTrue(filename.endswith(".gz")) def test_compress_and_encrypt(self): argv = ["", "dbbackup", "--compress", "--encrypt"] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES["written_files"])) filename, outputfile = HANDLED_FILES["written_files"][0] self.assertTrue(filename.endswith(".gz.gpg")) # Test file content outputfile = HANDLED_FILES["written_files"][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b"-----BEGIN PGP MESSAGE-----")) @patch("dbbackup.management.commands._base.input", return_value="y") class DbRestoreCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() add_private_gpg() open(TEST_DATABASE["NAME"], "a").close() self.instance = models.CharModel.objects.create(field="foo") def tearDown(self): clean_gpg_keys() def test_restore(self, *args): # Create backup execute_from_command_line(["", "dbbackup"]) self.instance.delete() # Restore execute_from_command_line(["", "dbrestore"]) restored = models.CharModel.objects.all().exists() self.assertTrue(restored) @patch("dbbackup.utils.getpass", return_value=None) def test_encrypted(self, *args): # Create backup execute_from_command_line(["", "dbbackup", "--encrypt"]) self.instance.delete() # Restore execute_from_command_line(["", "dbrestore", "--decrypt"]) restored = models.CharModel.objects.all().exists() self.assertTrue(restored) def test_compressed(self, *args): # Create backup execute_from_command_line(["", "dbbackup", "--compress"]) self.instance.delete() # Restore execute_from_command_line(["", "dbrestore", "--uncompress"]) restored = models.CharModel.objects.all().exists() self.assertTrue(restored) def test_no_backup_available(self, *args): with self.assertRaises(SystemExit): execute_from_command_line(["", "dbrestore"]) @patch("dbbackup.utils.getpass", return_value=None) def test_available_but_not_encrypted(self, *args): # Create backup execute_from_command_line(["", "dbbackup"]) # Restore with self.assertRaises(SystemExit): execute_from_command_line(["", "dbrestore", "--decrypt"]) def test_available_but_not_compressed(self, *args): # Create backup execute_from_command_line(["", "dbbackup"]) # Restore with self.assertRaises(SystemExit): execute_from_command_line(["", "dbrestore", "--uncompress"]) def test_specify_db(self, *args): # Create backup execute_from_command_line(["", "dbbackup", "--database", "default"]) # Test wrong name with self.assertRaises(SystemExit): execute_from_command_line(["", "dbrestore", "--database", "foo"]) # Restore execute_from_command_line(["", "dbrestore", "--database", "default"]) @patch("dbbackup.utils.getpass", return_value=None) def test_compressed_and_encrypted(self, *args): # Create backup execute_from_command_line(["", "dbbackup", "--compress", "--encrypt"]) self.instance.delete() # Restore execute_from_command_line(["", "dbrestore", "--uncompress", "--decrypt"]) restored = models.CharModel.objects.all().exists() self.assertTrue(restored) class MediaBackupCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() def tearDown(self): clean_gpg_keys() def test_encrypt(self): argv = ["", "mediabackup", "--encrypt"] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES["written_files"])) filename, outputfile = HANDLED_FILES["written_files"][0] self.assertTrue(".gpg" in filename) # Test file content outputfile = HANDLED_FILES["written_files"][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b"-----BEGIN PGP MESSAGE-----")) def test_compress(self): argv = ["", "mediabackup", "--compress"] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES["written_files"])) filename, outputfile = HANDLED_FILES["written_files"][0] self.assertTrue(".gz" in filename) @patch("dbbackup.utils.getpass", return_value=None) def test_compress_and_encrypted(self, getpass_mock): argv = ["", "mediabackup", "--compress", "--encrypt"] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES["written_files"])) filename, outputfile = HANDLED_FILES["written_files"][0] self.assertTrue(".gpg" in filename) self.assertTrue(".gz" in filename) # Test file content outputfile = HANDLED_FILES["written_files"][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b"-----BEGIN PGP MESSAGE-----")) @patch("dbbackup.management.commands._base.input", return_value="y") class MediaRestoreCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() add_private_gpg() def tearDown(self): clean_gpg_keys() self._emtpy_media() def _create_file(self, name=None): name = name or tempfile._RandomNameSequence().next() path = os.path.join(settings.MEDIA_ROOT, name) with open(path, "a+b") as fd: fd.write(b"foo") def _emtpy_media(self): for fi in os.listdir(settings.MEDIA_ROOT): os.remove(os.path.join(settings.MEDIA_ROOT, fi)) def _is_restored(self): return bool(os.listdir(settings.MEDIA_ROOT)) def test_restore(self, *args): # Create backup self._create_file("foo") execute_from_command_line(["", "mediabackup"]) self._emtpy_media() # Restore execute_from_command_line(["", "mediarestore"]) self.assertTrue(self._is_restored()) @patch("dbbackup.utils.getpass", return_value=None) def test_encrypted(self, *args): # Create backup self._create_file("foo") execute_from_command_line(["", "mediabackup", "--encrypt"]) self._emtpy_media() # Restore execute_from_command_line(["", "mediarestore", "--decrypt"]) self.assertTrue(self._is_restored()) def test_compressed(self, *args): # Create backup self._create_file("foo") execute_from_command_line(["", "mediabackup", "--compress"]) self._emtpy_media() # Restore execute_from_command_line(["", "mediarestore", "--uncompress"]) self.assertTrue(self._is_restored()) def test_no_backup_available(self, *args): with self.assertRaises(SystemExit): execute_from_command_line(["", "mediarestore"]) @patch("dbbackup.utils.getpass", return_value=None) def test_available_but_not_encrypted(self, *args): # Create backup execute_from_command_line(["", "mediabackup"]) # Restore with self.assertRaises(SystemExit): execute_from_command_line(["", "mediarestore", "--decrypt"]) def test_available_but_not_compressed(self, *args): # Create backup execute_from_command_line(["", "mediabackup"]) # Restore with self.assertRaises(SystemExit): execute_from_command_line(["", "mediarestore", "--uncompress"]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724449435.0 django_dbbackup-4.2.1/dbbackup/tests/settings.py0000666000000000000000000000547314662201233016704 0ustar00""" Configuration and launcher for dbbackup tests. """ import os import sys import tempfile from dotenv import load_dotenv test = len(sys.argv) <= 1 or sys.argv[1] == "test" if not test: load_dotenv() DEBUG = False BASE_DIR = os.path.dirname(os.path.abspath(__file__)) TESTAPP_DIR = os.path.join(BASE_DIR, "testapp/") BLOB_DIR = os.path.join(TESTAPP_DIR, "blobs/") ADMINS = (("ham", "foo@bar"),) ALLOWED_HOSTS = ["*"] MIDDLEWARE_CLASSES = () ROOT_URLCONF = "dbbackup.tests.testapp.urls" SECRET_KEY = "it's a secret to everyone" SITE_ID = 1 MEDIA_ROOT = os.environ.get("MEDIA_ROOT") or tempfile.mkdtemp() INSTALLED_APPS = ( "dbbackup", "dbbackup.tests.testapp", ) DEFAULT_AUTO_FIELD = "django.db.models.AutoField" DATABASES = { "default": { "ENGINE": os.environ.get("DB_ENGINE", "django.db.backends.sqlite3"), "NAME": os.environ.get("DB_NAME", ":memory:"), "USER": os.environ.get("DB_USER"), "PASSWORD": os.environ.get("DB_PASSWORD"), "HOST": os.environ.get("DB_HOST"), } } if os.environ.get("CONNECTOR"): CONNECTOR = {"CONNECTOR": os.environ["CONNECTOR"]} DBBACKUP_CONNECTORS = {"default": CONNECTOR} CACHES = { "default": { "BACKEND": "django.core.cache.backends.locmem.LocMemCache", } } SERVER_EMAIL = "dbbackup@test.org" DBBACKUP_GPG_RECIPIENT = "test@test" DBBACKUP_GPG_ALWAYS_TRUST = (True,) DBBACKUP_STORAGE = os.environ.get("STORAGE", "dbbackup.tests.utils.FakeStorage") DBBACKUP_STORAGE_OPTIONS = dict( [ keyvalue.split("=") for keyvalue in os.environ.get("STORAGE_OPTIONS", "").split(",") if keyvalue ] ) # For testing the new storages setting introduced in Django 4.2 STORAGES = { "default": { "BACKEND": "django.core.files.storage.FileSystemStorage", "OPTIONS": {}, }, "dbbackup": { "BACKEND": DBBACKUP_STORAGE, "OPTIONS": DBBACKUP_STORAGE_OPTIONS, }, } LOGGING = { "version": 1, "disable_existing_loggers": False, "root": {"handlers": ["console"], "level": "DEBUG"}, "handlers": { "console": { "level": os.getenv("DJANGO_LOG_LEVEL", "INFO"), "class": "logging.StreamHandler", "formatter": "simple", } }, "formatters": { "verbose": { "format": "[%(asctime)s] %(levelname)s [%(name)s:%(lineno)s] %(message)s", "datefmt": "%d/%b/%Y %H:%M:%S", }, "simple": {"format": "%(levelname)s %(message)s"}, }, "loggers": { "django.db.backends": { # uncomment to see all queries # 'level': 'DEBUG', "handlers": ["console"], } }, } # let there be silence DEFAULT_AUTO_FIELD = "django.db.models.AutoField" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_checks.py0000666000000000000000000000765614662175773017373 0ustar00from unittest.mock import patch from django.test import TestCase try: from dbbackup import checks from dbbackup.apps import DbbackupConfig except ImportError: checks = None def test_func(*args, **kwargs): return "foo" class ChecksTest(TestCase): def setUp(self): if checks is None: self.skipTest("Test framework has been released in Django 1.7") def test_check(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch("dbbackup.checks.settings.HOSTNAME", "") def test_hostname_invalid(self): expected_errors = [checks.W001] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.STORAGE", "") def test_hostname_storage(self): expected_errors = [checks.W002] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.FILENAME_TEMPLATE", test_func) def test_filename_template_is_callable(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch("dbbackup.checks.settings.FILENAME_TEMPLATE", "{datetime}.bak") def test_filename_template_is_string(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch("dbbackup.checks.settings.FILENAME_TEMPLATE", "foo.bak") def test_filename_template_no_date(self): expected_errors = [checks.W003] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE", test_func) def test_media_filename_template_is_callable(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch("dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE", "{datetime}.bak") def test_media_filename_template_is_string(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch("dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE", "foo.bak") def test_media_filename_template_no_date(self): expected_errors = [checks.W004] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.DATE_FORMAT", "foo@net.pt") def test_date_format_warning(self): expected_errors = [checks.W005] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.FAILURE_RECIPIENTS", "foo@net.pt") def test_Failure_recipients_warning(self): expected_errors = [checks.W006] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.FILENAME_TEMPLATE", "foo/bar-{datetime}.ext") def test_db_filename_template_with_slash(self): expected_errors = [checks.W007] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.FILENAME_TEMPLATE", lambda _: "foo/bar") def test_db_filename_template_callable_with_slash(self): expected_errors = [checks.W007] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE", "foo/bar-{datetime}.ext") def test_media_filename_template_with_slash(self): expected_errors = [checks.W008] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch("dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE", lambda _: "foo/bar") def test_media_filename_template_callable_with_slash(self): expected_errors = [checks.W008] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1724451421.708119 django_dbbackup-4.2.1/dbbackup/tests/test_connectors/0000777000000000000000000000000014662205136017703 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/test_connectors/__init__.py0000666000000000000000000000000014314702350021774 0ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_connectors/test_base.py0000666000000000000000000000644414662175773022253 0ustar00import os from tempfile import SpooledTemporaryFile from django.test import TestCase from dbbackup.db import exceptions from dbbackup.db.base import BaseCommandDBConnector, BaseDBConnector, get_connector class GetConnectorTest(TestCase): def test_get_connector(self): connector = get_connector() self.assertIsInstance(connector, BaseDBConnector) class BaseDBConnectorTest(TestCase): def test_init(self): BaseDBConnector() def test_settings(self): connector = BaseDBConnector() connector.settings def test_generate_filename(self): connector = BaseDBConnector() connector.generate_filename() class BaseCommandDBConnectorTest(TestCase): def test_run_command(self): connector = BaseCommandDBConnector() stdout, stderr = connector.run_command("echo 123") self.assertEqual(stdout.read(), b"123\n") self.assertEqual(stderr.read(), b"") def test_run_command_error(self): connector = BaseCommandDBConnector() with self.assertRaises(exceptions.CommandConnectorError): connector.run_command("echa 123") def test_run_command_stdin(self): connector = BaseCommandDBConnector() stdin = SpooledTemporaryFile() stdin.write(b"foo") stdin.seek(0) # Run stdout, stderr = connector.run_command("cat", stdin=stdin) self.assertEqual(stdout.read(), b"foo") self.assertFalse(stderr.read()) def test_run_command_with_env(self): connector = BaseCommandDBConnector() # Empty env stdout, stderr = connector.run_command("env") self.assertTrue(stdout.read()) # env from self.env connector.env = {"foo": "bar"} stdout, stderr = connector.run_command("env") self.assertIn(b"foo=bar\n", stdout.read()) # method override global env stdout, stderr = connector.run_command("env", env={"foo": "ham"}) self.assertIn(b"foo=ham\n", stdout.read()) # get a var from parent env os.environ["bar"] = "foo" stdout, stderr = connector.run_command("env") self.assertIn(b"bar=foo\n", stdout.read()) # Conf overrides parendt env connector.env = {"bar": "bar"} stdout, stderr = connector.run_command("env") self.assertIn(b"bar=bar\n", stdout.read()) # method overrides all stdout, stderr = connector.run_command("env", env={"bar": "ham"}) self.assertIn(b"bar=ham\n", stdout.read()) def test_run_command_with_parent_env(self): connector = BaseCommandDBConnector(use_parent_env=False) # Empty env stdout, stderr = connector.run_command("env") self.assertFalse(stdout.read()) # env from self.env connector.env = {"foo": "bar"} stdout, stderr = connector.run_command("env") self.assertEqual(stdout.read(), b"foo=bar\n") # method override global env stdout, stderr = connector.run_command("env", env={"foo": "ham"}) self.assertEqual(stdout.read(), b"foo=ham\n") # no var from parent env os.environ["bar"] = "foo" stdout, stderr = connector.run_command("env") self.assertNotIn(b"bar=foo\n", stdout.read()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_connectors/test_mongodb.py0000666000000000000000000001051614662175773022761 0ustar00from io import BytesIO from unittest.mock import patch from django.test import TestCase from dbbackup.db.mongodb import MongoDumpConnector @patch( "dbbackup.db.mongodb.MongoDumpConnector.run_command", return_value=(BytesIO(b"foo"), BytesIO()), ) class MongoDumpConnectorTest(TestCase): def test_create_dump(self, mock_dump_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b"foo") # Test cmd self.assertTrue(mock_dump_cmd.called) def test_create_dump_user(self, mock_dump_cmd): connector = MongoDumpConnector() # Without connector.settings.pop("USER", None) connector.create_dump() self.assertNotIn(" --user ", mock_dump_cmd.call_args[0][0]) # With connector.settings["USER"] = "foo" connector.create_dump() self.assertIn(" --username foo", mock_dump_cmd.call_args[0][0]) def test_create_dump_password(self, mock_dump_cmd): connector = MongoDumpConnector() # Without connector.settings.pop("PASSWORD", None) connector.create_dump() self.assertNotIn(" --password ", mock_dump_cmd.call_args[0][0]) # With connector.settings["PASSWORD"] = "foo" connector.create_dump() self.assertIn(" --password foo", mock_dump_cmd.call_args[0][0]) @patch( "dbbackup.db.mongodb.MongoDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) @patch( "dbbackup.db.mongodb.MongoDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_user(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.settings.pop("USER", None) connector.restore_dump(dump) self.assertNotIn(" --username ", mock_restore_cmd.call_args[0][0]) # With connector.settings["USER"] = "foo" connector.restore_dump(dump) self.assertIn(" --username foo", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.mongodb.MongoDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_password(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.settings.pop("PASSWORD", None) connector.restore_dump(dump) self.assertNotIn(" --password ", mock_restore_cmd.call_args[0][0]) # With connector.settings["PASSWORD"] = "foo" connector.restore_dump(dump) self.assertIn(" --password foo", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.mongodb.MongoDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_object_check(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.object_check = False connector.restore_dump(dump) self.assertNotIn("--objcheck", mock_restore_cmd.call_args[0][0]) # With connector.object_check = True connector.restore_dump(dump) self.assertIn(" --objcheck", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.mongodb.MongoDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_drop(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.drop = False connector.restore_dump(dump) self.assertNotIn("--drop", mock_restore_cmd.call_args[0][0]) # With connector.drop = True connector.restore_dump(dump) self.assertIn(" --drop", mock_restore_cmd.call_args[0][0]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_connectors/test_mysql.py0000666000000000000000000001354114662175773022502 0ustar00from io import BytesIO from unittest.mock import patch from django.test import TestCase from dbbackup.db.mysql import MysqlDumpConnector @patch( "dbbackup.db.mysql.MysqlDumpConnector.run_command", return_value=(BytesIO(b"foo"), BytesIO()), ) class MysqlDumpConnectorTest(TestCase): def test_create_dump(self, mock_dump_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b"foo") # Test cmd self.assertTrue(mock_dump_cmd.called) def test_create_dump_host(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop("HOST", None) connector.create_dump() self.assertNotIn(" --host=", mock_dump_cmd.call_args[0][0]) # With connector.settings["HOST"] = "foo" connector.create_dump() self.assertIn(" --host=foo", mock_dump_cmd.call_args[0][0]) def test_create_dump_port(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop("PORT", None) connector.create_dump() self.assertNotIn(" --port=", mock_dump_cmd.call_args[0][0]) # With connector.settings["PORT"] = 42 connector.create_dump() self.assertIn(" --port=42", mock_dump_cmd.call_args[0][0]) def test_create_dump_user(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop("USER", None) connector.create_dump() self.assertNotIn(" --user=", mock_dump_cmd.call_args[0][0]) # With connector.settings["USER"] = "foo" connector.create_dump() self.assertIn(" --user=foo", mock_dump_cmd.call_args[0][0]) def test_create_dump_password(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop("PASSWORD", None) connector.create_dump() self.assertNotIn(" --password=", mock_dump_cmd.call_args[0][0]) # With connector.settings["PASSWORD"] = "foo" connector.create_dump() self.assertIn(" --password=foo", mock_dump_cmd.call_args[0][0]) def test_create_dump_exclude(self, mock_dump_cmd): connector = MysqlDumpConnector() connector.settings["NAME"] = "db" # Without connector.create_dump() self.assertNotIn(" --ignore-table=", mock_dump_cmd.call_args[0][0]) # With connector.exclude = ("foo",) connector.create_dump() self.assertIn(" --ignore-table=db.foo", mock_dump_cmd.call_args[0][0]) # With several connector.exclude = ("foo", "bar") connector.create_dump() self.assertIn(" --ignore-table=db.foo", mock_dump_cmd.call_args[0][0]) self.assertIn(" --ignore-table=db.bar", mock_dump_cmd.call_args[0][0]) @patch( "dbbackup.db.mysql.MysqlDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) @patch( "dbbackup.db.mysql.MysqlDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_host(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop("HOST", None) connector.restore_dump(dump) self.assertNotIn(" --host=foo", mock_restore_cmd.call_args[0][0]) # With connector.settings["HOST"] = "foo" connector.restore_dump(dump) self.assertIn(" --host=foo", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.mysql.MysqlDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_port(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop("PORT", None) connector.restore_dump(dump) self.assertNotIn(" --port=", mock_restore_cmd.call_args[0][0]) # With connector.settings["PORT"] = 42 connector.restore_dump(dump) self.assertIn(" --port=42", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.mysql.MysqlDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_user(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop("USER", None) connector.restore_dump(dump) self.assertNotIn(" --user=", mock_restore_cmd.call_args[0][0]) # With connector.settings["USER"] = "foo" connector.restore_dump(dump) self.assertIn(" --user=foo", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.mysql.MysqlDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_password(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop("PASSWORD", None) connector.restore_dump(dump) self.assertNotIn(" --password=", mock_restore_cmd.call_args[0][0]) # With connector.settings["PASSWORD"] = "foo" connector.restore_dump(dump) self.assertIn(" --password=foo", mock_restore_cmd.call_args[0][0]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_connectors/test_postgresql.py0000666000000000000000000003251414662175773023541 0ustar00from io import BytesIO from unittest.mock import patch from django.test import TestCase from dbbackup.db.postgresql import ( PgDumpBinaryConnector, PgDumpConnector, PgDumpGisConnector, ) @patch( "dbbackup.db.postgresql.PgDumpConnector.run_command", return_value=(BytesIO(b"foo"), BytesIO()), ) class PgDumpConnectorTest(TestCase): def setUp(self): self.connector = PgDumpConnector() self.connector.settings["ENGINE"] = "django.db.backends.postgresql" self.connector.settings["NAME"] = "dbname" self.connector.settings["HOST"] = "hostname" def test_user_password_uses_special_characters(self, mock_dump_cmd): self.connector.settings["PASSWORD"] = "@!" self.connector.settings["USER"] = "@" self.connector.create_dump() self.assertIn( "postgresql://%40:%40%21@hostname/dbname", mock_dump_cmd.call_args[0][0] ) def test_create_dump(self, mock_dump_cmd): dump = self.connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b"foo") # Test cmd self.assertTrue(mock_dump_cmd.called) def test_create_dump_without_host(self, mock_dump_cmd): # this is allowed now: https://github.com/jazzband/django-dbbackup/issues/520 self.connector.settings.pop("HOST", None) self.connector.create_dump() def test_password_but_no_user(self, mock_dump_cmd): self.connector.settings.pop("USER", None) self.connector.settings["PASSWORD"] = "hello" self.connector.create_dump() self.assertIn("postgresql://hostname/dbname", mock_dump_cmd.call_args[0][0]) def test_create_dump_host(self, mock_dump_cmd): # With self.connector.settings["HOST"] = "foo" self.connector.create_dump() self.assertIn("postgresql://foo/dbname", mock_dump_cmd.call_args[0][0]) def test_create_dump_port(self, mock_dump_cmd): # Without self.connector.settings.pop("PORT", None) self.connector.create_dump() self.assertIn("postgresql://hostname/dbname", mock_dump_cmd.call_args[0][0]) # With self.connector.settings["PORT"] = 42 self.connector.create_dump() self.assertIn("postgresql://hostname:42/dbname", mock_dump_cmd.call_args[0][0]) def test_create_dump_user(self, mock_dump_cmd): # Without self.connector.settings.pop("USER", None) self.connector.create_dump() self.assertIn("postgresql://hostname/dbname", mock_dump_cmd.call_args[0][0]) # With self.connector.settings["USER"] = "foo" self.connector.create_dump() self.assertIn("postgresql://foo@hostname/dbname", mock_dump_cmd.call_args[0][0]) def test_create_dump_exclude(self, mock_dump_cmd): # Without self.connector.create_dump() self.assertNotIn(" --exclude-table-data=", mock_dump_cmd.call_args[0][0]) # With self.connector.exclude = ("foo",) self.connector.create_dump() self.assertIn(" --exclude-table-data=foo", mock_dump_cmd.call_args[0][0]) # With several self.connector.exclude = ("foo", "bar") self.connector.create_dump() self.assertIn(" --exclude-table-data=foo", mock_dump_cmd.call_args[0][0]) self.assertIn(" --exclude-table-data=bar", mock_dump_cmd.call_args[0][0]) def test_create_dump_drop(self, mock_dump_cmd): # Without self.connector.drop = False self.connector.create_dump() self.assertNotIn(" --clean", mock_dump_cmd.call_args[0][0]) # With self.connector.drop = True self.connector.create_dump() self.assertIn(" --clean", mock_dump_cmd.call_args[0][0]) @patch( "dbbackup.db.postgresql.PgDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): dump = self.connector.create_dump() self.connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) @patch( "dbbackup.db.postgresql.PgDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_user(self, mock_dump_cmd, mock_restore_cmd): dump = self.connector.create_dump() # Without self.connector.settings.pop("USER", None) self.connector.restore_dump(dump) self.assertIn("postgresql://hostname/dbname", mock_restore_cmd.call_args[0][0]) self.assertNotIn(" --username=", mock_restore_cmd.call_args[0][0]) # With self.connector.settings["USER"] = "foo" self.connector.restore_dump(dump) self.assertIn( "postgresql://foo@hostname/dbname", mock_restore_cmd.call_args[0][0] ) def test_create_dump_schema(self, mock_dump_cmd): # Without self.connector.create_dump() self.assertNotIn(" -n ", mock_dump_cmd.call_args[0][0]) # With self.connector.schemas = ["public"] self.connector.create_dump() self.assertIn(" -n public", mock_dump_cmd.call_args[0][0]) # With several self.connector.schemas = ["public", "foo"] self.connector.create_dump() self.assertIn(" -n public", mock_dump_cmd.call_args[0][0]) self.assertIn(" -n foo", mock_dump_cmd.call_args[0][0]) @patch( "dbbackup.db.postgresql.PgDumpConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_schema(self, mock_dump_cmd, mock_restore_cmd): # Without dump = self.connector.create_dump() self.connector.restore_dump(dump) self.assertNotIn(" -n ", mock_restore_cmd.call_args[0][0]) # With self.connector.schemas = ["public"] dump = self.connector.create_dump() self.connector.restore_dump(dump) self.assertIn(" -n public", mock_restore_cmd.call_args[0][0]) # With several self.connector.schemas = ["public", "foo"] dump = self.connector.create_dump() self.connector.restore_dump(dump) self.assertIn(" -n public", mock_restore_cmd.call_args[0][0]) self.assertIn(" -n foo", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.postgresql.PgDumpBinaryConnector.run_command", return_value=(BytesIO(b"foo"), BytesIO()), ) class PgDumpBinaryConnectorTest(TestCase): def setUp(self): self.connector = PgDumpBinaryConnector() self.connector.settings["HOST"] = "hostname" self.connector.settings["ENGINE"] = "django.db.backends.postgresql" self.connector.settings["NAME"] = "dbname" def test_create_dump(self, mock_dump_cmd): dump = self.connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b"foo") # Test cmd self.assertTrue(mock_dump_cmd.called) self.assertIn("--format=custom", mock_dump_cmd.call_args[0][0]) def test_create_dump_exclude(self, mock_dump_cmd): # Without self.connector.create_dump() self.assertNotIn(" --exclude-table-data=", mock_dump_cmd.call_args[0][0]) # With self.connector.exclude = ("foo",) self.connector.create_dump() self.assertIn(" --exclude-table-data=foo", mock_dump_cmd.call_args[0][0]) # With several self.connector.exclude = ("foo", "bar") self.connector.create_dump() self.assertIn(" --exclude-table-data=foo", mock_dump_cmd.call_args[0][0]) self.assertIn(" --exclude-table-data=bar", mock_dump_cmd.call_args[0][0]) def test_create_dump_drop(self, mock_dump_cmd): # Without self.connector.drop = False self.connector.create_dump() self.assertNotIn(" --clean", mock_dump_cmd.call_args[0][0]) # Binary drop at restore level self.connector.drop = True self.connector.create_dump() self.assertNotIn(" --clean", mock_dump_cmd.call_args[0][0]) @patch( "dbbackup.db.postgresql.PgDumpBinaryConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): dump = self.connector.create_dump() self.connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) def test_create_dump_schema(self, mock_dump_cmd): # Without self.connector.create_dump() self.assertNotIn(" -n ", mock_dump_cmd.call_args[0][0]) # With self.connector.schemas = ["public"] self.connector.create_dump() self.assertIn(" -n public", mock_dump_cmd.call_args[0][0]) # With several self.connector.schemas = ["public", "foo"] self.connector.create_dump() self.assertIn(" -n public", mock_dump_cmd.call_args[0][0]) self.assertIn(" -n foo", mock_dump_cmd.call_args[0][0]) @patch( "dbbackup.db.postgresql.PgDumpBinaryConnector.run_command", return_value=(BytesIO(), BytesIO()), ) def test_restore_dump_schema(self, mock_dump_cmd, mock_restore_cmd): # Without dump = self.connector.create_dump() self.connector.restore_dump(dump) self.assertNotIn(" -n ", mock_restore_cmd.call_args[0][0]) # With self.connector.schemas = ["public"] dump = self.connector.create_dump() self.connector.restore_dump(dump) self.assertIn(" -n public", mock_restore_cmd.call_args[0][0]) # With several self.connector.schemas = ["public", "foo"] dump = self.connector.create_dump() self.connector.restore_dump(dump) self.assertIn(" -n public", mock_restore_cmd.call_args[0][0]) self.assertIn(" -n foo", mock_restore_cmd.call_args[0][0]) @patch( "dbbackup.db.postgresql.PgDumpGisConnector.run_command", return_value=(BytesIO(b"foo"), BytesIO()), ) class PgDumpGisConnectorTest(TestCase): def setUp(self): self.connector = PgDumpGisConnector() self.connector.settings["HOST"] = "hostname" @patch( "dbbackup.db.postgresql.PgDumpGisConnector.run_command", return_value=(BytesIO(b"foo"), BytesIO()), ) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): dump = self.connector.create_dump() # Without ADMINUSER self.connector.settings.pop("ADMIN_USER", None) self.connector.restore_dump(dump) self.assertTrue(mock_restore_cmd.called) # With self.connector.settings["ADMIN_USER"] = "foo" self.connector.restore_dump(dump) self.assertTrue(mock_restore_cmd.called) def test_enable_postgis(self, mock_dump_cmd): self.connector.settings["ADMIN_USER"] = "foo" self.connector._enable_postgis() self.assertIn( '"CREATE EXTENSION IF NOT EXISTS postgis;"', mock_dump_cmd.call_args[0][0] ) self.assertIn("--username=foo", mock_dump_cmd.call_args[0][0]) def test_enable_postgis_host(self, mock_dump_cmd): self.connector.settings["ADMIN_USER"] = "foo" # Without self.connector.settings.pop("HOST", None) self.connector._enable_postgis() self.assertNotIn(" --host=", mock_dump_cmd.call_args[0][0]) # With self.connector.settings["HOST"] = "foo" self.connector._enable_postgis() self.assertIn(" --host=foo", mock_dump_cmd.call_args[0][0]) def test_enable_postgis_port(self, mock_dump_cmd): self.connector.settings["ADMIN_USER"] = "foo" # Without self.connector.settings.pop("PORT", None) self.connector._enable_postgis() self.assertNotIn(" --port=", mock_dump_cmd.call_args[0][0]) # With self.connector.settings["PORT"] = 42 self.connector._enable_postgis() self.assertIn(" --port=42", mock_dump_cmd.call_args[0][0]) @patch( "dbbackup.db.base.Popen", **{ "return_value.wait.return_value": True, "return_value.poll.return_value": False, }, ) class PgDumpConnectorRunCommandTest(TestCase): def test_run_command(self, mock_popen): connector = PgDumpConnector() connector.settings["HOST"] = "hostname" connector.create_dump() self.assertEqual(mock_popen.call_args[0][0][0], "pg_dump") def test_run_command_with_password(self, mock_popen): connector = PgDumpConnector() connector.settings["HOST"] = "hostname" connector.settings["PASSWORD"] = "foo" connector.create_dump() self.assertEqual(mock_popen.call_args[0][0][0], "pg_dump") def test_run_command_with_password_and_other(self, mock_popen): connector = PgDumpConnector(env={"foo": "bar"}) connector.settings["HOST"] = "hostname" connector.settings["PASSWORD"] = "foo" connector.create_dump() self.assertEqual(mock_popen.call_args[0][0][0], "pg_dump") self.assertIn("foo", mock_popen.call_args[1]["env"]) self.assertEqual("bar", mock_popen.call_args[1]["env"]["foo"]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724449435.0 django_dbbackup-4.2.1/dbbackup/tests/test_connectors/test_sqlite.py0000666000000000000000000000430114662201233022605 0ustar00from io import BytesIO from unittest.mock import mock_open, patch from django.db import connection from django.test import TestCase from dbbackup.db.sqlite import SqliteConnector, SqliteCPConnector from dbbackup.tests.testapp.models import CharModel, TextModel class SqliteConnectorTest(TestCase): def test_write_dump(self): dump_file = BytesIO() connector = SqliteConnector() connector._write_dump(dump_file) dump_file.seek(0) for line in dump_file: self.assertTrue(line.strip().endswith(b";")) def test_create_dump(self): connector = SqliteConnector() dump = connector.create_dump() self.assertTrue(dump.read()) def test_create_dump_with_unicode(self): CharModel.objects.create(field="\xe9") connector = SqliteConnector() dump = connector.create_dump() self.assertTrue(dump.read()) def test_create_dump_with_newline(self): TextModel.objects.create( field=f'INSERT ({"foo" * 5000}\nbar\n WHERE \nbaz IS\n "great" );\n' ) connector = SqliteConnector() dump = connector.create_dump() self.assertTrue(dump.read()) def test_restore_dump(self): TextModel.objects.create(field="T\nf\nw\nnl") connector = SqliteConnector() dump = connector.create_dump() connector.restore_dump(dump) def test_create_dump_with_virtual_tables(self): with connection.cursor() as c: c.execute("CREATE VIRTUAL TABLE lookup USING fts5(field)") connector = SqliteConnector() dump = connector.create_dump() self.assertTrue(dump.read()) @patch("dbbackup.db.sqlite.open", mock_open(read_data=b"foo"), create=True) class SqliteCPConnectorTest(TestCase): def test_create_dump(self): connector = SqliteCPConnector() dump = connector.create_dump() dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b"foo") def test_restore_dump(self): connector = SqliteCPConnector() dump = connector.create_dump() connector.restore_dump(dump) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_log.py0000666000000000000000000001060314662175773016676 0ustar00import logging from unittest.mock import patch import django from django.core import mail from django.test import TestCase from testfixtures import log_capture from dbbackup import log class LoggerDefaultTestCase(TestCase): @log_capture() def test_root(self, captures): logger = logging.getLogger() logger.debug("a noise") logger.info("a message") logger.warning("a warning") logger.error("an error") logger.critical("a critical error") captures.check( ("root", "DEBUG", "a noise"), ("root", "INFO", "a message"), ("root", "WARNING", "a warning"), ("root", "ERROR", "an error"), ("root", "CRITICAL", "a critical error"), ) @log_capture() def test_django(self, captures): logger = logging.getLogger("django") logger.debug("a noise") logger.info("a message") logger.warning("a warning") logger.error("an error") logger.critical("a critical error") if django.VERSION < (1, 9): captures.check( ("django", "DEBUG", "a noise"), ("django", "INFO", "a message"), ("django", "WARNING", "a warning"), ("django", "ERROR", "an error"), ("django", "CRITICAL", "a critical error"), ) else: captures.check( ("django", "INFO", "a message"), ("django", "WARNING", "a warning"), ("django", "ERROR", "an error"), ("django", "CRITICAL", "a critical error"), ) @log_capture() def test_dbbackup(self, captures): logger = logging.getLogger("dbbackup") logger.debug("a noise") logger.info("a message") logger.warning("a warning") logger.error("an error") logger.critical("a critical error") captures.check( ("dbbackup", "INFO", "a message"), ("dbbackup", "WARNING", "a warning"), ("dbbackup", "ERROR", "an error"), ("dbbackup", "CRITICAL", "a critical error"), ) @log_capture() def test_dbbackup_storage(self, captures): logger = logging.getLogger("dbbackup.storage") logger.debug("a noise") logger.info("a message") logger.warning("a warning") logger.error("an error") logger.critical("a critical error") captures.check( ("dbbackup.storage", "INFO", "a message"), ("dbbackup.storage", "WARNING", "a warning"), ("dbbackup.storage", "ERROR", "an error"), ("dbbackup.storage", "CRITICAL", "a critical error"), ) @log_capture() def test_other_module(self, captures): logger = logging.getLogger("os.path") logger.debug("a noise") logger.info("a message") logger.warning("a warning") logger.error("an error") logger.critical("a critical error") captures.check( ("os.path", "DEBUG", "a noise"), ("os.path", "INFO", "a message"), ("os.path", "WARNING", "a warning"), ("os.path", "ERROR", "an error"), ("os.path", "CRITICAL", "a critical error"), ) class DbbackupAdminEmailHandlerTest(TestCase): def setUp(self): self.logger = logging.getLogger("dbbackup") @patch("dbbackup.settings.SEND_EMAIL", True) def test_send_mail(self): # Test mail error msg = "Super msg" self.logger.error(msg) self.assertEqual(mail.outbox[0].subject, "[dbbackup] ERROR: Super msg") # Test don't mail below self.logger.warning(msg) self.assertEqual(len(mail.outbox), 1) @patch("dbbackup.settings.SEND_EMAIL", False) def test_send_mail_is_false(self): msg = "Super msg" self.logger.error(msg) self.assertEqual(len(mail.outbox), 0) class MailEnabledFilterTest(TestCase): @patch("dbbackup.settings.SEND_EMAIL", True) def test_filter_is_true(self): filter_ = log.MailEnabledFilter() self.assertTrue(filter_.filter("foo")) @patch("dbbackup.settings.SEND_EMAIL", False) def test_filter_is_false(self): filter_ = log.MailEnabledFilter() self.assertFalse(filter_.filter("foo")) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_storage.py0000666000000000000000000002052714662175773017567 0ustar00from unittest.mock import patch from django.test import TestCase from dbbackup import utils from dbbackup.storage import Storage, get_storage, get_storage_class from dbbackup.tests.utils import HANDLED_FILES, FakeStorage DEFAULT_STORAGE_PATH = "django.core.files.storage.FileSystemStorage" STORAGE_OPTIONS = {"location": "/tmp"} class Get_StorageTest(TestCase): @patch("dbbackup.settings.STORAGE", DEFAULT_STORAGE_PATH) @patch("dbbackup.settings.STORAGE_OPTIONS", STORAGE_OPTIONS) def test_func(self, *args): self.assertIsInstance(get_storage(), Storage) def test_set_path(self): fake_storage_path = "dbbackup.tests.utils.FakeStorage" storage = get_storage(fake_storage_path) self.assertIsInstance(storage.storage, FakeStorage) @patch("dbbackup.settings.STORAGE", DEFAULT_STORAGE_PATH) def test_set_options(self, *args): storage = get_storage(options=STORAGE_OPTIONS) self.assertIn( storage.storage.__module__, # TODO: Remove "django.core.files.storage" case when dropping support for Django < 4.2. ("django.core.files.storage", "django.core.files.storage.filesystem"), ) def test_get_storage_class(self): storage_class = get_storage_class(DEFAULT_STORAGE_PATH) self.assertIn( storage_class.__module__, ("django.core.files.storage", "django.core.files.storage.filesystem"), ) self.assertIn(storage_class.__name__, ("FileSystemStorage", "DefaultStorage")) storage_class = get_storage_class("dbbackup.tests.utils.FakeStorage") self.assertEqual(storage_class.__module__, "dbbackup.tests.utils") self.assertEqual(storage_class.__name__, "FakeStorage") def test_default_storage_class(self): storage_class = get_storage_class() self.assertIn( storage_class.__module__, ("django.core.files.storage", "django.core.files.storage.filesystem"), ) self.assertIn(storage_class.__name__, ("FileSystemStorage", "DefaultStorage")) def test_invalid_storage_class_path(self): with self.assertRaises(ImportError): get_storage_class("invalid.path.to.StorageClass") def test_storages_settings(self): from .settings import STORAGES self.assertIsInstance(STORAGES, dict) self.assertEqual( STORAGES["dbbackup"]["BACKEND"], "dbbackup.tests.utils.FakeStorage" ) from dbbackup.settings import DJANGO_STORAGES, STORAGE self.assertIsInstance(DJANGO_STORAGES, dict) self.assertEqual(DJANGO_STORAGES, STORAGES) self.assertEqual(STORAGES["dbbackup"]["BACKEND"], STORAGE) storage = get_storage() self.assertEqual(storage.storage.__class__.__module__, "dbbackup.tests.utils") self.assertEqual(storage.storage.__class__.__name__, "FakeStorage") def test_storages_settings_options(self): from dbbackup.settings import STORAGE_OPTIONS from .settings import STORAGES self.assertEqual(STORAGES["dbbackup"]["OPTIONS"], STORAGE_OPTIONS) class StorageTest(TestCase): def setUp(self): self.storageCls = Storage self.storageCls.name = "foo" self.storage = Storage() class StorageListBackupsTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.storage = get_storage() # foodb files HANDLED_FILES["written_files"] += [ (utils.filename_generate(ext, "foodb"), None) for ext in ("db", "db.gz", "db.gpg", "db.gz.gpg") ] HANDLED_FILES["written_files"] += [ (utils.filename_generate(ext, "hamdb", "fooserver"), None) for ext in ("db", "db.gz", "db.gpg", "db.gz.gpg") ] # Media file HANDLED_FILES["written_files"] += [ (utils.filename_generate(ext, None, None, "media"), None) for ext in ("tar", "tar.gz", "tar.gpg", "tar.gz.gpg") ] HANDLED_FILES["written_files"] += [ (utils.filename_generate(ext, "bardb", "barserver"), None) for ext in ("db", "db.gz", "db.gpg", "db.gz.gpg") ] # barserver files HANDLED_FILES["written_files"] += [("file_without_date", None)] def test_nofilter(self): files = self.storage.list_backups() self.assertEqual(len(HANDLED_FILES["written_files"]) - 1, len(files)) for file in files: self.assertNotEqual("file_without_date", file) def test_encrypted(self): files = self.storage.list_backups(encrypted=True) for file in files: self.assertIn(".gpg", file) def test_compressed(self): files = self.storage.list_backups(compressed=True) for file in files: self.assertIn(".gz", file) def test_not_encrypted(self): files = self.storage.list_backups(encrypted=False) for file in files: self.assertNotIn(".gpg", file) def test_not_compressed(self): files = self.storage.list_backups(compressed=False) for file in files: self.assertNotIn(".gz", file) def test_content_type_db(self): files = self.storage.list_backups(content_type="db") for file in files: self.assertIn(".db", file) def test_database(self): files = self.storage.list_backups(database="foodb") for file in files: self.assertIn("foodb", file) self.assertNotIn("bardb", file) self.assertNotIn("hamdb", file) def test_servername(self): files = self.storage.list_backups(servername="fooserver") for file in files: self.assertIn("fooserver", file) self.assertNotIn("barserver", file) files = self.storage.list_backups(servername="barserver") for file in files: self.assertIn("barserver", file) self.assertNotIn("fooserver", file) def test_content_type_media(self): files = self.storage.list_backups(content_type="media") for file in files: self.assertIn(".tar", file) # def test_servername(self): # files = self.storage.list_backups(servername='barserver') # for file in files: # self.assertIn('barserver', file) class StorageGetLatestTest(TestCase): def setUp(self): self.storage = get_storage() HANDLED_FILES["written_files"] = [ (f, None) for f in [ "2015-02-06-042810.bak", "2015-02-07-042810.bak", "2015-02-08-042810.bak", ] ] def tearDown(self): HANDLED_FILES.clean() def test_func(self): filename = self.storage.get_latest_backup() self.assertEqual(filename, "2015-02-08-042810.bak") class StorageGetMostRecentTest(TestCase): def setUp(self): self.storage = get_storage() HANDLED_FILES["written_files"] = [ (f, None) for f in [ "2015-02-06-042810.bak", "2015-02-07-042810.bak", "2015-02-08-042810.bak", ] ] def tearDown(self): HANDLED_FILES.clean() def test_func(self): filename = self.storage.get_older_backup() self.assertEqual(filename, "2015-02-06-042810.bak") def keep_only_even_files(filename): from dbbackup.utils import filename_to_date return filename_to_date(filename).day % 2 == 0 class StorageCleanOldBackupsTest(TestCase): def setUp(self): self.storage = get_storage() HANDLED_FILES.clean() HANDLED_FILES["written_files"] = [ (f, None) for f in [ "2015-02-06-042810.bak", "2015-02-07-042810.bak", "2015-02-08-042810.bak", ] ] def test_func(self): self.storage.clean_old_backups(keep_number=1) self.assertEqual(2, len(HANDLED_FILES["deleted_files"])) @patch("dbbackup.settings.CLEANUP_KEEP_FILTER", keep_only_even_files) def test_keep_filter(self): self.storage.clean_old_backups(keep_number=1) self.assertListEqual(["2015-02-07-042810.bak"], HANDLED_FILES["deleted_files"]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/test_utils.py0000666000000000000000000002221114662175773017253 0ustar00import os import tempfile from datetime import datetime from io import StringIO from unittest.mock import patch import django import pytz from django.core import mail from django.test import TestCase from dbbackup import settings, utils from dbbackup.tests.utils import ( COMPRESSED_FILE, ENCRYPTED_FILE, add_private_gpg, add_public_gpg, callable_for_filename_template, clean_gpg_keys, ) class Bytes_To_StrTest(TestCase): def test_get_gb(self): value = utils.bytes_to_str(byteVal=2**31) self.assertEqual(value, "2.0 GiB") def test_0_decimal(self): value = utils.bytes_to_str(byteVal=1.01, decimals=0) self.assertEqual(value, "1 B") def test_2_decimal(self): value = utils.bytes_to_str(byteVal=1.01, decimals=2) self.assertEqual(value, "1.01 B") class Handle_SizeTest(TestCase): def test_func(self): filehandle = StringIO("Test string") value = utils.handle_size(filehandle=filehandle) self.assertEqual(value, "11.0 B") class MailAdminsTest(TestCase): def test_func(self): subject = "foo subject" msg = "bar message" utils.mail_admins(subject, msg) self.assertEqual(len(mail.outbox), 1) sent_mail = mail.outbox[0] expected_subject = f"{settings.EMAIL_SUBJECT_PREFIX}{subject}" expected_to = settings.ADMINS[0][1] expected_from = settings.SERVER_EMAIL self.assertEqual(sent_mail.subject, expected_subject) self.assertEqual(sent_mail.body, msg) self.assertEqual(sent_mail.to[0], expected_to) self.assertEqual(sent_mail.from_email, expected_from) @patch("dbbackup.settings.ADMINS", None) def test_no_admin(self): subject = "foo subject" msg = "bar message" self.assertIsNone(utils.mail_admins(subject, msg)) self.assertEqual(len(mail.outbox), 0) class Email_Uncaught_ExceptionTest(TestCase): def test_success(self): def func(): pass utils.email_uncaught_exception(func) self.assertEqual(len(mail.outbox), 0) @patch("dbbackup.settings.SEND_EMAIL", False) def test_raise_error_without_mail(self): def func(): raise Exception("Foo") with self.assertRaises(Exception): utils.email_uncaught_exception(func)() self.assertEqual(len(mail.outbox), 0) @patch("dbbackup.settings.SEND_EMAIL", True) @patch("dbbackup.settings.FAILURE_RECIPIENTS", ["foo@bar"]) def test_raise_with_mail(self): def func(): raise Exception("Foo") with self.assertRaises(Exception): utils.email_uncaught_exception(func)() self.assertEqual(len(mail.outbox), 1) error_mail = mail.outbox[0] self.assertEqual(["foo@bar"], error_mail.to) self.assertIn('Exception("Foo")', error_mail.subject) if django.VERSION >= (1, 7): self.assertIn('Exception("Foo")', error_mail.body) class Encrypt_FileTest(TestCase): def setUp(self): self.path = tempfile.mktemp() with open(self.path, "a") as fd: fd.write("foo") add_public_gpg() def tearDown(self): os.remove(self.path) clean_gpg_keys() def test_func(self, *args): with open(self.path) as fd: encrypted_file, filename = utils.encrypt_file( inputfile=fd, filename="foo.txt" ) encrypted_file.seek(0) self.assertTrue(encrypted_file.read()) class Unencrypt_FileTest(TestCase): def setUp(self): add_private_gpg() def tearDown(self): clean_gpg_keys() @patch("dbbackup.utils.input", return_value=None) @patch("dbbackup.utils.getpass", return_value=None) def test_unencrypt(self, *args): inputfile = open(ENCRYPTED_FILE, "r+b") uncryptfile, filename = utils.unencrypt_file(inputfile, "foofile.gpg") uncryptfile.seek(0) self.assertEqual(b"foo\n", uncryptfile.read()) class Compress_FileTest(TestCase): def setUp(self): self.path = tempfile.mktemp() with open(self.path, "a+b") as fd: fd.write(b"foo") def tearDown(self): os.remove(self.path) def test_func(self, *args): with open(self.path) as fd: compressed_file, filename = utils.encrypt_file( inputfile=fd, filename="foo.txt" ) class Uncompress_FileTest(TestCase): def test_func(self): inputfile = open(COMPRESSED_FILE, "rb") fd, filename = utils.uncompress_file(inputfile, "foo.gz") fd.seek(0) self.assertEqual(fd.read(), b"foo\n") class Create_Spooled_Temporary_FileTest(TestCase): def setUp(self): self.path = tempfile.mktemp() with open(self.path, "a") as fd: fd.write("foo") def tearDown(self): os.remove(self.path) def test_func(self, *args): utils.create_spooled_temporary_file(filepath=self.path) class TimestampTest(TestCase): def test_naive_value(self): with self.settings(USE_TZ=False): timestamp = utils.timestamp(datetime(2015, 8, 15, 8, 15, 12, 0)) self.assertEqual(timestamp, "2015-08-15-081512") def test_aware_value(self): with self.settings(USE_TZ=True) and self.settings(TIME_ZONE="Europe/Rome"): timestamp = utils.timestamp( datetime(2015, 8, 15, 8, 15, 12, 0, tzinfo=pytz.utc) ) self.assertEqual(timestamp, "2015-08-15-101512") class Datefmt_To_Regex(TestCase): def test_patterns(self): now = datetime.now() for datefmt, regex in utils.PATTERN_MATCHNG: date_string = datetime.strftime(now, datefmt) regex = utils.datefmt_to_regex(datefmt) match = regex.match(date_string) self.assertTrue(match) self.assertEqual(match.groups()[0], date_string) def test_complex_pattern(self): now = datetime.now() datefmt = "Foo%a_%A-%w-%d-%b-%B_%m_%y_%Y-%H-%I-%M_%S_%f_%j-%U-%W-Bar" date_string = datetime.strftime(now, datefmt) regex = utils.datefmt_to_regex(datefmt) self.assertTrue(regex.pattern.startswith("(Foo")) self.assertTrue(regex.pattern.endswith("Bar)")) match = regex.match(date_string) self.assertTrue(match) self.assertEqual(match.groups()[0], date_string) class Filename_To_DatestringTest(TestCase): def test_func(self): now = datetime.now() datefmt = settings.DATE_FORMAT filename = f"{datetime.strftime(now, datefmt)}-foo.gz.gpg" datestring = utils.filename_to_datestring(filename, datefmt) self.assertIn(datestring, filename) def test_generated_filename(self): filename = utils.filename_generate("bak", "default") datestring = utils.filename_to_datestring(filename) self.assertIn(datestring, filename) class Filename_To_DateTest(TestCase): def test_func(self): now = datetime.now() datefmt = settings.DATE_FORMAT filename = f"{datetime.strftime(now, datefmt)}-foo.gz.gpg" date = utils.filename_to_date(filename, datefmt) self.assertEqual(date.timetuple()[:5], now.timetuple()[:5]) def test_generated_filename(self): filename = utils.filename_generate("bak", "default") utils.filename_to_date(filename) @patch("dbbackup.settings.HOSTNAME", "test") class Filename_GenerateTest(TestCase): @patch( "dbbackup.settings.FILENAME_TEMPLATE", "---{databasename}--{servername}-{datetime}.{extension}", ) def test_func(self, *args): extension = "foo" generated_name = utils.filename_generate(extension) self.assertTrue("--" not in generated_name) self.assertFalse(generated_name.startswith("-")) def test_db(self, *args): extension = "foo" generated_name = utils.filename_generate(extension) self.assertTrue(generated_name.startswith(settings.HOSTNAME)) self.assertTrue(generated_name.endswith(extension)) def test_media(self, *args): extension = "foo" generated_name = utils.filename_generate(extension, content_type="media") self.assertTrue(generated_name.startswith(settings.HOSTNAME)) self.assertTrue(generated_name.endswith(extension)) @patch("django.utils.timezone.settings.USE_TZ", True) def test_tz_true(self): filename = utils.filename_generate("bak", "default") datestring = utils.filename_to_datestring(filename) self.assertIn(datestring, filename) @patch("dbbackup.settings.FILENAME_TEMPLATE", callable_for_filename_template) def test_template_is_callable(self, *args): extension = "foo" generated_name = utils.filename_generate(extension) self.assertTrue(generated_name.endswith("foo")) class QuoteCommandArg(TestCase): def test_arg_with_space(self): assert utils.get_escaped_command_arg("foo bar") == "'foo bar'" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7101183 django_dbbackup-4.2.1/dbbackup/tests/testapp/0000777000000000000000000000000014662205136016147 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/__init__.py0000666000000000000000000000000014314702350020240 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7131195 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/0000777000000000000000000000000014662205136017250 5ustar00././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1724451421.714121 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/gpg/0000777000000000000000000000000014662205136020025 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/gpg/pubring.gpg0000666000000000000000000000122514314702350022164 0ustar00˜U®¿†øÊ±ÌÁÛ(8ªÊ9 æÐC£78Ö4¡WKccœ?6{OJªiSCS+³æaÓÍi‚¢¿âà6÷LÑÝÅæHÈ€!?¡«wÿèö”Ï16Ž–k¦+ƒŽ¿¹?ï$œˆk”z½ẽ>âLý4Ð.Zl#‘ù_…|6Ce„2 ƒ´Tester ˆ¸"U®¿†  € ÷Ñ»ðcýé€hÿq,g©¡»ä·£ƒ‚èj~D[úœ×£: ’»ãËuÌT2? Œ_ü"p–óϵ–Õz‚¤ôcKD‡ƒÈdHaB9¥éÃÔZ@[|K˜±Y) ƒà _EËkûVw'µ_ü#yg˜OõÃó¯vp²¢w~ûT¬Ü·†p¢«L °¸U®¿†àñïÌYTe™›T¬ùOl#vIþñãŸý޳Ôâï¼½ÓËf£SsIJǷ÷ÅZQa²Øë„Fÿq)A‡©Ä¸’¨¤e ˜‡€5úÅÍO©ÛзÌûa·Pù5`¯]¿–ñP”ß" Ä„´æÖ€kóˆŸ U®¿† ÷Ñ»ðcýé/èü µƒ†* ºsÿU¨Ü?~Õ²ÿu°¸zú¦ìã¢pìÝ{mÃõqý3]ÓÀêô÷I!Enõ‚j*€¼¦Íï3W×#å$r<‰þìך?!Þë\X/\ kÌŽ~wj4 “S“)1'Lc°G»W®‰à (úÆ9ÛúèˆPZ°././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/gpg/secring.gpg0000666000000000000000000000245514314702350022156 0ustar00•ØU®¿†øÊ±ÌÁÛ(8ªÊ9 æÐC£78Ö4¡WKccœ?6{OJªiSCS+³æaÓÍi‚¢¿âà6÷LÑÝÅæHÈ€!?¡«wÿèö”Ï16Ž–k¦+ƒŽ¿¹?ï$œˆk”z½ẽ>âLý4Ð.Zl#‘ù_…|6Ce„2 ƒþ Qši½îÇdACQ^_<½ÀÎwZ¶0Ãì:%óbéM[¹lО@Âthâ>æ*·s½F}­YÙ„®dcÈ”!Zçúöò9š¸ÀGrËð¶Ñ5Ø„ïìÅå…_A y…7Þ° f}2V§æ×cFäúsikrÕe£Ü&}^©ùcºMá–¶·dÆ€ÁD´”ùªùÏ;Bi–"Ëà$ßcTTmOï?äwgêßû’)“¦2‹ÍöÆr_—´O.c_o[ÿb鄸Hq^KM„ˆúâpP«'“¶ƒRvGJ‚*ã&ÑëUØzZ‹ú¾ÑRD©‰s¸Ð‚íI8 …ÎCùý¼©C|+£ ÷ “±6»ù¬õC–…¶°ÙFwCŒî~#ì8á®o€Gž—u}µÛÒXš¬²ŽŽ#vâ ¡¤”´Tester ˆ¸"U®¿†  € ÷Ñ»ðcýé€hÿq,g©¡»ä·£ƒ‚èj~D[úœ×£: ’»ãËuÌT2? Œ_ü"p–óϵ–Õz‚¤ôcKD‡ƒÈdHaB9¥éÃÔZ@[|K˜±Y) ƒà _EËkûVw'µ_ü#yg˜OõÃó¯vp²¢w~ûT¬Ü·†p¢«L °ØU®¿†àñïÌYTe™›T¬ùOl#vIþñãŸý޳Ôâï¼½ÓËf£SsIJǷ÷ÅZQa²Øë„Fÿq)A‡©Ä¸’¨¤e ˜‡€5úÅÍO©ÛзÌûa·Pù5`¯]¿–ñP”ß" Ä„´æÖ€kóü…~¸èÁ ¡•üÄØFŸl:Hä¦Íô…\MåÞ&b]ÉÏ©3€×C¿æÏ.ü«ß‹¾Ó‚!‚/š—&uMÄCô†ùX¡p‹O.›Aclûß%ÅÄwÚ7Úòîf#<4cu˜'+¬§4‚4>“ õLY@Ú¬k´òûÀAí"C´%Ï¡o+Õâ‰õ’£¤:ñ¢«•˦£CD=ÖóeQ@»"Gtå®ÿ{-—3zØev•{¶ÿ–ÄAò×kúQŠM¹¾©Ä²­(‡à:©y¹÷•}•WõxêcáCæ]ãr­­.VعíPjÂâëh\ÀѾΖΓ3ü ($•eºÉ gÆè“X¼nüÖ2Ö©Ü»pxäiä“7Åe­Â»"Û¦ú…ŠöìÊðËö>^]wÕ„ó÷âú=ªˆŸ U®¿† ÷Ñ»ðcýé/èü µƒ†* ºsÿU¨Ü?~Õ²ÿu°¸zú¦ìã¢pìÝ{mÃõqý3]ÓÀêô÷I!Enõ‚j*€¼¦Íï3W×#å$r<‰þìך?!Þë\X/\ kÌŽ~wj4 “S“)1'Lc°G»W®‰à (úÆ9ÛúèˆPZ°././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/test.gz0000666000000000000000000000003014314702350020554 0ustar00‹ä2±U342æý‚Z././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/test.txt.gpg0000666000000000000000000000032714314702350021540 0ustar00„Œ,ݬ·³²Qsþ2ÄþV!A(òÕÈ:ïÂK-IŒg¶•ýäXg¥7¥M¤“Á__Ž£ô/­ ùB3—‰e£ÊáãQcjESþv5 ÈHÒü}ú Êc:UiœÏéä õúd§Ü@Øêu¯°°‡Ù˜øwfðòÏ7 ¯RÙ_¶÷i¯dY ª;ÒGå›sX³Õ­Qµø#kx=<ñ±U§bÛ-<=×¼'¶«äá3ýÏ®‹Ußç:ìÌNCWÌ„©°•ôN ?ƒ¥¹¤ÌE././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/test.txt.gz0000666000000000000000000000004114314702350021374 0ustar00‹•U±Utest.txtKËÏç¨e2~././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/test.txt.gz.gpg0000666000000000000000000000035514314702350022160 0ustar00„Œ,ݬ·³²Qsý=«¨xi½?á—ÿ+ìEžs‚Ÿš?•¼FÉnüGw/ÂV§w*,ÊP˜Ü`CZ(VŠÁxc Þo7CoæZÞ’®Õ-H••å8H|s~!dè¼J™¬°è}Áu'Mô¨8wû"›¦§lù?Êv¶¡RÒv{=Ä{PŽÒ]"c‡ÊjG©÷§¦~D+/xá]†éi€ pSܼü~`Ü#ƒ\öœ«glc2Ü—ÚG<À€é$·O¨¨ÌZbrÖÊö͇¡äˆx YèÃ-CùqÁ‚.át·tA././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/blobs/test.txt.tar0000666000000000000000000000004114314702350021542 0ustar00‹•U±Utest.txtKËÏç¨e2~././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1724451421.715118 django_dbbackup-4.2.1/dbbackup/tests/testapp/management/0000777000000000000000000000000014662205136020263 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/management/__init__.py0000666000000000000000000000000014314702350022354 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7161195 django_dbbackup-4.2.1/dbbackup/tests/testapp/management/commands/0000777000000000000000000000000014662205136022064 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/management/commands/__init__.py0000666000000000000000000000000014314702350024155 0ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/management/commands/count.py0000666000000000000000000000040714662175773023604 0ustar00from django.core.management.base import BaseCommand from dbbackup.tests.testapp.models import CharModel class Command(BaseCommand): help = "Count things" def handle(self, **options): self.stdout.write(str(CharModel.objects.count())) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/management/commands/feed.py0000666000000000000000000000043014662175773023353 0ustar00from django.core.management.base import BaseCommand from dbbackup.tests.testapp.models import CharModel class Command(BaseCommand): help = "Count things" def handle(self, **options): for st in "abcde": CharModel.objects.create(field=st) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7186275 django_dbbackup-4.2.1/dbbackup/tests/testapp/migrations/0000777000000000000000000000000014662205136020323 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724449435.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/migrations/0001_initial.py0000666000000000000000000000422114662201233022757 0ustar00from django.db import migrations, models class Migration(migrations.Migration): initial = True dependencies = [] operations = [ migrations.CreateModel( name="CharModel", fields=[ ( "id", models.AutoField( verbose_name="ID", serialize=False, auto_created=True, primary_key=True, ), ), ("field", models.CharField(max_length=10)), ], ), migrations.CreateModel( name="FileModel", fields=[ ( "id", models.AutoField( verbose_name="ID", serialize=False, auto_created=True, primary_key=True, ), ), ("field", models.FileField(upload_to=".")), ], ), migrations.CreateModel( name="ForeignKeyModel", fields=[ ( "id", models.AutoField( verbose_name="ID", serialize=False, auto_created=True, primary_key=True, ), ), ( "field", models.ForeignKey(to="testapp.CharModel", on_delete=models.CASCADE), ), ], ), migrations.CreateModel( name="ManyToManyModel", fields=[ ( "id", models.AutoField( verbose_name="ID", serialize=False, auto_created=True, primary_key=True, ), ), ("field", models.ManyToManyField(to="testapp.CharModel")), ], ), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724449435.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/migrations/0002_textmodel.py0000666000000000000000000000127314662201233023340 0ustar00# Generated by Django 4.0.1 on 2022-04-27 22:36 from django.db import migrations, models class Migration(migrations.Migration): dependencies = [ ("testapp", "0001_initial"), ] operations = [ migrations.CreateModel( name="TextModel", fields=[ ( "id", models.AutoField( auto_created=True, primary_key=True, serialize=False, verbose_name="ID", ), ), ("field", models.TextField()), ], ), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1664320744.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/migrations/__init__.py0000666000000000000000000000000014314702350022414 0ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724449435.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/models.py0000666000000000000000000000071314662201233017777 0ustar00from django.db import models class CharModel(models.Model): field = models.CharField(max_length=10) class TextModel(models.Model): field = models.TextField() class ForeignKeyModel(models.Model): field = models.ForeignKey(CharModel, on_delete=models.CASCADE) class ManyToManyModel(models.Model): field = models.ManyToManyField(CharModel) class FileModel(models.Model): field = models.FileField(upload_to=".") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/urls.py0000666000000000000000000000010614662175773017520 0ustar00urlpatterns = ( # url(r'^admin/', include(admin.site.urls)), ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/testapp/views.py0000666000000000000000000000003314662175773017667 0ustar00# Create your views here. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/tests/utils.py0000666000000000000000000000671614662175773016230 0ustar00import contextlib import os import subprocess from django.conf import settings from django.core.files import File from django.core.files.storage import Storage from django.utils import timezone from dbbackup.db.base import get_connector BASE_FILE = os.path.join(settings.BLOB_DIR, "test.txt") ENCRYPTED_FILE = os.path.join(settings.BLOB_DIR, "test.txt.gpg") COMPRESSED_FILE = os.path.join(settings.BLOB_DIR, "test.txt.gz") TARED_FILE = os.path.join(settings.BLOB_DIR, "test.txt.tar") ENCRYPTED_COMPRESSED_FILE = os.path.join(settings.BLOB_DIR, "test.txt.gz.gpg") TEST_DATABASE = { "ENGINE": "django.db.backends.sqlite3", "NAME": "/tmp/foo.db", "USER": "foo", "PASSWORD": "bar", "HOST": "foo", "PORT": 122, } TEST_MONGODB = { "ENGINE": "django_mongodb_engine", "NAME": "mongo_test", "USER": "foo", "PASSWORD": "bar", "HOST": "foo", "PORT": 122, } TEST_DATABASE = settings.DATABASES["default"] GPG_PRIVATE_PATH = os.path.join(settings.BLOB_DIR, "gpg/secring.gpg") GPG_PUBLIC_PATH = os.path.join(settings.BLOB_DIR, "gpg/pubring.gpg") GPG_FINGERPRINT = "7438 8D4E 02AF C011 4E2F 1E79 F7D1 BBF0 1F63 FDE9" DEV_NULL = open(os.devnull, "w") class handled_files(dict): """ Dict for gather information about fake storage and clean between tests. You should use the constant instance ``HANDLED_FILES`` and clean it before tests. """ def __init__(self): super().__init__() self.clean() def clean(self): self["written_files"] = [] self["deleted_files"] = [] HANDLED_FILES = handled_files() class FakeStorage(Storage): name = "FakeStorage" def exists(self, name): return name in HANDLED_FILES["written_files"] def get_available_name(self, name, max_length=None): return name[:max_length] def get_valid_name(self, name): return name def listdir(self, path): return ([], [f[0] for f in HANDLED_FILES["written_files"]]) def accessed_time(self, name): return timezone.now() created_time = modified_time = accessed_time def _open(self, name, mode="rb"): file_ = [f[1] for f in HANDLED_FILES["written_files"] if f[0] == name][0] file_.seek(0) return file_ def _save(self, name, content): HANDLED_FILES["written_files"].append((name, File(content))) return name def delete(self, name): HANDLED_FILES["deleted_files"].append(name) def clean_gpg_keys(): with contextlib.suppress(Exception): cmd = "gpg --batch --yes --delete-key '%s'" % GPG_FINGERPRINT subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) with contextlib.suppress(Exception): cmd = "gpg --batch --yes --delete-secrect-key '%s'" % GPG_FINGERPRINT subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) def add_private_gpg(): cmd = f"gpg --import {GPG_PRIVATE_PATH}".split() subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) def add_public_gpg(): cmd = f"gpg --import {GPG_PUBLIC_PATH}".split() subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) def callable_for_filename_template(datetime, **kwargs): return f"{datetime}_foo" def get_dump(database=TEST_DATABASE): return get_connector().create_dump() def get_dump_name(database=None): database = database or TEST_DATABASE return get_connector().generate_filename() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/dbbackup/utils.py0000666000000000000000000003120414662175773015054 0ustar00""" Utility functions for dbbackup. """ import gzip import logging import os import re import sys import tempfile import traceback from datetime import datetime from functools import wraps from getpass import getpass from shlex import quote from shutil import copyfileobj from django.core.mail import EmailMultiAlternatives from django.db import connection from django.http import HttpRequest from django.utils import timezone from . import settings FAKE_HTTP_REQUEST = HttpRequest() FAKE_HTTP_REQUEST.META["SERVER_NAME"] = "" FAKE_HTTP_REQUEST.META["SERVER_PORT"] = "" FAKE_HTTP_REQUEST.META["HTTP_HOST"] = settings.HOSTNAME FAKE_HTTP_REQUEST.path = "/DJANGO-DBBACKUP-EXCEPTION" BYTES = ( ("PiB", 1125899906842624.0), ("TiB", 1099511627776.0), ("GiB", 1073741824.0), ("MiB", 1048576.0), ("KiB", 1024.0), ("B", 1.0), ) REG_FILENAME_CLEAN = re.compile(r"-+") class EncryptionError(Exception): pass class DecryptionError(Exception): pass def bytes_to_str(byteVal, decimals=1): """ Convert bytes to a human readable string. :param byteVal: Value to convert in bytes :type byteVal: int or float :param decimal: Number of decimal to display :type decimal: int :returns: Number of byte with the best unit of measure :rtype: str """ for unit, byte in BYTES: if byteVal >= byte: if decimals == 0: return f"{int(round(byteVal / byte, 0))} {unit}" return f"{round(byteVal / byte, decimals)} {unit}" return f"{byteVal} B" def handle_size(filehandle): """ Get file's size to a human readable string. :param filehandle: File to handle :type filehandle: file :returns: File's size with the best unit of measure :rtype: str """ if hasattr(filehandle, "size"): return bytes_to_str(filehandle.size) filehandle.seek(0, 2) return bytes_to_str(filehandle.tell()) def mail_admins( subject, message, fail_silently=False, connection=None, html_message=None ): """Sends a message to the admins, as defined by the DBBACKUP_ADMINS setting.""" if not settings.ADMINS: return mail = EmailMultiAlternatives( f"{settings.EMAIL_SUBJECT_PREFIX}{subject}", message, settings.SERVER_EMAIL, [a[1] for a in settings.ADMINS], connection=connection, ) if html_message: mail.attach_alternative(html_message, "text/html") mail.send(fail_silently=fail_silently) def email_uncaught_exception(func): """ Function decorator for send email with uncaught exceptions to admins. Email is sent to ``settings.DBBACKUP_FAILURE_RECIPIENTS`` (``settings.ADMINS`` if not defined). The message contains a traceback of error. """ @wraps(func) def wrapper(*args, **kwargs): try: func(*args, **kwargs) except Exception: logger = logging.getLogger("dbbackup") exc_type, exc_value, tb = sys.exc_info() tb_str = "".join(traceback.format_tb(tb)) msg = f"{exc_type.__name__}: {exc_value}\n{tb_str}" logger.error(msg) raise finally: connection.close() return wrapper def create_spooled_temporary_file(filepath=None, fileobj=None): """ Create a spooled temporary file. if ``filepath`` or ``fileobj`` is defined its content will be copied into temporary file. :param filepath: Path of input file :type filepath: str :param fileobj: Input file object :type fileobj: file :returns: Spooled temporary file :rtype: :class:`tempfile.SpooledTemporaryFile` """ spooled_file = tempfile.SpooledTemporaryFile( max_size=settings.TMP_FILE_MAX_SIZE, dir=settings.TMP_DIR ) if filepath: fileobj = open(filepath, "r+b") if fileobj is not None: fileobj.seek(0) copyfileobj(fileobj, spooled_file, settings.TMP_FILE_READ_SIZE) return spooled_file def encrypt_file(inputfile, filename): """ Encrypt input file using GPG and remove .gpg extension to its name. :param inputfile: File to encrypt :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :returns: Tuple with file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ import gnupg tempdir = tempfile.mkdtemp(dir=settings.TMP_DIR) try: filename = f"{filename}.gpg" filepath = os.path.join(tempdir, filename) try: inputfile.seek(0) always_trust = settings.GPG_ALWAYS_TRUST g = gnupg.GPG() result = g.encrypt_file( inputfile, output=filepath, recipients=settings.GPG_RECIPIENT, always_trust=always_trust, ) inputfile.close() if not result: msg = f"Encryption failed; status: {result.status}" raise EncryptionError(msg) return create_spooled_temporary_file(filepath), filename finally: if os.path.exists(filepath): os.remove(filepath) finally: os.rmdir(tempdir) def unencrypt_file(inputfile, filename, passphrase=None): """ Unencrypt input file using GPG and remove .gpg extension to its name. :param inputfile: File to encrypt :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :param passphrase: Passphrase of GPG key, if equivalent to False, it will be asked to user. If user answer an empty pass, no passphrase will be used. :type passphrase: ``str`` or ``None`` :returns: Tuple with file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ import gnupg def get_passphrase(passphrase=passphrase): return passphrase or getpass("Input Passphrase: ") or None temp_dir = tempfile.mkdtemp(dir=settings.TMP_DIR) try: new_basename = os.path.basename(filename).replace(".gpg", "") temp_filename = os.path.join(temp_dir, new_basename) try: inputfile.seek(0) g = gnupg.GPG() result = g.decrypt_file( fileobj_or_path=inputfile, passphrase=get_passphrase(), output=temp_filename, ) if not result: raise DecryptionError("Decryption failed; status: %s" % result.status) outputfile = create_spooled_temporary_file(temp_filename) finally: if os.path.exists(temp_filename): os.remove(temp_filename) finally: os.rmdir(temp_dir) return outputfile, new_basename def compress_file(inputfile, filename): """ Compress input file using gzip and change its name. :param inputfile: File to compress :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :returns: Tuple with compressed file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ outputfile = create_spooled_temporary_file() new_filename = f"{filename}.gz" zipfile = gzip.GzipFile(filename=filename, fileobj=outputfile, mode="wb") try: inputfile.seek(0) copyfileobj(inputfile, zipfile, settings.TMP_FILE_READ_SIZE) finally: zipfile.close() return outputfile, new_filename def uncompress_file(inputfile, filename): """ Uncompress this file using gzip and change its name. :param inputfile: File to compress :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :returns: Tuple with file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ zipfile = gzip.GzipFile(fileobj=inputfile, mode="rb") try: inputfile.seek(0) outputfile = create_spooled_temporary_file(fileobj=zipfile) finally: zipfile.close() new_basename = os.path.basename(filename).replace(".gz", "") return outputfile, new_basename def timestamp(value): """ Return the timestamp of a datetime.datetime object. :param value: a datetime object :type value: datetime.datetime :return: the timestamp :rtype: str """ value = value if timezone.is_naive(value) else timezone.localtime(value) return value.strftime(settings.DATE_FORMAT) def filename_details(filepath): # TODO: What was this function made for ? return "" PATTERN_MATCHNG = ( ("%a", r"[A-Z][a-z]+"), ("%A", r"[A-Z][a-z]+"), ("%w", r"\d"), ("%d", r"\d{2}"), ("%b", r"[A-Z][a-z]+"), ("%B", r"[A-Z][a-z]+"), ("%m", r"\d{2}"), ("%y", r"\d{2}"), ("%Y", r"\d{4}"), ("%H", r"\d{2}"), ("%I", r"\d{2}"), # ('%p', r'(?AM|PM|am|pm)'), ("%M", r"\d{2}"), ("%S", r"\d{2}"), ("%f", r"\d{6}"), # ('%z', r'\+\d{4}'), # ('%Z', r'(?|UTC|EST|CST)'), ("%j", r"\d{3}"), ("%U", r"\d{2}"), ("%W", r"\d{2}"), # ('%c', r'[A-Z][a-z]+ [A-Z][a-z]{2} \d{2} \d{2}:\d{2}:\d{2} \d{4}'), # ('%x', r'd{2}/d{2}/d{4}'), # ('%X', r'd{2}:d{2}:d{2}'), # ('%%', r'%'), ) def datefmt_to_regex(datefmt): """ Convert a strftime format string to a regex. :param datefmt: strftime format string :type datefmt: ``str`` :returns: Equivalent regex :rtype: ``re.compite`` """ new_string = datefmt for pat, reg in PATTERN_MATCHNG: new_string = new_string.replace(pat, reg) return re.compile(f"({new_string})") def filename_to_datestring(filename, datefmt=None): """ Return the date part of a file name. :param datefmt: strftime format string, ``settings.DATE_FORMAT`` is used if is ``None`` :type datefmt: ``str`` or ``None`` :returns: Date part or nothing if not found :rtype: ``str`` or ``NoneType`` """ datefmt = datefmt or settings.DATE_FORMAT regex = datefmt_to_regex(datefmt) search = regex.search(filename) if search: return search.groups()[0] def filename_to_date(filename, datefmt=None): """ Return a datetime from a file name. :param datefmt: strftime format string, ``settings.DATE_FORMAT`` is used if is ``None`` :type datefmt: ``str`` or ``NoneType`` :returns: Date guessed or nothing if no date found :rtype: ``datetime.datetime`` or ``NoneType`` """ datefmt = datefmt or settings.DATE_FORMAT datestring = filename_to_datestring(filename, datefmt) if datestring is not None: return datetime.strptime(datestring, datefmt) def filename_generate( extension, database_name="", servername=None, content_type="db", wildcard=None ): """ Create a new backup filename. :param extension: Extension of backup file :type extension: ``str`` :param database_name: If it is database backup specify its name :type database_name: ``str`` :param servername: Specify server name or by default ``settings.DBBACKUP_HOSTNAME`` :type servername: ``str`` :param content_type: Content type to backup, ``'media'`` or ``'db'`` :type content_type: ``str`` :param wildcard: Replace datetime with this wilecard regex :type content_type: ``str`` :returns: Computed file name :rtype: ``str` """ if content_type == "db": if "/" in database_name: database_name = os.path.basename(database_name) if "." in database_name: database_name = database_name.split(".")[0] template = settings.FILENAME_TEMPLATE elif content_type == "media": template = settings.MEDIA_FILENAME_TEMPLATE else: template = settings.FILENAME_TEMPLATE params = { "servername": servername or settings.HOSTNAME, "datetime": wildcard or datetime.now().strftime(settings.DATE_FORMAT), "databasename": database_name, "extension": extension, "content_type": content_type, } if callable(template): filename = template(**params) else: filename = template.format(**params) filename = REG_FILENAME_CLEAN.sub("-", filename) filename = filename[1:] if filename.startswith("-") else filename return filename def get_escaped_command_arg(arg): return quote(arg) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7411273 django_dbbackup-4.2.1/django_dbbackup.egg-info/0000777000000000000000000000000014662205136016341 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724451421.0 django_dbbackup-4.2.1/django_dbbackup.egg-info/PKG-INFO0000666000000000000000000002234214662205135017440 0ustar00Metadata-Version: 2.1 Name: django-dbbackup Version: 4.2.1 Summary: Management commands to help backup and restore a project database and media. Home-page: https://github.com/jazzband/django-dbbackup Author: Archmonger Author-email: archiethemonger@gmail.com License: BSD Keywords: django,database,media,backup,amazon,s3,dropbox Classifier: Development Status :: 4 - Beta Classifier: Environment :: Web Environment Classifier: Environment :: Console Classifier: Framework :: Django :: 3.2 Classifier: Framework :: Django :: 4.2 Classifier: Framework :: Django :: 5.0 Classifier: Intended Audience :: Developers Classifier: Intended Audience :: System Administrators Classifier: License :: OSI Approved :: BSD License Classifier: Natural Language :: English Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Topic :: Database Classifier: Topic :: System :: Archiving Classifier: Topic :: System :: Archiving :: Backup Classifier: Topic :: System :: Archiving :: Compression Requires-Python: >=3.7 Description-Content-Type: text/x-rst License-File: LICENSE.txt License-File: AUTHORS.txt Requires-Dist: django>=3.2 Requires-Dist: pytz Django Database Backup ====================== .. image:: https://github.com/jazzband/django-dbbackup/actions/workflows/build.yml/badge.svg :target: https://github.com/jazzband/django-dbbackup/actions .. image:: https://readthedocs.org/projects/django-dbbackup/badge/?version=stable :target: https://django-dbbackup.readthedocs.io/ :alt: Documentation Status .. image:: https://codecov.io/gh/jazzband/django-dbbackup/branch/master/graph/badge.svg?token=zaYmStcsuX :target: https://codecov.io/gh/jazzband/django-dbbackup .. image:: https://jazzband.co/static/img/badge.svg :target: https://jazzband.co/ :alt: Jazzband This Django application provides management commands to help backup and restore your project database and media files with various storages such as Amazon S3, Dropbox, local file storage or any Django storage. It is made to: - Allow you to secure your backup with GPG signature and encryption - Archive with compression - Deal easily with remote archiving - Keep your development database up to date - Use Crontab or Celery to setup automated backups Docs ==== See our official documentation at `Read The Docs`_. Why use DBBackup ================ This software doesn't reinvent the wheel, in a few words it is a pipe between your Django project and your backup storage. It tries to use the traditional dump & restore mechanisms, apply compression and/or encryption and use the storage system you desire. It gives a simple interface to backup and restore your database or media files. Management Commands =================== dbbackup -------- Backup your database to the specified storage. By default this will backup all databases specified in your settings.py file and will not delete any old backups. You can optionally specify a server name to be included in the backup filename. :: Usage: ./manage.py dbbackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -d DATABASE, --database=DATABASE Database to backup (default: everything) -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the backup files -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem -x EXCLUDE_TABLES, --exclude-tables=EXCLUDE_TABLES Exclude tables data from backup (-x 'public.table1, public.table2') dbrestore --------- Restore your database from the specified storage. By default this will lookup the latest backup and restore from that. You may optionally specify a servername if you you want to backup a database image that was created from a different server. You may also specify an explicit local file to backup from. :: Usage: ./manage.py dbrestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -d DATABASE, --database=DATABASE Database to restore -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -s SERVERNAME, --servername=SERVERNAME Use a different servername backup -c, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring mediabackup ----------- Backup media files by get them one by one, include in a TAR file. :: Usage: ./manage.py mediabackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the archive -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem mediarestore ------------ Restore media files from storage backup to your media storage. :: Usage: ./manage.py mediarestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -e, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring -r, --replace Replace existing files Tests ===== Tests are stored in `dbbackup.tests` and to run them you must launch: :: python runtests.py In fact, ``runtests.py`` acts as a ``manage.py`` file and all Django commands are available. So you could launch: :: python runtests.py shell to get a Python shell configured with the test project. Also all test command options are available and usable to run only a selection of tests. See `Django test command documentation`_ for more information about it. .. _`Django test command documentation`: https://docs.djangoproject.com/en/stable/topics/testing/overview/#running-tests There are even functional tests: :: ./functional.sh See documentation for details. To run the tests across all supported versions of Django and Python, you can use Tox. Firstly install Tox: :: pip install tox To run the tests just use the command ``tox`` in the command line. If you want to run the tests against just one specific test environment you can run ``tox -e ``. For example, to run the tests with Python3.9 and Django3.2 you would run: :: tox -e py39-django32 The available test environments can be found in ``tox.ini``. Contributing ============ .. image:: https://jazzband.co/static/img/jazzband.svg :target: https://jazzband.co/ :alt: Jazzband This is a `Jazzband `_ project. By contributing you agree to abide by the `Contributor Code of Conduct `_ and follow the `guidelines `_. All contribution are very welcomed, propositions, problems, bugs and enhancement are tracked with `GitHub issues`_ system and patches are submitted via `pull requests`_. We use GitHub Actions as continuous integration tools. .. _`Read The Docs`: https://django-dbbackup.readthedocs.org/ .. _`GitHub issues`: https://github.com/jazzband/django-dbbackup/issues .. _`pull requests`: https://github.com/jazzband/django-dbbackup/pulls .. _Coveralls: https://coveralls.io/github/jazzband/django-dbbackup ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724451421.0 django_dbbackup-4.2.1/django_dbbackup.egg-info/SOURCES.txt0000666000000000000000000000507514662205135020233 0ustar00AUTHORS.txt LICENSE.txt MANIFEST.in README.rst pyproject.toml requirements.txt setup.py dbbackup/VERSION dbbackup/__init__.py dbbackup/apps.py dbbackup/checks.py dbbackup/log.py dbbackup/settings.py dbbackup/storage.py dbbackup/utils.py dbbackup/db/__init__.py dbbackup/db/base.py dbbackup/db/exceptions.py dbbackup/db/mongodb.py dbbackup/db/mysql.py dbbackup/db/postgresql.py dbbackup/db/sqlite.py dbbackup/management/__init__.py dbbackup/management/commands/__init__.py dbbackup/management/commands/_base.py dbbackup/management/commands/dbbackup.py dbbackup/management/commands/dbrestore.py dbbackup/management/commands/listbackups.py dbbackup/management/commands/mediabackup.py dbbackup/management/commands/mediarestore.py dbbackup/tests/__init__.py dbbackup/tests/settings.py dbbackup/tests/test_checks.py dbbackup/tests/test_log.py dbbackup/tests/test_storage.py dbbackup/tests/test_utils.py dbbackup/tests/utils.py dbbackup/tests/commands/__init__.py dbbackup/tests/commands/test_base.py dbbackup/tests/commands/test_dbbackup.py dbbackup/tests/commands/test_dbrestore.py dbbackup/tests/commands/test_listbackups.py dbbackup/tests/commands/test_mediabackup.py dbbackup/tests/functional/__init__.py dbbackup/tests/functional/test_commands.py dbbackup/tests/test_connectors/__init__.py dbbackup/tests/test_connectors/test_base.py dbbackup/tests/test_connectors/test_mongodb.py dbbackup/tests/test_connectors/test_mysql.py dbbackup/tests/test_connectors/test_postgresql.py dbbackup/tests/test_connectors/test_sqlite.py dbbackup/tests/testapp/__init__.py dbbackup/tests/testapp/models.py dbbackup/tests/testapp/urls.py dbbackup/tests/testapp/views.py dbbackup/tests/testapp/blobs/test.gz dbbackup/tests/testapp/blobs/test.txt.gpg dbbackup/tests/testapp/blobs/test.txt.gz dbbackup/tests/testapp/blobs/test.txt.gz.gpg dbbackup/tests/testapp/blobs/test.txt.tar dbbackup/tests/testapp/blobs/gpg/pubring.gpg dbbackup/tests/testapp/blobs/gpg/secring.gpg dbbackup/tests/testapp/management/__init__.py dbbackup/tests/testapp/management/commands/__init__.py dbbackup/tests/testapp/management/commands/count.py dbbackup/tests/testapp/management/commands/feed.py dbbackup/tests/testapp/migrations/0001_initial.py dbbackup/tests/testapp/migrations/0002_textmodel.py dbbackup/tests/testapp/migrations/__init__.py django_dbbackup.egg-info/PKG-INFO django_dbbackup.egg-info/SOURCES.txt django_dbbackup.egg-info/dependency_links.txt django_dbbackup.egg-info/not-zip-safe django_dbbackup.egg-info/requires.txt django_dbbackup.egg-info/top_level.txt requirements/build.txt requirements/dev.txt requirements/docs.txt requirements/tests.txt././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724451421.0 django_dbbackup-4.2.1/django_dbbackup.egg-info/dependency_links.txt0000666000000000000000000000000114662205135022406 0ustar00 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724451421.0 django_dbbackup-4.2.1/django_dbbackup.egg-info/not-zip-safe0000666000000000000000000000000214662205135020567 0ustar00 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724451421.0 django_dbbackup-4.2.1/django_dbbackup.egg-info/requires.txt0000666000000000000000000000002114662205135020731 0ustar00django>=3.2 pytz ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724451421.0 django_dbbackup-4.2.1/django_dbbackup.egg-info/top_level.txt0000666000000000000000000000001114662205135021062 0ustar00dbbackup ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/pyproject.toml0000666000000000000000000000020014662175773014473 0ustar00[tool.black] target-version = ['py37'] extend-exclude = 'migrations' [tool.isort] profile = 'black' skip = 'migrations' ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1724451421.7406268 django_dbbackup-4.2.1/requirements/0000777000000000000000000000000014662205136014275 5ustar00././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/requirements/build.txt0000666000000000000000000000005514662175773016152 0ustar00build setuptools tox>=4.0.0 twine wheel ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/requirements/dev.txt0000666000000000000000000000004414662175773015627 0ustar00black flake8 isort pylint rope ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/requirements/docs.txt0000666000000000000000000000011514662175773016000 0ustar00. docutils python-dotenv sphinx sphinx-django-command sphinx-rtd-theme ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/requirements/tests.txt0000666000000000000000000000021714662175773016215 0ustar00coverage django-storages flake8 pep8 psycopg2 pylint python-dotenv python-gnupg>=0.5.0 pytz testfixtures tox>=4.0.0 tox-gh-actions ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/requirements.txt0000666000000000000000000000002314662175773015046 0ustar00django>=3.2 pytz ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1724451421.742627 django_dbbackup-4.2.1/setup.cfg0000666000000000000000000000005214662205136013370 0ustar00[egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1724447739.0 django_dbbackup-4.2.1/setup.py0000666000000000000000000000441614662175773013306 0ustar00#!/usr/bin/env python from pathlib import Path from setuptools import find_packages, setup root_dir = Path(__file__).parent src_dir = root_dir / "dbbackup" with (src_dir / "VERSION").open() as f: version = f.read().strip() def get_requirements(): with (root_dir / "requirements.txt").open() as f: return f.read().splitlines() def get_test_requirements(): with (root_dir / "requirements" / "tests.txt").open() as f: return f.read().splitlines() setup( name="django-dbbackup", version=version, description="Management commands to help backup and restore a project database and media.", author="Archmonger", author_email="archiethemonger@gmail.com", long_description=(root_dir / "README.rst").read_text(encoding="utf-8"), long_description_content_type="text/x-rst", python_requires=">=3.7", install_requires=get_requirements(), tests_require=get_test_requirements(), include_package_data=True, zip_safe=False, license="BSD", url="https://github.com/jazzband/django-dbbackup", keywords=[ "django", "database", "media", "backup", "amazon", "s3", "dropbox", ], packages=find_packages(), classifiers=[ "Development Status :: 4 - Beta", "Environment :: Web Environment", "Environment :: Console", "Framework :: Django :: 3.2", "Framework :: Django :: 4.2", "Framework :: Django :: 5.0", "Intended Audience :: Developers", "Intended Audience :: System Administrators", "License :: OSI Approved :: BSD License", "Natural Language :: English", "Operating System :: OS Independent", "Programming Language :: Python", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Programming Language :: Python :: 3.12", "Topic :: Database", "Topic :: System :: Archiving", "Topic :: System :: Archiving :: Backup", "Topic :: System :: Archiving :: Compression", ], )