django-dbbackup-3.3.0/0000775000175000017500000000000013645400163014554 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/requirements-tests.txt0000664000175000017500000000012013645400106021166 0ustar zuluzulu00000000000000pep8 flake8 pylint coverage python-gnupg django-storages pytz testfixtures mock django-dbbackup-3.3.0/PKG-INFO0000664000175000017500000000246113645400163015654 0ustar zuluzulu00000000000000Metadata-Version: 1.1 Name: django-dbbackup Version: 3.3.0 Summary: Management commands to help backup and restore a project database and media Home-page: https://github.com/django-dbbackup/django-dbbackup Author: Michael Shepanski Author-email: mjs7231@gmail.com License: BSD Description: UNKNOWN Keywords: django,database,media,backup,amazon,s3dropbox Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Environment :: Web Environment Classifier: Environment :: Console Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: Intended Audience :: System Administrators Classifier: License :: OSI Approved :: BSD License Classifier: Natural Language :: English Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Topic :: Database Classifier: Topic :: System :: Archiving Classifier: Topic :: System :: Archiving :: Backup Classifier: Topic :: System :: Archiving :: Compression django-dbbackup-3.3.0/requirements.txt0000664000175000017500000000002513645400106020032 0ustar zuluzulu00000000000000pytz six Django>=1.5 django-dbbackup-3.3.0/README.rst0000664000175000017500000001662613645400106016253 0ustar zuluzulu00000000000000Django Database Backup ====================== .. image:: https://api.travis-ci.org/django-dbbackup/django-dbbackup.svg :target: https://travis-ci.org/django-dbbackup/django-dbbackup .. image:: https://readthedocs.org/projects/django-dbbackup/badge/?version=latest :target: http://django-dbbackup.readthedocs.io/en/latest/ :alt: Documentation Status .. image:: https://coveralls.io/repos/django-dbbackup/django-dbbackup/badge.svg?branch=master&service=github :target: https://coveralls.io/github/django-dbbackup/django-dbbackup?branch=master .. image:: https://landscape.io/github/django-dbbackup/django-dbbackup/master/landscape.svg?style=flat :target: https://landscape.io/github/django-dbbackup/django-dbbackup/master :alt: Code Health This Django application provides management commands to help backup and restore your project database and media files with various storages such as Amazon S3, Dropbox, local file storage or any Django storage. It is made for: - Ensure your backup with GPG signature and encryption - Archive with compression - Deal easily with remote archiving - Great to keep your development database up to date. - Use Crontab or Celery to setup automated backups. Docs ==== See our offical documentation at `Read The Docs`_. Why use DBBackup ================ This software doesn't reinvent the wheel, in few words it is a pipe between your Django project and your backup storage. It tries to use the traditional dump & restore mechanisms, apply compression and/or encryption and use the storage system you desire. It gives a simple interface to backup and restore your database or media files. Management Commands =================== dbbackup -------- Backup your database to the specified storage. By default this will backup all databases specified in your settings.py file and will not delete any old backups. You can optionally specify a server name to be included in the backup filename. :: Usage: ./manage.py dbbackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -d DATABASE, --database=DATABASE Database to backup (default: everything) -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the backup files -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem dbrestore --------- Restore your database from the specified storage. By default this will lookup the latest backup and restore from that. You may optionally specify a servername if you you want to backup a database image that was created from a different server. You may also specify an explicit local file to backup from. :: Usage: ./manage.py dbrestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -d DATABASE, --database=DATABASE Database to restore -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -s SERVERNAME, --servername=SERVERNAME Use a different servername backup -c, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring mediabackup ----------- Backup media files by get them one by one, include in a TAR file. :: Usage: ./manage.py mediabackup [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -c, --clean Clean up old backup files -s SERVERNAME, --servername=SERVERNAME Specify server name to include in backup filename -z, --compress Compress the archive -e, --encrypt Encrypt the backup files -o OUTPUT_FILENAME, --output-filename=OUTPUT_FILENAME Specify filename on storage -O OUTPUT_PATH, --output-path=OUTPUT_PATH Specify where to store on local filesystem mediarestore ------------ Restore media files from storage backup to your media storage. :: Usage: ./manage.py mediarestore [options] Options: --noinput Tells Django to NOT prompt the user for input of any kind. -q, --quiet Tells Django to NOT output other text than errors. -i INPUT_FILENAME, --input-filename=INPUT_FILENAME Specify filename to backup from -I INPUT_PATH, --input-path=INPUT_PATH Specify path on local filesystem to backup from -e, --decrypt Decrypt data before restoring -p PASSPHRASE, --passphrase=PASSPHRASE Passphrase for decrypt file -z, --uncompress Uncompress gzip data before restoring -r, --replace Replace existing files Contributing ============ All contribution are very welcomed, propositions, problems, bugs and enhancement are tracked with `GitHub issues`_ system and patch are submitted via `pull requests`_. We use `Travis`_ coupled with `Coveralls`_ as continious integration tools. .. _`Read The Docs`: http://django-dbbackup.readthedocs.org/ .. _`GitHub issues`: https://github.com/django-dbbackup/django-dbbackup/issues .. _`pull requests`: https://github.com/django-dbbackup/django-dbbackup/pulls .. _Travis: https://travis-ci.org/django-dbbackup/django-dbbackup .. _Coveralls: https://coveralls.io/github/django-dbbackup/django-dbbackup .. image:: https://ga-beacon.appspot.com/UA-87461-7/django-dbbackup/home :target: https://github.com/igrigorik/ga-beacon Tests ===== Tests are stored in `dbbackup.tests` and for run them you must launch: :: python runtests.py In fact, ``runtests.py`` acts as a ``manage.py`` file and all Django command are available. So you could launch: :: python runtests.py shell For get a Python shell configured with the test project. Also all test command options are available and usable for run only some chosen tests. See `Django test command documentation`_ for more informations about it. .. _`Django test command documentation`: https://docs.djangoproject.com/en/stable/topics/testing/overview/#running-tests There are even functional tests: :: ./functional.sh See documentation for details about To run the tests across all supported versions of Django and Python, you can use Tox. Firstly install Tox: :: pip install tox To run the tests just use the command ``tox`` in the command line. If you want to run the tests against just one specific test environment you can run ``tox -e ``. For example, to run the tests with Python3.3 and Django1.9 you would run: :: tox -e py3.3-django1.9 The available test environments can be found in ``tox.ini``. django-dbbackup-3.3.0/setup.cfg0000664000175000017500000000013613645400163016375 0ustar zuluzulu00000000000000[flake8] max-line-length = 99 exclude = tests,settings [egg_info] tag_build = tag_date = 0 django-dbbackup-3.3.0/setup.py0000664000175000017500000000320113645400106016257 0ustar zuluzulu00000000000000#!/usr/bin/env python from setuptools import setup, find_packages import dbbackup def get_requirements(): return open('requirements.txt').read().splitlines() def get_test_requirements(): return open('requirements-tests.txt').read().splitlines() keywords = [ 'django', 'database', 'media', 'backup', 'amazon', 's3' 'dropbox', ] setup( name='django-dbbackup', version=dbbackup.__version__, description=dbbackup.__doc__, author=dbbackup.__author__, author_email=dbbackup.__email__, install_requires=get_requirements(), tests_require=get_test_requirements(), license='BSD', url=dbbackup.__url__, keywords=keywords, packages=find_packages(), classifiers=[ 'Development Status :: 4 - Beta', 'Environment :: Web Environment', 'Environment :: Console', 'Framework :: Django', 'Intended Audience :: Developers', 'Intended Audience :: System Administrators', 'License :: OSI Approved :: BSD License', 'Natural Language :: English', 'Operating System :: OS Independent', 'Programming Language :: Python', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', 'Programming Language :: Python :: 3.8', 'Topic :: Database', 'Topic :: System :: Archiving', 'Topic :: System :: Archiving :: Backup', 'Topic :: System :: Archiving :: Compression' ], ) django-dbbackup-3.3.0/django_dbbackup.egg-info/0000775000175000017500000000000013645400163021343 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/django_dbbackup.egg-info/dependency_links.txt0000664000175000017500000000000113645400163025411 0ustar zuluzulu00000000000000 django-dbbackup-3.3.0/django_dbbackup.egg-info/PKG-INFO0000664000175000017500000000246113645400163022443 0ustar zuluzulu00000000000000Metadata-Version: 1.1 Name: django-dbbackup Version: 3.3.0 Summary: Management commands to help backup and restore a project database and media Home-page: https://github.com/django-dbbackup/django-dbbackup Author: Michael Shepanski Author-email: mjs7231@gmail.com License: BSD Description: UNKNOWN Keywords: django,database,media,backup,amazon,s3dropbox Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Environment :: Web Environment Classifier: Environment :: Console Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: Intended Audience :: System Administrators Classifier: License :: OSI Approved :: BSD License Classifier: Natural Language :: English Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Topic :: Database Classifier: Topic :: System :: Archiving Classifier: Topic :: System :: Archiving :: Backup Classifier: Topic :: System :: Archiving :: Compression django-dbbackup-3.3.0/django_dbbackup.egg-info/top_level.txt0000664000175000017500000000001113645400163024065 0ustar zuluzulu00000000000000dbbackup django-dbbackup-3.3.0/django_dbbackup.egg-info/requires.txt0000664000175000017500000000002513645400163023740 0ustar zuluzulu00000000000000pytz six Django>=1.5 django-dbbackup-3.3.0/django_dbbackup.egg-info/SOURCES.txt0000664000175000017500000000462513645400163023236 0ustar zuluzulu00000000000000LICENSE.txt MANIFEST.in README.rst requirements-docs.txt requirements-tests.txt requirements.txt setup.cfg setup.py dbbackup/__init__.py dbbackup/apps.py dbbackup/checks.py dbbackup/log.py dbbackup/settings.py dbbackup/storage.py dbbackup/utils.py dbbackup/db/__init__.py dbbackup/db/base.py dbbackup/db/exceptions.py dbbackup/db/mongodb.py dbbackup/db/mysql.py dbbackup/db/postgresql.py dbbackup/db/sqlite.py dbbackup/management/__init__.py dbbackup/management/commands/__init__.py dbbackup/management/commands/_base.py dbbackup/management/commands/dbbackup.py dbbackup/management/commands/dbrestore.py dbbackup/management/commands/listbackups.py dbbackup/management/commands/mediabackup.py dbbackup/management/commands/mediarestore.py dbbackup/tests/__init__.py dbbackup/tests/settings.py dbbackup/tests/test_checks.py dbbackup/tests/test_log.py dbbackup/tests/test_storage.py dbbackup/tests/test_utils.py dbbackup/tests/utils.py dbbackup/tests/commands/__init__.py dbbackup/tests/commands/test_base.py dbbackup/tests/commands/test_dbbackup.py dbbackup/tests/commands/test_dbrestore.py dbbackup/tests/commands/test_listbackups.py dbbackup/tests/commands/test_mediabackup.py dbbackup/tests/functional/__init__.py dbbackup/tests/functional/test_commands.py dbbackup/tests/test_connectors/__init__.py dbbackup/tests/test_connectors/test_base.py dbbackup/tests/test_connectors/test_mongodb.py dbbackup/tests/test_connectors/test_mysql.py dbbackup/tests/test_connectors/test_postgresql.py dbbackup/tests/test_connectors/test_sqlite.py dbbackup/tests/testapp/__init__.py dbbackup/tests/testapp/models.py dbbackup/tests/testapp/urls.py dbbackup/tests/testapp/views.py dbbackup/tests/testapp/blobs/test.gz dbbackup/tests/testapp/blobs/test.txt.gpg dbbackup/tests/testapp/blobs/test.txt.gz dbbackup/tests/testapp/blobs/test.txt.gz.gpg dbbackup/tests/testapp/blobs/test.txt.tar dbbackup/tests/testapp/blobs/gpg/pubring.gpg dbbackup/tests/testapp/blobs/gpg/secring.gpg dbbackup/tests/testapp/management/__init__.py dbbackup/tests/testapp/management/commands/__init__.py dbbackup/tests/testapp/management/commands/count.py dbbackup/tests/testapp/management/commands/feed.py dbbackup/tests/testapp/migrations/0001_initial.py dbbackup/tests/testapp/migrations/__init__.py django_dbbackup.egg-info/PKG-INFO django_dbbackup.egg-info/SOURCES.txt django_dbbackup.egg-info/dependency_links.txt django_dbbackup.egg-info/requires.txt django_dbbackup.egg-info/top_level.txtdjango-dbbackup-3.3.0/requirements-docs.txt0000664000175000017500000000004613645400106020763 0ustar zuluzulu00000000000000Sphinx docutils sphinx-django-command django-dbbackup-3.3.0/LICENSE.txt0000664000175000017500000000275413645400106016404 0ustar zuluzulu00000000000000Copyright (c) 2010, Michael Shepanski All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name django-dbbackup nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. django-dbbackup-3.3.0/dbbackup/0000775000175000017500000000000013645400163016327 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/apps.py0000664000175000017500000000053413645400106017643 0ustar zuluzulu00000000000000"""Apps for DBBackup""" from django.apps import AppConfig from django.utils.translation import ugettext_lazy as _ class DbbackupConfig(AppConfig): """ Config for DBBackup application. """ name = 'dbbackup' label = 'dbbackup' verbose_name = _('Backup and restore') def ready(self): from dbbackup import checks django-dbbackup-3.3.0/dbbackup/settings.py0000664000175000017500000000445113645400106020542 0ustar zuluzulu00000000000000# DO NOT IMPORT THIS BEFORE django.configure() has been run! import logging.config import tempfile import socket from django.conf import settings import dbbackup.log DATABASES = getattr(settings, 'DBBACKUP_DATABASES', list(settings.DATABASES.keys())) # Fake host HOSTNAME = getattr(settings, 'DBBACKUP_HOSTNAME', socket.gethostname()) # Directory to use for temporary files TMP_DIR = getattr(settings, 'DBBACKUP_TMP_DIR', tempfile.gettempdir()) TMP_FILE_MAX_SIZE = getattr(settings, 'DBBACKUP_TMP_FILE_MAX_SIZE', 10*1024*1024) TMP_FILE_READ_SIZE = getattr(settings, 'DBBACKUP_TMP_FILE_READ_SIZE', 1024*1000) # Days to keep CLEANUP_KEEP = getattr(settings, 'DBBACKUP_CLEANUP_KEEP', 10) CLEANUP_KEEP_MEDIA = getattr(settings, 'DBBACKUP_CLEANUP_KEEP_MEDIA', CLEANUP_KEEP) CLEANUP_KEEP_FILTER = getattr(settings, 'DBBACKUP_CLEANUP_KEEP_FILTER', lambda x: False) MEDIA_PATH = getattr(settings, 'DBBACKUP_MEDIA_PATH', settings.MEDIA_ROOT) DATE_FORMAT = getattr(settings, 'DBBACKUP_DATE_FORMAT', '%Y-%m-%d-%H%M%S') FILENAME_TEMPLATE = getattr(settings, 'DBBACKUP_FILENAME_TEMPLATE', '{databasename}-{servername}-{datetime}.{extension}') MEDIA_FILENAME_TEMPLATE = getattr(settings, 'DBBACKUP_MEDIA_FILENAME_TEMPLATE', '{servername}-{datetime}.{extension}') GPG_ALWAYS_TRUST = getattr(settings, 'DBBACKUP_GPG_ALWAYS_TRUST', False) GPG_RECIPIENT = GPG_ALWAYS_TRUST = getattr(settings, 'DBBACKUP_GPG_RECIPIENT', None) STORAGE = getattr(settings, 'DBBACKUP_STORAGE', 'django.core.files.storage.FileSystemStorage') STORAGE_OPTIONS = getattr(settings, 'DBBACKUP_STORAGE_OPTIONS', {}) CONNECTORS = getattr(settings, 'DBBACKUP_CONNECTORS', {}) CUSTOM_CONNECTOR_MAPPING = getattr(settings, 'DBBACKUP_CONNECTOR_MAPPING', {}) # Logging LOGGING = getattr(settings, 'DBBACKUP_LOGGING', dbbackup.log.DEFAULT_LOGGING) LOG_CONFIGURATOR = logging.config.DictConfigurator(LOGGING) LOG_CONFIGURATOR.configure() # Mail SEND_EMAIL = getattr(settings, 'DBBACKUP_SEND_EMAIL', True) SERVER_EMAIL = getattr(settings, 'DBBACKUP_SERVER_EMAIL', settings.SERVER_EMAIL) FAILURE_RECIPIENTS = getattr(settings, 'DBBACKUP_FAILURE_RECIPIENTS', None) if FAILURE_RECIPIENTS is None: ADMINS = getattr(settings, 'DBBACKUP_ADMIN', settings.ADMINS) else: ADMINS = FAILURE_RECIPIENTS EMAIL_SUBJECT_PREFIX = getattr(settings, 'DBBACKUP_EMAIL_SUBJECT_PREFIX', '[dbbackup] ') django-dbbackup-3.3.0/dbbackup/checks.py0000664000175000017500000000362013645400106020137 0ustar zuluzulu00000000000000import re from django.core.checks import Warning, register, Tags from six import string_types from dbbackup import settings W001 = Warning('Invalid HOSTNAME parameter', hint='Set a non empty string to this settings.DBBACKUP_HOSTNAME', id='dbbackup.W001') W002 = Warning('Invalid STORAGE parameter', hint='Set a valid path to a storage in settings.DBBACKUP_STORAGE', id='dbbackup.W002') W003 = Warning('Invalid FILENAME_TEMPLATE parameter', hint='Include {datetime} to settings.DBBACKUP_FILENAME_TEMPLATE', id='dbbackup.W003') W004 = Warning('Invalid MEDIA_FILENAME_TEMPLATE parameter', hint='Include {datetime} to settings.DBBACKUP_MEDIA_FILENAME_TEMPLATE', id='dbbackup.W004') W005 = Warning('Invalid DATE_FORMAT parameter', hint='settings.DBBACKUP_DATE_FORMAT can contain only [A-Za-z0-9%_-]', id='dbbackup.W005') W006 = Warning('FAILURE_RECIPIENTS has been deprecated', hint='settings.DBBACKUP_FAILURE_RECIPIENTS is replaced by ' 'settings.DBBACKUP_ADMINS', id='dbbackup.W006') @register(Tags.compatibility) def check_settings(app_configs, **kwargs): errors = [] if not settings.HOSTNAME: errors.append(W001) if not settings.STORAGE or not isinstance(settings.STORAGE, string_types): errors.append(W002) if not callable(settings.FILENAME_TEMPLATE): if '{datetime}' not in settings.FILENAME_TEMPLATE: errors.append(W003) if not callable(settings.MEDIA_FILENAME_TEMPLATE): if '{datetime}' not in settings.MEDIA_FILENAME_TEMPLATE: errors.append(W004) if re.search(r'[^A-Za-z0-9%_-]', settings.DATE_FORMAT): errors.append(W005) if getattr(settings, 'FAILURE_RECIPIENTS', None) is not None: errors.append(W006) return errors django-dbbackup-3.3.0/dbbackup/tests/0000775000175000017500000000000013645400163017471 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/functional/0000775000175000017500000000000013645400163021633 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/functional/test_commands.py0000664000175000017500000002137013645400106025045 0ustar zuluzulu00000000000000import os import tempfile from mock import patch from django.test import TransactionTestCase as TestCase from django.core.management import execute_from_command_line from django.conf import settings from dbbackup.tests.utils import (TEST_DATABASE, HANDLED_FILES, clean_gpg_keys, add_public_gpg, add_private_gpg, get_dump, get_dump_name) from dbbackup.tests.testapp import models class DbBackupCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() open(TEST_DATABASE['NAME'], 'a').close() self.instance = models.CharModel.objects.create(field='foo') def tearDown(self): clean_gpg_keys() def test_database(self): argv = ['', 'dbbackup', '--database=default'] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES['written_files'])) filename, outputfile = HANDLED_FILES['written_files'][0] # Test file content outputfile.seek(0) self.assertTrue(outputfile.read()) def test_encrypt(self): argv = ['', 'dbbackup', '--encrypt'] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES['written_files'])) filename, outputfile = HANDLED_FILES['written_files'][0] self.assertTrue(filename.endswith('.gpg')) # Test file content outputfile = HANDLED_FILES['written_files'][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b'-----BEGIN PGP MESSAGE-----')) def test_compress(self): argv = ['', 'dbbackup', '--compress'] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES['written_files'])) filename, outputfile = HANDLED_FILES['written_files'][0] self.assertTrue(filename.endswith('.gz')) def test_compress_and_encrypt(self): argv = ['', 'dbbackup', '--compress', '--encrypt'] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES['written_files'])) filename, outputfile = HANDLED_FILES['written_files'][0] self.assertTrue(filename.endswith('.gz.gpg')) # Test file content outputfile = HANDLED_FILES['written_files'][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b'-----BEGIN PGP MESSAGE-----')) @patch('dbbackup.management.commands._base.input', return_value='y') class DbRestoreCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() add_private_gpg() open(TEST_DATABASE['NAME'], 'a').close() self.instance = models.CharModel.objects.create(field='foo') def tearDown(self): clean_gpg_keys() def test_restore(self, *args): # Create backup execute_from_command_line(['', 'dbbackup']) self.instance.delete() # Restore execute_from_command_line(['', 'dbrestore']) restored = models.CharModel.objects.all().exists() self.assertTrue(restored) @patch('dbbackup.utils.getpass', return_value=None) def test_encrypted(self, *args): # Create backup execute_from_command_line(['', 'dbbackup', '--encrypt']) self.instance.delete() # Restore execute_from_command_line(['', 'dbrestore', '--decrypt']) restored = models.CharModel.objects.all().exists() self.assertTrue(restored) def test_compressed(self, *args): # Create backup execute_from_command_line(['', 'dbbackup', '--compress']) self.instance.delete() # Restore execute_from_command_line(['', 'dbrestore', '--uncompress']) def test_no_backup_available(self, *args): with self.assertRaises(SystemExit): execute_from_command_line(['', 'dbrestore']) @patch('dbbackup.utils.getpass', return_value=None) def test_available_but_not_encrypted(self, *args): # Create backup execute_from_command_line(['', 'dbbackup']) # Restore with self.assertRaises(SystemExit): execute_from_command_line(['', 'dbrestore', '--decrypt']) def test_available_but_not_compressed(self, *args): # Create backup execute_from_command_line(['', 'dbbackup']) # Restore with self.assertRaises(SystemExit): execute_from_command_line(['', 'dbrestore', '--uncompress']) def test_specify_db(self, *args): # Create backup execute_from_command_line(['', 'dbbackup', '--database', 'default']) # Test wrong name with self.assertRaises(SystemExit): execute_from_command_line(['', 'dbrestore', '--database', 'foo']) # Restore execute_from_command_line(['', 'dbrestore', '--database', 'default']) class MediaBackupCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() def tearDown(self): clean_gpg_keys() def test_encrypt(self): argv = ['', 'mediabackup', '--encrypt'] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES['written_files'])) filename, outputfile = HANDLED_FILES['written_files'][0] self.assertTrue('.gpg' in filename) # Test file content outputfile = HANDLED_FILES['written_files'][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b'-----BEGIN PGP MESSAGE-----')) def test_compress(self): argv = ['', 'mediabackup', '--compress'] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES['written_files'])) filename, outputfile = HANDLED_FILES['written_files'][0] self.assertTrue('.gz' in filename) @patch('dbbackup.utils.getpass', return_value=None) def test_compress_and_encrypted(self, getpass_mock): argv = ['', 'mediabackup', '--compress', '--encrypt'] execute_from_command_line(argv) self.assertEqual(1, len(HANDLED_FILES['written_files'])) filename, outputfile = HANDLED_FILES['written_files'][0] self.assertTrue('.gpg' in filename) self.assertTrue('.gz' in filename) # Test file content outputfile = HANDLED_FILES['written_files'][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b'-----BEGIN PGP MESSAGE-----')) @patch('dbbackup.management.commands._base.input', return_value='y') class MediaRestoreCommandTest(TestCase): def setUp(self): HANDLED_FILES.clean() add_public_gpg() add_private_gpg() def tearDown(self): clean_gpg_keys() self._emtpy_media() def _create_file(self, name=None): name = name or tempfile._RandomNameSequence().next() path = os.path.join(settings.MEDIA_ROOT, name) with open(path, 'a+b') as fd: fd.write(b'foo') def _emtpy_media(self): for fi in os.listdir(settings.MEDIA_ROOT): os.remove(os.path.join(settings.MEDIA_ROOT, fi)) def _is_restored(self): return bool(os.listdir(settings.MEDIA_ROOT)) def test_restore(self, *args): # Create backup self._create_file('foo') execute_from_command_line(['', 'mediabackup']) self._emtpy_media() # Restore execute_from_command_line(['', 'mediarestore']) self.assertTrue(self._is_restored()) @patch('dbbackup.utils.getpass', return_value=None) def test_encrypted(self, *args): # Create backup self._create_file('foo') execute_from_command_line(['', 'mediabackup', '--encrypt']) self._emtpy_media() # Restore execute_from_command_line(['', 'mediarestore', '--decrypt']) self.assertTrue(self._is_restored()) def test_compressed(self, *args): # Create backup self._create_file('foo') execute_from_command_line(['', 'mediabackup', '--compress']) self._emtpy_media() # Restore execute_from_command_line(['', 'mediarestore', '--uncompress']) self.assertTrue(self._is_restored()) def test_no_backup_available(self, *args): with self.assertRaises(SystemExit): execute_from_command_line(['', 'mediarestore']) @patch('dbbackup.utils.getpass', return_value=None) def test_available_but_not_encrypted(self, *args): # Create backup execute_from_command_line(['', 'mediabackup']) # Restore with self.assertRaises(SystemExit): execute_from_command_line(['', 'mediarestore', '--decrypt']) def test_available_but_not_compressed(self, *args): # Create backup execute_from_command_line(['', 'mediabackup']) # Restore with self.assertRaises(SystemExit): execute_from_command_line(['', 'mediarestore', '--uncompress']) django-dbbackup-3.3.0/dbbackup/tests/functional/__init__.py0000664000175000017500000000000013645400106023727 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/test_checks.py0000664000175000017500000000530313645400106022340 0ustar zuluzulu00000000000000from mock import patch from django.test import TestCase try: from dbbackup import checks from dbbackup.apps import DbbackupConfig except ImportError: checks = None def test_func(*args, **kwargs): return 'foo' class ChecksTest(TestCase): def setUp(self): if checks is None: self.skipTest("Test framework has been released in Django 1.7") def test_check(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch('dbbackup.checks.settings.HOSTNAME', '') def test_hostname_invalid(self): expected_errors = [checks.W001] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch('dbbackup.checks.settings.STORAGE', '') def test_hostname_storage(self): expected_errors = [checks.W002] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch('dbbackup.checks.settings.FILENAME_TEMPLATE', test_func) def test_filename_template_is_callable(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch('dbbackup.checks.settings.FILENAME_TEMPLATE', '{datetime}.bak') def test_filename_template_is_string(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch('dbbackup.checks.settings.FILENAME_TEMPLATE', 'foo.bak') def test_filename_template_no_date(self): expected_errors = [checks.W003] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch('dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE', test_func) def test_media_filename_template_is_callable(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch('dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE', '{datetime}.bak') def test_media_filename_template_is_string(self): self.assertFalse(checks.check_settings(DbbackupConfig)) @patch('dbbackup.checks.settings.MEDIA_FILENAME_TEMPLATE', 'foo.bak') def test_media_filename_template_no_date(self): expected_errors = [checks.W004] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch('dbbackup.checks.settings.DATE_FORMAT', 'foo@net.pt') def test_date_format_warning(self): expected_errors = [checks.W005] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) @patch('dbbackup.checks.settings.FAILURE_RECIPIENTS', 'foo@net.pt') def test_Failure_recipients_warning(self): expected_errors = [checks.W006] errors = checks.check_settings(DbbackupConfig) self.assertEqual(expected_errors, errors) django-dbbackup-3.3.0/dbbackup/tests/test_log.py0000664000175000017500000001036613645400106021666 0ustar zuluzulu00000000000000import logging from mock import patch import django from django.test import TestCase from django.core import mail from dbbackup import log from testfixtures import log_capture class LoggerDefaultTestCase(TestCase): @log_capture() def test_root(self, captures): logger = logging.getLogger() logger.debug('a noise') logger.info('a message') logger.warning('a warning') logger.error('an error') logger.critical('a critical error') captures.check( ('root', 'DEBUG', 'a noise'), ('root', 'INFO', 'a message'), ('root', 'WARNING', 'a warning'), ('root', 'ERROR', 'an error'), ('root', 'CRITICAL', 'a critical error'), ) @log_capture() def test_django(self, captures): logger = logging.getLogger('django') logger.debug('a noise') logger.info('a message') logger.warning('a warning') logger.error('an error') logger.critical('a critical error') if django.VERSION < (1, 9): captures.check( ('django', 'DEBUG', 'a noise'), ('django', 'INFO', 'a message'), ('django', 'WARNING', 'a warning'), ('django', 'ERROR', 'an error'), ('django', 'CRITICAL', 'a critical error'), ) else: captures.check( ('django', 'INFO', 'a message'), ('django', 'WARNING', 'a warning'), ('django', 'ERROR', 'an error'), ('django', 'CRITICAL', 'a critical error'), ) @log_capture() def test_dbbackup(self, captures): logger = logging.getLogger('dbbackup') logger.debug('a noise') logger.info('a message') logger.warning('a warning') logger.error('an error') logger.critical('a critical error') captures.check( ('dbbackup', 'INFO', 'a message'), ('dbbackup', 'WARNING', 'a warning'), ('dbbackup', 'ERROR', 'an error'), ('dbbackup', 'CRITICAL', 'a critical error'), ) @log_capture() def test_dbbackup_storage(self, captures): logger = logging.getLogger('dbbackup.storage') logger.debug('a noise') logger.info('a message') logger.warning('a warning') logger.error('an error') logger.critical('a critical error') captures.check( ('dbbackup.storage', 'INFO', 'a message'), ('dbbackup.storage', 'WARNING', 'a warning'), ('dbbackup.storage', 'ERROR', 'an error'), ('dbbackup.storage', 'CRITICAL', 'a critical error'), ) @log_capture() def test_other_module(self, captures): logger = logging.getLogger('os.path') logger.debug('a noise') logger.info('a message') logger.warning('a warning') logger.error('an error') logger.critical('a critical error') captures.check( ('os.path', 'DEBUG', 'a noise'), ('os.path', 'INFO', 'a message'), ('os.path', 'WARNING', 'a warning'), ('os.path', 'ERROR', 'an error'), ('os.path', 'CRITICAL', 'a critical error'), ) class DbbackupAdminEmailHandlerTest(TestCase): def setUp(self): self.logger = logging.getLogger('dbbackup') @patch('dbbackup.settings.SEND_EMAIL', True) def test_send_mail(self): # Test mail error msg = "Super msg" self.logger.error(msg) self.assertEqual(mail.outbox[0].subject, '[dbbackup] ERROR: Super msg') # Test don't mail below self.logger.warning(msg) self.assertEqual(len(mail.outbox), 1) @patch('dbbackup.settings.SEND_EMAIL', False) def test_send_mail_is_false(self): msg = "Super msg" self.logger.error(msg) self.assertEqual(len(mail.outbox), 0) class MailEnabledFilterTest(TestCase): @patch('dbbackup.settings.SEND_EMAIL', True) def test_filter_is_true(self): filter_ = log.MailEnabledFilter() self.assertTrue(filter_.filter('foo')) @patch('dbbackup.settings.SEND_EMAIL', False) def test_filter_is_false(self): filter_ = log.MailEnabledFilter() self.assertFalse(filter_.filter('foo')) django-dbbackup-3.3.0/dbbackup/tests/testapp/0000775000175000017500000000000013645400163021151 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/models.py0000664000175000017500000000113313645400106023001 0ustar zuluzulu00000000000000from __future__ import unicode_literals from django.db import models ___all__ = ('CharModel', 'IntegerModel', 'TextModel', 'BooleanModel' 'DateModel', 'DateTimeModel', 'ForeignKeyModel', 'ManyToManyModel', 'FileModel', 'TestModel',) class CharModel(models.Model): field = models.CharField(max_length=10) class ForeignKeyModel(models.Model): field = models.ForeignKey(CharModel, on_delete=models.CASCADE) class ManyToManyModel(models.Model): field = models.ManyToManyField(CharModel) class FileModel(models.Model): field = models.FileField(upload_to='.') django-dbbackup-3.3.0/dbbackup/tests/testapp/migrations/0000775000175000017500000000000013645400163023325 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/migrations/0001_initial.py0000664000175000017500000000257113645400106025772 0ustar zuluzulu00000000000000# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations class Migration(migrations.Migration): dependencies = [ ] operations = [ migrations.CreateModel( name='CharModel', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('field', models.CharField(max_length=10)), ], ), migrations.CreateModel( name='FileModel', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('field', models.FileField(upload_to='.')), ], ), migrations.CreateModel( name='ForeignKeyModel', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True,)), ('field', models.ForeignKey(to='testapp.CharModel', on_delete=models.CASCADE)), ], ), migrations.CreateModel( name='ManyToManyModel', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('field', models.ManyToManyField(to='testapp.CharModel')), ], ), ] django-dbbackup-3.3.0/dbbackup/tests/testapp/migrations/__init__.py0000664000175000017500000000000013645400106025421 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/urls.py0000664000175000017500000000046113645400106022506 0ustar zuluzulu00000000000000try: from django.conf.urls import patterns, include, url urlpatterns = patterns( '', # url(r'^admin/', include(admin.site.urls)), ) except ImportError: from django.conf.urls import include, url urlpatterns = ( # url(r'^admin/', include(admin.site.urls)), ) django-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/0000775000175000017500000000000013645400163022252 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/test.txt.gz.gpg0000664000175000017500000000035513645400106025165 0ustar zuluzulu00000000000000„Œ,ݬ·³²Qsý=«¨xi½?á—ÿ+ìEžs‚Ÿš?•¼FÉnüGw/ÂV§w*,ÊP˜Ü`CZ(VŠÁxc Þo7CoæZÞ’®Õ-H••å8H|s~!dè¼J™¬°è}Áu'Mô¨8wû"›¦§lù?Êv¶¡RÒv{=Ä{PŽÒ]"c‡ÊjG©÷§¦~D+/xá]†éi€ pSܼü~`Ü#ƒ\öœ«glc2Ü—ÚG<À€é$·O¨¨ÌZbrÖÊö͇¡äˆx YèÃ-CùqÁ‚.át·tAdjango-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/test.txt.tar0000664000175000017500000000004113645400106024547 0ustar zuluzulu00000000000000‹•U±Utest.txtKËÏç¨e2~django-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/test.txt.gpg0000664000175000017500000000032713645400106024545 0ustar zuluzulu00000000000000„Œ,ݬ·³²Qsþ2ÄþV!A(òÕÈ:ïÂK-IŒg¶•ýäXg¥7¥M¤“Á__Ž£ô/­ ùB3—‰e£ÊáãQcjESþv5 ÈHÒü}ú Êc:UiœÏéä õúd§Ü@Øêu¯°°‡Ù˜øwfðòÏ7 ¯RÙ_¶÷i¯dY ª;ÒGå›sX³Õ­Qµø#kx=<ñ±U§bÛ-<=×¼'¶«äá3ýÏ®‹Ußç:ìÌNCWÌ„©°•ôN ?ƒ¥¹¤ÌEdjango-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/test.txt.gz0000664000175000017500000000004113645400106024401 0ustar zuluzulu00000000000000‹•U±Utest.txtKËÏç¨e2~django-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/test.gz0000664000175000017500000000003013645400106023561 0ustar zuluzulu00000000000000‹ä2±U342æý‚Zdjango-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/gpg/0000775000175000017500000000000013645400163023027 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/gpg/secring.gpg0000775000175000017500000000245513645400106025166 0ustar zuluzulu00000000000000•ØU®¿†øÊ±ÌÁÛ(8ªÊ9 æÐC£78Ö4¡WKccœ?6{OJªiSCS+³æaÓÍi‚¢¿âà6÷LÑÝÅæHÈ€!?¡«wÿèö”Ï16Ž–k¦+ƒŽ¿¹?ï$œˆk”z½ẽ>âLý4Ð.Zl#‘ù_…|6Ce„2 ƒþ Qši½îÇdACQ^_<½ÀÎwZ¶0Ãì:%óbéM[¹lО@Âthâ>æ*·s½F}­YÙ„®dcÈ”!Zçúöò9š¸ÀGrËð¶Ñ5Ø„ïìÅå…_A y…7Þ° f}2V§æ×cFäúsikrÕe£Ü&}^©ùcºMá–¶·dÆ€ÁD´”ùªùÏ;Bi–"Ëà$ßcTTmOï?äwgêßû’)“¦2‹ÍöÆr_—´O.c_o[ÿb鄸Hq^KM„ˆúâpP«'“¶ƒRvGJ‚*ã&ÑëUØzZ‹ú¾ÑRD©‰s¸Ð‚íI8 …ÎCùý¼©C|+£ ÷ “±6»ù¬õC–…¶°ÙFwCŒî~#ì8á®o€Gž—u}µÛÒXš¬²ŽŽ#vâ ¡¤”´Tester ˆ¸"U®¿†  € ÷Ñ»ðcýé€hÿq,g©¡»ä·£ƒ‚èj~D[úœ×£: ’»ãËuÌT2? Œ_ü"p–óϵ–Õz‚¤ôcKD‡ƒÈdHaB9¥éÃÔZ@[|K˜±Y) ƒà _EËkûVw'µ_ü#yg˜OõÃó¯vp²¢w~ûT¬Ü·†p¢«L °ØU®¿†àñïÌYTe™›T¬ùOl#vIþñãŸý޳Ôâï¼½ÓËf£SsIJǷ÷ÅZQa²Øë„Fÿq)A‡©Ä¸’¨¤e ˜‡€5úÅÍO©ÛзÌûa·Pù5`¯]¿–ñP”ß" Ä„´æÖ€kóü…~¸èÁ ¡•üÄØFŸl:Hä¦Íô…\MåÞ&b]ÉÏ©3€×C¿æÏ.ü«ß‹¾Ó‚!‚/š—&uMÄCô†ùX¡p‹O.›Aclûß%ÅÄwÚ7Úòîf#<4cu˜'+¬§4‚4>“ õLY@Ú¬k´òûÀAí"C´%Ï¡o+Õâ‰õ’£¤:ñ¢«•˦£CD=ÖóeQ@»"Gtå®ÿ{-—3zØev•{¶ÿ–ÄAò×kúQŠM¹¾©Ä²­(‡à:©y¹÷•}•WõxêcáCæ]ãr­­.VعíPjÂâëh\ÀѾΖΓ3ü ($•eºÉ gÆè“X¼nüÖ2Ö©Ü»pxäiä“7Åe­Â»"Û¦ú…ŠöìÊðËö>^]wÕ„ó÷âú=ªˆŸ U®¿† ÷Ñ»ðcýé/èü µƒ†* ºsÿU¨Ü?~Õ²ÿu°¸zú¦ìã¢pìÝ{mÃõqý3]ÓÀêô÷I!Enõ‚j*€¼¦Íï3W×#å$r<‰þìך?!Þë\X/\ kÌŽ~wj4 “S“)1'Lc°G»W®‰à (úÆ9ÛúèˆPZ°django-dbbackup-3.3.0/dbbackup/tests/testapp/blobs/gpg/pubring.gpg0000775000175000017500000000122513645400106025174 0ustar zuluzulu00000000000000˜U®¿†øÊ±ÌÁÛ(8ªÊ9 æÐC£78Ö4¡WKccœ?6{OJªiSCS+³æaÓÍi‚¢¿âà6÷LÑÝÅæHÈ€!?¡«wÿèö”Ï16Ž–k¦+ƒŽ¿¹?ï$œˆk”z½ẽ>âLý4Ð.Zl#‘ù_…|6Ce„2 ƒ´Tester ˆ¸"U®¿†  € ÷Ñ»ðcýé€hÿq,g©¡»ä·£ƒ‚èj~D[úœ×£: ’»ãËuÌT2? Œ_ü"p–óϵ–Õz‚¤ôcKD‡ƒÈdHaB9¥éÃÔZ@[|K˜±Y) ƒà _EËkûVw'µ_ü#yg˜OõÃó¯vp²¢w~ûT¬Ü·†p¢«L °¸U®¿†àñïÌYTe™›T¬ùOl#vIþñãŸý޳Ôâï¼½ÓËf£SsIJǷ÷ÅZQa²Øë„Fÿq)A‡©Ä¸’¨¤e ˜‡€5úÅÍO©ÛзÌûa·Pù5`¯]¿–ñP”ß" Ä„´æÖ€kóˆŸ U®¿† ÷Ñ»ðcýé/èü µƒ†* ºsÿU¨Ü?~Õ²ÿu°¸zú¦ìã¢pìÝ{mÃõqý3]ÓÀêô÷I!Enõ‚j*€¼¦Íï3W×#å$r<‰þìך?!Þë\X/\ kÌŽ~wj4 “S“)1'Lc°G»W®‰à (úÆ9ÛúèˆPZ°django-dbbackup-3.3.0/dbbackup/tests/testapp/views.py0000664000175000017500000000007713645400106022661 0ustar zuluzulu00000000000000from django.shortcuts import render # Create your views here. django-dbbackup-3.3.0/dbbackup/tests/testapp/__init__.py0000664000175000017500000000000013645400106023245 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/management/0000775000175000017500000000000013645400163023265 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/management/commands/0000775000175000017500000000000013645400163025066 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/management/commands/count.py0000664000175000017500000000037413645400106026571 0ustar zuluzulu00000000000000from django.core.management.base import BaseCommand from dbbackup.tests.testapp.models import CharModel class Command(BaseCommand): help = "Count things" def handle(self, **options): self.stdout.write(str(CharModel.objects.count())) django-dbbackup-3.3.0/dbbackup/tests/testapp/management/commands/__init__.py0000664000175000017500000000000013645400106027162 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/testapp/management/commands/feed.py0000664000175000017500000000047113645400106026342 0ustar zuluzulu00000000000000from django.core.management.base import BaseCommand from dbbackup.tests.testapp.models import CharModel from dbbackup.tests.utils import FakeStorage class Command(BaseCommand): help = "Count things" def handle(self, **options): for st in 'abcde': CharModel.objects.create(field=st) django-dbbackup-3.3.0/dbbackup/tests/testapp/management/__init__.py0000664000175000017500000000000013645400106025361 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/settings.py0000664000175000017500000000267413645400106021711 0ustar zuluzulu00000000000000""" Configuration and launcher for dbbackup tests. """ import os import tempfile DEBUG = False BASE_DIR = os.path.dirname(os.path.abspath(__file__)) TESTAPP_DIR = os.path.join(BASE_DIR, 'testapp/') BLOB_DIR = os.path.join(TESTAPP_DIR, 'blobs/') ADMINS = ( ('ham', 'foo@bar'), ) ALLOWED_HOSTS = ['*'] MIDDLEWARE_CLASSES = () ROOT_URLCONF = 'dbbackup.tests.testapp.urls' SECRET_KEY = "it's a secret to everyone" SITE_ID = 1 MEDIA_ROOT = os.environ.get('MEDIA_ROOT') or tempfile.mkdtemp() INSTALLED_APPS = ( 'dbbackup', 'dbbackup.tests.testapp', ) DATABASES = {'default': { "ENGINE": os.environ.get('DB_ENGINE', "django.db.backends.sqlite3"), "NAME": os.environ.get('DB_NAME', ":memory:"), "USER": os.environ.get('DB_USER'), "PASSWORD": os.environ.get('DB_PASSWORD'), "HOST": os.environ.get('DB_HOST'), }} if os.environ.get('CONNECTOR'): CONNECTOR = {'CONNECTOR': os.environ['CONNECTOR']} DBBACKUP_CONNECTORS = {'default': CONNECTOR} CACHES = { 'default': { 'BACKEND': 'django.core.cache.backends.locmem.LocMemCache', } } SERVER_EMAIL = 'dbbackup@test.org' DBBACKUP_GPG_RECIPIENT = "test@test" DBBACKUP_GPG_ALWAYS_TRUST = True, DBBACKUP_STORAGE = os.environ.get('STORAGE', 'dbbackup.tests.utils.FakeStorage') DBBACKUP_STORAGE_OPTIONS = dict([keyvalue.split('=') for keyvalue in os.environ.get('STORAGE_OPTIONS', '').split(',') if keyvalue]) django-dbbackup-3.3.0/dbbackup/tests/test_utils.py0000664000175000017500000002173313645400106022245 0ustar zuluzulu00000000000000import os import pytz import tempfile from mock import patch from datetime import datetime import django from django.test import TestCase from django.core import mail from six import StringIO from dbbackup import utils, settings from dbbackup.tests.utils import (ENCRYPTED_FILE, clean_gpg_keys, add_private_gpg, COMPRESSED_FILE, callable_for_filename_template, DEV_NULL, add_public_gpg) class Bytes_To_StrTest(TestCase): def test_get_gb(self): value = utils.bytes_to_str(byteVal=2**31) self.assertEqual(value, "2.0 GiB") def test_0_decimal(self): value = utils.bytes_to_str(byteVal=1.01, decimals=0) self.assertEqual(value, "1 B") def test_2_decimal(self): value = utils.bytes_to_str(byteVal=1.01, decimals=2) self.assertEqual(value, "1.01 B") class Handle_SizeTest(TestCase): def test_func(self): filehandle = StringIO('Test string') value = utils.handle_size(filehandle=filehandle) self.assertEqual(value, '11.0 B') class MailAdminsTest(TestCase): def test_func(self): subject = 'foo subject' msg = 'bar message' utils.mail_admins(subject, msg) self.assertEqual(len(mail.outbox), 1) sent_mail = mail.outbox[0] expected_subject = '%s%s' % (settings.EMAIL_SUBJECT_PREFIX, subject) expected_to = settings.ADMINS[0][1] expected_from = settings.SERVER_EMAIL self.assertEqual(sent_mail.subject, expected_subject) self.assertEqual(sent_mail.body, msg) self.assertEqual(sent_mail.to[0], expected_to) self.assertEqual(sent_mail.from_email, expected_from) @patch('dbbackup.settings.ADMINS', None) def test_no_admin(self): subject = 'foo subject' msg = 'bar message' self.assertIsNone(utils.mail_admins(subject, msg)) self.assertEqual(len(mail.outbox), 0) class Email_Uncaught_ExceptionTest(TestCase): def test_success(self): def func(): pass utils.email_uncaught_exception(func) self.assertEqual(len(mail.outbox), 0) @patch('dbbackup.settings.SEND_EMAIL', False) def test_raise_error_without_mail(self): def func(): raise Exception('Foo') with self.assertRaises(Exception): utils.email_uncaught_exception(func)() self.assertEqual(len(mail.outbox), 0) @patch('dbbackup.settings.SEND_EMAIL', True) @patch('dbbackup.settings.FAILURE_RECIPIENTS', ['foo@bar']) def test_raise_with_mail(self): def func(): raise Exception('Foo') with self.assertRaises(Exception): utils.email_uncaught_exception(func)() self.assertEqual(len(mail.outbox), 1) error_mail = mail.outbox[0] self.assertEqual(['foo@bar'], error_mail.to) self.assertIn("Exception('Foo')", error_mail.subject) if django.VERSION >= (1, 7): self.assertIn("Exception('Foo')", error_mail.body) class Encrypt_FileTest(TestCase): def setUp(self): self.path = tempfile.mktemp() with open(self.path, 'a') as fd: fd.write('foo') add_public_gpg() def tearDown(self): os.remove(self.path) clean_gpg_keys() def test_func(self, *args): with open(self.path) as fd: encrypted_file, filename = utils.encrypt_file(inputfile=fd, filename='foo.txt') encrypted_file.seek(0) self.assertTrue(encrypted_file.read()) class Unencrypt_FileTest(TestCase): def setUp(self): add_private_gpg() def tearDown(self): clean_gpg_keys() @patch('dbbackup.utils.input', return_value=None) @patch('dbbackup.utils.getpass', return_value=None) def test_unencrypt(self, *args): inputfile = open(ENCRYPTED_FILE, 'r+b') uncryptfile, filename = utils.unencrypt_file(inputfile, 'foofile.gpg') uncryptfile.seek(0) self.assertEqual(b'foo\n', uncryptfile.read()) class Compress_FileTest(TestCase): def setUp(self): self.path = tempfile.mktemp() with open(self.path, 'a+b') as fd: fd.write(b'foo') def tearDown(self): os.remove(self.path) def test_func(self, *args): with open(self.path) as fd: compressed_file, filename = utils.encrypt_file(inputfile=fd, filename='foo.txt') class Uncompress_FileTest(TestCase): def test_func(self): inputfile = open(COMPRESSED_FILE, 'rb') fd, filename = utils.uncompress_file(inputfile, 'foo.gz') fd.seek(0) self.assertEqual(fd.read(), b'foo\n') class Create_Spooled_Temporary_FileTest(TestCase): def setUp(self): self.path = tempfile.mktemp() with open(self.path, 'a') as fd: fd.write('foo') def tearDown(self): os.remove(self.path) def test_func(self, *args): utils.create_spooled_temporary_file(filepath=self.path) class TimestampTest(TestCase): def test_naive_value(self): with self.settings(USE_TZ=False): timestamp = utils.timestamp(datetime(2015, 8, 15, 8, 15, 12, 0)) self.assertEqual(timestamp, '2015-08-15-081512') def test_aware_value(self): with self.settings(USE_TZ=True) and self.settings(TIME_ZONE='Europe/Rome'): timestamp = utils.timestamp(datetime(2015, 8, 15, 8, 15, 12, 0, tzinfo=pytz.utc)) self.assertEqual(timestamp, '2015-08-15-101512') class Datefmt_To_Regex(TestCase): def test_patterns(self): now = datetime.now() for datefmt, regex in utils.PATTERN_MATCHNG: date_string = datetime.strftime(now, datefmt) regex = utils.datefmt_to_regex(datefmt) match = regex.match(date_string) self.assertTrue(match) self.assertEqual(match.groups()[0], date_string) def test_complex_pattern(self): now = datetime.now() datefmt = 'Foo%a_%A-%w-%d-%b-%B_%m_%y_%Y-%H-%I-%M_%S_%f_%j-%U-%W-Bar' date_string = datetime.strftime(now, datefmt) regex = utils.datefmt_to_regex(datefmt) self.assertTrue(regex.pattern.startswith('(Foo')) self.assertTrue(regex.pattern.endswith('Bar)')) match = regex.match(date_string) self.assertTrue(match) self.assertEqual(match.groups()[0], date_string) class Filename_To_DatestringTest(TestCase): def test_func(self): now = datetime.now() datefmt = settings.DATE_FORMAT filename = '%s-foo.gz.gpg' % datetime.strftime(now, datefmt) datestring = utils.filename_to_datestring(filename, datefmt) self.assertIn(datestring, filename) def test_generated_filename(self): filename = utils.filename_generate('bak', 'default') datestring = utils.filename_to_datestring(filename) self.assertIn(datestring, filename) class Filename_To_DateTest(TestCase): def test_func(self): now = datetime.now() datefmt = settings.DATE_FORMAT filename = '%s-foo.gz.gpg' % datetime.strftime(now, datefmt) date = utils.filename_to_date(filename, datefmt) self.assertEqual(date.timetuple()[:5], now.timetuple()[:5]) def test_generated_filename(self): filename = utils.filename_generate('bak', 'default') datestring = utils.filename_to_date(filename) @patch('dbbackup.settings.HOSTNAME', 'test') class Filename_GenerateTest(TestCase): @patch('dbbackup.settings.FILENAME_TEMPLATE', '---{databasename}--{servername}-{datetime}.{extension}') def test_func(self, *args): extension = 'foo' generated_name = utils.filename_generate(extension) self.assertTrue('--' not in generated_name) self.assertFalse(generated_name.startswith('-')) def test_db(self, *args): extension = 'foo' generated_name = utils.filename_generate(extension) self.assertTrue(generated_name.startswith(settings.HOSTNAME)) self.assertTrue(generated_name.endswith(extension)) def test_media(self, *args): extension = 'foo' generated_name = utils.filename_generate(extension, content_type='media') self.assertTrue(generated_name.startswith(settings.HOSTNAME)) self.assertTrue(generated_name.endswith(extension)) @patch('django.utils.timezone.settings.USE_TZ', True) def test_tz_true(self): filename = utils.filename_generate('bak', 'default') datestring = utils.filename_to_datestring(filename) self.assertIn(datestring, filename) @patch('dbbackup.settings.FILENAME_TEMPLATE', callable_for_filename_template) def test_template_is_callable(self, *args): extension = 'foo' generated_name = utils.filename_generate(extension) self.assertTrue(generated_name.endswith('foo')) class QuoteCommandArg(TestCase): def test_arg_with_space(self): assert utils.get_escaped_command_arg('foo bar') == '\'foo bar\'' django-dbbackup-3.3.0/dbbackup/tests/test_storage.py0000664000175000017500000001360713645400106022552 0ustar zuluzulu00000000000000from mock import patch from django.test import TestCase from dbbackup.storage import get_storage, Storage from dbbackup.tests.utils import HANDLED_FILES, FakeStorage from dbbackup import utils DEFAULT_STORAGE_PATH = 'django.core.files.storage.FileSystemStorage' STORAGE_OPTIONS = {'location': '/tmp'} class Get_StorageTest(TestCase): @patch('dbbackup.settings.STORAGE', DEFAULT_STORAGE_PATH) @patch('dbbackup.settings.STORAGE_OPTIONS', STORAGE_OPTIONS) def test_func(self, *args): self.assertIsInstance(get_storage(), Storage) def test_set_path(self): fake_storage_path = 'dbbackup.tests.utils.FakeStorage' storage = get_storage(fake_storage_path) self.assertIsInstance(storage.storage, FakeStorage) @patch('dbbackup.settings.STORAGE', DEFAULT_STORAGE_PATH) def test_set_options(self, *args): storage = get_storage(options=STORAGE_OPTIONS) self.assertEqual(storage.storage.__module__, 'django.core.files.storage') class StorageTest(TestCase): def setUp(self): self.storageCls = Storage self.storageCls.name = 'foo' self.storage = Storage() class StorageListBackupsTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.storage = get_storage() # foodb files HANDLED_FILES['written_files'] += [ (utils.filename_generate(ext, 'foodb'), None) for ext in ('db', 'db.gz', 'db.gpg', 'db.gz.gpg') ] HANDLED_FILES['written_files'] += [ (utils.filename_generate(ext, 'hamdb', 'fooserver'), None) for ext in ('db', 'db.gz', 'db.gpg', 'db.gz.gpg') ] # Media file HANDLED_FILES['written_files'] += [ (utils.filename_generate(ext, None, None, 'media'), None) for ext in ('tar', 'tar.gz', 'tar.gpg', 'tar.gz.gpg') ] HANDLED_FILES['written_files'] += [ (utils.filename_generate(ext, 'bardb', 'barserver'), None) for ext in ('db', 'db.gz', 'db.gpg', 'db.gz.gpg') ] # barserver files HANDLED_FILES['written_files'] += [ ('file_without_date', None) ] def test_nofilter(self): files = self.storage.list_backups() self.assertEqual(len(HANDLED_FILES['written_files'])-1, len(files)) for file in files: self.assertNotEqual('file_without_date', file) def test_encrypted(self): files = self.storage.list_backups(encrypted=True) for file in files: self.assertIn('.gpg', file) def test_compressed(self): files = self.storage.list_backups(compressed=True) for file in files: self.assertIn('.gz', file) def test_not_encrypted(self): files = self.storage.list_backups(encrypted=False) for file in files: self.assertNotIn('.gpg', file) def test_not_compressed(self): files = self.storage.list_backups(compressed=False) for file in files: self.assertNotIn('.gz', file) def test_content_type_db(self): files = self.storage.list_backups(content_type='db') for file in files: self.assertIn('.db', file) def test_database(self): files = self.storage.list_backups(database='foodb') for file in files: self.assertIn('foodb', file) self.assertNotIn('bardb', file) self.assertNotIn('hamdb', file) def test_servername(self): files = self.storage.list_backups(servername='fooserver') for file in files: self.assertIn('fooserver', file) self.assertNotIn('barserver', file) files = self.storage.list_backups(servername='barserver') for file in files: self.assertIn('barserver', file) self.assertNotIn('fooserver', file) def test_content_type_media(self): files = self.storage.list_backups(content_type='media') for file in files: self.assertIn('.tar', file) # def test_servername(self): # files = self.storage.list_backups(servername='barserver') # for file in files: # self.assertIn('barserver', file) class StorageGetLatestTest(TestCase): def setUp(self): self.storage = get_storage() HANDLED_FILES['written_files'] = [(f, None) for f in [ '2015-02-06-042810.bak', '2015-02-07-042810.bak', '2015-02-08-042810.bak', ]] def tearDown(self): HANDLED_FILES.clean() def test_func(self): filename = self.storage.get_latest_backup() self.assertEqual(filename, '2015-02-08-042810.bak') class StorageGetMostRecentTest(TestCase): def setUp(self): self.storage = get_storage() HANDLED_FILES['written_files'] = [(f, None) for f in [ '2015-02-06-042810.bak', '2015-02-07-042810.bak', '2015-02-08-042810.bak', ]] def tearDown(self): HANDLED_FILES.clean() def test_func(self): filename = self.storage.get_older_backup() self.assertEqual(filename, '2015-02-06-042810.bak') def keep_only_even_files(filename): from dbbackup.utils import filename_to_date return filename_to_date(filename).day % 2 == 0 class StorageCleanOldBackupsTest(TestCase): def setUp(self): self.storage = get_storage() HANDLED_FILES.clean() HANDLED_FILES['written_files'] = [(f, None) for f in [ '2015-02-06-042810.bak', '2015-02-07-042810.bak', '2015-02-08-042810.bak', ]] def test_func(self): self.storage.clean_old_backups(keep_number=1) self.assertEqual(2, len(HANDLED_FILES['deleted_files'])) @patch('dbbackup.settings.CLEANUP_KEEP_FILTER', keep_only_even_files) def test_keep_filter(self): self.storage.clean_old_backups(keep_number=1) self.assertListEqual(['2015-02-07-042810.bak'], HANDLED_FILES['deleted_files'])django-dbbackup-3.3.0/dbbackup/tests/commands/0000775000175000017500000000000013645400163021272 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/commands/test_listbackups.py0000664000175000017500000000703313645400106025227 0ustar zuluzulu00000000000000from mock import patch from django.test import TestCase from django.core.management import execute_from_command_line from six import StringIO from dbbackup.management.commands.listbackups import Command as ListbackupsCommand from dbbackup.storage import get_storage from dbbackup.tests.utils import HANDLED_FILES class ListbackupsCommandTest(TestCase): def setUp(self): self.command = ListbackupsCommand() self.command.storage = get_storage() HANDLED_FILES['written_files'] = [(f, None) for f in [ '2015-02-06-042810.bak', '2015-02-07-042810.bak', '2015-02-08-042810.bak', ]] def test_get_backup_attrs(self): options = {} attrs = self.command.get_backup_attrs(options) self.assertEqual(len(HANDLED_FILES['written_files']), len(attrs)) class ListbackupsCommandArgComputingTest(TestCase): def setUp(self): HANDLED_FILES['written_files'] = [(f, None) for f in [ '2015-02-06-042810_foo.db', '2015-02-06-042810_foo.db.gz', '2015-02-06-042810_foo.db.gpg', '2015-02-06-042810_foo.db.gz.gpg', '2015-02-06-042810_foo.tar', '2015-02-06-042810_foo.tar.gz', '2015-02-06-042810_foo.tar.gpg', '2015-02-06-042810_foo.tar.gz.gpg', '2015-02-06-042810_bar.db', '2015-02-06-042810_bar.db.gz', '2015-02-06-042810_bar.db.gpg', '2015-02-06-042810_bar.db.gz.gpg', '2015-02-06-042810_bar.tar', '2015-02-06-042810_bar.tar.gz', '2015-02-06-042810_bar.tar.gpg', '2015-02-06-042810_bar.tar.gz.gpg', ]] def test_list(self): execute_from_command_line(['', 'listbackups']) def test_filter_encrypted(self): stdout = StringIO() with patch('sys.stdout', stdout): execute_from_command_line(['', 'listbackups', '--encrypted', '-q']) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn('.gpg', line) def test_filter_not_encrypted(self): stdout = StringIO() with patch('sys.stdout', stdout): execute_from_command_line(['', 'listbackups', '--not-encrypted', '-q']) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertNotIn('.gpg', line) def test_filter_compressed(self): stdout = StringIO() with patch('sys.stdout', stdout): execute_from_command_line(['', 'listbackups', '--compressed', '-q']) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn('.gz', line) def test_filter_not_compressed(self): stdout = StringIO() with patch('sys.stdout', stdout): execute_from_command_line(['', 'listbackups', '--not-compressed', '-q']) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertNotIn('.gz', line) def test_filter_db(self): stdout = StringIO() with patch('sys.stdout', stdout): execute_from_command_line(['', 'listbackups', '--content-type', 'db', '-q']) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn('.db', line) def test_filter_media(self): stdout = StringIO() with patch('sys.stdout', stdout): execute_from_command_line(['', 'listbackups', '--content-type', 'media', '-q']) stdout.seek(0) stdout.readline() for line in stdout.readlines(): self.assertIn('.tar', line) django-dbbackup-3.3.0/dbbackup/tests/commands/test_base.py0000664000175000017500000001374213645400106023621 0ustar zuluzulu00000000000000""" Tests for base command class. """ import os import logging from mock import patch from django.test import TestCase import six from django.core.files import File from dbbackup.management.commands._base import BaseDbBackupCommand from dbbackup.storage import get_storage from dbbackup.tests.utils import DEV_NULL, HANDLED_FILES class BaseDbBackupCommandSetLoggerLevelTest(TestCase): def setUp(self): self.command = BaseDbBackupCommand() def test_0_level(self): self.command.verbosity = 0 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.WARNING) def test_1_level(self): self.command.verbosity = 1 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.INFO) def test_2_level(self): self.command.verbosity = 2 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.DEBUG) def test_3_level(self): self.command.verbosity = 3 self.command._set_logger_level() self.assertEqual(self.command.logger.level, logging.DEBUG) def test_quiet(self): self.command.quiet = True self.command._set_logger_level() self.assertGreater(self.command.logger.level, logging.ERROR) class BaseDbBackupCommandMethodsTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.command = BaseDbBackupCommand() self.command.storage = get_storage() def test_read_from_storage(self): HANDLED_FILES['written_files'].append(['foo', File(six.BytesIO(b'bar'))]) file_ = self.command.read_from_storage('foo') self.assertEqual(file_.read(), b'bar') def test_write_to_storage(self): self.command.write_to_storage(six.BytesIO(b'foo'), 'bar') self.assertEqual(HANDLED_FILES['written_files'][0][0], 'bar') def test_read_local_file(self): # setUp self.command.path = '/tmp/foo.bak' open(self.command.path, 'w').close() # Test output_file = self.command.read_local_file(self.command.path) # tearDown os.remove(self.command.path) def test_write_local_file(self): fd, path = File(six.BytesIO(b"foo")), '/tmp/foo.bak' self.command.write_local_file(fd, path) self.assertTrue(os.path.exists(path)) # tearDown os.remove(path) def test_ask_confirmation(self): # Yes with patch('dbbackup.management.commands._base.input', return_value='y'): self.command._ask_confirmation() with patch('dbbackup.management.commands._base.input', return_value='Y'): self.command._ask_confirmation() with patch('dbbackup.management.commands._base.input', return_value=''): self.command._ask_confirmation() with patch('dbbackup.management.commands._base.input', return_value='foo'): self.command._ask_confirmation() # No with patch('dbbackup.management.commands._base.input', return_value='n'): with self.assertRaises(SystemExit): self.command._ask_confirmation() with patch('dbbackup.management.commands._base.input', return_value='N'): with self.assertRaises(SystemExit): self.command._ask_confirmation() with patch('dbbackup.management.commands._base.input', return_value='No'): with self.assertRaises(SystemExit): self.command._ask_confirmation() class BaseDbBackupCommandCleanupOldBackupsTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.command = BaseDbBackupCommand() self.command.stdout = DEV_NULL self.command.encrypt = False self.command.compress = False self.command.servername = 'foo-server' self.command.storage = get_storage() HANDLED_FILES['written_files'] = [(f, None) for f in [ 'fooserver-2015-02-06-042810.tar', 'fooserver-2015-02-07-042810.tar', 'fooserver-2015-02-08-042810.tar', 'foodb-fooserver-2015-02-06-042810.dump', 'foodb-fooserver-2015-02-07-042810.dump', 'foodb-fooserver-2015-02-08-042810.dump', 'bardb-fooserver-2015-02-06-042810.dump', 'bardb-fooserver-2015-02-07-042810.dump', 'bardb-fooserver-2015-02-08-042810.dump', 'hamdb-hamserver-2015-02-06-042810.dump', 'hamdb-hamserver-2015-02-07-042810.dump', 'hamdb-hamserver-2015-02-08-042810.dump', ]] @patch('dbbackup.settings.CLEANUP_KEEP', 1) def test_clean_db(self): self.command.content_type = 'db' self.command.database = 'foodb' self.command._cleanup_old_backups(database='foodb') self.assertEqual(2, len(HANDLED_FILES['deleted_files'])) self.assertNotIn('foodb-fooserver-2015-02-08-042810.dump', HANDLED_FILES['deleted_files']) @patch('dbbackup.settings.CLEANUP_KEEP', 1) def test_clean_other_db(self): self.command.content_type = 'db' self.command._cleanup_old_backups(database='bardb') self.assertEqual(2, len(HANDLED_FILES['deleted_files'])) self.assertNotIn('bardb-fooserver-2015-02-08-042810.dump', HANDLED_FILES['deleted_files']) @patch('dbbackup.settings.CLEANUP_KEEP', 1) def test_clean_other_server_db(self): self.command.content_type = 'db' self.command._cleanup_old_backups(database='bardb') self.assertEqual(2, len(HANDLED_FILES['deleted_files'])) self.assertNotIn('bardb-fooserver-2015-02-08-042810.dump', HANDLED_FILES['deleted_files']) @patch('dbbackup.settings.CLEANUP_KEEP_MEDIA', 1) def test_clean_media(self): self.command.content_type = 'media' self.command._cleanup_old_backups() self.assertEqual(2, len(HANDLED_FILES['deleted_files'])) self.assertNotIn('foo-server-2015-02-08-042810.tar', HANDLED_FILES['deleted_files']) django-dbbackup-3.3.0/dbbackup/tests/commands/__init__.py0000664000175000017500000000000013645400106023366 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/commands/test_dbrestore.py0000664000175000017500000001233113645400106024671 0ustar zuluzulu00000000000000""" Tests for dbrestore command. """ from mock import patch from tempfile import mktemp from shutil import copyfileobj from django.test import TestCase from django.core.management.base import CommandError from django.core.files import File from django.conf import settings from dbbackup import utils from dbbackup.db.base import get_connector from dbbackup.db.mongodb import MongoDumpConnector from dbbackup.management.commands.dbrestore import Command as DbrestoreCommand from dbbackup.storage import get_storage from dbbackup.settings import HOSTNAME from dbbackup.tests.utils import (TEST_DATABASE, add_private_gpg, DEV_NULL, clean_gpg_keys, HANDLED_FILES, TEST_MONGODB, TARED_FILE, get_dump, get_dump_name) @patch('dbbackup.management.commands._base.input', return_value='y') class DbrestoreCommandRestoreBackupTest(TestCase): def setUp(self): self.command = DbrestoreCommand() self.command.stdout = DEV_NULL self.command.uncompress = False self.command.decrypt = False self.command.backup_extension = 'bak' self.command.filename = 'foofile' self.command.database = TEST_DATABASE self.command.passphrase = None self.command.interactive = True self.command.storage = get_storage() self.command.servername = HOSTNAME self.command.database_name = 'default' self.command.connector = get_connector('default') HANDLED_FILES.clean() def tearDown(self): clean_gpg_keys() def test_no_filename(self, *args): # Prepare backup HANDLED_FILES['written_files'].append( (utils.filename_generate('default'), File(get_dump()))) # Check self.command.path = None self.command.filename = None self.command._restore_backup() def test_no_backup_found(self, *args): self.command.path = None self.command.filename = None with self.assertRaises(CommandError): self.command._restore_backup() def test_uncompress(self, *args): self.command.path = None compressed_file, self.command.filename = utils.compress_file(get_dump(), get_dump_name()) HANDLED_FILES['written_files'].append( (self.command.filename, File(compressed_file)) ) self.command.uncompress = True self.command._restore_backup() @patch('dbbackup.utils.getpass', return_value=None) def test_decrypt(self, *args): self.command.path = None self.command.decrypt = True encrypted_file, self.command.filename = utils.encrypt_file(get_dump(), get_dump_name()) HANDLED_FILES['written_files'].append( (self.command.filename, File(encrypted_file)) ) self.command._restore_backup() def test_path(self, *args): temp_dump = get_dump() dump_path = mktemp() with open(dump_path, 'wb') as dump: copyfileobj(temp_dump, dump) self.command.path = dump.name self.command._restore_backup() self.command.decrypt = False self.command.filepath = get_dump_name() HANDLED_FILES['written_files'].append( (self.command.filepath, get_dump()) ) self.command._restore_backup() class DbrestoreCommandGetDatabaseTest(TestCase): def setUp(self): self.command = DbrestoreCommand() def test_give_db_name(self): name, db = self.command._get_database({'database': 'default'}) self.assertEqual(name, 'default') self.assertEqual(db, settings.DATABASES['default']) def test_no_given_db(self): name, db = self.command._get_database({}) self.assertEqual(name, 'default') self.assertEqual(db, settings.DATABASES['default']) @patch('django.conf.settings.DATABASES', {'db1': {}, 'db2': {}}) def test_no_given_db_multidb(self): with self.assertRaises(CommandError): self.command._get_database({}) @patch('dbbackup.management.commands._base.input', return_value='y') @patch('dbbackup.management.commands.dbrestore.get_connector', return_value=MongoDumpConnector()) @patch('dbbackup.db.mongodb.MongoDumpConnector.restore_dump') class DbMongoRestoreCommandRestoreBackupTest(TestCase): def setUp(self): self.command = DbrestoreCommand() self.command.stdout = DEV_NULL self.command.uncompress = False self.command.decrypt = False self.command.backup_extension = 'bak' self.command.path = None self.command.filename = 'foofile' self.command.database = TEST_MONGODB self.command.passphrase = None self.command.interactive = True self.command.storage = get_storage() self.command.connector = MongoDumpConnector() self.command.database_name = 'mongo' self.command.servername = HOSTNAME HANDLED_FILES.clean() add_private_gpg() def test_mongo_settings_backup_command(self, mock_runcommands, *args): self.command.storage.file_read = TARED_FILE self.command.filename = TARED_FILE HANDLED_FILES['written_files'].append((TARED_FILE, open(TARED_FILE, 'rb'))) self.command._restore_backup() self.assertTrue(mock_runcommands.called) django-dbbackup-3.3.0/dbbackup/tests/commands/test_dbbackup.py0000664000175000017500000000464313645400106024462 0ustar zuluzulu00000000000000""" Tests for dbbackup command. """ import os from mock import patch from django.test import TestCase from dbbackup.management.commands.dbbackup import Command as DbbackupCommand from dbbackup.db.base import get_connector from dbbackup.storage import get_storage from dbbackup.tests.utils import (TEST_DATABASE, add_public_gpg, clean_gpg_keys, DEV_NULL) @patch('dbbackup.settings.GPG_RECIPIENT', 'test@test') @patch('sys.stdout', DEV_NULL) class DbbackupCommandSaveNewBackupTest(TestCase): def setUp(self): self.command = DbbackupCommand() self.command.servername = 'foo-server' self.command.encrypt = False self.command.compress = False self.command.database = TEST_DATABASE['NAME'] self.command.storage = get_storage() self.command.connector = get_connector() self.command.stdout = DEV_NULL self.command.filename = None self.command.path = None def tearDown(self): clean_gpg_keys() def test_func(self): self.command._save_new_backup(TEST_DATABASE) def test_compress(self): self.command.compress = True self.command._save_new_backup(TEST_DATABASE) def test_encrypt(self): add_public_gpg() self.command.encrypt = True self.command._save_new_backup(TEST_DATABASE) def test_path(self): self.command.path = '/tmp/foo.bak' self.command._save_new_backup(TEST_DATABASE) self.assertTrue(os.path.exists(self.command.path)) # tearDown os.remove(self.command.path) @patch('dbbackup.settings.GPG_RECIPIENT', 'test@test') @patch('sys.stdout', DEV_NULL) @patch('dbbackup.db.sqlite.SqliteConnector.create_dump') @patch('dbbackup.utils.handle_size', returned_value=4.2) class DbbackupCommandSaveNewMongoBackupTest(TestCase): def setUp(self): self.command = DbbackupCommand() self.command.servername = 'foo-server' self.command.encrypt = False self.command.compress = False self.command.storage = get_storage() self.command.stdout = DEV_NULL self.command.filename = None self.command.path = None self.command.connector = get_connector('default') def tearDown(self): clean_gpg_keys() def test_func(self, mock_run_commands, mock_handle_size): self.command._save_new_backup(TEST_DATABASE) self.assertTrue(mock_run_commands.called) django-dbbackup-3.3.0/dbbackup/tests/commands/test_mediabackup.py0000664000175000017500000000511613645400106025150 0ustar zuluzulu00000000000000""" Tests for mediabackup command. """ import os import tempfile from django.test import TestCase from django.core.files.storage import get_storage_class from dbbackup.management.commands.mediabackup import Command as DbbackupCommand from dbbackup.storage import get_storage from dbbackup.tests.utils import DEV_NULL, HANDLED_FILES, add_public_gpg class MediabackupBackupMediafilesTest(TestCase): def setUp(self): HANDLED_FILES.clean() self.command = DbbackupCommand() self.command.servername = 'foo-server' self.command.storage = get_storage() self.command.stdout = DEV_NULL self.command.compress = False self.command.encrypt = False self.command.path = None self.command.media_storage = get_storage_class()() self.command.filename = None def tearDown(self): if self.command.path is not None: try: os.remove(self.command.path) except OSError: pass def test_func(self): self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES['written_files'])) def test_compress(self): self.command.compress = True self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES['written_files'])) self.assertTrue(HANDLED_FILES['written_files'][0][0].endswith('.gz')) def test_encrypt(self): self.command.encrypt = True add_public_gpg() self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES['written_files'])) outputfile = HANDLED_FILES['written_files'][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b'-----BEGIN PGP MESSAGE-----')) def test_compress_and_encrypt(self): self.command.compress = True self.command.encrypt = True add_public_gpg() self.command.backup_mediafiles() self.assertEqual(1, len(HANDLED_FILES['written_files'])) outputfile = HANDLED_FILES['written_files'][0][1] outputfile.seek(0) self.assertTrue(outputfile.read().startswith(b'-----BEGIN PGP MESSAGE-----')) def test_write_local_file(self): self.command.path = tempfile.mktemp() self.command.backup_mediafiles() self.assertTrue(os.path.exists(self.command.path)) self.assertEqual(0, len(HANDLED_FILES['written_files'])) def test_output_filename(self): self.command.filename = "my_new_name.tar" self.command.backup_mediafiles() self.assertEqual(HANDLED_FILES['written_files'][0][0], self.command.filename) django-dbbackup-3.3.0/dbbackup/tests/test_connectors/0000775000175000017500000000000013645400163022705 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/test_connectors/test_postgresql.py0000664000175000017500000003230113645400106026515 0ustar zuluzulu00000000000000from __future__ import unicode_literals from mock import patch, mock_open from django.test import TestCase from six import BytesIO from dbbackup.db.postgresql import (PgDumpConnector, PgDumpGisConnector, PgDumpBinaryConnector) @patch('dbbackup.db.postgresql.PgDumpConnector.run_command', return_value=(BytesIO(b'foo'), BytesIO())) class PgDumpConnectorTest(TestCase): def test_create_dump(self, mock_dump_cmd): connector = PgDumpConnector() dump = connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b'foo') # Test cmd self.assertTrue(mock_dump_cmd.called) def test_create_dump_host(self, mock_dump_cmd): connector = PgDumpConnector() # Without connector.settings.pop('HOST', None) connector.create_dump() self.assertNotIn(' --host=', mock_dump_cmd.call_args[0][0]) # With connector.settings['HOST'] = 'foo' connector.create_dump() self.assertIn(' --host=foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_port(self, mock_dump_cmd): connector = PgDumpConnector() # Without connector.settings.pop('PORT', None) connector.create_dump() self.assertNotIn(' --port=', mock_dump_cmd.call_args[0][0]) # With connector.settings['PORT'] = 42 connector.create_dump() self.assertIn(' --port=42', mock_dump_cmd.call_args[0][0]) def test_create_dump_user(self, mock_dump_cmd): connector = PgDumpConnector() # Without connector.settings.pop('USER', None) connector.create_dump() self.assertNotIn(' --username=', mock_dump_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.create_dump() self.assertIn(' --username=foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_exclude(self, mock_dump_cmd): connector = PgDumpConnector() # Without connector.create_dump() self.assertNotIn(' --exclude-table=', mock_dump_cmd.call_args[0][0]) # With connector.exclude = ('foo',) connector.create_dump() self.assertIn(' --exclude-table=foo', mock_dump_cmd.call_args[0][0]) # With serveral connector.exclude = ('foo', 'bar') connector.create_dump() self.assertIn(' --exclude-table=foo', mock_dump_cmd.call_args[0][0]) self.assertIn(' --exclude-table=bar', mock_dump_cmd.call_args[0][0]) def test_create_dump_drop(self, mock_dump_cmd): connector = PgDumpConnector() # Without connector.drop = False connector.create_dump() self.assertNotIn(' --clean', mock_dump_cmd.call_args[0][0]) # With connector.drop = True connector.create_dump() self.assertIn(' --clean', mock_dump_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpConnector() dump = connector.create_dump() connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) @patch('dbbackup.db.postgresql.PgDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_host(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('HOST', None) connector.restore_dump(dump) self.assertNotIn(' --host=foo', mock_restore_cmd.call_args[0][0]) # With connector.settings['HOST'] = 'foo' connector.restore_dump(dump) self.assertIn(' --host=foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_port(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('PORT', None) connector.restore_dump(dump) self.assertNotIn(' --port=', mock_restore_cmd.call_args[0][0]) # With connector.settings['PORT'] = 42 connector.restore_dump(dump) self.assertIn(' --port=42', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_user(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('USER', None) connector.restore_dump(dump) self.assertNotIn(' --username=', mock_restore_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.restore_dump(dump) self.assertIn(' --username=foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpBinaryConnector.run_command', return_value=(BytesIO(b'foo'), BytesIO())) class PgDumpBinaryConnectorTest(TestCase): def test_create_dump(self, mock_dump_cmd): connector = PgDumpBinaryConnector() dump = connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b'foo') # Test cmd self.assertTrue(mock_dump_cmd.called) def test_create_dump_host(self, mock_dump_cmd): connector = PgDumpBinaryConnector() # Without connector.settings.pop('HOST', None) connector.create_dump() self.assertNotIn(' --host=', mock_dump_cmd.call_args[0][0]) # With connector.settings['HOST'] = 'foo' connector.create_dump() self.assertIn(' --host=foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_port(self, mock_dump_cmd): connector = PgDumpBinaryConnector() # Without connector.settings.pop('PORT', None) connector.create_dump() self.assertNotIn(' --port=', mock_dump_cmd.call_args[0][0]) # With connector.settings['PORT'] = 42 connector.create_dump() self.assertIn(' --port=42', mock_dump_cmd.call_args[0][0]) def test_create_dump_user(self, mock_dump_cmd): connector = PgDumpBinaryConnector() # Without connector.settings.pop('USER', None) connector.create_dump() self.assertNotIn(' --user=', mock_dump_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.create_dump() self.assertIn(' --user=foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_exclude(self, mock_dump_cmd): connector = PgDumpBinaryConnector() # Without connector.create_dump() self.assertNotIn(' --exclude-table=', mock_dump_cmd.call_args[0][0]) # With connector.exclude = ('foo',) connector.create_dump() self.assertIn(' --exclude-table=foo', mock_dump_cmd.call_args[0][0]) # With serveral connector.exclude = ('foo', 'bar') connector.create_dump() self.assertIn(' --exclude-table=foo', mock_dump_cmd.call_args[0][0]) self.assertIn(' --exclude-table=bar', mock_dump_cmd.call_args[0][0]) def test_create_dump_drop(self, mock_dump_cmd): connector = PgDumpBinaryConnector() # Without connector.drop = False connector.create_dump() self.assertNotIn(' --clean', mock_dump_cmd.call_args[0][0]) # Binary drop at restore level connector.drop = True connector.create_dump() self.assertNotIn(' --clean', mock_dump_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpBinaryConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpBinaryConnector() dump = connector.create_dump() connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) @patch('dbbackup.db.postgresql.PgDumpBinaryConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_host(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpBinaryConnector() dump = connector.create_dump() # Without connector.settings.pop('HOST', None) connector.restore_dump(dump) self.assertNotIn(' --host=foo', mock_restore_cmd.call_args[0][0]) # With connector.settings['HOST'] = 'foo' connector.restore_dump(dump) self.assertIn(' --host=foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpBinaryConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_port(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpBinaryConnector() dump = connector.create_dump() # Without connector.settings.pop('PORT', None) connector.restore_dump(dump) self.assertNotIn(' --port=', mock_restore_cmd.call_args[0][0]) # With connector.settings['PORT'] = 42 connector.restore_dump(dump) self.assertIn(' --port=42', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpBinaryConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_user(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpBinaryConnector() dump = connector.create_dump() # Without connector.settings.pop('USER', None) connector.restore_dump(dump) self.assertNotIn(' --user=', mock_restore_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.restore_dump(dump) self.assertIn(' --user=foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.postgresql.PgDumpGisConnector.run_command', return_value=(BytesIO(b'foo'), BytesIO())) class PgDumpGisConnectorTest(TestCase): @patch('dbbackup.db.postgresql.PgDumpGisConnector.run_command', return_value=(BytesIO(b'foo'), BytesIO())) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): connector = PgDumpGisConnector() dump = connector.create_dump() # Without ADMINUSER connector.settings.pop('ADMIN_USER', None) connector.restore_dump(dump) self.assertTrue(mock_restore_cmd.called) # With connector.settings['ADMIN_USER'] = 'foo' connector.restore_dump(dump) self.assertTrue(mock_restore_cmd.called) def test_enable_postgis(self, mock_dump_cmd): connector = PgDumpGisConnector() connector.settings['ADMIN_USER'] = 'foo' connector._enable_postgis() self.assertIn('"CREATE EXTENSION IF NOT EXISTS postgis;"', mock_dump_cmd.call_args[0][0]) self.assertIn('--username=foo', mock_dump_cmd.call_args[0][0]) def test_enable_postgis_host(self, mock_dump_cmd): connector = PgDumpGisConnector() connector.settings['ADMIN_USER'] = 'foo' # Without connector.settings.pop('HOST', None) connector._enable_postgis() self.assertNotIn(' --host=', mock_dump_cmd.call_args[0][0]) # With connector.settings['HOST'] = 'foo' connector._enable_postgis() self.assertIn(' --host=foo', mock_dump_cmd.call_args[0][0]) def test_enable_postgis_port(self, mock_dump_cmd): connector = PgDumpGisConnector() connector.settings['ADMIN_USER'] = 'foo' # Without connector.settings.pop('PORT', None) connector._enable_postgis() self.assertNotIn(' --port=', mock_dump_cmd.call_args[0][0]) # With connector.settings['PORT'] = 42 connector._enable_postgis() self.assertIn(' --port=42', mock_dump_cmd.call_args[0][0]) @patch('dbbackup.db.base.Popen', **{ 'return_value.wait.return_value': True, 'return_value.poll.return_value': False, }) class PgDumpConnectorRunCommandTest(TestCase): def test_run_command(self, mock_popen): connector = PgDumpConnector() connector.create_dump() self.assertEqual(mock_popen.call_args[0][0][0], 'pg_dump') def test_run_command_with_password(self, mock_popen): connector = PgDumpConnector() connector.settings['PASSWORD'] = 'foo' connector.create_dump() self.assertEqual(mock_popen.call_args[0][0][0], 'pg_dump') self.assertIn('PGPASSWORD', mock_popen.call_args[1]['env']) self.assertEqual('foo', mock_popen.call_args[1]['env']['PGPASSWORD']) def test_run_command_with_password_and_other(self, mock_popen): connector = PgDumpConnector(env={'foo': 'bar'}) connector.settings['PASSWORD'] = 'foo' connector.create_dump() self.assertEqual(mock_popen.call_args[0][0][0], 'pg_dump') self.assertIn('foo', mock_popen.call_args[1]['env']) self.assertEqual('bar', mock_popen.call_args[1]['env']['foo']) self.assertIn('PGPASSWORD', mock_popen.call_args[1]['env']) self.assertEqual('foo', mock_popen.call_args[1]['env']['PGPASSWORD']) django-dbbackup-3.3.0/dbbackup/tests/test_connectors/test_mysql.py0000664000175000017500000001324613645400106025466 0ustar zuluzulu00000000000000from __future__ import unicode_literals from mock import patch from django.test import TestCase from six import BytesIO from dbbackup.db.mysql import MysqlDumpConnector @patch('dbbackup.db.mysql.MysqlDumpConnector.run_command', return_value=(BytesIO(b'foo'), BytesIO())) class MysqlDumpConnectorTest(TestCase): def test_create_dump(self, mock_dump_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b'foo') # Test cmd self.assertTrue(mock_dump_cmd.called) def test_create_dump_host(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop('HOST', None) connector.create_dump() self.assertNotIn(' --host=', mock_dump_cmd.call_args[0][0]) # With connector.settings['HOST'] = 'foo' connector.create_dump() self.assertIn(' --host=foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_port(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop('PORT', None) connector.create_dump() self.assertNotIn(' --port=', mock_dump_cmd.call_args[0][0]) # With connector.settings['PORT'] = 42 connector.create_dump() self.assertIn(' --port=42', mock_dump_cmd.call_args[0][0]) def test_create_dump_user(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop('USER', None) connector.create_dump() self.assertNotIn(' --user=', mock_dump_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.create_dump() self.assertIn(' --user=foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_password(self, mock_dump_cmd): connector = MysqlDumpConnector() # Without connector.settings.pop('PASSWORD', None) connector.create_dump() self.assertNotIn(' --password=', mock_dump_cmd.call_args[0][0]) # With connector.settings['PASSWORD'] = 'foo' connector.create_dump() self.assertIn(' --password=foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_exclude(self, mock_dump_cmd): connector = MysqlDumpConnector() connector.settings['NAME'] = 'db' # Without connector.create_dump() self.assertNotIn(' --ignore-table=', mock_dump_cmd.call_args[0][0]) # With connector.exclude = ('foo',) connector.create_dump() self.assertIn(' --ignore-table=db.foo', mock_dump_cmd.call_args[0][0]) # With serveral connector.exclude = ('foo', 'bar') connector.create_dump() self.assertIn(' --ignore-table=db.foo', mock_dump_cmd.call_args[0][0]) self.assertIn(' --ignore-table=db.bar', mock_dump_cmd.call_args[0][0]) @patch('dbbackup.db.mysql.MysqlDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) @patch('dbbackup.db.mysql.MysqlDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_host(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('HOST', None) connector.restore_dump(dump) self.assertNotIn(' --host=foo', mock_restore_cmd.call_args[0][0]) # With connector.settings['HOST'] = 'foo' connector.restore_dump(dump) self.assertIn(' --host=foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.mysql.MysqlDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_port(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('PORT', None) connector.restore_dump(dump) self.assertNotIn(' --port=', mock_restore_cmd.call_args[0][0]) # With connector.settings['PORT'] = 42 connector.restore_dump(dump) self.assertIn(' --port=42', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.mysql.MysqlDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_user(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('USER', None) connector.restore_dump(dump) self.assertNotIn(' --user=', mock_restore_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.restore_dump(dump) self.assertIn(' --user=foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.mysql.MysqlDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_password(self, mock_dump_cmd, mock_restore_cmd): connector = MysqlDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('PASSWORD', None) connector.restore_dump(dump) self.assertNotIn(' --password=', mock_restore_cmd.call_args[0][0]) # With connector.settings['PASSWORD'] = 'foo' connector.restore_dump(dump) self.assertIn(' --password=foo', mock_restore_cmd.call_args[0][0]) django-dbbackup-3.3.0/dbbackup/tests/test_connectors/test_sqlite.py0000664000175000017500000000301113645400106025607 0ustar zuluzulu00000000000000from __future__ import unicode_literals from mock import patch, mock_open from django.test import TestCase from six import BytesIO from dbbackup.db.sqlite import SqliteConnector, SqliteCPConnector from dbbackup.tests.testapp.models import CharModel class SqliteConnectorTest(TestCase): def test_write_dump(self): dump_file = BytesIO() connector = SqliteConnector() connector._write_dump(dump_file) dump_file.seek(0) for line in dump_file: self.assertTrue(line.strip().endswith(b';')) def test_create_dump(self): connector = SqliteConnector() dump = connector.create_dump() self.assertTrue(dump.read()) def test_create_dump_with_unicode(self): CharModel.objects.create(field='\xe9') connector = SqliteConnector() dump = connector.create_dump() self.assertTrue(dump.read()) def test_restore_dump(self): connector = SqliteConnector() dump = connector.create_dump() connector.restore_dump(dump) @patch('dbbackup.db.sqlite.open', mock_open(read_data=b'foo'), create=True) class SqliteCPConnectorTest(TestCase): def test_create_dump(self): connector = SqliteCPConnector() dump = connector.create_dump() dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b'foo') def test_restore_dump(self): connector = SqliteCPConnector() dump = connector.create_dump() connector.restore_dump(dump) django-dbbackup-3.3.0/dbbackup/tests/test_connectors/test_mongodb.py0000664000175000017500000001027013645400106025740 0ustar zuluzulu00000000000000from __future__ import unicode_literals from mock import patch from django.test import TestCase from six import BytesIO from dbbackup.db.mongodb import MongoDumpConnector @patch('dbbackup.db.mongodb.MongoDumpConnector.run_command', return_value=(BytesIO(b'foo'), BytesIO())) class MongoDumpConnectorTest(TestCase): def test_create_dump(self, mock_dump_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Test dump dump_content = dump.read() self.assertTrue(dump_content) self.assertEqual(dump_content, b'foo') # Test cmd self.assertTrue(mock_dump_cmd.called) def test_create_dump_user(self, mock_dump_cmd): connector = MongoDumpConnector() # Without connector.settings.pop('USER', None) connector.create_dump() self.assertNotIn(' --user ', mock_dump_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.create_dump() self.assertIn(' --username foo', mock_dump_cmd.call_args[0][0]) def test_create_dump_password(self, mock_dump_cmd): connector = MongoDumpConnector() # Without connector.settings.pop('PASSWORD', None) connector.create_dump() self.assertNotIn(' --password ', mock_dump_cmd.call_args[0][0]) # With connector.settings['PASSWORD'] = 'foo' connector.create_dump() self.assertIn(' --password foo', mock_dump_cmd.call_args[0][0]) @patch('dbbackup.db.mongodb.MongoDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() connector.restore_dump(dump) # Test cmd self.assertTrue(mock_restore_cmd.called) @patch('dbbackup.db.mongodb.MongoDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_user(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('USER', None) connector.restore_dump(dump) self.assertNotIn(' --username ', mock_restore_cmd.call_args[0][0]) # With connector.settings['USER'] = 'foo' connector.restore_dump(dump) self.assertIn(' --username foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.mongodb.MongoDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_password(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.settings.pop('PASSWORD', None) connector.restore_dump(dump) self.assertNotIn(' --password ', mock_restore_cmd.call_args[0][0]) # With connector.settings['PASSWORD'] = 'foo' connector.restore_dump(dump) self.assertIn(' --password foo', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.mongodb.MongoDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_object_check(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.object_check = False connector.restore_dump(dump) self.assertNotIn('--objcheck', mock_restore_cmd.call_args[0][0]) # With connector.object_check = True connector.restore_dump(dump) self.assertIn(' --objcheck', mock_restore_cmd.call_args[0][0]) @patch('dbbackup.db.mongodb.MongoDumpConnector.run_command', return_value=(BytesIO(), BytesIO())) def test_restore_dump_drop(self, mock_dump_cmd, mock_restore_cmd): connector = MongoDumpConnector() dump = connector.create_dump() # Without connector.drop = False connector.restore_dump(dump) self.assertNotIn('--drop', mock_restore_cmd.call_args[0][0]) # With connector.drop = True connector.restore_dump(dump) self.assertIn(' --drop', mock_restore_cmd.call_args[0][0]) django-dbbackup-3.3.0/dbbackup/tests/test_connectors/test_base.py0000664000175000017500000000641013645400106025226 0ustar zuluzulu00000000000000from __future__ import unicode_literals import os from tempfile import SpooledTemporaryFile from django.test import TestCase from dbbackup.db.base import get_connector, BaseDBConnector, BaseCommandDBConnector from dbbackup.db import exceptions class GetConnectorTest(TestCase): def test_get_connector(self): connector = get_connector() self.assertIsInstance(connector, BaseDBConnector) class BaseDBConnectorTest(TestCase): def test_init(self): connector = BaseDBConnector() def test_settings(self): connector = BaseDBConnector() connector.settings def test_generate_filename(self): connector = BaseDBConnector() filename = connector.generate_filename() class BaseCommandDBConnectorTest(TestCase): def test_run_command(self): connector = BaseCommandDBConnector() stdout, stderr = connector.run_command('echo 123') self.assertEqual(stdout.read(), b'123\n') self.assertEqual(stderr.read(), b'') def test_run_command_error(self): connector = BaseCommandDBConnector() with self.assertRaises(exceptions.CommandConnectorError): connector.run_command('echa 123') def test_run_command_stdin(self): connector = BaseCommandDBConnector() stdin = SpooledTemporaryFile() stdin.write(b'foo') stdin.seek(0) # Run stdout, stderr = connector.run_command('cat', stdin=stdin) self.assertEqual(stdout.read(), b'foo') self.assertFalse(stderr.read()) def test_run_command_with_env(self): connector = BaseCommandDBConnector() # Empty env stdout, stderr = connector.run_command('env') self.assertTrue(stdout.read()) # env from self.env connector.env = {'foo': 'bar'} stdout, stderr = connector.run_command('env') self.assertIn(b'foo=bar\n', stdout.read()) # method overide gloabal env stdout, stderr = connector.run_command('env', env={'foo': 'ham'}) self.assertIn(b'foo=ham\n', stdout.read()) # get a var from parent env os.environ['bar'] = 'foo' stdout, stderr = connector.run_command('env') self.assertIn(b'bar=foo\n', stdout.read()) # Conf overides parendt env connector.env = {'bar': 'bar'} stdout, stderr = connector.run_command('env') self.assertIn(b'bar=bar\n', stdout.read()) # method overides all stdout, stderr = connector.run_command('env', env={'bar': 'ham'}) self.assertIn(b'bar=ham\n', stdout.read()) def test_run_command_with_parent_env(self): connector = BaseCommandDBConnector(use_parent_env=False) # Empty env stdout, stderr = connector.run_command('env') self.assertFalse(stdout.read()) # env from self.env connector.env = {'foo': 'bar'} stdout, stderr = connector.run_command('env') self.assertEqual(stdout.read(), b'foo=bar\n') # method overide gloabal env stdout, stderr = connector.run_command('env', env={'foo': 'ham'}) self.assertEqual(stdout.read(), b'foo=ham\n') # no var from parent env os.environ['bar'] = 'foo' stdout, stderr = connector.run_command('env') self.assertNotIn(b'bar=foo\n', stdout.read()) django-dbbackup-3.3.0/dbbackup/tests/test_connectors/__init__.py0000664000175000017500000000000013645400106025001 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/tests/utils.py0000664000175000017500000000700613645400106021203 0ustar zuluzulu00000000000000import os import subprocess import six from django.conf import settings from django.utils import timezone from django.core.files import File from django.core.files.storage import Storage from dbbackup.db.base import get_connector BASE_FILE = os.path.join(settings.BLOB_DIR, 'test.txt') ENCRYPTED_FILE = os.path.join(settings.BLOB_DIR, 'test.txt.gpg') COMPRESSED_FILE = os.path.join(settings.BLOB_DIR, 'test.txt.gz') TARED_FILE = os.path.join(settings.BLOB_DIR, 'test.txt.tar') ENCRYPTED_COMPRESSED_FILE = os.path.join(settings.BLOB_DIR, 'test.txt.gz.gpg') TEST_DATABASE = {'ENGINE': 'django.db.backends.sqlite3', 'NAME': '/tmp/foo.db', 'USER': 'foo', 'PASSWORD': 'bar', 'HOST': 'foo', 'PORT': 122} TEST_MONGODB = {'ENGINE': 'django_mongodb_engine', 'NAME': 'mongo_test', 'USER': 'foo', 'PASSWORD': 'bar', 'HOST': 'foo', 'PORT': 122} TEST_DATABASE = settings.DATABASES['default'] GPG_PRIVATE_PATH = os.path.join(settings.BLOB_DIR, 'gpg/secring.gpg') GPG_PUBLIC_PATH = os.path.join(settings.BLOB_DIR, 'gpg/pubring.gpg') GPG_FINGERPRINT = '7438 8D4E 02AF C011 4E2F 1E79 F7D1 BBF0 1F63 FDE9' DEV_NULL = open(os.devnull, 'w') class handled_files(dict): """ Dict for gather information about fake storage and clean between tests. You should use the constant instance ``HANDLED_FILES`` and clean it before tests. """ def __init__(self): super(handled_files, self).__init__() self.clean() def clean(self): self['written_files'] = [] self['deleted_files'] = [] HANDLED_FILES = handled_files() class FakeStorage(Storage): name = 'FakeStorage' def exists(self, name): return name in HANDLED_FILES['written_files'] def get_available_name(self, name, max_length=None): return name[:max_length] def get_valid_name(self, name): return name def listdir(self, path): return ([], [f[0] for f in HANDLED_FILES['written_files']]) def accessed_time(self, name): return timezone.now() created_time = modified_time = accessed_time def _open(self, name, mode='rb'): file_ = [f[1] for f in HANDLED_FILES['written_files'] if f[0] == name][0] file_.seek(0) return file_ def _save(self, name, content): HANDLED_FILES['written_files'].append((name, File(content))) return name def delete(self, name): HANDLED_FILES['deleted_files'].append(name) def clean_gpg_keys(): try: cmd = ("gpg --batch --yes --delete-key '%s'" % GPG_FINGERPRINT) subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) except: pass try: cmd = ("gpg --batch --yes --delete-secrect-key '%s'" % GPG_FINGERPRINT) subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) except: pass def add_private_gpg(): cmd = ('gpg --import %s' % GPG_PRIVATE_PATH).split() subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) def add_public_gpg(): cmd = ('gpg --import %s' % GPG_PUBLIC_PATH).split() subprocess.call(cmd, stdout=DEV_NULL, stderr=DEV_NULL) def skip_py3(testcase, reason="Not in Python 3"): """Decorator for skip Python 3 tests.""" if six.PY3: setup = lambda s: s.skipTest(reason) testcase.setUp = setup return testcase def callable_for_filename_template(datetime, **kwargs): return '%s_foo' % datetime def get_dump(database=TEST_DATABASE): return get_connector().create_dump() def get_dump_name(database=None): database = database or TEST_DATABASE return get_connector().generate_filename() django-dbbackup-3.3.0/dbbackup/tests/__init__.py0000664000175000017500000000000013645400106021565 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/storage.py0000664000175000017500000002143513645400106020347 0ustar zuluzulu00000000000000""" Utils for handle files. """ import logging from django.core.exceptions import ImproperlyConfigured from django.core.files.storage import get_storage_class from . import settings, utils def get_storage(path=None, options=None): """ Get the specified storage configured with options. :param path: Path in Python dot style to module containing the storage class. If empty settings.DBBACKUP_STORAGE will be used. :type path: ``str`` :param options: Parameters for configure the storage, if empty settings.DBBACKUP_STORAGE_OPTIONS will be used. :type options: ``dict`` :return: Storage configured :rtype: :class:`.Storage` """ path = path or settings.STORAGE options = options or settings.STORAGE_OPTIONS if not path: raise ImproperlyConfigured('You must specify a storage class using ' 'DBBACKUP_STORAGE settings.') return Storage(path, **options) class StorageError(Exception): pass class FileNotFound(StorageError): pass class Storage(object): """ This object make high-level storage operations for upload/download or list and filter files. It uses a Django storage object for low-level operations. """ @property def logger(self): if not hasattr(self, '_logger'): self._logger = logging.getLogger('dbbackup.storage') return self._logger def __init__(self, storage_path=None, **options): """ Initialize a Django Storage instance with given options. :param storage_path: Path to a Django Storage class with dot style If ``None``, ``settings.DBBACKUP_STORAGE`` will be used. :type storage_path: str """ self._storage_path = storage_path or settings.STORAGE options = options.copy() options.update(settings.STORAGE_OPTIONS) options = dict([(key.lower(), value) for key, value in options.items()]) self.storageCls = get_storage_class(self._storage_path) self.storage = self.storageCls(**options) self.name = self.storageCls.__name__ def __str__(self): return 'dbbackup-%s' % self.storage.__str__() def delete_file(self, filepath): self.logger.debug('Deleting file %s', filepath) self.storage.delete(name=filepath) def list_directory(self, path=''): return self.storage.listdir(path)[1] def write_file(self, filehandle, filename): self.logger.debug('Writing file %s', filename) self.storage.save(name=filename, content=filehandle) def read_file(self, filepath): self.logger.debug('Reading file %s', filepath) file_ = self.storage.open(name=filepath, mode='rb') if not getattr(file_, 'name', None): file_.name = filepath return file_ def list_backups(self, encrypted=None, compressed=None, content_type=None, database=None, servername=None): """ List stored files except given filter. If filter is None, it won't be used. ``content_type`` must be ``'db'`` for database backups or ``'media'`` for media backups. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :returns: List of files :rtype: ``list`` of ``str`` """ if content_type not in ('db', 'media', None): msg = "Bad content_type %s, must be 'db', 'media', or None" % ( content_type) raise TypeError(msg) # TODO: Make better filter for include only backups files = [f for f in self.list_directory() if utils.filename_to_datestring(f)] if encrypted is not None: files = [f for f in files if ('.gpg' in f) == encrypted] if compressed is not None: files = [f for f in files if ('.gz' in f) == compressed] if content_type == 'media': files = [f for f in files if '.tar' in f] elif content_type == 'db': files = [f for f in files if '.tar' not in f] if database: files = [f for f in files if database in f] if servername: files = [f for f in files if servername in f] return files def get_latest_backup(self, encrypted=None, compressed=None, content_type=None, database=None, servername=None): """ Return the latest backup file name. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :returns: Most recent file :rtype: ``str`` :raises: FileNotFound: If no backup file is found """ files = self.list_backups(encrypted=encrypted, compressed=compressed, content_type=content_type, database=database, servername=servername) if not files: raise FileNotFound("There's no backup file available.") return max(files, key=utils.filename_to_date) def get_older_backup(self, encrypted=None, compressed=None, content_type=None, database=None, servername=None): """ Return the older backup's file name. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :returns: Older file :rtype: ``str`` :raises: FileNotFound: If no backup file is found """ files = self.list_backups(encrypted=encrypted, compressed=compressed, content_type=content_type, database=database, servername=servername) if not files: raise FileNotFound("There's no backup file available.") return min(files, key=utils.filename_to_date) def clean_old_backups(self, encrypted=None, compressed=None, content_type=None, database=None, servername=None, keep_number=None): """ Delete olders backups and hold the number defined. :param encrypted: Filter by encrypted or not :type encrypted: ``bool`` or ``None`` :param compressed: Filter by compressed or not :type compressed: ``bool`` or ``None`` :param content_type: Filter by media or database backup, must be ``'db'`` or ``'media'`` :type content_type: ``str`` or ``None`` :param database: Filter by source database's name :type: ``str`` or ``None`` :param servername: Filter by source server's name :type: ``str`` or ``None`` :param keep_number: Number of files to keep, other will be deleted :type keep_number: ``int`` or ``None`` """ if keep_number is None: keep_number = settings.CLEANUP_KEEP if content_type == 'db' \ else settings.CLEANUP_KEEP_MEDIA keep_filter = settings.CLEANUP_KEEP_FILTER files = self.list_backups(encrypted=encrypted, compressed=compressed, content_type=content_type, database=database, servername=servername) files = sorted(files, key=utils.filename_to_date, reverse=True) files_to_delete = [fi for i, fi in enumerate(files) if i >= keep_number] for filename in files_to_delete: if keep_filter(filename): continue self.delete_file(filename) django-dbbackup-3.3.0/dbbackup/log.py0000664000175000017500000000321713645400106017462 0ustar zuluzulu00000000000000import logging import django from django.utils.log import AdminEmailHandler DEFAULT_LOGGING = { 'version': 1, 'disable_existing_loggers': False, 'handlers': { 'dbbackup.console': { 'formatter': 'base', 'level': 'DEBUG', 'class': 'logging.StreamHandler', }, 'dbbackup.mail_admins': { 'level': 'ERROR', 'class': 'dbbackup.log.DbbackupAdminEmailHandler', 'filters': ['require_dbbackup_mail_enabled'], 'include_html': True, } }, 'filters': { 'require_dbbackup_mail_enabled': { '()': 'dbbackup.log.MailEnabledFilter' } }, 'formatters': { 'base': {'format': '%(message)s'}, 'simple': {'format': '%(levelname)s %(message)s'} }, 'loggers': { 'dbbackup': { 'handlers': [ 'dbbackup.mail_admins', 'dbbackup.console' ], 'level': 'INFO' }, } } class DbbackupAdminEmailHandler(AdminEmailHandler): def emit(self, record): # Monkey patch for old Django versions without send_mail method if django.VERSION < (1, 8): from . import utils django.core.mail.mail_admins = utils.mail_admins super(DbbackupAdminEmailHandler, self).emit(record) def send_mail(self, subject, message, *args, **kwargs): from . import utils utils.mail_admins(subject, message, *args, connection=self.connection(), **kwargs) class MailEnabledFilter(logging.Filter): def filter(self, record): from .settings import SEND_EMAIL return SEND_EMAIL django-dbbackup-3.3.0/dbbackup/utils.py0000664000175000017500000003046613645400106020047 0ustar zuluzulu00000000000000""" Utility functions for dbbackup. """ from __future__ import absolute_import, division, print_function, unicode_literals import sys import os import traceback import tempfile import gzip import re import logging from getpass import getpass from shutil import copyfileobj from functools import wraps from datetime import datetime import six from django.core.mail import EmailMultiAlternatives from django.db import connection from django.http import HttpRequest from django.utils import timezone try: from pipes import quote except ImportError: from shlex import quote from . import settings input = raw_input if six.PY2 else input # noqa FAKE_HTTP_REQUEST = HttpRequest() FAKE_HTTP_REQUEST.META['SERVER_NAME'] = '' FAKE_HTTP_REQUEST.META['SERVER_PORT'] = '' FAKE_HTTP_REQUEST.META['HTTP_HOST'] = settings.HOSTNAME FAKE_HTTP_REQUEST.path = '/DJANGO-DBBACKUP-EXCEPTION' BYTES = ( ('PiB', 1125899906842624.0), ('TiB', 1099511627776.0), ('GiB', 1073741824.0), ('MiB', 1048576.0), ('KiB', 1024.0), ('B', 1.0) ) REG_FILENAME_CLEAN = re.compile(r'-+') class EncryptionError(Exception): pass class DecryptionError(Exception): pass def bytes_to_str(byteVal, decimals=1): """ Convert bytes to a human readable string. :param byteVal: Value to convert in bytes :type byteVal: int or float :param decimal: Number of decimal to display :type decimal: int :returns: Number of byte with the best unit of measure :rtype: str """ for unit, byte in BYTES: if (byteVal >= byte): if decimals == 0: return '%s %s' % (int(round(byteVal / byte, 0)), unit) return '%s %s' % (round(byteVal / byte, decimals), unit) return '%s B' % byteVal def handle_size(filehandle): """ Get file's size to a human readable string. :param filehandle: File to handle :type filehandle: file :returns: File's size with the best unit of measure :rtype: str """ filehandle.seek(0, 2) return bytes_to_str(filehandle.tell()) def mail_admins(subject, message, fail_silently=False, connection=None, html_message=None): """Sends a message to the admins, as defined by the DBBACKUP_ADMINS setting.""" if not settings.ADMINS: return mail = EmailMultiAlternatives('%s%s' % (settings.EMAIL_SUBJECT_PREFIX, subject), message, settings.SERVER_EMAIL, [a[1] for a in settings.ADMINS], connection=connection) if html_message: mail.attach_alternative(html_message, 'text/html') mail.send(fail_silently=fail_silently) def email_uncaught_exception(func): """ Function decorator for send email with uncaught exceptions to admins. Email is sent to ``settings.DBBACKUP_FAILURE_RECIPIENTS`` (``settings.ADMINS`` if not defined). The message contains a traceback of error. """ @wraps(func) def wrapper(*args, **kwargs): try: func(*args, **kwargs) except Exception: logger = logging.getLogger('dbbackup') exc_type, exc_value, tb = sys.exc_info() tb_str = ''.join(traceback.format_tb(tb)) msg = '%s: %s\n%s' % (exc_type.__name__, exc_value, tb_str) logger.error(msg) raise finally: connection.close() return wrapper def create_spooled_temporary_file(filepath=None, fileobj=None): """ Create a spooled temporary file. if ``filepath`` or ``fileobj`` is defined its content will be copied into temporary file. :param filepath: Path of input file :type filepath: str :param fileobj: Input file object :type fileobj: file :returns: Spooled temporary file :rtype: :class:`tempfile.SpooledTemporaryFile` """ spooled_file = tempfile.SpooledTemporaryFile( max_size=settings.TMP_FILE_MAX_SIZE, dir=settings.TMP_DIR) if filepath: fileobj = open(filepath, 'r+b') if fileobj is not None: fileobj.seek(0) copyfileobj(fileobj, spooled_file, settings.TMP_FILE_READ_SIZE) return spooled_file def encrypt_file(inputfile, filename): """ Encrypt input file using GPG and remove .gpg extension to its name. :param inputfile: File to encrypt :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :returns: Tuple with file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ import gnupg tempdir = tempfile.mkdtemp(dir=settings.TMP_DIR) try: filename = '%s.gpg' % filename filepath = os.path.join(tempdir, filename) try: inputfile.seek(0) always_trust = settings.GPG_ALWAYS_TRUST g = gnupg.GPG() result = g.encrypt_file(inputfile, output=filepath, recipients=settings.GPG_RECIPIENT, always_trust=always_trust) inputfile.close() if not result: msg = 'Encryption failed; status: %s' % result.status raise EncryptionError(msg) return create_spooled_temporary_file(filepath), filename finally: if os.path.exists(filepath): os.remove(filepath) finally: os.rmdir(tempdir) def unencrypt_file(inputfile, filename, passphrase=None): """ Unencrypt input file using GPG and remove .gpg extension to its name. :param inputfile: File to encrypt :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :param passphrase: Passphrase of GPG key, if equivalent to False, it will be asked to user. If user answer an empty pass, no passphrase will be used. :type passphrase: ``str`` or ``None`` :returns: Tuple with file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ import gnupg def get_passphrase(passphrase=passphrase): return passphrase or getpass('Input Passphrase: ') or None temp_dir = tempfile.mkdtemp(dir=settings.TMP_DIR) try: new_basename = os.path.basename(filename).replace('.gpg', '') temp_filename = os.path.join(temp_dir, new_basename) try: inputfile.seek(0) g = gnupg.GPG() result = g.decrypt_file(file=inputfile, passphrase=get_passphrase(), output=temp_filename) if not result: raise DecryptionError('Decryption failed; status: %s' % result.status) outputfile = create_spooled_temporary_file(temp_filename) finally: if os.path.exists(temp_filename): os.remove(temp_filename) finally: os.rmdir(temp_dir) return outputfile, new_basename def compress_file(inputfile, filename): """ Compress input file using gzip and change its name. :param inputfile: File to compress :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :returns: Tuple with compressed file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ outputfile = create_spooled_temporary_file() new_filename = filename + '.gz' zipfile = gzip.GzipFile(filename=filename, fileobj=outputfile, mode="wb") try: inputfile.seek(0) copyfileobj(inputfile, zipfile, settings.TMP_FILE_READ_SIZE) finally: zipfile.close() return outputfile, new_filename def uncompress_file(inputfile, filename): """ Uncompress this file using gzip and change its name. :param inputfile: File to compress :type inputfile: ``file`` like object :param filename: File's name :type filename: ``str`` :returns: Tuple with file and new file's name :rtype: :class:`tempfile.SpooledTemporaryFile`, ``str`` """ zipfile = gzip.GzipFile(fileobj=inputfile, mode="rb") try: outputfile = create_spooled_temporary_file(fileobj=zipfile) finally: zipfile.close() new_basename = os.path.basename(filename).replace('.gz', '') return outputfile, new_basename def timestamp(value): """ Return the timestamp of a datetime.datetime object. :param value: a datetime object :type value: datetime.datetime :return: the timestamp :rtype: str """ value = value if timezone.is_naive(value) else timezone.localtime(value) return value.strftime(settings.DATE_FORMAT) def filename_details(filepath): # TODO: What was this function made for ? return '' PATTERN_MATCHNG = ( ('%a', r'[A-Z][a-z]+'), ('%A', r'[A-Z][a-z]+'), ('%w', r'\d'), ('%d', r'\d{2}'), ('%b', r'[A-Z][a-z]+'), ('%B', r'[A-Z][a-z]+'), ('%m', r'\d{2}'), ('%y', r'\d{2}'), ('%Y', r'\d{4}'), ('%H', r'\d{2}'), ('%I', r'\d{2}'), # ('%p', r'(?AM|PM|am|pm)'), ('%M', r'\d{2}'), ('%S', r'\d{2}'), ('%f', r'\d{6}'), # ('%z', r'\+\d{4}'), # ('%Z', r'(?|UTC|EST|CST)'), ('%j', r'\d{3}'), ('%U', r'\d{2}'), ('%W', r'\d{2}'), # ('%c', r'[A-Z][a-z]+ [A-Z][a-z]{2} \d{2} \d{2}:\d{2}:\d{2} \d{4}'), # ('%x', r'd{2}/d{2}/d{4}'), # ('%X', r'd{2}:d{2}:d{2}'), # ('%%', r'%'), ) def datefmt_to_regex(datefmt): """ Convert a strftime format string to a regex. :param datefmt: strftime format string :type datefmt: ``str`` :returns: Equivalent regex :rtype: ``re.compite`` """ new_string = datefmt for pat, reg in PATTERN_MATCHNG: new_string = new_string.replace(pat, reg) return re.compile(r'(%s)' % new_string) def filename_to_datestring(filename, datefmt=None): """ Return the date part of a file name. :param datefmt: strftime format string, ``settings.DATE_FORMAT`` is used if is ``None`` :type datefmt: ``str`` or ``None`` :returns: Date part or nothing if not found :rtype: ``str`` or ``NoneType`` """ datefmt = datefmt or settings.DATE_FORMAT regex = datefmt_to_regex(datefmt) search = regex.search(filename) if search: return search.groups()[0] def filename_to_date(filename, datefmt=None): """ Return a datetime from a file name. :param datefmt: strftime format string, ``settings.DATE_FORMAT`` is used if is ``None`` :type datefmt: ``str`` or ``NoneType`` :returns: Date guessed or nothing if no date found :rtype: ``datetime.datetime`` or ``NoneType`` """ datefmt = datefmt or settings.DATE_FORMAT datestring = filename_to_datestring(filename, datefmt) if datestring is not None: return datetime.strptime(datestring, datefmt) def filename_generate(extension, database_name='', servername=None, content_type='db', wildcard=None): """ Create a new backup filename. :param extension: Extension of backup file :type extension: ``str`` :param database_name: If it is database backup specify its name :type database_name: ``str`` :param servername: Specify server name or by default ``settings.DBBACKUP_HOSTNAME`` :type servername: ``str`` :param content_type: Content type to backup, ``'media'`` or ``'db'`` :type content_type: ``str`` :param wildcard: Replace datetime with this wilecard regex :type content_type: ``str`` :returns: Computed file name :rtype: ``str` """ if content_type == 'db': if '/' in database_name: database_name = os.path.basename(database_name) if '.' in database_name: database_name = database_name.split('.')[0] template = settings.FILENAME_TEMPLATE elif content_type == 'media': template = settings.MEDIA_FILENAME_TEMPLATE else: template = settings.FILENAME_TEMPLATE params = { 'servername': servername or settings.HOSTNAME, 'datetime': wildcard or datetime.now().strftime(settings.DATE_FORMAT), 'databasename': database_name, 'extension': extension, 'content_type': content_type } if callable(template): filename = template(**params) else: filename = template.format(**params) filename = REG_FILENAME_CLEAN.sub('-', filename) filename = filename[1:] if filename.startswith('-') else filename return filename def get_escaped_command_arg(arg): return quote(arg) django-dbbackup-3.3.0/dbbackup/__init__.py0000664000175000017500000000051013645400150020430 0ustar zuluzulu00000000000000"Management commands to help backup and restore a project database and media" VERSION = (3, 3, 0) __version__ = '.'.join([str(i) for i in VERSION]) __author__ = 'Michael Shepanski' __email__ = 'mjs7231@gmail.com' __url__ = 'https://github.com/django-dbbackup/django-dbbackup' default_app_config = 'dbbackup.apps.DbbackupConfig' django-dbbackup-3.3.0/dbbackup/management/0000775000175000017500000000000013645400163020443 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/management/commands/0000775000175000017500000000000013645400163022244 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/management/commands/dbrestore.py0000664000175000017500000001011413645400106024601 0ustar zuluzulu00000000000000""" Restore database. """ from __future__ import (absolute_import, division, print_function, unicode_literals) from django.conf import settings from django.core.management.base import CommandError from django.db import connection from ._base import BaseDbBackupCommand, make_option from ... import utils from ...db.base import get_connector from ...storage import get_storage, StorageError class Command(BaseDbBackupCommand): help = """Restore a database backup from storage, encrypted and/or compressed.""" content_type = 'db' option_list = BaseDbBackupCommand.option_list + ( make_option("-d", "--database", help="Database to restore"), make_option("-i", "--input-filename", help="Specify filename to backup from"), make_option("-I", "--input-path", help="Specify path on local filesystem to backup from"), make_option("-s", "--servername", help="If backup file is not specified, filter the " "existing ones with the given servername"), make_option("-c", "--decrypt", default=False, action='store_true', help="Decrypt data before restoring"), make_option("-p", "--passphrase", help="Passphrase for decrypt file", default=None), make_option("-z", "--uncompress", action='store_true', default=False, help="Uncompress gzip data before restoring") ) def handle(self, *args, **options): """Django command handler.""" self.verbosity = int(options.get('verbosity')) self.quiet = options.get('quiet') self._set_logger_level() try: connection.close() self.filename = options.get('input_filename') self.path = options.get('input_path') self.servername = options.get('servername') self.decrypt = options.get('decrypt') self.uncompress = options.get('uncompress') self.passphrase = options.get('passphrase') self.interactive = options.get('interactive') self.database_name, self.database = self._get_database(options) self.storage = get_storage() self._restore_backup() except StorageError as err: raise CommandError(err) def _get_database(self, options): """Get the database to restore.""" database_name = options.get('database') if not database_name: if len(settings.DATABASES) > 1: errmsg = "Because this project contains more than one database, you"\ " must specify the --database option." raise CommandError(errmsg) database_name = list(settings.DATABASES.keys())[0] if database_name not in settings.DATABASES: raise CommandError("Database %s does not exist." % database_name) return database_name, settings.DATABASES[database_name] def _restore_backup(self): """Restore the specified database.""" input_filename, input_file = self._get_backup_file(database=self.database_name, servername=self.servername) self.logger.info("Restoring backup for database '%s' and server '%s'", self.database_name, self.servername) self.logger.info("Restoring: %s" % input_filename) if self.decrypt: unencrypted_file, input_filename = utils.unencrypt_file(input_file, input_filename, self.passphrase) input_file.close() input_file = unencrypted_file if self.uncompress: uncompressed_file, input_filename = utils.uncompress_file(input_file, input_filename) input_file.close() input_file = uncompressed_file self.logger.info("Restore tempfile created: %s", utils.handle_size(input_file)) if self.interactive: self._ask_confirmation() input_file.seek(0) self.connector = get_connector(self.database_name) self.connector.restore_dump(input_file) django-dbbackup-3.3.0/dbbackup/management/commands/mediarestore.py0000664000175000017500000000716313645400106025305 0ustar zuluzulu00000000000000""" Restore media files. """ import tarfile from django.core.management.base import CommandError from django.core.files.storage import get_storage_class from ._base import BaseDbBackupCommand, make_option from ...storage import get_storage, StorageError from ... import utils class Command(BaseDbBackupCommand): help = """Restore a media backup from storage, encrypted and/or compressed.""" content_type = 'media' option_list = ( make_option("-i", "--input-filename", action='store', help="Specify filename to backup from"), make_option("-I", "--input-path", help="Specify path on local filesystem to backup from"), make_option("-s", "--servername", help="If backup file is not specified, filter the existing ones with the " "given servername"), make_option("-e", "--decrypt", default=False, action='store_true', help="Decrypt data before restoring"), make_option("-p", "--passphrase", default=None, help="Passphrase for decrypt file"), make_option("-z", "--uncompress", action='store_true', help="Uncompress gzip data before restoring"), make_option("-r", "--replace", help="Replace existing files", action='store_true'), ) def handle(self, *args, **options): """Django command handler.""" self.verbosity = int(options.get('verbosity')) self.quiet = options.get('quiet') self._set_logger_level() self.servername = options.get('servername') self.decrypt = options.get('decrypt') self.uncompress = options.get('uncompress') self.filename = options.get('input_filename') self.path = options.get('input_path') self.replace = options.get('replace') self.passphrase = options.get('passphrase') self.interactive = options.get('interactive') self.storage = get_storage() self.media_storage = get_storage_class()() self._restore_backup() def _upload_file(self, name, media_file): if self.media_storage.exists(name): if self.replace: self.media_storage.delete(name) self.logger.info("%s deleted", name) else: return self.media_storage.save(name, media_file) self.logger.info("%s uploaded", name) def _restore_backup(self): self.logger.info("Restoring backup for media files") input_filename, input_file = self._get_backup_file(servername=self.servername) self.logger.info("Restoring: %s", input_filename) if self.decrypt: unencrypted_file, input_filename = utils.unencrypt_file(input_file, input_filename, self.passphrase) input_file.close() input_file = unencrypted_file self.logger.debug("Backup size: %s", utils.handle_size(input_file)) if self.interactive: self._ask_confirmation() input_file.seek(0) tar_file = tarfile.open(fileobj=input_file, mode='r:gz') \ if self.uncompress \ else tarfile.open(fileobj=input_file, mode='r:') # Restore file 1 by 1 for media_file_info in tar_file: if media_file_info.path == 'media': continue # Don't copy root directory media_file = tar_file.extractfile(media_file_info) if media_file is None: continue # Skip directories name = media_file_info.path.replace('media/', '') self._upload_file(name, media_file) django-dbbackup-3.3.0/dbbackup/management/commands/mediabackup.py0000664000175000017500000001006613645400106025063 0ustar zuluzulu00000000000000""" Save media files. """ from __future__ import (absolute_import, division, print_function, unicode_literals) import os import tarfile from django.core.management.base import CommandError from django.core.files.storage import get_storage_class from ._base import BaseDbBackupCommand, make_option from ... import utils from ...storage import get_storage, StorageError class Command(BaseDbBackupCommand): help = """Backup media files, gather all in a tarball and encrypt or compress.""" content_type = "media" option_list = BaseDbBackupCommand.option_list + ( make_option("-c", "--clean", help="Clean up old backup files", action="store_true", default=False), make_option("-s", "--servername", help="Specify server name to include in backup filename"), make_option("-z", "--compress", help="Compress the archive", action="store_true", default=False), make_option("-e", "--encrypt", help="Encrypt the backup files", action="store_true", default=False), make_option("-o", "--output-filename", default=None, help="Specify filename on storage"), make_option("-O", "--output-path", default=None, help="Specify where to store on local filesystem",) ) @utils.email_uncaught_exception def handle(self, **options): self.verbosity = options.get('verbosity') self.quiet = options.get('quiet') self._set_logger_level() self.encrypt = options.get('encrypt', False) self.compress = options.get('compress', False) self.servername = options.get('servername') self.filename = options.get('output_filename') self.path = options.get('output_path') try: self.media_storage = get_storage_class()() self.storage = get_storage() self.backup_mediafiles() if options.get('clean'): self._cleanup_old_backups(servername=self.servername) except StorageError as err: raise CommandError(err) def _explore_storage(self): """Generator of all files contained in media storage.""" path = '' dirs = [path] while dirs: path = dirs.pop() subdirs, files = self.media_storage.listdir(path) for media_filename in files: yield os.path.join(path, media_filename) dirs.extend([os.path.join(path, subdir) for subdir in subdirs]) def _create_tar(self, name): """Create TAR file.""" fileobj = utils.create_spooled_temporary_file() mode = 'w:gz' if self.compress else 'w' tar_file = tarfile.open(name=name, fileobj=fileobj, mode=mode) for media_filename in self._explore_storage(): tarinfo = tarfile.TarInfo(media_filename) media_file = self.media_storage.open(media_filename) tarinfo.size = len(media_file) tar_file.addfile(tarinfo, media_file) # Close the TAR for writing tar_file.close() return fileobj def backup_mediafiles(self): """ Create backup file and write it to storage. """ # Check for filename option if self.filename: filename = self.filename else: extension = "tar%s" % ('.gz' if self.compress else '') filename = utils.filename_generate(extension, servername=self.servername, content_type=self.content_type) tarball = self._create_tar(filename) # Apply trans if self.encrypt: encrypted_file = utils.encrypt_file(tarball, filename) tarball, filename = encrypted_file self.logger.debug("Backup size: %s", utils.handle_size(tarball)) # Store backup tarball.seek(0) if self.path is None: self.write_to_storage(tarball, filename) else: self.write_local_file(tarball, self.path) django-dbbackup-3.3.0/dbbackup/management/commands/__init__.py0000664000175000017500000000000013645400106024340 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/management/commands/dbbackup.py0000664000175000017500000000677213645400106024402 0ustar zuluzulu00000000000000""" Command for backup database. """ from __future__ import (absolute_import, division, print_function, unicode_literals) from django.core.management.base import CommandError from ._base import BaseDbBackupCommand, make_option from ...db.base import get_connector from ...storage import get_storage, StorageError from ... import utils, settings class Command(BaseDbBackupCommand): help = "Backup a database, encrypt and/or compress and write to " \ "storage.""" content_type = 'db' option_list = BaseDbBackupCommand.option_list + ( make_option("-c", "--clean", dest='clean', action="store_true", default=False, help="Clean up old backup files"), make_option("-d", "--database", help="Database(s) to backup specified by key separated by" " commas(default: all)"), make_option("-s", "--servername", help="Specify server name to include in backup filename"), make_option("-z", "--compress", action="store_true", default=False, help="Compress the backup files"), make_option("-e", "--encrypt", action="store_true", default=False, help="Encrypt the backup files"), make_option("-o", "--output-filename", default=None, help="Specify filename on storage"), make_option("-O", "--output-path", default=None, help="Specify where to store on local filesystem") ) @utils.email_uncaught_exception def handle(self, **options): self.verbosity = options.get('verbosity') self.quiet = options.get('quiet') self._set_logger_level() self.clean = options.get('clean') self.servername = options.get('servername') self.compress = options.get('compress') self.encrypt = options.get('encrypt') self.filename = options.get('output_filename') self.path = options.get('output_path') self.storage = get_storage() self.database = options.get('database') or '' database_keys = self.database.split(',') or settings.DATABASES for database_key in database_keys: self.connector = get_connector(database_key) database = self.connector.settings try: self._save_new_backup(database) if self.clean: self._cleanup_old_backups(database=database_key) except StorageError as err: raise CommandError(err) def _save_new_backup(self, database): """ Save a new backup file. """ self.logger.info("Backing Up Database: %s", database['NAME']) # Get backup and name filename = self.connector.generate_filename(self.servername) outputfile = self.connector.create_dump() # Apply trans if self.compress: compressed_file, filename = utils.compress_file(outputfile, filename) outputfile = compressed_file if self.encrypt: encrypted_file, filename = utils.encrypt_file(outputfile, filename) outputfile = encrypted_file # Set file name filename = self.filename if self.filename else filename self.logger.debug("Backup size: %s", utils.handle_size(outputfile)) # Store backup outputfile.seek(0) if self.path is None: self.write_to_storage(outputfile, filename) else: self.write_local_file(outputfile, self.path) django-dbbackup-3.3.0/dbbackup/management/commands/listbackups.py0000664000175000017500000000371313645400106025143 0ustar zuluzulu00000000000000""" List backups. """ from __future__ import (absolute_import, division, print_function, unicode_literals) from ... import utils from ._base import BaseDbBackupCommand, make_option from ...storage import get_storage ROW_TEMPLATE = '{name:40} {datetime:20}' FILTER_KEYS = ('encrypted', 'compressed', 'content_type', 'database') class Command(BaseDbBackupCommand): option_list = ( make_option("-d", "--database", help="Filter by database name"), make_option("-z", "--compressed", help="Exclude non-compressed", action="store_true", default=None, dest="compressed"), make_option("-Z", "--not-compressed", help="Exclude compressed", action="store_false", default=None, dest="compressed"), make_option("-e", "--encrypted", help="Exclude non-encrypted", action="store_true", default=None, dest="encrypted"), make_option("-E", "--not-encrypted", help="Exclude encrypted", action="store_false", default=None, dest="encrypted"), make_option("-c", "--content-type", help="Filter by content type 'db' or 'media'"), ) def handle(self, **options): self.quiet = options.get('quiet') self.storage = get_storage() files_attr = self.get_backup_attrs(options) if not self.quiet: title = ROW_TEMPLATE.format(name='Name', datetime='Datetime') self.stdout.write(title) for file_attr in files_attr: row = ROW_TEMPLATE.format(**file_attr) self.stdout.write(row) def get_backup_attrs(self, options): filters = dict([(k, v) for k, v in options.items() if k in FILTER_KEYS]) filenames = self.storage.list_backups(**filters) files_attr = [ {'datetime': utils.filename_to_date(filename).strftime('%x %X'), 'name': filename} for filename in filenames] return files_attr django-dbbackup-3.3.0/dbbackup/management/commands/_base.py0000664000175000017500000001060213645400106023663 0ustar zuluzulu00000000000000""" Abstract Command. """ import sys import logging from optparse import make_option as optparse_make_option from shutil import copyfileobj import django from django.core.management.base import BaseCommand, CommandError import six from ...storage import StorageError if six.PY2: input = raw_input else: long = int USELESS_ARGS = ('callback', 'callback_args', 'callback_kwargs', 'metavar') TYPES = { 'string': str, 'int': int, 'long': long, 'float': float, 'complex': complex, 'choice': list } LOGGING_VERBOSITY = { 0: logging.WARN, 1: logging.INFO, 2: logging.DEBUG, 3: logging.DEBUG, } def make_option(*args, **kwargs): return args, kwargs class BaseDbBackupCommand(BaseCommand): """ Base command class used for create all dbbackup command. """ base_option_list = ( make_option("--noinput", action='store_false', dest='interactive', default=True, help='Tells Django to NOT prompt the user for input of any kind.'), make_option('-q', "--quiet", action='store_true', default=False, help='Tells Django to NOT output other text than errors.') ) option_list = () verbosity = 1 quiet = False logger = logging.getLogger('dbbackup.command') def __init__(self, *args, **kwargs): self.option_list = self.base_option_list + self.option_list if django.VERSION < (1, 10): options = tuple([optparse_make_option(*_args, **_kwargs) for _args, _kwargs in self.option_list]) self.option_list = options + BaseCommand.option_list super(BaseDbBackupCommand, self).__init__(*args, **kwargs) def add_arguments(self, parser): for args, kwargs in self.option_list: kwargs = dict([ (k, v) for k, v in kwargs.items() if not k.startswith('_') and k not in USELESS_ARGS]) parser.add_argument(*args, **kwargs) def _set_logger_level(self): level = 60 if self.quiet else LOGGING_VERBOSITY[int(self.verbosity)] self.logger.setLevel(level) def _ask_confirmation(self): answer = input("Are you sure you want to continue? [Y/n] ") if answer.lower().startswith('n'): self.logger.info("Quitting") sys.exit(0) def read_from_storage(self, path): return self.storage.read_file(path) def write_to_storage(self, file, path): self.logger.info("Writing file to %s", path) self.storage.write_file(file, path) def read_local_file(self, path): """Open file in read mode on local filesystem.""" return open(path, 'rb') def write_local_file(self, outputfile, path): """Write file to the desired path.""" self.logger.info("Writing file to %s", path) outputfile.seek(0) with open(path, 'wb') as fd: copyfileobj(outputfile, fd) def _get_backup_file(self, database=None, servername=None): if self.path: input_filename = self.path input_file = self.read_local_file(self.path) else: if self.filename: input_filename = self.filename # Fetch the latest backup if filepath not specified else: self.logger.info("Finding latest backup") try: input_filename = self.storage.get_latest_backup( encrypted=self.decrypt, compressed=self.uncompress, content_type=self.content_type, database=database, servername=servername) except StorageError as err: raise CommandError(err.args[0]) input_file = self.read_from_storage(input_filename) return input_filename, input_file def _cleanup_old_backups(self, database=None, servername=None): """ Cleanup old backups, keeping the number of backups specified by DBBACKUP_CLEANUP_KEEP and any backups that occur on first of the month. """ self.storage.clean_old_backups(encrypted=self.encrypt, compressed=self.compress, content_type=self.content_type, database=database, servername=servername) django-dbbackup-3.3.0/dbbackup/management/__init__.py0000664000175000017500000000000013645400106022537 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/db/0000775000175000017500000000000013645400163016714 5ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/db/postgresql.py0000664000175000017500000001146613645400106021476 0ustar zuluzulu00000000000000from dbbackup import utils from .base import BaseCommandDBConnector class PgDumpConnector(BaseCommandDBConnector): """ PostgreSQL connector, it uses pg_dump`` to create an SQL text file and ``psql`` for restore it. """ extension = 'psql' dump_cmd = 'pg_dump' restore_cmd = 'psql' single_transaction = True drop = True def run_command(self, *args, **kwargs): if self.settings.get('PASSWORD'): env = kwargs.get('env', {}) env['PGPASSWORD'] = utils.get_escaped_command_arg(self.settings['PASSWORD']) kwargs['env'] = env return super(PgDumpConnector, self).run_command(*args, **kwargs) def _create_dump(self): cmd = '{} '.format(self.dump_cmd) if self.settings.get('HOST'): cmd += ' --host={}'.format(self.settings['HOST']) if self.settings.get('PORT'): cmd += ' --port={}'.format(self.settings['PORT']) if self.settings.get('USER'): cmd += ' --username={}'.format(self.settings['USER']) cmd += ' --no-password' for table in self.exclude: cmd += ' --exclude-table={}'.format(table) if self.drop: cmd += ' --clean' cmd += ' {}'.format(self.settings['NAME']) cmd = '{} {} {}'.format(self.dump_prefix, cmd, self.dump_suffix) stdout, stderr = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): cmd = '{} '.format(self.restore_cmd) if self.settings.get('HOST'): cmd += ' --host={}'.format(self.settings['HOST']) if self.settings.get('PORT'): cmd += ' --port={}'.format(self.settings['PORT']) if self.settings.get('USER'): cmd += ' --username={}'.format(self.settings['USER']) cmd += ' --no-password' # without this, psql terminates with an exit value of 0 regardless of errors cmd += ' --set ON_ERROR_STOP=on' if self.single_transaction: cmd += ' --single-transaction' cmd += ' {}'.format(self.settings['NAME']) cmd = '{} {} {}'.format(self.restore_prefix, cmd, self.restore_suffix) stdout, stderr = self.run_command(cmd, stdin=dump, env=self.restore_env) return stdout, stderr class PgDumpGisConnector(PgDumpConnector): """ PostgreGIS connector, same than :class:`PgDumpGisConnector` but enable postgis if not made. """ psql_cmd = 'psql' def _enable_postgis(self): cmd = '{} -c "CREATE EXTENSION IF NOT EXISTS postgis;"'.format( self.psql_cmd) cmd += ' --username={}'.format(self.settings['ADMIN_USER']) cmd += ' --no-password' if self.settings.get('HOST'): cmd += ' --host={}'.format(self.settings['HOST']) if self.settings.get('PORT'): cmd += ' --port={}'.format(self.settings['PORT']) return self.run_command(cmd) def _restore_dump(self, dump): if self.settings.get('ADMIN_USER'): self._enable_postgis() return super(PgDumpGisConnector, self)._restore_dump(dump) class PgDumpBinaryConnector(PgDumpConnector): """ PostgreSQL connector, it uses pg_dump`` to create an SQL text file and ``pg_restore`` for restore it. """ extension = 'psql.bin' dump_cmd = 'pg_dump' restore_cmd = 'pg_restore' single_transaction = True drop = True def _create_dump(self): cmd = '{} {}'.format(self.dump_cmd, self.settings['NAME']) if self.settings.get('HOST'): cmd += ' --host={}'.format(self.settings['HOST']) if self.settings.get('PORT'): cmd += ' --port={}'.format(self.settings['PORT']) if self.settings.get('USER'): cmd += ' --user={}'.format(self.settings['USER']) cmd += ' --no-password' cmd += ' --format=custom' for table in self.exclude: cmd += ' --exclude-table={}'.format(table) cmd = '{} {} {}'.format(self.dump_prefix, cmd, self.dump_suffix) stdout, stderr = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): cmd = '{} --dbname={}'.format(self.restore_cmd, self.settings['NAME']) if self.settings.get('HOST'): cmd += ' --host={}'.format(self.settings['HOST']) if self.settings.get('PORT'): cmd += ' --port={}'.format(self.settings['PORT']) if self.settings.get('USER'): cmd += ' --user={}'.format(self.settings['USER']) cmd += ' --no-password' if self.single_transaction: cmd += ' --single-transaction' if self.drop: cmd += ' --clean' cmd = '{} {} {}'.format(self.restore_prefix, cmd, self.restore_suffix) stdout, stderr = self.run_command(cmd, stdin=dump, env=self.restore_env) return stdout, stderr django-dbbackup-3.3.0/dbbackup/db/mysql.py0000664000175000017500000000345413645400106020436 0ustar zuluzulu00000000000000from dbbackup import utils from .base import BaseCommandDBConnector class MysqlDumpConnector(BaseCommandDBConnector): """ MySQL connector, creates dump with ``mysqldump`` and restore with ``mysql``. """ dump_cmd = 'mysqldump' restore_cmd = 'mysql' def _create_dump(self): cmd = '{} {} --quick'.format(self.dump_cmd, self.settings['NAME']) if self.settings.get('HOST'): cmd += ' --host={}'.format(self.settings['HOST']) if self.settings.get('PORT'): cmd += ' --port={}'.format(self.settings['PORT']) if self.settings.get('USER'): cmd += ' --user={}'.format(self.settings['USER']) if self.settings.get('PASSWORD'): cmd += ' --password={}'.format(utils.get_escaped_command_arg(self.settings['PASSWORD'])) for table in self.exclude: cmd += ' --ignore-table={}.{}'.format(self.settings['NAME'], table) cmd = '{} {} {}'.format(self.dump_prefix, cmd, self.dump_suffix) stdout, stderr = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): cmd = '{} {}'.format(self.restore_cmd, self.settings['NAME']) if self.settings.get('HOST'): cmd += ' --host={}'.format(self.settings['HOST']) if self.settings.get('PORT'): cmd += ' --port={}'.format(self.settings['PORT']) if self.settings.get('USER'): cmd += ' --user={}'.format(self.settings['USER']) if self.settings.get('PASSWORD'): cmd += ' --password={}'.format(utils.get_escaped_command_arg(self.settings['PASSWORD'])) cmd = '{} {} {}'.format(self.restore_prefix, cmd, self.restore_suffix) stdout, stderr = self.run_command(cmd, stdin=dump, env=self.restore_env) return stdout, stderr django-dbbackup-3.3.0/dbbackup/db/exceptions.py0000664000175000017500000000045713645400106021452 0ustar zuluzulu00000000000000"""Exceptions for database connectors.""" class ConnectorError(Exception): """Base connector error""" class DumpError(ConnectorError): """Error on dump""" class RestoreError(ConnectorError): """Error on restore""" class CommandConnectorError(ConnectorError): """Failing command""" django-dbbackup-3.3.0/dbbackup/db/base.py0000664000175000017500000001301713645400106020177 0ustar zuluzulu00000000000000""" Base database connectors """ import os import shlex from django.core.files.base import File from tempfile import SpooledTemporaryFile from subprocess import Popen from importlib import import_module from dbbackup import settings, utils from . import exceptions CONNECTOR_MAPPING = { 'django.db.backends.sqlite3': 'dbbackup.db.sqlite.SqliteConnector', 'django.db.backends.mysql': 'dbbackup.db.mysql.MysqlDumpConnector', 'django.db.backends.postgresql': 'dbbackup.db.postgresql.PgDumpConnector', 'django.db.backends.postgresql_psycopg2': 'dbbackup.db.postgresql.PgDumpConnector', 'django.db.backends.oracle': None, 'django_mongodb_engine': 'dbbackup.db.mongodb.MongoDumpConnector', 'djongo': 'dbbackup.db.mongodb.MongoDumpConnector', 'django.contrib.gis.db.backends.postgis': 'dbbackup.db.postgresql.PgDumpGisConnector', 'django.contrib.gis.db.backends.mysql': 'dbbackup.db.mysql.MysqlDumpConnector', 'django.contrib.gis.db.backends.oracle': None, 'django.contrib.gis.db.backends.spatialite': 'dbbackup.db.sqlite.SqliteConnector', } if settings.CUSTOM_CONNECTOR_MAPPING: CONNECTOR_MAPPING.update(settings.CUSTOM_CONNECTOR_MAPPING) def get_connector(database_name=None): """ Get a connector from its database key in setttings. """ from django.db import connections, DEFAULT_DB_ALIAS # Get DB database_name = database_name or DEFAULT_DB_ALIAS connection = connections[database_name] engine = connection.settings_dict['ENGINE'] connector_settings = settings.CONNECTORS.get(database_name, {}) connector_path = connector_settings.get('CONNECTOR', CONNECTOR_MAPPING[engine]) connector_module_path = '.'.join(connector_path.split('.')[:-1]) module = import_module(connector_module_path) connector_name = connector_path.split('.')[-1] connector = getattr(module, connector_name) return connector(database_name, **connector_settings) class BaseDBConnector(object): """ Base class for create database connector. This kind of object creates interaction with database and allow backup and restore operations. """ extension = 'dump' exclude = [] def __init__(self, database_name=None, **kwargs): from django.db import connections, DEFAULT_DB_ALIAS self.database_name = database_name or DEFAULT_DB_ALIAS self.connection = connections[self.database_name] for attr, value in kwargs.items(): setattr(self, attr.lower(), value) @property def settings(self): """Mix of database and connector settings.""" if not hasattr(self, '_settings'): sett = self.connection.settings_dict.copy() sett.update(settings.CONNECTORS.get(self.database_name, {})) self._settings = sett return self._settings def generate_filename(self, server_name=None): return utils.filename_generate(self.extension, self.database_name, server_name) def create_dump(self): dump = self._create_dump() return dump def _create_dump(self): """ Override this method to define dump creation. """ raise NotImplementedError("_create_dump not implemented") def restore_dump(self, dump): """ :param dump: Dump file :type dump: file """ result = self._restore_dump(dump) return result def _restore_dump(self, dump): """ Override this method to define dump creation. :param dump: Dump file :type dump: file """ raise NotImplementedError("_restore_dump not implemented") class BaseCommandDBConnector(BaseDBConnector): """ Base class for create database connector based on command line tools. """ dump_prefix = '' dump_suffix = '' restore_prefix = '' restore_suffix = '' use_parent_env = True env = {} dump_env = {} restore_env = {} def run_command(self, command, stdin=None, env=None): """ Launch a shell command line. :param command: Command line to launch :type command: str :param stdin: Standard input of command :type stdin: file :param env: Environment variable used in command :type env: dict :return: Standard output of command :rtype: file """ cmd = shlex.split(command) stdout = SpooledTemporaryFile(max_size=settings.TMP_FILE_MAX_SIZE, dir=settings.TMP_DIR) stderr = SpooledTemporaryFile(max_size=settings.TMP_FILE_MAX_SIZE, dir=settings.TMP_DIR) full_env = os.environ.copy() if self.use_parent_env else {} full_env.update(self.env) full_env.update(env or {}) try: if isinstance(stdin, File): process = Popen( cmd, stdin=stdin.open("rb"), stdout=stdout, stderr=stderr, env=full_env ) else: process = Popen(cmd, stdin=stdin, stdout=stdout, stderr=stderr, env=full_env) process.wait() if process.poll(): stderr.seek(0) raise exceptions.CommandConnectorError( "Error running: {}\n{}".format(command, stderr.read().decode('utf-8'))) stdout.seek(0) stderr.seek(0) return stdout, stderr except OSError as err: raise exceptions.CommandConnectorError( "Error running: {}\n{}".format(command, str(err))) django-dbbackup-3.3.0/dbbackup/db/__init__.py0000664000175000017500000000000013645400106021010 0ustar zuluzulu00000000000000django-dbbackup-3.3.0/dbbackup/db/mongodb.py0000664000175000017500000000354113645400106020713 0ustar zuluzulu00000000000000from dbbackup import utils from .base import BaseCommandDBConnector class MongoDumpConnector(BaseCommandDBConnector): """ MongoDB connector, creates dump with ``mongodump`` and restore with ``mongorestore``. """ dump_cmd = 'mongodump' restore_cmd = 'mongorestore' object_check = True drop = True def _create_dump(self): cmd = '{} --db {}'.format(self.dump_cmd, self.settings['NAME']) host = self.settings.get('HOST') or 'localhost' port = self.settings.get('PORT') or 27017 cmd += ' --host {}:{}'.format(host, port) if self.settings.get('USER'): cmd += ' --username {}'.format(self.settings['USER']) if self.settings.get('PASSWORD'): cmd += ' --password {}'.format(utils.get_escaped_command_arg(self.settings['PASSWORD'])) for collection in self.exclude: cmd += ' --excludeCollection {}'.format(collection) cmd += ' --archive' cmd = '{} {} {}'.format(self.dump_prefix, cmd, self.dump_suffix) stdout, stderr = self.run_command(cmd, env=self.dump_env) return stdout def _restore_dump(self, dump): cmd = self.restore_cmd host = self.settings.get('HOST') or 'localhost' port = self.settings.get('PORT') or 27017 cmd += ' --host {}:{}'.format(host, port) if self.settings.get('USER'): cmd += ' --username {}'.format(self.settings['USER']) if self.settings.get('PASSWORD'): cmd += ' --password {}'.format(utils.get_escaped_command_arg(self.settings['PASSWORD'])) if self.object_check: cmd += ' --objcheck' if self.drop: cmd += ' --drop' cmd += ' --archive' cmd = '{} {} {}'.format(self.restore_prefix, cmd, self.restore_suffix) return self.run_command(cmd, stdin=dump, env=self.restore_env) django-dbbackup-3.3.0/dbbackup/db/sqlite.py0000664000175000017500000000704513645400106020572 0ustar zuluzulu00000000000000from __future__ import unicode_literals import warnings from tempfile import SpooledTemporaryFile from shutil import copyfileobj from django.db import IntegrityError, OperationalError from six import BytesIO from .base import BaseDBConnector DUMP_TABLES = """ SELECT "name", "type", "sql" FROM "sqlite_master" WHERE "sql" NOT NULL AND "type" == 'table' ORDER BY "name" """ DUMP_ETC = """ SELECT "name", "type", "sql" FROM "sqlite_master" WHERE "sql" NOT NULL AND "type" IN ('index', 'trigger', 'view') """ class SqliteConnector(BaseDBConnector): """ Create a dump at SQL layer like could make ``.dumps`` in sqlite3. Restore by evaluate the created SQL. """ def _write_dump(self, fileobj): cursor = self.connection.cursor() cursor.execute(DUMP_TABLES) for table_name, type, sql in cursor.fetchall(): if table_name.startswith('sqlite_') or table_name in self.exclude: continue elif sql.startswith('CREATE TABLE'): sql = sql.replace('CREATE TABLE', 'CREATE TABLE IF NOT EXISTS') # Make SQL commands in 1 line sql = sql.replace('\n ', '') sql = sql.replace('\n)', ')') fileobj.write("{};\n".format(sql).encode('UTF-8')) else: fileobj.write("{};\n".format(sql)) table_name_ident = table_name.replace('"', '""') res = cursor.execute('PRAGMA table_info("{0}")'.format(table_name_ident)) column_names = [str(table_info[1]) for table_info in res.fetchall()] q = """SELECT 'INSERT INTO "{0}" VALUES({1})' FROM "{0}";\n""".format( table_name_ident, ",".join("""'||quote("{0}")||'""".format(col.replace('"', '""')) for col in column_names)) query_res = cursor.execute(q) for row in query_res: fileobj.write("{};\n".format(row[0]).encode('UTF-8')) schema_res = cursor.execute(DUMP_ETC) for name, type, sql in schema_res.fetchall(): if sql.startswith("CREATE INDEX"): sql = sql.replace('CREATE INDEX', 'CREATE INDEX IF NOT EXISTS') fileobj.write('{};\n'.format(sql).encode('UTF-8')) cursor.close() def create_dump(self): if not self.connection.is_usable(): self.connection.connect() dump_file = SpooledTemporaryFile(max_size=10 * 1024 * 1024) self._write_dump(dump_file) dump_file.seek(0) return dump_file def restore_dump(self, dump): if not self.connection.is_usable(): self.connection.connect() cursor = self.connection.cursor() for line in dump.readlines(): try: cursor.execute(line.decode('UTF-8')) except OperationalError as err: warnings.warn("Error in db restore: {}".format(err)) except IntegrityError as err: warnings.warn("Error in db restore: {}".format(err)) class SqliteCPConnector(BaseDBConnector): """ Create a dump by copy the binary data file. Restore by simply copy to the good location. """ def create_dump(self): path = self.connection.settings_dict['NAME'] dump = BytesIO() with open(path, 'rb') as db_file: copyfileobj(db_file, dump) dump.seek(0) return dump def restore_dump(self, dump): path = self.connection.settings_dict['NAME'] with open(path, 'wb') as db_file: copyfileobj(dump, db_file) django-dbbackup-3.3.0/MANIFEST.in0000664000175000017500000000021013645400106016300 0ustar zuluzulu00000000000000include requirements*.txt include README.rst include LICENSE.txt recursive-include dbbackup/tests/testapp/blobs/ *.gpg *.txt *.gz *.tar