django-background-tasks-1.1.11/0000777000000000000000000000000013142344323016236 5ustar rootroot00000000000000django-background-tasks-1.1.11/README.md0000666000000000000000000000272213142344323017520 0ustar rootroot00000000000000# Django Background Tasks [![Build Status](https://travis-ci.org/arteria/django-background-tasks.svg?branch=master)](https://travis-ci.org/arteria/django-background-tasks) [![Coverage Status](https://coveralls.io/repos/arteria/django-background-tasks/badge.svg?branch=master&service=github)](https://coveralls.io/github/arteria/django-background-tasks?branch=master) [![Documentation Status](https://readthedocs.org/projects/django-background-tasks/badge/?version=latest)](http://django-background-tasks.readthedocs.io/en/latest/?badge=latest) [![PyPI](https://img.shields.io/pypi/v/django-background-tasks.svg)](https://pypi.python.org/pypi/django-background-tasks) Django Background Task is a databased-backed work queue for Django, loosely based around [Ruby's DelayedJob](https://github.com/tobi/delayed_job) library. This project was adopted and adapted from [lilspikey](https://github.com/lilspikey/) django-background-task. To avoid conflicts on PyPI we renamed it to django-background-tasks (plural). For an easy upgrade from django-background-task to django-background-tasks, the internal module structure were left untouched. In Django Background Task, all tasks are implemented as functions (or any other callable). There are two parts to using background tasks: * creating the task functions and registering them with the scheduler * setup a cron task (or long running process) to execute the tasks ## Docs See http://django-background-tasks.readthedocs.io/en/latest/. django-background-tasks-1.1.11/setup.py0000666000000000000000000000135613142344323017755 0ustar rootroot00000000000000from setuptools import setup, find_packages import codecs version = __import__('background_task').__version__ classifiers = [c for c in open('classifiers').read().splitlines() if '#' not in c] setup( name='django-background-tasks', version=version, description='Database backed asynchronous task queue', long_description=codecs.open('README.md', encoding='utf-8').read(), author='arteria GmbH, John Montgomery', author_email='admin@arteria.ch', url='http://github.com/arteria/django-background-tasks', license='BSD', packages=find_packages(exclude=['ez_setup']), include_package_data=True, install_requires=open('requirements.txt').read().splitlines(), zip_safe=True, classifiers=classifiers, ) django-background-tasks-1.1.11/requirements.txt0000666000000000000000000000003213142344323021515 0ustar rootroot00000000000000django-compat>=1.0.13 six django-background-tasks-1.1.11/LICENSE0000666000000000000000000000305313142344323017244 0ustar rootroot00000000000000Copyright (c) 2015, arteria GmbH. Copyright (c) 2010, John Montgomery. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of Django Bakground Task nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.django-background-tasks-1.1.11/AUTHORS.txt0000666000000000000000000000064513142344323020131 0ustar rootroot00000000000000Contributors * John Montgomery (lilspikey & johnsensible, initiator) * Yannik Ammann (yannik-ammann) * Luthaf (luthaf) * Philippe O. Wagner (philippeowagner) * weijia (weijia) * tdruez (tdruez) * Chad G. Hansen (chadgh) * Grant McConnaughey (grantmcconnaughey) * James Mason (bear454) * Pavel Zagrebelin (Zagrebelin) * Stephen Brown (december1981) * Adam Johnson (adamchainz) * (kherrett) Your name could stand here :) django-background-tasks-1.1.11/PKG-INFO0000666000000000000000000000511613142344323017336 0ustar rootroot00000000000000Metadata-Version: 1.1 Name: django-background-tasks Version: 1.1.11 Summary: Database backed asynchronous task queue Home-page: http://github.com/arteria/django-background-tasks Author: arteria GmbH, John Montgomery Author-email: admin@arteria.ch License: BSD Description: # Django Background Tasks [![Build Status](https://travis-ci.org/arteria/django-background-tasks.svg?branch=master)](https://travis-ci.org/arteria/django-background-tasks) [![Coverage Status](https://coveralls.io/repos/arteria/django-background-tasks/badge.svg?branch=master&service=github)](https://coveralls.io/github/arteria/django-background-tasks?branch=master) [![Documentation Status](https://readthedocs.org/projects/django-background-tasks/badge/?version=latest)](http://django-background-tasks.readthedocs.io/en/latest/?badge=latest) [![PyPI](https://img.shields.io/pypi/v/django-background-tasks.svg)](https://pypi.python.org/pypi/django-background-tasks) Django Background Task is a databased-backed work queue for Django, loosely based around [Ruby's DelayedJob](https://github.com/tobi/delayed_job) library. This project was adopted and adapted from [lilspikey](https://github.com/lilspikey/) django-background-task. To avoid conflicts on PyPI we renamed it to django-background-tasks (plural). For an easy upgrade from django-background-task to django-background-tasks, the internal module structure were left untouched. In Django Background Task, all tasks are implemented as functions (or any other callable). There are two parts to using background tasks: * creating the task functions and registering them with the scheduler * setup a cron task (or long running process) to execute the tasks ## Docs See http://django-background-tasks.readthedocs.io/en/latest/. Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Web Environment Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Framework :: Django Classifier: Framework :: Django :: 1.4 Classifier: Framework :: Django :: 1.5 Classifier: Framework :: Django :: 1.6 Classifier: Framework :: Django :: 1.7 Classifier: Framework :: Django :: 1.8 Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 3 django-background-tasks-1.1.11/background_task/0000777000000000000000000000000013142344323021377 5ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/models_completed.py0000666000000000000000000001013013142344323025263 0ustar rootroot00000000000000# -*- coding: utf-8 -*- import os from compat.models import GenericForeignKey from django.contrib.contenttypes.models import ContentType from django.db import models from django.utils import timezone from django.utils.six import python_2_unicode_compatible from background_task.models import Task class CompletedTaskQuerySet(models.QuerySet): def created_by(self, creator): """ :return: A CompletedTask queryset filtered by creator """ content_type = ContentType.objects.get_for_model(creator) return self.filter( creator_content_type=content_type, creator_object_id=creator.id, ) def failed(self, within=None): """ :param within: A timedelta object :return: A queryset of CompletedTasks that failed within the given timeframe (e.g. less than 1h ago) """ qs = self.filter( failed_at__isnull=False, ) if within: time_limit = timezone.now() - within qs = qs.filter(failed_at__gt=time_limit) return qs def succeeded(self, within=None): """ :param within: A timedelta object :return: A queryset of CompletedTasks that completed successfully within the given timeframe (e.g. less than 1h ago) """ qs = self.filter( failed_at__isnull=True, ) if within: time_limit = timezone.now() - within qs = qs.filter(run_at__gt=time_limit) return qs @python_2_unicode_compatible class CompletedTask(models.Model): # the "name" of the task/function to be run task_name = models.CharField(max_length=255, db_index=True) # the json encoded parameters to pass to the task task_params = models.TextField() # a sha1 hash of the name and params, to lookup already scheduled tasks task_hash = models.CharField(max_length=40, db_index=True) verbose_name = models.CharField(max_length=255, null=True, blank=True) # what priority the task has priority = models.IntegerField(default=0, db_index=True) # when the task should be run run_at = models.DateTimeField(db_index=True) repeat = models.BigIntegerField(choices=Task.REPEAT_CHOICES, default=Task.NEVER) repeat_until = models.DateTimeField(null=True, blank=True) # the "name" of the queue this is to be run on queue = models.CharField(max_length=255, db_index=True, null=True, blank=True) # how many times the task has been tried attempts = models.IntegerField(default=0, db_index=True) # when the task last failed failed_at = models.DateTimeField(db_index=True, null=True, blank=True) # details of the error that occurred last_error = models.TextField(blank=True) # details of who's trying to run the task at the moment locked_by = models.CharField(max_length=64, db_index=True, null=True, blank=True) locked_at = models.DateTimeField(db_index=True, null=True, blank=True) creator_content_type = models.ForeignKey( ContentType, null=True, blank=True, related_name='completed_background_task', on_delete=models.CASCADE ) creator_object_id = models.PositiveIntegerField(null=True, blank=True) creator = GenericForeignKey('creator_content_type', 'creator_object_id') objects = CompletedTaskQuerySet.as_manager() def locked_by_pid_running(self): """ Check if the locked_by process is still running. """ if self.locked_by: try: # won't kill the process. kill is a bad named system call os.kill(int(self.locked_by), 0) return True except: return False else: return None locked_by_pid_running.boolean = True def has_error(self): """ Check if the last_error field is empty. """ return bool(self.last_error) has_error.boolean = True def __str__(self): return u'{} - {}'.format( self.verbose_name or self.task_name, self.run_at, ) django-background-tasks-1.1.11/background_task/exceptions.py0000666000000000000000000000027513142344323024136 0ustar rootroot00000000000000# -*- coding: utf-8 -*- class BackgroundTaskError(Exception): def __init__(self, message, errors=None): super(Exception, self).__init__(message) self.errors = errors django-background-tasks-1.1.11/background_task/apps.py0000666000000000000000000000045513142344323022720 0ustar rootroot00000000000000from django.apps import AppConfig class BackgroundTasksAppConfig(AppConfig): name = 'background_task' from background_task import __version__ as version_info verbose_name = 'Background Tasks ({})'.format(version_info) def ready(self): import background_task.signals # noqa django-background-tasks-1.1.11/background_task/tasks.py0000666000000000000000000002442213142344323023102 0ustar rootroot00000000000000# -*- coding: utf-8 -*- from __future__ import unicode_literals from datetime import datetime, timedelta from importlib import import_module from multiprocessing.pool import ThreadPool import logging import os import sys from compat import atomic from django.utils import timezone from django.utils.six import python_2_unicode_compatible from background_task.exceptions import BackgroundTaskError from background_task.models import Task from background_task.settings import app_settings from background_task import signals logger = logging.getLogger(__name__) def bg_runner(proxy_task, task=None, *args, **kwargs): """ Executes the function attached to task. Used to enable threads. If a Task instance is provided, args and kwargs are ignored and retrieved from the Task itself. """ signals.task_started.send(Task) try: func = getattr(proxy_task, 'task_function', None) if isinstance(task, Task): args, kwargs = task.params() else: task_name = getattr(proxy_task, 'name', None) task_queue = getattr(proxy_task, 'queue', None) task_qs = Task.objects.get_task(task_name=task_name, args=args, kwargs=kwargs) if task_queue: task_qs = task_qs.filter(queue=task_queue) if task_qs: task = task_qs[0] if func is None: raise BackgroundTaskError("Function is None, can't execute!") func(*args, **kwargs) if task: # task done, so can delete it task.increment_attempts() completed = task.create_completed_task() signals.task_successful.send(sender=task.__class__, task_id=task.id, completed_task=completed) task.create_repetition() task.delete() logger.info('Ran task and deleting %s', task) except Exception as ex: t, e, traceback = sys.exc_info() if task: logger.error('Rescheduling %s', task, exc_info=(t, e, traceback)) signals.task_error.send(sender=ex.__class__, task=task) task.reschedule(t, e, traceback) del traceback signals.task_finished.send(Task) class PoolRunner: def __init__(self, bg_runner, num_processes): self._bg_runner = bg_runner self._num_processes = num_processes _pool_instance = None @property def _pool(self): if not self._pool_instance: self._pool_instance = ThreadPool(processes=self._num_processes) return self._pool_instance def run(self, proxy_task, task=None, *args, **kwargs): self._pool.apply_async(func=self._bg_runner, args=(proxy_task, task) + tuple(args), kwds=kwargs) __call__ = run class Tasks(object): def __init__(self): self._tasks = {} self._runner = DBTaskRunner() self._task_proxy_class = TaskProxy self._bg_runner = bg_runner self._pool_runner = PoolRunner(bg_runner, app_settings.BACKGROUND_TASK_ASYNC_THREADS) def background(self, name=None, schedule=None, queue=None): ''' decorator to turn a regular function into something that gets run asynchronously in the background, at a later time ''' # see if used as simple decorator # where first arg is the function to be decorated fn = None if name and callable(name): fn = name name = None def _decorator(fn): _name = name if not _name: _name = '%s.%s' % (fn.__module__, fn.__name__) proxy = self._task_proxy_class(_name, fn, schedule, queue, self._runner) self._tasks[_name] = proxy return proxy if fn: return _decorator(fn) return _decorator def run_task(self, task_name, args=None, kwargs=None): # task_name can be either the name of a task or a Task instance. if isinstance(task_name, Task): task = task_name task_name = task.task_name # When we have a Task instance we do not need args and kwargs, but they are kept for backward compatibility args = [] kwargs = {} else: task = None proxy_task = self._tasks[task_name] if app_settings.BACKGROUND_TASK_RUN_ASYNC: self._pool_runner(proxy_task, task, *args, **kwargs) else: self._bg_runner(proxy_task, task, *args, **kwargs) def run_next_task(self, queue=None): return self._runner.run_next_task(self, queue) class TaskSchedule(object): SCHEDULE = 0 RESCHEDULE_EXISTING = 1 CHECK_EXISTING = 2 def __init__(self, run_at=None, priority=None, action=None): self._run_at = run_at self._priority = priority self._action = action @classmethod def create(self, schedule): if isinstance(schedule, TaskSchedule): return schedule priority = None run_at = None action = None if schedule: if isinstance(schedule, (int, timedelta, datetime)): run_at = schedule else: run_at = schedule.get('run_at', None) priority = schedule.get('priority', None) action = schedule.get('action', None) return TaskSchedule(run_at=run_at, priority=priority, action=action) def merge(self, schedule): params = {} for name in ['run_at', 'priority', 'action']: attr_name = '_%s' % name value = getattr(self, attr_name, None) if value is None: params[name] = getattr(schedule, attr_name, None) else: params[name] = value return TaskSchedule(**params) @property def run_at(self): run_at = self._run_at or timezone.now() if isinstance(run_at, int): run_at = timezone.now() + timedelta(seconds=run_at) if isinstance(run_at, timedelta): run_at = timezone.now() + run_at return run_at @property def priority(self): return self._priority or 0 @property def action(self): return self._action or TaskSchedule.SCHEDULE def __repr__(self): return 'TaskSchedule(run_at=%s, priority=%s)' % (self._run_at, self._priority) def __eq__(self, other): return self._run_at == other._run_at \ and self._priority == other._priority \ and self._action == other._action class DBTaskRunner(object): ''' Encapsulate the model related logic in here, in case we want to support different queues in the future ''' def __init__(self): self.worker_name = str(os.getpid()) def schedule(self, task_name, args, kwargs, run_at=None, priority=0, action=TaskSchedule.SCHEDULE, queue=None, verbose_name=None, creator=None, repeat=None, repeat_until=None): '''Simply create a task object in the database''' task = Task.objects.new_task(task_name, args, kwargs, run_at, priority, queue, verbose_name, creator, repeat, repeat_until) if action != TaskSchedule.SCHEDULE: task_hash = task.task_hash now = timezone.now() unlocked = Task.objects.unlocked(now) existing = unlocked.filter(task_hash=task_hash) if queue: existing = existing.filter(queue=queue) if action == TaskSchedule.RESCHEDULE_EXISTING: updated = existing.update(run_at=run_at, priority=priority) if updated: return elif action == TaskSchedule.CHECK_EXISTING: if existing.count(): return task.save() signals.task_created.send(sender=self.__class__, task=task) return task @atomic def get_task_to_run(self, tasks, queue=None): available_tasks = [task for task in Task.objects.find_available(queue) if task.task_name in tasks._tasks][:5] for task in available_tasks: # try to lock task locked_task = task.lock(self.worker_name) if locked_task: return locked_task return None @atomic def run_task(self, tasks, task): logger.info('Running %s', task) tasks.run_task(task) @atomic def run_next_task(self, tasks, queue=None): # we need to commit to make sure # we can see new tasks as they arrive task = self.get_task_to_run(tasks, queue) # transaction.commit() if task: self.run_task(tasks, task) # transaction.commit() return True else: return False @python_2_unicode_compatible class TaskProxy(object): def __init__(self, name, task_function, schedule, queue, runner): self.name = name self.now = self.task_function = task_function self.runner = runner self.schedule = TaskSchedule.create(schedule) self.queue = queue def __call__(self, *args, **kwargs): schedule = kwargs.pop('schedule', None) schedule = TaskSchedule.create(schedule).merge(self.schedule) run_at = schedule.run_at priority = kwargs.pop('priority', schedule.priority) action = schedule.action queue = kwargs.pop('queue', self.queue) verbose_name = kwargs.pop('verbose_name', None) creator = kwargs.pop('creator', None) repeat = kwargs.pop('repeat', None) repeat_until = kwargs.pop('repeat_until', None) return self.runner.schedule(self.name, args, kwargs, run_at, priority, action, queue, verbose_name, creator, repeat, repeat_until) def __str__(self): return 'TaskProxy(%s)' % self.name tasks = Tasks() def autodiscover(): """ Autodiscover tasks.py files in much the same way as admin app """ import imp from django.conf import settings for app in settings.INSTALLED_APPS: try: app_path = import_module(app).__path__ except (AttributeError, ImportError): continue try: imp.find_module('tasks', app_path) except ImportError: continue import_module("%s.tasks" % app) django-background-tasks-1.1.11/background_task/signals.py0000666000000000000000000000224613142344323023415 0ustar rootroot00000000000000# -*- coding: utf-8 -*- import django.dispatch from django.db import connections from background_task.settings import app_settings task_created = django.dispatch.Signal(providing_args=['task']) task_error = django.dispatch.Signal(providing_args=['task']) task_rescheduled = django.dispatch.Signal(providing_args=['task']) task_failed = django.dispatch.Signal(providing_args=['task_id', 'completed_task']) task_successful = django.dispatch.Signal(providing_args=['task_id', 'completed_task']) task_started = django.dispatch.Signal() task_finished = django.dispatch.Signal() # Register an event to reset saved queries when a Task is started. def reset_queries(**kwargs): if app_settings.BACKGROUND_TASK_RUN_ASYNC: for conn in connections.all(): conn.queries_log.clear() task_started.connect(reset_queries) # Register an event to reset transaction state and close connections past # their lifetime. def close_old_connections(**kwargs): if app_settings.BACKGROUND_TASK_RUN_ASYNC: for conn in connections.all(): conn.close_if_unusable_or_obsolete() task_started.connect(close_old_connections) task_finished.connect(close_old_connections) django-background-tasks-1.1.11/background_task/utils.py0000666000000000000000000000150513142344323023112 0ustar rootroot00000000000000# -*- coding: utf-8 -*- import signal import platform TTW_SLOW = [0.5, 1.5] TTW_FAST = [0.0, 0.1] class SignalManager(object): """Manages POSIX signals.""" kill_now = False time_to_wait = TTW_SLOW def __init__(self): # Temporary workaround for signals not available on Windows if platform.system() == 'Windows': signal.signal(signal.SIGTERM, self.exit_gracefully) else: signal.signal(signal.SIGTSTP, self.exit_gracefully) signal.signal(signal.SIGUSR1, self.speed_up) signal.signal(signal.SIGUSR2, self.slow_down) def exit_gracefully(self, signum, frame): self.kill_now = True def speed_up(self, signum, frame): self.time_to_wait = TTW_FAST def slow_down(self, signum, frame): self.time_to_wait = TTW_SLOW django-background-tasks-1.1.11/background_task/models.py0000666000000000000000000002525613142344323023246 0ustar rootroot00000000000000# -*- coding: utf-8 -*- from datetime import timedelta from hashlib import sha1 import json import logging import os import traceback from compat import StringIO from compat.models import GenericForeignKey from django.contrib.contenttypes.models import ContentType from django.db import models from django.db.models import Q from django.utils import timezone from django.utils.six import python_2_unicode_compatible from background_task.settings import app_settings from background_task.signals import task_failed, task_rescheduled logger = logging.getLogger(__name__) # inspired by http://github.com/tobi/delayed_job # class TaskQuerySet(models.QuerySet): def created_by(self, creator): """ :return: A Task queryset filtered by creator """ content_type = ContentType.objects.get_for_model(creator) return self.filter( creator_content_type=content_type, creator_object_id=creator.id, ) class TaskManager(models.Manager): def get_queryset(self): return TaskQuerySet(self.model, using=self._db) def created_by(self, creator): return self.get_queryset().created_by(creator) def find_available(self, queue=None): now = timezone.now() qs = self.unlocked(now) if queue: qs = qs.filter(queue=queue) ready = qs.filter(run_at__lte=now, failed_at=None) _priority_ordering = '{}priority'.format(app_settings.BACKGROUND_TASK_PRIORITY_ORDERING) ready = ready.order_by(_priority_ordering, 'run_at') if app_settings.BACKGROUND_TASK_RUN_ASYNC: currently_locked = self.locked(now).count() count = app_settings.BACKGROUND_TASK_ASYNC_THREADS - currently_locked if count > 0: ready = ready[:count] else: ready = self.none() return ready def unlocked(self, now): max_run_time = app_settings.BACKGROUND_TASK_MAX_RUN_TIME qs = self.get_queryset() expires_at = now - timedelta(seconds=max_run_time) unlocked = Q(locked_by=None) | Q(locked_at__lt=expires_at) return qs.filter(unlocked) def locked(self, now): max_run_time = app_settings.BACKGROUND_TASK_MAX_RUN_TIME qs = self.get_queryset() expires_at = now - timedelta(seconds=max_run_time) locked = Q(locked_by__isnull=False) | Q(locked_at__gt=expires_at) return qs.filter(locked) def new_task(self, task_name, args=None, kwargs=None, run_at=None, priority=0, queue=None, verbose_name=None, creator=None, repeat=None, repeat_until=None): args = args or () kwargs = kwargs or {} if run_at is None: run_at = timezone.now() task_params = json.dumps((args, kwargs), sort_keys=True) s = "%s%s" % (task_name, task_params) task_hash = sha1(s.encode('utf-8')).hexdigest() return Task(task_name=task_name, task_params=task_params, task_hash=task_hash, priority=priority, run_at=run_at, queue=queue, verbose_name=verbose_name, creator=creator, repeat=repeat or Task.NEVER, repeat_until=repeat_until, ) def get_task(self, task_name, args=None, kwargs=None): args = args or () kwargs = kwargs or {} task_params = json.dumps((args, kwargs), sort_keys=True) s = "%s%s" % (task_name, task_params) task_hash = sha1(s.encode('utf-8')).hexdigest() qs = self.get_queryset() return qs.filter(task_hash=task_hash) def drop_task(self, task_name, args=None, kwargs=None): return self.get_task(task_name, args, kwargs).delete() @python_2_unicode_compatible class Task(models.Model): # the "name" of the task/function to be run task_name = models.CharField(max_length=255, db_index=True) # the json encoded parameters to pass to the task task_params = models.TextField() # a sha1 hash of the name and params, to lookup already scheduled tasks task_hash = models.CharField(max_length=40, db_index=True) verbose_name = models.CharField(max_length=255, null=True, blank=True) # what priority the task has priority = models.IntegerField(default=0, db_index=True) # when the task should be run run_at = models.DateTimeField(db_index=True) # Repeat choices are encoded as number of seconds # The repeat implementation is based on this encoding HOURLY = 3600 DAILY = 24 * HOURLY WEEKLY = 7 * DAILY EVERY_2_WEEKS = 2 * WEEKLY EVERY_4_WEEKS = 4 * WEEKLY NEVER = 0 REPEAT_CHOICES = ( (HOURLY, 'hourly'), (DAILY, 'daily'), (WEEKLY, 'weekly'), (EVERY_2_WEEKS, 'every 2 weeks'), (EVERY_4_WEEKS, 'every 4 weeks'), (NEVER, 'never'), ) repeat = models.BigIntegerField(choices=REPEAT_CHOICES, default=NEVER) repeat_until = models.DateTimeField(null=True, blank=True) # the "name" of the queue this is to be run on queue = models.CharField(max_length=255, db_index=True, null=True, blank=True) # how many times the task has been tried attempts = models.IntegerField(default=0, db_index=True) # when the task last failed failed_at = models.DateTimeField(db_index=True, null=True, blank=True) # details of the error that occurred last_error = models.TextField(blank=True) # details of who's trying to run the task at the moment locked_by = models.CharField(max_length=64, db_index=True, null=True, blank=True) locked_at = models.DateTimeField(db_index=True, null=True, blank=True) creator_content_type = models.ForeignKey( ContentType, null=True, blank=True, related_name='background_task', on_delete=models.CASCADE ) creator_object_id = models.PositiveIntegerField(null=True, blank=True) creator = GenericForeignKey('creator_content_type', 'creator_object_id') objects = TaskManager() def locked_by_pid_running(self): """ Check if the locked_by process is still running. """ if self.locked_by: try: # won't kill the process. kill is a bad named system call os.kill(int(self.locked_by), 0) return True except: return False else: return None locked_by_pid_running.boolean = True def has_error(self): """ Check if the last_error field is empty. """ return bool(self.last_error) has_error.boolean = True def params(self): args, kwargs = json.loads(self.task_params) # need to coerce kwargs keys to str kwargs = dict((str(k), v) for k, v in kwargs.items()) return args, kwargs def lock(self, locked_by): now = timezone.now() unlocked = Task.objects.unlocked(now).filter(pk=self.pk) updated = unlocked.update(locked_by=locked_by, locked_at=now) if updated: return Task.objects.get(pk=self.pk) return None def _extract_error(self, type, err, tb): file = StringIO() traceback.print_exception(type, err, tb, None, file) return file.getvalue() def increment_attempts(self): self.attempts += 1 self.save() def has_reached_max_attempts(self): max_attempts = app_settings.BACKGROUND_TASK_MAX_ATTEMPTS return self.attempts >= max_attempts def is_repeating_task(self): return self.repeat > self.NEVER def reschedule(self, type, err, traceback): ''' Set a new time to run the task in future, or create a CompletedTask and delete the Task if it has reached the maximum of allowed attempts ''' self.last_error = self._extract_error(type, err, traceback) self.increment_attempts() if self.has_reached_max_attempts(): self.failed_at = timezone.now() logger.warning('Marking task %s as failed', self) completed = self.create_completed_task() task_failed.send(sender=self.__class__, task_id=self.id, completed_task=completed) self.delete() else: backoff = timedelta(seconds=(self.attempts ** 4) + 5) self.run_at = timezone.now() + backoff logger.warning('Rescheduling task %s for %s later at %s', self, backoff, self.run_at) task_rescheduled.send(sender=self.__class__, task=self) self.locked_by = None self.locked_at = None self.save() def create_completed_task(self): ''' Returns a new CompletedTask instance with the same values ''' from background_task.models_completed import CompletedTask completed_task = CompletedTask( task_name=self.task_name, task_params=self.task_params, task_hash=self.task_hash, priority=self.priority, run_at=timezone.now(), queue=self.queue, attempts=self.attempts, failed_at=self.failed_at, last_error=self.last_error, locked_by=self.locked_by, locked_at=self.locked_at, verbose_name=self.verbose_name, creator=self.creator, repeat=self.repeat, repeat_until=self.repeat_until, ) completed_task.save() return completed_task def create_repetition(self): """ :return: A new Task with an offset of self.repeat, or None if the self.repeat_until is reached """ if not self.is_repeating_task(): return None if self.repeat_until and self.repeat_until <= timezone.now(): # Repeat chain completed return None args, kwargs = self.params() new_run_at = self.run_at + timedelta(seconds=self.repeat) new_task = TaskManager().new_task( task_name=self.task_name, args=args, kwargs=kwargs, run_at=new_run_at, priority=self.priority, queue=self.queue, verbose_name=self.verbose_name, creator=self.creator, repeat=self.repeat, repeat_until=self.repeat_until, ) new_task.save() return new_task def save(self, *arg, **kw): # force NULL rather than empty string self.locked_by = self.locked_by or None return super(Task, self).save(*arg, **kw) def __str__(self): return u'{}'.format(self.verbose_name or self.task_name) class Meta: db_table = 'background_task' django-background-tasks-1.1.11/background_task/tests/0000777000000000000000000000000013142344323022541 5ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/tests/test_settings_async.py0000666000000000000000000000012713142344323027207 0ustar rootroot00000000000000# -*- coding: utf-8 -*- from .test_settings import * BACKGROUND_TASK_RUN_ASYNC = True django-background-tasks-1.1.11/background_task/tests/test_tasks.py0000666000000000000000000007253613142344323025314 0ustar rootroot00000000000000# -*- coding: utf-8 -*- import time from datetime import timedelta, datetime from mock import patch from django.contrib.auth.models import User from django.test import override_settings from django.test.testcases import TransactionTestCase from django.conf import settings from django.utils import timezone from background_task.tasks import tasks, TaskSchedule, TaskProxy from background_task.models import Task from background_task.models_completed import CompletedTask from background_task import background from background_task.settings import app_settings _recorded = [] def mocked_run_task(name, args=None, kwargs=None): """ We mock tasks.run_task to give other threads some time to update the database. Otherwise we run into a locked database. """ val = tasks.run_task(name, args, kwargs) if app_settings.BACKGROUND_TASK_RUN_ASYNC: time.sleep(1) return val def mocked_run_next_task(queue=None): """ We mock tasks.mocked_run_next_task to give other threads some time to update the database. Otherwise we run into a locked database. """ val = tasks.run_next_task(queue) if app_settings.BACKGROUND_TASK_RUN_ASYNC: time.sleep(1) return val run_task = mocked_run_task run_next_task = mocked_run_next_task def empty_task(): pass def record_task(*arg, **kw): _recorded.append((arg, kw)) class TestBackgroundDecorator(TransactionTestCase): def test_get_proxy(self): proxy = tasks.background()(empty_task) self.assertNotEqual(proxy, empty_task) self.assertTrue(isinstance(proxy, TaskProxy)) # and alternate form proxy = tasks.background(empty_task) self.assertNotEqual(proxy, empty_task) self.assertTrue(isinstance(proxy, TaskProxy)) def test_default_name(self): proxy = tasks.background()(empty_task) self.assertEqual(proxy.name, 'background_task.tests.test_tasks.empty_task') proxy = tasks.background()(record_task) self.assertEqual(proxy.name, 'background_task.tests.test_tasks.record_task') proxy = tasks.background(empty_task) # print proxy self.assertTrue(isinstance(proxy, TaskProxy)) self.assertEqual(proxy.name, 'background_task.tests.test_tasks.empty_task') def test_specified_name(self): proxy = tasks.background(name='mytask')(empty_task) self.assertEqual(proxy.name, 'mytask') def test_task_function(self): proxy = tasks.background()(empty_task) self.assertEqual(proxy.task_function, empty_task) proxy = tasks.background()(record_task) self.assertEqual(proxy.task_function, record_task) def test_default_schedule(self): proxy = tasks.background()(empty_task) self.assertEqual(TaskSchedule(), proxy.schedule) def test_schedule(self): proxy = tasks.background(schedule=10)(empty_task) self.assertEqual(TaskSchedule(run_at=10), proxy.schedule) def test_str(self): proxy = tasks.background()(empty_task) self.assertEqual( u'TaskProxy(background_task.tests.test_tasks.empty_task)', str(proxy) ) def test_shortcut(self): '''check shortcut to decorator works''' proxy = background()(empty_task) self.failIfEqual(proxy, empty_task) self.assertEqual(proxy.task_function, empty_task) def test_launch_sync(self): ''' Check launch original function in synchronous mode ''' @background def add(x, y): return x + y t = Task.objects.count() ct = CompletedTask.objects.count() answer = add.now(2, 3) self.assertEqual(answer, 5) self.assertEqual(Task.objects.count(), t, 'Task was created') self.assertEqual(CompletedTask.objects.count(), ct, 'Completed task was created') class TestTaskProxy(TransactionTestCase): def setUp(self): super(TestTaskProxy, self).setUp() self.proxy = tasks.background()(record_task) def test_run_task(self): run_task(self.proxy.name, [], {}) self.assertEqual(((), {}), _recorded.pop()) run_task(self.proxy.name, ['hi'], {}) self.assertEqual((('hi',), {}), _recorded.pop()) run_task(self.proxy.name, [], {'kw': 1}) self.assertEqual(((), {'kw': 1}), _recorded.pop()) class TestTaskSchedule(TransactionTestCase): def test_priority(self): self.assertEqual(0, TaskSchedule().priority) self.assertEqual(0, TaskSchedule(priority=0).priority) self.assertEqual(1, TaskSchedule(priority=1).priority) self.assertEqual(2, TaskSchedule(priority=2).priority) def _within_one_second(self, d1, d2): self.failUnless(isinstance(d1, datetime)) self.failUnless(isinstance(d2, datetime)) self.failUnless(abs(d1 - d2) <= timedelta(seconds=1)) def test_run_at(self): for schedule in [None, 0, timedelta(seconds=0)]: now = timezone.now() run_at = TaskSchedule(run_at=schedule).run_at self._within_one_second(run_at, now) now = timezone.now() run_at = TaskSchedule(run_at=now).run_at self._within_one_second(run_at, now) fixed_dt = timezone.now() + timedelta(seconds=60) run_at = TaskSchedule(run_at=fixed_dt).run_at self._within_one_second(run_at, fixed_dt) run_at = TaskSchedule(run_at=90).run_at self._within_one_second(run_at, timezone.now() + timedelta(seconds=90)) run_at = TaskSchedule(run_at=timedelta(seconds=35)).run_at self._within_one_second(run_at, timezone.now() + timedelta(seconds=35)) def test_create(self): fixed_dt = timezone.now() + timedelta(seconds=10) schedule = TaskSchedule.create({'run_at': fixed_dt}) self.assertEqual(schedule.run_at, fixed_dt) self.assertEqual(0, schedule.priority) self.assertEqual(TaskSchedule.SCHEDULE, schedule.action) schedule = {'run_at': fixed_dt, 'priority': 2, 'action': TaskSchedule.RESCHEDULE_EXISTING} schedule = TaskSchedule.create(schedule) self.assertEqual(schedule.run_at, fixed_dt) self.assertEqual(2, schedule.priority) self.assertEqual(TaskSchedule.RESCHEDULE_EXISTING, schedule.action) schedule = TaskSchedule.create(0) self._within_one_second(schedule.run_at, timezone.now()) schedule = TaskSchedule.create(10) self._within_one_second(schedule.run_at, timezone.now() + timedelta(seconds=10)) schedule = TaskSchedule.create(TaskSchedule(run_at=fixed_dt)) self.assertEqual(schedule.run_at, fixed_dt) self.assertEqual(0, schedule.priority) self.assertEqual(TaskSchedule.SCHEDULE, schedule.action) def test_merge(self): default = TaskSchedule(run_at=10, priority=2, action=TaskSchedule.RESCHEDULE_EXISTING) schedule = TaskSchedule.create(20).merge(default) self._within_one_second(timezone.now() + timedelta(seconds=20), schedule.run_at) self.assertEqual(2, schedule.priority) self.assertEqual(TaskSchedule.RESCHEDULE_EXISTING, schedule.action) schedule = TaskSchedule.create({'priority': 0}).merge(default) self._within_one_second(timezone.now() + timedelta(seconds=10), schedule.run_at) self.assertEqual(0, schedule.priority) self.assertEqual(TaskSchedule.RESCHEDULE_EXISTING, schedule.action) action = TaskSchedule.CHECK_EXISTING schedule = TaskSchedule.create({'action': action}).merge(default) self._within_one_second(timezone.now() + timedelta(seconds=10), schedule.run_at) self.assertEqual(2, schedule.priority) self.assertEqual(action, schedule.action) def test_repr(self): self.assertEqual('TaskSchedule(run_at=10, priority=0)', repr(TaskSchedule(run_at=10, priority=0))) class TestSchedulingTasks(TransactionTestCase): def test_background_gets_scheduled(self): self.result = None @tasks.background(name='test_background_gets_scheduled') def set_result(result): self.result = result # calling set_result should now actually create a record in the db set_result(1) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) task = all_tasks[0] self.assertEqual('test_background_gets_scheduled', task.task_name) self.assertEqual('[[1], {}]', task.task_params) def test_reschedule_existing(self): reschedule_existing = TaskSchedule.RESCHEDULE_EXISTING @tasks.background(name='test_reschedule_existing', schedule=TaskSchedule(action=reschedule_existing)) def reschedule_fn(): pass # this should only end up with one task # and it should be scheduled for the later time reschedule_fn() reschedule_fn(schedule=90) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) task = all_tasks[0] self.assertEqual('test_reschedule_existing', task.task_name) # check task is scheduled for later on now = timezone.now() self.failUnless(now + timedelta(seconds=89) < task.run_at) self.failUnless(now + timedelta(seconds=91) > task.run_at) def test_check_existing(self): check_existing = TaskSchedule.CHECK_EXISTING @tasks.background(name='test_check_existing', schedule=TaskSchedule(action=check_existing)) def check_fn(): pass # this should only end up with the first call # scheduled check_fn() check_fn(schedule=90) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) task = all_tasks[0] self.assertEqual('test_check_existing', task.task_name) # check new task is scheduled for the earlier time now = timezone.now() self.failUnless(now - timedelta(seconds=1) < task.run_at) self.failUnless(now + timedelta(seconds=1) > task.run_at) class TestTaskRunner(TransactionTestCase): def setUp(self): super(TestTaskRunner, self).setUp() self.runner = tasks._runner def test_get_task_to_run_no_tasks(self): self.failIf(self.runner.get_task_to_run(tasks)) def test_get_task_to_run(self): task = Task.objects.new_task('mytask', (1), {}) task.save() self.failUnless(task.locked_by is None) self.failUnless(task.locked_at is None) locked_task = self.runner.get_task_to_run(tasks) self.failIf(locked_task is None) self.failIf(locked_task.locked_by is None) self.assertEqual(self.runner.worker_name, locked_task.locked_by) self.failIf(locked_task.locked_at is None) self.assertEqual('mytask', locked_task.task_name) class TestTaskModel(TransactionTestCase): def test_lock_uncontested(self): task = Task.objects.new_task('mytask') task.save() self.failUnless(task.locked_by is None) self.failUnless(task.locked_at is None) locked_task = task.lock('mylock') self.assertEqual('mylock', locked_task.locked_by) self.failIf(locked_task.locked_at is None) self.assertEqual(task.pk, locked_task.pk) def test_lock_contested(self): # locking should actually look at db, not object # in memory task = Task.objects.new_task('mytask') task.save() self.failIf(task.lock('mylock') is None) self.failUnless(task.lock('otherlock') is None) def test_lock_expired(self): task = Task.objects.new_task('mytask') task.save() locked_task = task.lock('mylock') # force expire the lock expire_by = timedelta(seconds=(app_settings.BACKGROUND_TASK_MAX_RUN_TIME + 2)) locked_task.locked_at = locked_task.locked_at - expire_by locked_task.save() # now try to get the lock again self.failIf(task.lock('otherlock') is None) def test_str(self): task = Task.objects.new_task('mytask') self.assertEqual(u'mytask', str(task)) task = Task.objects.new_task('mytask', verbose_name="My Task") self.assertEqual(u'My Task', str(task)) def test_creator(self): user = User.objects.create_user(username='bob', email='bob@example.com', password='12345') task = Task.objects.new_task('mytask', creator=user) task.save() self.assertEqual(task.creator, user) def test_repeat(self): repeat_until = timezone.now() + timedelta(days=1) task = Task.objects.new_task('mytask', repeat=Task.HOURLY, repeat_until=repeat_until) task.save() self.assertEqual(task.repeat, Task.HOURLY) self.assertEqual(task.repeat_until, repeat_until) def test_create_completed_task(self): task = Task.objects.new_task( task_name='mytask', args=[1], kwargs={'q': 's'}, priority=1, queue='myqueue', verbose_name='My Task', creator=User.objects.create_user(username='bob', email='bob@example.com', password='12345'), ) task.save() completed_task = task.create_completed_task() self.assertEqual(completed_task.task_name, task.task_name) self.assertEqual(completed_task.task_params, task.task_params) self.assertEqual(completed_task.priority, task.priority) self.assertEqual(completed_task.queue, task.queue) self.assertEqual(completed_task.verbose_name, task.verbose_name) self.assertEqual(completed_task.creator, task.creator) self.assertEqual(completed_task.repeat, task.repeat) self.assertEqual(completed_task.repeat_until, task.repeat_until) class TestTasks(TransactionTestCase): def setUp(self): super(TestTasks, self).setUp() @tasks.background(name='set_fields') def set_fields(**fields): for key, value in fields.items(): setattr(self, key, value) @tasks.background(name='throws_error') def throws_error(): raise RuntimeError("an error") self.set_fields = set_fields self.throws_error = throws_error def test_run_next_task_nothing_scheduled(self): self.failIf(run_next_task()) def test_run_next_task_one_task_scheduled(self): self.set_fields(worked=True) self.failIf(hasattr(self, 'worked')) self.failUnless(run_next_task()) self.failUnless(hasattr(self, 'worked')) self.failUnless(self.worked) def test_run_next_task_several_tasks_scheduled(self): self.set_fields(one='1') self.set_fields(two='2') self.set_fields(three='3') for i in range(3): self.failUnless(run_next_task()) self.failIf(run_next_task()) # everything should have been run for field, value in [('one', '1'), ('two', '2'), ('three', '3')]: self.failUnless(hasattr(self, field)) self.assertEqual(value, getattr(self, field)) def test_run_next_task_error_handling(self): self.throws_error() all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) original_task = all_tasks[0] # should run, but trigger error self.failUnless(run_next_task()) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) failed_task = all_tasks[0] # should have an error recorded self.failIfEqual('', failed_task.last_error) self.failUnless(failed_task.failed_at is None) self.assertEqual(1, failed_task.attempts) # should have been rescheduled for the future # and no longer locked self.failUnless(failed_task.run_at > original_task.run_at) self.failUnless(failed_task.locked_by is None) self.failUnless(failed_task.locked_at is None) def test_run_next_task_does_not_run_locked(self): self.set_fields(locked=True) self.failIf(hasattr(self, 'locked')) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) original_task = all_tasks[0] original_task.lock('lockname') self.failIf(run_next_task()) self.failIf(hasattr(self, 'locked')) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) def test_run_next_task_unlocks_after_MAX_RUN_TIME(self): self.set_fields(lock_overridden=True) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) original_task = all_tasks[0] locked_task = original_task.lock('lockname') self.failIf(run_next_task()) self.failIf(hasattr(self, 'lock_overridden')) # put lot time into past expire_by = timedelta(seconds=(app_settings.BACKGROUND_TASK_MAX_RUN_TIME + 2)) locked_task.locked_at = locked_task.locked_at - expire_by locked_task.save() # so now we should be able to override the lock # and run the task self.failUnless(run_next_task()) self.assertEqual(0, Task.objects.count()) self.failUnless(hasattr(self, 'lock_overridden')) self.failUnless(self.lock_overridden) def test_default_schedule_used_for_run_at(self): @tasks.background(name='default_schedule_used_for_run_at', schedule=60) def default_schedule_used_for_time(): pass now = timezone.now() default_schedule_used_for_time() all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) task = all_tasks[0] self.failUnless(now < task.run_at) self.failUnless((task.run_at - now) <= timedelta(seconds=61)) self.failUnless((task.run_at - now) >= timedelta(seconds=59)) def test_default_schedule_used_for_priority(self): @tasks.background(name='default_schedule_used_for_priority', schedule={'priority': 2}) def default_schedule_used_for_priority(): pass default_schedule_used_for_priority() all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) task = all_tasks[0] self.assertEqual(2, task.priority) def test_non_default_schedule_used(self): default_run_at = timezone.now() + timedelta(seconds=90) @tasks.background(name='non_default_schedule_used', schedule={'run_at': default_run_at, 'priority': 2}) def default_schedule_used_for_priority(): pass run_at = timezone.now().replace(microsecond=0) + timedelta(seconds=60) default_schedule_used_for_priority(schedule=run_at) all_tasks = Task.objects.all() self.assertEqual(1, all_tasks.count()) task = all_tasks[0] self.assertEqual(run_at, task.run_at) def test_failed_at_set_after_MAX_ATTEMPTS(self): @tasks.background(name='test_failed_at_set_after_MAX_ATTEMPTS') def failed_at_set_after_MAX_ATTEMPTS(): raise RuntimeError('failed') failed_at_set_after_MAX_ATTEMPTS() available = Task.objects.find_available() self.assertEqual(1, available.count()) task = available[0] self.failUnless(task.failed_at is None) task.attempts = app_settings.BACKGROUND_TASK_MAX_ATTEMPTS task.save() # task should be scheduled to run now # but will be marked as failed straight away self.failUnless(run_next_task()) available = Task.objects.find_available() self.assertEqual(0, available.count()) all_tasks = Task.objects.all() self.assertEqual(0, all_tasks.count()) self.assertEqual(1, CompletedTask.objects.count()) completed_task = CompletedTask.objects.all()[0] self.failIf(completed_task.failed_at is None) def test_run_task_return_value(self): return_value = self.set_fields(test='test') self.assertEqual(Task.objects.count(), 1) task = Task.objects.first() self.assertEqual(return_value, task) self.assertEqual(return_value.pk, task.pk) def test_verbose_name_param(self): verbose_name = 'My Task' task = self.set_fields(test='test1', verbose_name=verbose_name) self.assertEqual(task.verbose_name, verbose_name) def test_creator_param(self): user = User.objects.create_user(username='bob', email='bob@example.com', password='12345') task = self.set_fields(test='test2', creator=user) self.assertEqual(task.creator, user) class MaxAttemptsTestCase(TransactionTestCase): def setUp(self): @tasks.background(name='failing task') def failing_task(): raise Exception("error") # return 0 / 0 self.failing_task = failing_task self.task1 = self.failing_task() self.task2 = self.failing_task() self.task1_id = self.task1.id self.task2_id = self.task2.id @override_settings(MAX_ATTEMPTS=1) def test_max_attempts_one(self): self.assertEqual(settings.MAX_ATTEMPTS, 1) self.assertEqual(Task.objects.count(), 2) run_next_task() self.assertEqual(Task.objects.count(), 1) self.assertEqual(Task.objects.all()[0].id, self.task2_id) self.assertEqual(CompletedTask.objects.count(), 1) completed_task = CompletedTask.objects.all()[0] self.assertEqual(completed_task.attempts, 1) self.assertEqual(completed_task.task_name, self.task1.task_name) self.assertEqual(completed_task.task_params, self.task1.task_params) self.assertIsNotNone(completed_task.last_error) self.assertIsNotNone(completed_task.failed_at) run_next_task() self.assertEqual(Task.objects.count(), 0) self.assertEqual(CompletedTask.objects.count(), 2) @override_settings(MAX_ATTEMPTS=2) def test_max_attempts_two(self): self.assertEqual(settings.MAX_ATTEMPTS, 2) run_next_task() self.assertEqual(Task.objects.count(), 2) self.assertEqual(CompletedTask.objects.count(), 0) class ArgumentsWithDictTestCase(TransactionTestCase): def setUp(self): @tasks.background(name='failing task') def task(d): pass self.task = task def test_task_with_dictionary_in_args(self): self.assertEqual(Task.objects.count(), 0) d = {22222: 2, 11111: 1} self.task(d) self.assertEqual(Task.objects.count(), 1) run_next_task() self.assertEqual(Task.objects.count(), 0) completed_named_queue_tasks = [] @background(queue='named_queue') def named_queue_task(message): completed_named_queue_tasks.append(message) class NamedQueueTestCase(TransactionTestCase): def test_process_queue(self): named_queue_task('test1') run_next_task(queue='named_queue') self.assertIn('test1', completed_named_queue_tasks, msg='Task should be processed') def test_process_all_tasks(self): named_queue_task('test2') run_next_task() self.assertIn('test2', completed_named_queue_tasks, msg='Task should be processed') def test_process_other_queue(self): named_queue_task('test3') run_next_task(queue='other_named_queue') self.assertNotIn('test3', completed_named_queue_tasks, msg='Task should be ignored') run_next_task() class RepetitionTestCase(TransactionTestCase): def setUp(self): @tasks.background() def my_task(*args, **kwargs): pass self.my_task = my_task def test_repeat(self): repeat_until = timezone.now() + timedelta(weeks=1) old_task = self.my_task( 'test-repeat', foo='bar', repeat=Task.HOURLY, repeat_until=repeat_until, verbose_name="Test repeat", ) self.assertEqual(old_task.repeat, Task.HOURLY) self.assertEqual(old_task.repeat_until, repeat_until) tasks.run_next_task() time.sleep(0.5) self.assertEqual(Task.objects.filter(repeat=Task.HOURLY).count(), 1) new_task = Task.objects.get(repeat=Task.HOURLY) self.assertNotEqual(new_task.id, old_task.id) self.assertEqual(new_task.task_name, old_task.task_name) self.assertEqual(new_task.params(), old_task.params()) self.assertEqual(new_task.task_hash, old_task.task_hash) self.assertEqual(new_task.verbose_name, old_task.verbose_name) self.assertEqual((new_task.run_at - old_task.run_at), timedelta(hours=1)) self.assertEqual(new_task.repeat_until, old_task.repeat_until) class QuerySetManagerTestCase(TransactionTestCase): def setUp(self): @tasks.background() def succeeding_task(): return 0/1 @tasks.background() def failing_task(): return 0/0 self.user1 = User.objects.create_user(username='bob', email='bob@example.com', password='12345') self.user2 = User.objects.create_user(username='bob2', email='bob@example.com', password='12345') self.task_all = succeeding_task() self.task_user = succeeding_task(creator=self.user1) self.failing_task_all = failing_task() self.failing_task_user = failing_task(creator=self.user1) @override_settings(MAX_ATTEMPTS=1) def test_task_manager(self): self.assertEqual(len(Task.objects.all()), 4) self.assertEqual(len(Task.objects.created_by(self.user1)), 2) self.assertEqual(len(Task.objects.created_by(self.user2)), 0) for i in range(4): run_next_task() self.assertEqual(len(Task.objects.all()), 0) self.assertEqual(len(Task.objects.created_by(self.user1)), 0) self.assertEqual(len(Task.objects.created_by(self.user2)), 0) @override_settings(MAX_ATTEMPTS=1) def test_completed_task_manager(self): self.assertEqual(len(CompletedTask.objects.created_by(self.user1)), 0) self.assertEqual(len(CompletedTask.objects.created_by(self.user2)), 0) self.assertEqual(len(CompletedTask.objects.failed()), 0) self.assertEqual(len(CompletedTask.objects.created_by(self.user1).failed()), 0) self.assertEqual(len(CompletedTask.objects.failed(within=timedelta(hours=1))), 0) self.assertEqual(len(CompletedTask.objects.succeeded()), 0) self.assertEqual(len(CompletedTask.objects.created_by(self.user1).succeeded()), 0) self.assertEqual(len(CompletedTask.objects.succeeded(within=timedelta(hours=1))), 0) for i in range(4): run_next_task() self.assertEqual(len(CompletedTask.objects.created_by(self.user1)), 2) self.assertEqual(len(CompletedTask.objects.created_by(self.user2)), 0) self.assertEqual(len(CompletedTask.objects.failed()), 2) self.assertEqual(len(CompletedTask.objects.created_by(self.user1).failed()), 1) self.assertEqual(len(CompletedTask.objects.failed(within=timedelta(hours=1))), 2) self.assertEqual(len(CompletedTask.objects.succeeded()), 2) self.assertEqual(len(CompletedTask.objects.created_by(self.user1).succeeded()), 1) self.assertEqual(len(CompletedTask.objects.succeeded(within=timedelta(hours=1))), 2) class PriorityTestCase(TransactionTestCase): def setUp(self): @tasks.background() def mytask(): pass run_at = timezone.now() - timedelta(minutes=1) self.high_priority_task = mytask(priority=99, schedule=run_at) self.low_priority_task = mytask(priority=-1, schedule=run_at) def test_priority(self): self.assertEqual(self.high_priority_task.priority, 99) self.assertEqual(self.low_priority_task.priority, -1) available = Task.objects.find_available() self.assertEqual(available.count(), 2) self.assertEqual(available.first(), self.high_priority_task) self.assertEqual(available.last(), self.low_priority_task) self.assertFalse(CompletedTask.objects.filter(priority=self.high_priority_task.priority).exists()) self.assertFalse(CompletedTask.objects.filter(priority=self.low_priority_task.priority).exists()) run_next_task() self.assertTrue(CompletedTask.objects.filter(priority=self.high_priority_task.priority).exists()) self.assertFalse(CompletedTask.objects.filter(priority=self.low_priority_task.priority).exists()) run_next_task() self.assertTrue(CompletedTask.objects.filter(priority=self.high_priority_task.priority).exists()) self.assertTrue(CompletedTask.objects.filter(priority=self.low_priority_task.priority).exists()) class LoggingTestCase(TransactionTestCase): def setUp(self): @tasks.background() def succeeding_task(): return 0/1 @tasks.background() def failing_task(): return 0/0 self.succeeding_task = succeeding_task self.failing_task = failing_task @patch('background_task.tasks.logger') def test_success_logging(self, mock_logger): self.succeeding_task() run_next_task() self.assertFalse(mock_logger.warning.called) self.assertFalse(mock_logger.error.called) self.assertFalse(mock_logger.critical.called) @patch('background_task.tasks.logger') def test_error_logging(self, mock_logger): self.failing_task() run_next_task() self.assertFalse(mock_logger.warning.called) self.assertTrue(mock_logger.error.called) self.assertFalse(mock_logger.critical.called) django-background-tasks-1.1.11/background_task/tests/test_settings.py0000666000000000000000000000175613142344323026023 0ustar rootroot00000000000000# -*- coding: utf-8 -*- DEBUG = True TEMPLATE_DEBUG = DEBUG DATABASES = { 'default': { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': ':memory:', 'TEST': # This will force django to create a real sqlite database on # the disk, instead of creating it in memory. # We need this to test the async behavior. { 'NAME': 'test_db', }, 'USER': '', 'PASSWORD': '', 'HOST': '', 'PORT': '', } } INSTALLED_APPS = [ 'django.contrib.contenttypes', 'django.contrib.auth', 'background_task', ] SECRET_KEY = 'foo' USE_TZ = True BACKGROUND_TASK_RUN_ASYNC = False LOGGING = { 'version': 1, 'disable_existing_loggers': False, 'handlers': { 'console': { 'class': 'logging.StreamHandler', }, }, 'loggers': { 'background_task': { 'handlers': ['console'], 'level': 'INFO', }, }, } django-background-tasks-1.1.11/background_task/tests/__init__.py0000666000000000000000000000000013142344323024640 0ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/migrations/0000777000000000000000000000000013142344323023553 5ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/migrations/0001_initial.py0000666000000000000000000000756713142344323026235 0ustar rootroot00000000000000# -*- coding: utf-8 -*- # Generated by Django 1.10.6 on 2017-04-03 21:42 from __future__ import unicode_literals from django.db import migrations, models import django.db.models.deletion class Migration(migrations.Migration): initial = True dependencies = [ ('contenttypes', '0002_remove_content_type_name'), ] operations = [ migrations.CreateModel( name='CompletedTask', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('task_name', models.CharField(db_index=True, max_length=255)), ('task_params', models.TextField()), ('task_hash', models.CharField(db_index=True, max_length=40)), ('verbose_name', models.CharField(blank=True, max_length=255, null=True)), ('priority', models.IntegerField(db_index=True, default=0)), ('run_at', models.DateTimeField(db_index=True)), ('repeat', models.BigIntegerField(choices=[(3600, 'hourly'), (86400, 'daily'), (604800, 'weekly'), (1209600, 'every 2 weeks'), (2419200, 'every 4 weeks'), (0, 'never')], default=0)), ('repeat_until', models.DateTimeField(blank=True, null=True)), ('queue', models.CharField(blank=True, db_index=True, max_length=255, null=True)), ('attempts', models.IntegerField(db_index=True, default=0)), ('failed_at', models.DateTimeField(blank=True, db_index=True, null=True)), ('last_error', models.TextField(blank=True)), ('locked_by', models.CharField(blank=True, db_index=True, max_length=64, null=True)), ('locked_at', models.DateTimeField(blank=True, db_index=True, null=True)), ('creator_object_id', models.PositiveIntegerField(blank=True, null=True)), ('creator_content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='completed_background_task', to='contenttypes.ContentType')), ], ), migrations.CreateModel( name='Task', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('task_name', models.CharField(db_index=True, max_length=255)), ('task_params', models.TextField()), ('task_hash', models.CharField(db_index=True, max_length=40)), ('verbose_name', models.CharField(blank=True, max_length=255, null=True)), ('priority', models.IntegerField(db_index=True, default=0)), ('run_at', models.DateTimeField(db_index=True)), ('repeat', models.BigIntegerField(choices=[(3600, 'hourly'), (86400, 'daily'), (604800, 'weekly'), (1209600, 'every 2 weeks'), (2419200, 'every 4 weeks'), (0, 'never')], default=0)), ('repeat_until', models.DateTimeField(blank=True, null=True)), ('queue', models.CharField(blank=True, db_index=True, max_length=255, null=True)), ('attempts', models.IntegerField(db_index=True, default=0)), ('failed_at', models.DateTimeField(blank=True, db_index=True, null=True)), ('last_error', models.TextField(blank=True)), ('locked_by', models.CharField(blank=True, db_index=True, max_length=64, null=True)), ('locked_at', models.DateTimeField(blank=True, db_index=True, null=True)), ('creator_object_id', models.PositiveIntegerField(blank=True, null=True)), ('creator_content_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='background_task', to='contenttypes.ContentType')), ], options={ 'db_table': 'background_task', }, ), ] django-background-tasks-1.1.11/background_task/migrations/__init__.py0000666000000000000000000000000013142344323025652 0ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/__init__.py0000666000000000000000000000034613142344323023513 0ustar rootroot00000000000000# -*- coding: utf-8 -*- __version__ = '1.1.11' default_app_config = 'background_task.apps.BackgroundTasksAppConfig' def background(*arg, **kw): from background_task.tasks import tasks return tasks.background(*arg, **kw) django-background-tasks-1.1.11/background_task/admin.py0000666000000000000000000000124213142344323023040 0ustar rootroot00000000000000# -*- coding: utf-8 -*- from django.contrib import admin from background_task.models_completed import CompletedTask from background_task.models import Task class TaskAdmin(admin.ModelAdmin): display_filter = ['task_name'] list_display = ['task_name', 'task_params', 'run_at', 'priority', 'attempts', 'has_error', 'locked_by', 'locked_by_pid_running', ] class CompletedTaskAdmin(admin.ModelAdmin): display_filter = ['task_name'] list_display = ['task_name', 'task_params', 'run_at', 'priority', 'attempts', 'has_error', 'locked_by', 'locked_by_pid_running', ] admin.site.register(Task, TaskAdmin) admin.site.register(CompletedTask, CompletedTaskAdmin) django-background-tasks-1.1.11/background_task/settings.py0000666000000000000000000000360113142344323023611 0ustar rootroot00000000000000# -*- coding: utf-8 -*- import multiprocessing from django.conf import settings try: cpu_count = multiprocessing.cpu_count() except Exception: cpu_count = 1 class AppSettings(object): """ """ @property def MAX_ATTEMPTS(self): """Control how many times a task will be attempted.""" return getattr(settings, 'MAX_ATTEMPTS', 25) @property def BACKGROUND_TASK_MAX_ATTEMPTS(self): """Control how many times a task will be attempted.""" return self.MAX_ATTEMPTS @property def MAX_RUN_TIME(self): """Maximum possible task run time, after which tasks will be unlocked and tried again.""" return getattr(settings, 'MAX_RUN_TIME', 3600) @property def BACKGROUND_TASK_MAX_RUN_TIME(self): """Maximum possible task run time, after which tasks will be unlocked and tried again.""" return self.MAX_RUN_TIME @property def BACKGROUND_TASK_RUN_ASYNC(self): """Control if tasks will run asynchronous in a ThreadPool.""" return getattr(settings, 'BACKGROUND_TASK_RUN_ASYNC', False) @property def BACKGROUND_TASK_ASYNC_THREADS(self): """Specify number of concurrent threads.""" return getattr(settings, 'BACKGROUND_TASK_ASYNC_THREADS', cpu_count) @property def BACKGROUND_TASK_PRIORITY_ORDERING(self): """ Control the ordering of tasks in the queue. Choose either `DESC` or `ASC`. https://en.m.wikipedia.org/wiki/Nice_(Unix) A niceness of −20 is the highest priority and 19 is the lowest priority. The default niceness for processes is inherited from its parent process and is usually 0. """ order = getattr(settings, 'BACKGROUND_TASK_PRIORITY_ORDERING', 'DESC') if order == 'ASC': prefix = '' else: prefix = '-' return prefix app_settings = AppSettings() django-background-tasks-1.1.11/background_task/management/0000777000000000000000000000000013142344323023513 5ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/management/commands/0000777000000000000000000000000013142344323025314 5ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/management/commands/process_tasks.py0000666000000000000000000000612613142344323030556 0ustar rootroot00000000000000# -*- coding: utf-8 -*- import logging import random import sys import time from django import VERSION from django.core.management.base import BaseCommand from background_task.tasks import tasks, autodiscover from background_task.utils import SignalManager from compat import close_connection logger = logging.getLogger(__name__) def _configure_log_std(): class StdOutWrapper(object): def write(self, s): logger.info(s) class StdErrWrapper(object): def write(self, s): logger.error(s) sys.stdout = StdOutWrapper() sys.stderr = StdErrWrapper() class Command(BaseCommand): help = 'Run tasks that are scheduled to run on the queue' # Command options are specified in an abstract way to enable Django < 1.8 compatibility OPTIONS = ( (('--duration', ), { 'action': 'store', 'dest': 'duration', 'type': int, 'default': 0, 'help': 'Run task for this many seconds (0 or less to run forever) - default is 0', }), (('--sleep', ), { 'action': 'store', 'dest': 'sleep', 'type': float, 'default': 5.0, 'help': 'Sleep for this many seconds before checking for new tasks (if none were found) - default is 5', }), (('--queue', ), { 'action': 'store', 'dest': 'queue', 'help': 'Only process tasks on this named queue', }), (('--log-std', ), { 'action': 'store_true', 'dest': 'log_std', 'help': 'Redirect stdout and stderr to the logging system', }), ) if VERSION < (1, 8): from optparse import make_option option_list = BaseCommand.option_list + tuple([make_option(*args, **kwargs) for args, kwargs in OPTIONS]) # Used in Django >= 1.8 def add_arguments(self, parser): for (args, kwargs) in self.OPTIONS: parser.add_argument(*args, **kwargs) def __init__(self, *args, **kwargs): super(Command, self).__init__(*args, **kwargs) self._tasks = tasks def handle(self, *args, **options): duration = options.pop('duration', 0) sleep = options.pop('sleep', 5.0) queue = options.pop('queue', None) log_std = options.pop('log_std', False) sig_manager = SignalManager() if log_std: _configure_log_std() autodiscover() start_time = time.time() while (duration <= 0) or (time.time() - start_time) <= duration: if sig_manager.kill_now: # shutting down gracefully break if not self._tasks.run_next_task(queue): # there were no tasks in the queue, let's recover. close_connection() logger.debug('waiting for tasks') time.sleep(sleep) else: # there were some tasks to process, let's check if there is more work to do after a little break. time.sleep(random.uniform(sig_manager.time_to_wait[0], sig_manager.time_to_wait[1])) django-background-tasks-1.1.11/background_task/management/commands/__init__.py0000666000000000000000000000000013142344323027413 0ustar rootroot00000000000000django-background-tasks-1.1.11/background_task/management/__init__.py0000666000000000000000000000000013142344323025612 0ustar rootroot00000000000000django-background-tasks-1.1.11/classifiers0000666000000000000000000000076313142344323020476 0ustar rootroot00000000000000Development Status :: 5 - Production/Stable Environment :: Web Environment Intended Audience :: Developers License :: OSI Approved :: BSD License Operating System :: OS Independent Topic :: Software Development :: Libraries :: Python Modules Framework :: Django Framework :: Django :: 1.4 Framework :: Django :: 1.5 Framework :: Django :: 1.6 Framework :: Django :: 1.7 Framework :: Django :: 1.8 Programming Language :: Python Programming Language :: Python :: 2 Programming Language :: Python :: 3django-background-tasks-1.1.11/MANIFEST.in0000666000000000000000000000017513142344323017777 0ustar rootroot00000000000000include README.md include LICENSE include AUTHORS.txt include requirements.txt include classifiers recursive-include tests * django-background-tasks-1.1.11/django_background_tasks.egg-info/0000777000000000000000000000000013142344323024576 5ustar rootroot00000000000000django-background-tasks-1.1.11/django_background_tasks.egg-info/zip-safe0000666000000000000000000000000113142344323026226 0ustar rootroot00000000000000 django-background-tasks-1.1.11/django_background_tasks.egg-info/SOURCES.txt0000666000000000000000000000175513142344323026472 0ustar rootroot00000000000000AUTHORS.txt LICENSE MANIFEST.in README.md classifiers requirements.txt setup.py background_task/__init__.py background_task/admin.py background_task/apps.py background_task/exceptions.py background_task/models.py background_task/models_completed.py background_task/settings.py background_task/signals.py background_task/tasks.py background_task/utils.py background_task/management/__init__.py background_task/management/commands/__init__.py background_task/management/commands/process_tasks.py background_task/migrations/0001_initial.py background_task/migrations/__init__.py background_task/tests/__init__.py background_task/tests/test_settings.py background_task/tests/test_settings_async.py background_task/tests/test_tasks.py django_background_tasks.egg-info/PKG-INFO django_background_tasks.egg-info/SOURCES.txt django_background_tasks.egg-info/dependency_links.txt django_background_tasks.egg-info/requires.txt django_background_tasks.egg-info/top_level.txt django_background_tasks.egg-info/zip-safedjango-background-tasks-1.1.11/django_background_tasks.egg-info/top_level.txt0000666000000000000000000000002013142344323027320 0ustar rootroot00000000000000background_task django-background-tasks-1.1.11/django_background_tasks.egg-info/PKG-INFO0000666000000000000000000000511613142344323025676 0ustar rootroot00000000000000Metadata-Version: 1.1 Name: django-background-tasks Version: 1.1.11 Summary: Database backed asynchronous task queue Home-page: http://github.com/arteria/django-background-tasks Author: arteria GmbH, John Montgomery Author-email: admin@arteria.ch License: BSD Description: # Django Background Tasks [![Build Status](https://travis-ci.org/arteria/django-background-tasks.svg?branch=master)](https://travis-ci.org/arteria/django-background-tasks) [![Coverage Status](https://coveralls.io/repos/arteria/django-background-tasks/badge.svg?branch=master&service=github)](https://coveralls.io/github/arteria/django-background-tasks?branch=master) [![Documentation Status](https://readthedocs.org/projects/django-background-tasks/badge/?version=latest)](http://django-background-tasks.readthedocs.io/en/latest/?badge=latest) [![PyPI](https://img.shields.io/pypi/v/django-background-tasks.svg)](https://pypi.python.org/pypi/django-background-tasks) Django Background Task is a databased-backed work queue for Django, loosely based around [Ruby's DelayedJob](https://github.com/tobi/delayed_job) library. This project was adopted and adapted from [lilspikey](https://github.com/lilspikey/) django-background-task. To avoid conflicts on PyPI we renamed it to django-background-tasks (plural). For an easy upgrade from django-background-task to django-background-tasks, the internal module structure were left untouched. In Django Background Task, all tasks are implemented as functions (or any other callable). There are two parts to using background tasks: * creating the task functions and registering them with the scheduler * setup a cron task (or long running process) to execute the tasks ## Docs See http://django-background-tasks.readthedocs.io/en/latest/. Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Web Environment Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Framework :: Django Classifier: Framework :: Django :: 1.4 Classifier: Framework :: Django :: 1.5 Classifier: Framework :: Django :: 1.6 Classifier: Framework :: Django :: 1.7 Classifier: Framework :: Django :: 1.8 Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 3 django-background-tasks-1.1.11/django_background_tasks.egg-info/requires.txt0000666000000000000000000000003213142344323027171 0ustar rootroot00000000000000django-compat>=1.0.13 six django-background-tasks-1.1.11/django_background_tasks.egg-info/dependency_links.txt0000666000000000000000000000000113142344323030644 0ustar rootroot00000000000000 django-background-tasks-1.1.11/setup.cfg0000666000000000000000000000007313142344323020057 0ustar rootroot00000000000000[egg_info] tag_build = tag_date = 0 tag_svn_revision = 0