././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/0000755000175100001710000000000000000000000014076 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/CHANGELOG.rst0000644000175100001710000003103400000000000016120 0ustar00runnerdockerWhat’s new in django-cachalot? ============================== 2.4.3 ----- - Fix annotated Now being cached (#195) - Fix conditional annotated expressions not being cached (#196) - Simplify annotation handling by using the flatten method (#197) - Fix Django 3.2 default_app_config deprecation (#198) - (Internal) Pinned psycopg2 to <2.9 due to Django 2.2 incompatibility 2.4.2 ----- - Add convenience settings `CACHALOT_ONLY_CACHABLE_APPS` and `CACHALOT_UNCACHABLE_APPS` (#187) - Drop support for Django 3.0 (#189) - (Internal) Added Django main-branch CI on cron job - (Internal) Removed duplicate code (#190) 2.4.1 ----- - Fix Django requirement constraint to include 3.2.X not just 3.2 - (Internal) Deleted obsolete travis-matrix.py file 2.4.0 ----- - Add support for Django 3.2 (#181) - Remove enforced system check for Django version (#175) - Drop support for Django 2.0-2.1 and Python 3.5 (#181) - Add support for Pymemcache for Django 3.2+ (#181) - Reverts #157 with proper fix. (#181) - Add ``CACHALOT_ADDITIONAL_TABLES`` setting for unmanaged models (#183) 2.3.5 ----- - Fix cachalot_disabled (#174) 2.3.4 ----- - Fix bug with externally invalidated cache keys (#120) - Omit test files in coverage 2.3.3 ----- - Remove deprecated signal argument (#165) - Add Python 3.9 support - Use Discord instead since Slack doesn't save messages, @Andrew-Chen-Wang is not on there very much, and Discord has a phenomenal search functionality (with ES). 2.3.2 ----- - Cast memoryview objects to bytes to be able to pickle them (#163) 2.3.1 ----- - Added support for Django 3.1, including the new, native JSONField 2.3.0 ----- - Added context manager for temporarily disabling cachalot using `cachalot_disabled()` - Fix for certain Subquery cases. 2.2.2 ----- - Drop support for Django 1.11 and Python 2.7 - Added fix for subqueries from Django 2.2 2.2.0 ----- - Adds Django 2.2 and 3.0 support. - Dropped official support for Python 3.4 - It won't run properly with Travis CI tests on MySQL. - All Travis CI tests are fully functional. 2.1.0 ----- - Adds Django 2.1 support. 2.0.2 ----- - Adds support for ``.union``, ``.intersection`` & ``.difference`` that should have been introduced since 1.5.0 - Fixes error raised in some rare and undetermined cases, when the cache backend doesn’t yield data as expected 2.0.1 ----- - Allows specifying a schema name in ``Model._meta.db_table`` 2.0.0 ----- - Adds Django 2.0 support - Drops Django 1.10 support - Drops Django 1.8 support (1.9 support was dropped in 1.5.0) - Adds a check to make sure it is used with a supported Django version - Fixes a bug partially breaking django-cachalot when an error occurred during the end of a `transaction.atomic` block, typically when using deferred constraints 1.5.0 ----- - Adds Django 1.11 support - Adds Python 3.6 support - Drops Django 1.9 support (but 1.8 is still supported) - Drops Python 3.3 support - Adds ``CACHALOT_DATABASES`` to specify which databases have django-cachalot enabled (by default, only supported databases are enabled) - Stops advising users to dynamically override cachalot settings as it cannot be thread-safe due to Django’s internals - Invalidates tables after raw ``CREATE``, ``ALTER`` & ``DROP`` SQL queries - Allows specifying model lookups like ``auth.User`` in the API functions (previously, it could only be done in the Django template tag, not in the Jinja2 ``get_last_invalidation`` function nor in API functions) - Fixes the cache used by ``CachalotPanel`` if ``CACHALOT_CACHE`` is different from ``'default'`` - Uploads a wheel distribution of this package to PyPI starting now, in addition of the source release - Improves tests 1.4.1 ----- - Fixes a circular import occurring when CachalotPanel is used and django-debug-toolbar is before django-cachalot in ``INSTALLED_APPS`` - Stops checking compatibility for caches other than ``CACHALOT_CACHE`` 1.4.0 ----- - Fixes a bad design: ``QuerySet.select_for_update`` was cached, but it’s not correct since it does not lock data in the database once data was cached, leading to the database lock being useless in some cases - Stops automatically invalidating other caches than ``CACHALOT_CACHE`` for consistency, performance, and usefulness reasons - Fixes a minor issue: the ``post_invalidation`` signal was sent during transactions when calling the ``invalidate`` command - Creates `a gitter chat room `_ - Removes the Slack team. Slack does not allow public chat, this was therefore a bad idea 1.3.0 ----- - Adds Django 1.10 support - Drops Django 1.7 support - Drops Python 3.2 support - Adds a Jinja2 extension with a ``cache`` statement and the ``get_last_invalidation`` function - Adds a ``CACHALOT_TIMEOUT`` setting after dozens of private & public requests, but it’s not really useful - Fixes a ``RuntimeError`` occurring if a ``DatabaseCache`` was used in a project, even if not used by django-cachalot - Allows bytes raw queries (except on SQLite where it’s not supposed to work) - Creates `a Slack team `_ to discuss, easier than using Google Groups 1.2.1 ----- **Mandatory update if you’re using django-cachalot 1.2.0.** This version reverts the cache keys hashing change from 1.2.0, as it was leading to a non-shared cache when Python used a random seed for hashing, which is the case by default on Python 3.3, 3.4, & 3.5, and also on 2.7 & 3.2 if you set ``PYTHONHASHSEED=random``. 1.2.0 ----- **WARNING: This version is unsafe, it can lead to invalidation errors** - Adds Django 1.9 support - Simplifies and speeds up cache keys hashing - Documents how to use django-cachalot with a replica database - Adds ``DummyCache`` to ``VALID_CACHE_BACKENDS`` - Updates the comparison with django-cache-machine & django-cacheops by checking features and measuring performance instead of relying on their documentations and a 2-years-ago experience of them 1.1.0 ----- **Backwards incompatible changes:** - Adds Django 1.8 support and drops Django 1.6 & Python 2.6 support - Merges the 3 API functions ``invalidate_all``, ``invalidate_tables``, & ``invalidate_models`` into a single ``invalidate`` function while optimising it Other additions: - Adds a ``get_last_invalidation`` function to the API and the equivalent template tag - Adds a ``CACHALOT_ONLY_CACHABLE_TABLES`` setting in order to make a whitelist of the only table names django-cachalot can cache - Caches queries with IP addresses, floats, or decimals in parameters - Adds a Django check to ensure the project uses compatible cache and database backends - Adds a lot of tests, especially to test django.contrib.postgres - Adds a comparison with django-cache-machine and django-cacheops in the documentation Fixed: - Removes a useless extra invalidation during each write operation to the database, leading to a small speedup during data modification and tests - The ``post_invalidation`` signal was triggered during transactions and was not triggered when using the API or raw write queries: both issues are now fixed - Fixes a very unlikely invalidation issue occurring only when an error occurred in a transaction after a transaction of another database nested in the first transaction was committed, like this: .. code:: python from django.db import transaction assert list(YourModel.objects.using('another_db')) == [] try: with transaction.atomic(): with transaction.atomic('another_db'): obj = YourModel.objects.using('another_db').create(name='test') raise ZeroDivisionError except ZeroDivisionError: pass # Before django-cachalot 1.1.0, this assert was failing. assert list(YourModel.objects.using('another_db')) == [obj] 1.0.3 ----- - Fixes an invalidation issue that could rarely occur when querying on a ``BinaryField`` with PostgreSQL, or with some geographic queries (there was a small chance that a same query with different parameters could erroneously give the same result as the previous one) - Adds a ``CACHALOT_UNCACHABLE_TABLES`` setting - Fixes a Django 1.7 migrations invalidation issue in tests (that was leading to this error half of the time: ``RuntimeError: Error creating new content types. Please make sure contenttypes is migrated before trying to migrate apps individually.``) - Optimises tests when using django-cachalot by avoid several useless cache invalidations 1.0.2 ----- - Fixes an ``AttributeError`` occurring when excluding through a many-to-many relation on a child model (using multi-table inheritance) - Stops caching queries with random subqueries – for example ``User.objects.filter(pk__in=User.objects.order_by('?'))`` - Optimises automatic invalidation - Adds a note about clock synchronisation 1.0.1 ----- - Fixes an invalidation issue discovered by Helen Warren that was occurring when updating a ``ManyToManyField`` after executing using ``.exclude`` on that relation. For example, ``Permission.objects.all().delete()`` was not invalidating ``User.objects.exclude(user_permissions=None)`` - Fixes a ``UnicodeDecodeError`` introduced with python-memcached 1.54 - Adds a ``post_invalidation`` signal 1.0.0 ----- Fixes a bug occurring when caching a SQL query using a non-ascii table name. 1.0.0rc ------- Added: - Adds an `invalidate_cachalot` command to invalidate django-cachalot from a script without having to clear the whole cache - Adds the benchmark introduction, conditions & results to the documentation - Adds a short guide on how to configure Redis as a LRU cache Fixed: - Fixes a rare invalidation issue occurring when updating a many-to-many table after executing a queryset generating a ``HAVING`` SQL statement – for example, ``User.objects.first().user_permissions.add(Permission.objects.first())`` was not invalidating ``User.objects.annotate(n=Count('user_permissions')).filter(n__gte=1)`` - Fixes an even rarer invalidation issue occurring when updating a many-to-many table after executing a queryset filtering nested subqueries by another subquery through that many-to-many table – for example:: User.objects.filter( pk__in=User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all()))) - Avoids setting useless cache keys by using table names instead of Django-generated table alias 0.9.0 ----- Added: - Caches all queries implying ``Queryset.extra`` - Invalidates raw queries - Adds a simple API containing: ``invalidate_tables``, ``invalidate_models``, ``invalidate_all`` - Adds file-based cache support for Django 1.7 - Adds a setting to choose if random queries must be cached - Adds 2 settings to customize how cache keys are generated - Adds a django-debug-toolbar panel - Adds a benchmark Fixed: - Rewrites invalidation for a better speed & memory performance - Fixes a stale cache issue occurring when an invalidation is done exactly during a SQL request on the invalidated table(s) - Fixes a stale cache issue occurring after concurrent transactions - Uses an infinite timeout Removed: - Simplifies ``cachalot_settings`` and forbids its use or modification 0.8.1 ----- - Fixes an issue with pip if Django is not yet installed 0.8.0 ----- - Adds multi-database support - Adds invalidation when altering the DB schema using `migrate`, `syncdb`, `flush`, `loaddata` commands (also invalidates South, if you use it) - Small optimizations & simplifications - Adds several tests 0.7.0 ----- - Adds thread-safety - Optimizes the amount of cache queries during transaction 0.6.0 ----- - Adds memcached support 0.5.0 ----- - Adds ``CACHALOT_ENABLED`` & ``CACHALOT_CACHE`` settings - Allows settings to be dynamically overridden using ``cachalot_settings`` - Adds some missing tests 0.4.1 ----- - Fixes ``pip install``. 0.4.0 (**install broken**) -------------------------- - Adds Travis CI and adds compatibility for: - Django 1.6 & 1.7 - Python 2.6, 2.7, 3.2, 3.3, & 3.4 - locmem & Redis - SQLite, PostgreSQL, MySQL 0.3.0 ----- - Handles transactions - Adds lots of tests for complex cases 0.2.0 ----- - Adds a test suite - Fixes invalidation for data creation/deletion - Stops caching on queries defining ``select`` or ``where`` arguments with ``QuerySet.extra`` 0.1.0 ----- Prototype simply caching all SQL queries reading the database and trying to invalidate them when SQL queries modify the database. Has issues invalidating deletions and creations. Also caches ``QuerySet.extra`` queries but can’t reliably invalidate them. No transaction support, no test, no multi-database support, etc. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/LICENSE0000644000175100001710000000272600000000000015112 0ustar00runnerdockerCopyright (c) 2014-2016, Bertrand Bordage All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of django-cachalot nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/MANIFEST.in0000644000175100001710000000014300000000000015632 0ustar00runnerdockerinclude README.rst LICENSE CHANGELOG.rst requirements.txt recursive-include cachalot *.json *.html ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/PKG-INFO0000644000175100001710000001765700000000000015213 0ustar00runnerdockerMetadata-Version: 1.1 Name: django-cachalot Version: 2.4.3 Summary: Caches your Django ORM queries and automatically invalidates them. Home-page: https://github.com/noripyt/django-cachalot Author: Bertrand Bordage, Andrew Chen Wang Author-email: acwangpython@gmail.com License: BSD Description: Django Cachalot =============== Caches your Django ORM queries and automatically invalidates them. Documentation: http://django-cachalot.readthedocs.io ---- .. image:: http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 :target: https://pypi.python.org/pypi/django-cachalot .. image:: https://img.shields.io/pypi/pyversions/django-cachalot :target: https://django-cachalot.readthedocs.io/en/latest/ .. image:: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml/badge.svg :target: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml .. image:: http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://coveralls.io/r/noripyt/django-cachalot?branch=master .. image:: http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://scrutinizer-ci.com/g/noripyt/django-cachalot/ .. image:: https://img.shields.io/discord/773656139207802881 :target: https://discord.gg/WFGFBk8rSU ---- Table of Contents: - Quickstart - Usage - Hacking - Benchmark - Third-Party Cache Comparison - Discussion Quickstart ---------- Cachalot officially supports Python 3.6-3.9 and Django 2.2 and 3.1-3.2 with the databases PostgreSQL, SQLite, and MySQL. Note: an upper limit on Django version is set for your safety. Please do not ignore it. Usage ----- #. ``pip install django-cachalot`` #. Add ``'cachalot',`` to your ``INSTALLED_APPS`` #. If you use multiple servers with a common cache server, `double check their clock synchronisation `_ #. If you modify data outside Django – typically after restoring a SQL database –, use the `manage.py command `_ #. Be aware of `the few other limits `_ #. If you use `django-debug-toolbar `_, you can add ``'cachalot.panels.CachalotPanel',`` to your ``DEBUG_TOOLBAR_PANELS`` #. Enjoy! Hacking ------- To start developing, install the requirements and run the tests via tox. Make sure you have the following services: * Memcached * Redis * PostgreSQL * MySQL For setup: #. Install: ``pip install -r requirements/hacking.txt`` #. For PostgreSQL: ``CREATE ROLE cachalot LOGIN SUPERUSER;`` #. Run: ``tox --current-env`` to run the test suite on your current Python version. #. You can also run specific databases and Django versions: ``tox -e py38-django3.1-postgresql-redis`` Benchmark --------- Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called "cachalot" on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called "cachalot." You can also run the benchmark, and it'll raise errors with specific instructions for how to fix it. #. Install: ``pip install -r requirements/benchmark.txt`` #. Run: ``python benchmark.py`` The output will be in benchmark/TODAY'S_DATE/ TODO Create Docker-compose file to allow for easier running of data. Third-Party Cache Comparison ---------------------------- There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix: TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won't need to scale to the point of needing cache-machine added to the bowl). If you're an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with ``select_related`` and ``prefetch_related``, you can get a nearly 100x speed up for your initial deployment. Recall, cachalot caches THE ENTIRE TABLE. That's where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it's just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it. Cachalot is more-or-less intended for cold caches or "just-right" conditions. If you find a partition library for Django (also authored but work-in-progress by `Andrew Chen Wang`_), then the caching will work better since sharding the cold/accessed-the-least records aren't invalidated as much. Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It's the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot. Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that's modified rapidly? Since you've mixed your cold (90% of) with your hot (10% of) records, you're caching and invalidating an entire table. It's like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them. Note 1: My personal experience with caches stems from Reddit's: https://redditblog.com/2017/01/17/caching-at-reddit/ Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools Discussion ---------- Help? Technical chat? `It's here on Discord `_. Legacy chats: - https://gitter.im/django-cachalot/Lobby - https://join.slack.com/t/cachalotdjango/shared_invite/zt-dd0tj27b-cIH6VlaSOjAWnTG~II5~qw .. _Andrew Chen Wang: https://github.com/Andrew-Chen-Wang .. image:: https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Framework :: Django :: 2.2 Classifier: Framework :: Django :: 3.1 Classifier: Framework :: Django :: 3.2 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Topic :: Internet :: WWW/HTTP ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/README.rst0000644000175100001710000001360000000000000015565 0ustar00runnerdockerDjango Cachalot =============== Caches your Django ORM queries and automatically invalidates them. Documentation: http://django-cachalot.readthedocs.io ---- .. image:: http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 :target: https://pypi.python.org/pypi/django-cachalot .. image:: https://img.shields.io/pypi/pyversions/django-cachalot :target: https://django-cachalot.readthedocs.io/en/latest/ .. image:: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml/badge.svg :target: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml .. image:: http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://coveralls.io/r/noripyt/django-cachalot?branch=master .. image:: http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://scrutinizer-ci.com/g/noripyt/django-cachalot/ .. image:: https://img.shields.io/discord/773656139207802881 :target: https://discord.gg/WFGFBk8rSU ---- Table of Contents: - Quickstart - Usage - Hacking - Benchmark - Third-Party Cache Comparison - Discussion Quickstart ---------- Cachalot officially supports Python 3.6-3.9 and Django 2.2 and 3.1-3.2 with the databases PostgreSQL, SQLite, and MySQL. Note: an upper limit on Django version is set for your safety. Please do not ignore it. Usage ----- #. ``pip install django-cachalot`` #. Add ``'cachalot',`` to your ``INSTALLED_APPS`` #. If you use multiple servers with a common cache server, `double check their clock synchronisation `_ #. If you modify data outside Django – typically after restoring a SQL database –, use the `manage.py command `_ #. Be aware of `the few other limits `_ #. If you use `django-debug-toolbar `_, you can add ``'cachalot.panels.CachalotPanel',`` to your ``DEBUG_TOOLBAR_PANELS`` #. Enjoy! Hacking ------- To start developing, install the requirements and run the tests via tox. Make sure you have the following services: * Memcached * Redis * PostgreSQL * MySQL For setup: #. Install: ``pip install -r requirements/hacking.txt`` #. For PostgreSQL: ``CREATE ROLE cachalot LOGIN SUPERUSER;`` #. Run: ``tox --current-env`` to run the test suite on your current Python version. #. You can also run specific databases and Django versions: ``tox -e py38-django3.1-postgresql-redis`` Benchmark --------- Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called "cachalot" on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called "cachalot." You can also run the benchmark, and it'll raise errors with specific instructions for how to fix it. #. Install: ``pip install -r requirements/benchmark.txt`` #. Run: ``python benchmark.py`` The output will be in benchmark/TODAY'S_DATE/ TODO Create Docker-compose file to allow for easier running of data. Third-Party Cache Comparison ---------------------------- There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix: TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won't need to scale to the point of needing cache-machine added to the bowl). If you're an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with ``select_related`` and ``prefetch_related``, you can get a nearly 100x speed up for your initial deployment. Recall, cachalot caches THE ENTIRE TABLE. That's where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it's just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it. Cachalot is more-or-less intended for cold caches or "just-right" conditions. If you find a partition library for Django (also authored but work-in-progress by `Andrew Chen Wang`_), then the caching will work better since sharding the cold/accessed-the-least records aren't invalidated as much. Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It's the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot. Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that's modified rapidly? Since you've mixed your cold (90% of) with your hot (10% of) records, you're caching and invalidating an entire table. It's like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them. Note 1: My personal experience with caches stems from Reddit's: https://redditblog.com/2017/01/17/caching-at-reddit/ Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools Discussion ---------- Help? Technical chat? `It's here on Discord `_. Legacy chats: - https://gitter.im/django-cachalot/Lobby - https://join.slack.com/t/cachalotdjango/shared_invite/zt-dd0tj27b-cIH6VlaSOjAWnTG~II5~qw .. _Andrew Chen Wang: https://github.com/Andrew-Chen-Wang .. image:: https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/cachalot/0000755000175100001710000000000000000000000015654 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/__init__.py0000644000175100001710000000046200000000000017767 0ustar00runnerdockerVERSION = (2, 4, 3) __version__ = ".".join(map(str, VERSION)) try: from django import VERSION as DJANGO_VERSION if DJANGO_VERSION < (3, 2): default_app_config = "cachalot.apps.CachalotConfig" except ImportError: # pragma: no cover default_app_config = "cachalot.apps.CachalotConfig" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/api.py0000644000175100001710000001466400000000000017012 0ustar00runnerdockerfrom contextlib import contextmanager from django.apps import apps from django.conf import settings from django.db import connections from .cache import cachalot_caches from .settings import cachalot_settings from .signals import post_invalidation from .transaction import AtomicCache from .utils import _invalidate_tables try: from asgiref.local import Local LOCAL_STORAGE = Local() except ImportError: import threading LOCAL_STORAGE = threading.local() __all__ = ('invalidate', 'get_last_invalidation', 'cachalot_disabled') def _cache_db_tables_iterator(tables, cache_alias, db_alias): no_tables = not tables cache_aliases = settings.CACHES if cache_alias is None else (cache_alias,) db_aliases = settings.DATABASES if db_alias is None else (db_alias,) for db_alias in db_aliases: if no_tables: tables = connections[db_alias].introspection.table_names() if tables: for cache_alias in cache_aliases: yield cache_alias, db_alias, tables def _get_tables(tables_or_models): for table_or_model in tables_or_models: if isinstance(table_or_model, str) and '.' in table_or_model: try: table_or_model = apps.get_model(table_or_model) except LookupError: pass yield (table_or_model if isinstance(table_or_model, str) else table_or_model._meta.db_table) def invalidate(*tables_or_models, **kwargs): """ Clears what was cached by django-cachalot implying one or more SQL tables or models from ``tables_or_models``. If ``tables_or_models`` is not specified, all tables found in the database (including those outside Django) are invalidated. If ``cache_alias`` is specified, it only clears the SQL queries stored on this cache, otherwise queries from all caches are cleared. If ``db_alias`` is specified, it only clears the SQL queries executed on this database, otherwise queries from all databases are cleared. :arg tables_or_models: SQL tables names, models or models lookups (or a combination) :type tables_or_models: tuple of strings or models :arg cache_alias: Alias from the Django ``CACHES`` setting :type cache_alias: string or NoneType :arg db_alias: Alias from the Django ``DATABASES`` setting :type db_alias: string or NoneType :returns: Nothing :rtype: NoneType """ # TODO: Replace with positional arguments when we drop Python 2 support. cache_alias = kwargs.pop('cache_alias', None) db_alias = kwargs.pop('db_alias', None) for k in kwargs: raise TypeError( "invalidate() got an unexpected keyword argument '%s'" % k) send_signal = False invalidated = set() for cache_alias, db_alias, tables in _cache_db_tables_iterator( list(_get_tables(tables_or_models)), cache_alias, db_alias): cache = cachalot_caches.get_cache(cache_alias, db_alias) if not isinstance(cache, AtomicCache): send_signal = True _invalidate_tables(cache, db_alias, tables) invalidated.update(tables) if send_signal: for table in invalidated: post_invalidation.send(table, db_alias=db_alias) def get_last_invalidation(*tables_or_models, **kwargs): """ Returns the timestamp of the most recent invalidation of the given ``tables_or_models``. If ``tables_or_models`` is not specified, all tables found in the database (including those outside Django) are used. If ``cache_alias`` is specified, it only fetches invalidations in this cache, otherwise invalidations in all caches are fetched. If ``db_alias`` is specified, it only fetches invalidations for this database, otherwise invalidations for all databases are fetched. :arg tables_or_models: SQL tables names, models or models lookups (or a combination) :type tables_or_models: tuple of strings or models :arg cache_alias: Alias from the Django ``CACHES`` setting :type cache_alias: string or NoneType :arg db_alias: Alias from the Django ``DATABASES`` setting :type db_alias: string or NoneType :returns: The timestamp of the most recent invalidation :rtype: float """ # TODO: Replace with positional arguments when we drop Python 2 support. cache_alias = kwargs.pop('cache_alias', None) db_alias = kwargs.pop('db_alias', None) for k in kwargs: raise TypeError("get_last_invalidation() got an unexpected " "keyword argument '%s'" % k) last_invalidation = 0.0 for cache_alias, db_alias, tables in _cache_db_tables_iterator( list(_get_tables(tables_or_models)), cache_alias, db_alias): get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN table_cache_keys = [get_table_cache_key(db_alias, t) for t in tables] invalidations = cachalot_caches.get_cache( cache_alias, db_alias).get_many(table_cache_keys).values() if invalidations: current_last_invalidation = max(invalidations) if current_last_invalidation > last_invalidation: last_invalidation = current_last_invalidation return last_invalidation @contextmanager def cachalot_disabled(all_queries=False): """ Context manager for temporarily disabling cachalot. If you evaluate the same queryset a second time, like normally for Django querysets, this will access the variable that saved it in-memory. For example: .. code-block:: python with cachalot_disabled(): qs = Test.objects.filter(blah=blah) # Does a single query to the db list(qs) # Evaluates queryset # Because the qs was evaluated, it's # saved in memory: list(qs) # this does 0 queries. # This does 1 query to the db list(Test.objects.filter(blah=blah)) If you evaluate the queryset outside the context manager, any duplicate query will use the cached result unless an object creation happens in between the original and duplicate query. :arg all_queries: Any query, including already evaluated queries, are re-evaluated. :type all_queries: bool """ was_enabled = getattr(LOCAL_STORAGE, "cachalot_enabled", cachalot_settings.CACHALOT_ENABLED) LOCAL_STORAGE.cachalot_enabled = False LOCAL_STORAGE.disable_on_all = all_queries yield LOCAL_STORAGE.cachalot_enabled = was_enabled ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/apps.py0000644000175100001710000000673600000000000017205 0ustar00runnerdockerimport copyreg from django.apps import AppConfig from django.conf import settings from django.core.checks import register, Tags, Warning, Error from cachalot.utils import ITERABLES from .settings import ( cachalot_settings, SUPPORTED_CACHE_BACKENDS, SUPPORTED_DATABASE_ENGINES, SUPPORTED_ONLY) @register(Tags.caches, Tags.compatibility) def check_cache_compatibility(app_configs, **kwargs): cache = settings.CACHES[cachalot_settings.CACHALOT_CACHE] cache_backend = cache['BACKEND'] if cache_backend not in SUPPORTED_CACHE_BACKENDS: return [Warning( 'Cache backend %r is not supported by django-cachalot.' % cache_backend, hint='Switch to a supported cache backend ' 'like Redis or Memcached.', id='cachalot.W001')] return [] @register(Tags.database, Tags.compatibility) def check_databases_compatibility(app_configs, **kwargs): errors = [] databases = settings.DATABASES original_enabled_databases = getattr(settings, 'CACHALOT_DATABASES', SUPPORTED_ONLY) enabled_databases = cachalot_settings.CACHALOT_DATABASES if original_enabled_databases == SUPPORTED_ONLY: if not cachalot_settings.CACHALOT_DATABASES: errors.append(Warning( 'None of the configured databases are supported ' 'by django-cachalot.', hint='Use a supported database, or remove django-cachalot, or ' 'put at least one database alias in `CACHALOT_DATABASES` ' 'to force django-cachalot to use it.', id='cachalot.W002' )) elif enabled_databases.__class__ in ITERABLES: for db_alias in enabled_databases: if db_alias in databases: engine = databases[db_alias]['ENGINE'] if engine not in SUPPORTED_DATABASE_ENGINES: errors.append(Warning( 'Database engine %r is not supported ' 'by django-cachalot.' % engine, hint='Switch to a supported database engine.', id='cachalot.W003' )) else: errors.append(Error( 'Database alias %r from `CACHALOT_DATABASES` ' 'is not defined in `DATABASES`.' % db_alias, hint='Change `CACHALOT_DATABASES` to be compliant with' '`CACHALOT_DATABASES`', id='cachalot.E001', )) if not enabled_databases: errors.append(Warning( 'Django-cachalot is useless because no database ' 'is configured in `CACHALOT_DATABASES`.', hint='Reconfigure django-cachalot or remove it.', id='cachalot.W004' )) else: errors.append(Error( "`CACHALOT_DATABASES` must be either %r or a list, tuple, " "frozenset or set of database aliases." % SUPPORTED_ONLY, hint='Remove `CACHALOT_DATABASES` or change it.', id='cachalot.E002', )) return errors class CachalotConfig(AppConfig): name = 'cachalot' def ready(self): # Cast memoryview objects to bytes to be able to pickle them. # https://docs.python.org/3/library/copyreg.html#copyreg.pickle copyreg.pickle(memoryview, lambda val: (memoryview, (bytes(val),))) cachalot_settings.load() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/cache.py0000644000175100001710000000403500000000000017273 0ustar00runnerdockerfrom collections import defaultdict from threading import local from django.core.cache import caches from django.db import DEFAULT_DB_ALIAS from .settings import cachalot_settings from .signals import post_invalidation from .transaction import AtomicCache class CacheHandler(local): @property def atomic_caches(self): if not hasattr(self, '_atomic_caches'): self._atomic_caches = defaultdict(list) return self._atomic_caches def get_atomic_cache(self, cache_alias, db_alias, level): if cache_alias not in self.atomic_caches[db_alias][level]: self.atomic_caches[db_alias][level][cache_alias] = AtomicCache( self.get_cache(cache_alias, db_alias, level-1), db_alias) return self.atomic_caches[db_alias][level][cache_alias] def get_cache(self, cache_alias=None, db_alias=None, atomic_level=-1): if db_alias is None: db_alias = DEFAULT_DB_ALIAS if cache_alias is None: cache_alias = cachalot_settings.CACHALOT_CACHE min_level = -len(self.atomic_caches[db_alias]) if atomic_level < min_level: return caches[cache_alias] return self.get_atomic_cache(cache_alias, db_alias, atomic_level) def enter_atomic(self, db_alias): if db_alias is None: db_alias = DEFAULT_DB_ALIAS self.atomic_caches[db_alias].append({}) def exit_atomic(self, db_alias, commit): if db_alias is None: db_alias = DEFAULT_DB_ALIAS atomic_caches = self.atomic_caches[db_alias].pop().values() if commit: to_be_invalidated = set() for atomic_cache in atomic_caches: atomic_cache.commit() to_be_invalidated.update(atomic_cache.to_be_invalidated) # This happens when committing the outermost atomic block. if not self.atomic_caches[db_alias]: for table in to_be_invalidated: post_invalidation.send(table, db_alias=db_alias) cachalot_caches = CacheHandler() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/jinja2ext.py0000644000175100001710000000524100000000000020126 0ustar00runnerdockerfrom django.core.cache import caches, DEFAULT_CACHE_ALIAS from django.core.cache.utils import make_template_fragment_key from jinja2.nodes import Keyword, Const, CallBlock from jinja2.ext import Extension from .api import get_last_invalidation class CachalotExtension(Extension): tags = {'cache'} allowed_kwargs = ('cache_key', 'timeout', 'cache_alias') def __init__(self, environment): super(CachalotExtension, self).__init__(environment) self.environment.globals.update( get_last_invalidation=get_last_invalidation) def parse_args(self, parser): args = [] kwargs = [] stream = parser.stream while stream.current.type != 'block_end': if stream.current.type == 'name' \ and stream.look().type == 'assign': key = stream.current.value if key not in self.allowed_kwargs: parser.fail( "'%s' is not a valid keyword argument " "for {%% cache %%}" % key, stream.current.lineno) stream.skip(2) value = parser.parse_expression() kwargs.append(Keyword(key, value, lineno=value.lineno)) else: args.append(parser.parse_expression()) if stream.current.type == 'block_end': break parser.stream.expect('comma') return args, kwargs def parse(self, parser): tag = parser.stream.current.value lineno = next(parser.stream).lineno args, kwargs = self.parse_args(parser) default_cache_key = (None if parser.filename is None else '%s:%d' % (parser.filename, lineno)) kwargs.append(Keyword('default_cache_key', Const(default_cache_key), lineno=lineno)) body = parser.parse_statements(['name:end' + tag], drop_needle=True) return CallBlock(self.call_method('cache', args, kwargs), [], [], body).set_lineno(lineno) def cache(self, *args, **kwargs): cache_alias = kwargs.get('cache_alias', DEFAULT_CACHE_ALIAS) cache_key = kwargs.get('cache_key', kwargs['default_cache_key']) if cache_key is None: raise ValueError( 'You must set `cache_key` when the template is not a file.') cache_key = make_template_fragment_key(cache_key, args) out = caches[cache_alias].get(cache_key) if out is None: out = kwargs['caller']() caches[cache_alias].set(cache_key, out, kwargs.get('timeout')) return out cachalot = CachalotExtension ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/cachalot/management/0000755000175100001710000000000000000000000017770 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/management/__init__.py0000644000175100001710000000000000000000000022067 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/cachalot/management/commands/0000755000175100001710000000000000000000000021571 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/management/commands/__init__.py0000644000175100001710000000000000000000000023670 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/management/commands/invalidate_cachalot.py0000644000175100001710000000363000000000000026123 0ustar00runnerdockerfrom django.conf import settings from django.core.management.base import BaseCommand from django.apps import apps from ...api import invalidate class Command(BaseCommand): help = 'Invalidates the cache keys set by django-cachalot.' def add_arguments(self, parser): parser.add_argument('app_label[.model_name]', nargs='*') parser.add_argument( '-c', '--cache', action='store', dest='cache_alias', choices=list(settings.CACHES.keys()), help='Cache alias from the CACHES setting.') parser.add_argument( '-d', '--db', action='store', dest='db_alias', choices=list(settings.DATABASES.keys()), help='Database alias from the DATABASES setting.') def handle(self, *args, **options): cache_alias = options['cache_alias'] db_alias = options['db_alias'] verbosity = int(options['verbosity']) labels = options['app_label[.model_name]'] models = [] for label in labels: try: models.extend(apps.get_app_config(label).get_models()) except LookupError: app_label = '.'.join(label.split('.')[:-1]) model_name = label.split('.')[-1] models.append(apps.get_model(app_label, model_name)) cache_str = '' if cache_alias is None else "on cache '%s'" % cache_alias db_str = '' if db_alias is None else "for database '%s'" % db_alias keys_str = 'keys for %s models' % len(models) if labels else 'all keys' if verbosity > 0: self.stdout.write(' '.join(filter(bool, ['Invalidating', keys_str, cache_str, db_str])) + '...') invalidate(*models, cache_alias=cache_alias, db_alias=db_alias) if verbosity > 0: self.stdout.write('Cache keys successfully invalidated.') ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/models.py0000644000175100001710000000000000000000000017477 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/monkey_patch.py0000644000175100001710000001522400000000000020713 0ustar00runnerdockerfrom collections.abc import Iterable from functools import wraps from time import time from django.core.exceptions import EmptyResultSet from django.db.backends.utils import CursorWrapper from django.db.models.signals import post_migrate from django.db.models.sql.compiler import ( SQLCompiler, SQLInsertCompiler, SQLUpdateCompiler, SQLDeleteCompiler, ) from django.db.transaction import Atomic, get_connection from .api import invalidate, LOCAL_STORAGE from .cache import cachalot_caches from .settings import cachalot_settings, ITERABLES from .utils import ( _get_table_cache_keys, _get_tables_from_sql, UncachableQuery, is_cachable, filter_cachable, ) WRITE_COMPILERS = (SQLInsertCompiler, SQLUpdateCompiler, SQLDeleteCompiler) def _unset_raw_connection(original): def inner(compiler, *args, **kwargs): compiler.connection.raw = False try: return original(compiler, *args, **kwargs) finally: compiler.connection.raw = True return inner def _get_result_or_execute_query(execute_query_func, cache, cache_key, table_cache_keys): try: data = cache.get_many(table_cache_keys + [cache_key]) except KeyError: data = None new_table_cache_keys = set(table_cache_keys) if data: new_table_cache_keys.difference_update(data) if not new_table_cache_keys: try: timestamp, result = data.pop(cache_key) if timestamp >= max(data.values()): return result except (KeyError, TypeError, ValueError): # In case `cache_key` is not in `data` or contains bad data, # we simply run the query and cache again the results. pass result = execute_query_func() if result.__class__ not in ITERABLES and isinstance(result, Iterable): result = list(result) now = time() to_be_set = {k: now for k in new_table_cache_keys} to_be_set[cache_key] = (now, result) cache.set_many(to_be_set, cachalot_settings.CACHALOT_TIMEOUT) return result def _patch_compiler(original): @wraps(original) @_unset_raw_connection def inner(compiler, *args, **kwargs): execute_query_func = lambda: original(compiler, *args, **kwargs) # Checks if utils/cachalot_disabled if not getattr(LOCAL_STORAGE, "cachalot_enabled", True): return execute_query_func() db_alias = compiler.using if db_alias not in cachalot_settings.CACHALOT_DATABASES \ or isinstance(compiler, WRITE_COMPILERS): return execute_query_func() try: cache_key = cachalot_settings.CACHALOT_QUERY_KEYGEN(compiler) table_cache_keys = _get_table_cache_keys(compiler) except (EmptyResultSet, UncachableQuery): return execute_query_func() return _get_result_or_execute_query( execute_query_func, cachalot_caches.get_cache(db_alias=db_alias), cache_key, table_cache_keys) return inner def _patch_write_compiler(original): @wraps(original) @_unset_raw_connection def inner(write_compiler, *args, **kwargs): db_alias = write_compiler.using table = write_compiler.query.get_meta().db_table if is_cachable(table): invalidate(table, db_alias=db_alias, cache_alias=cachalot_settings.CACHALOT_CACHE) return original(write_compiler, *args, **kwargs) return inner def _patch_orm(): if cachalot_settings.CACHALOT_ENABLED: SQLCompiler.execute_sql = _patch_compiler(SQLCompiler.execute_sql) for compiler in WRITE_COMPILERS: compiler.execute_sql = _patch_write_compiler(compiler.execute_sql) def _unpatch_orm(): if hasattr(SQLCompiler.execute_sql, '__wrapped__'): SQLCompiler.execute_sql = SQLCompiler.execute_sql.__wrapped__ for compiler in WRITE_COMPILERS: compiler.execute_sql = compiler.execute_sql.__wrapped__ def _patch_cursor(): def _patch_cursor_execute(original): @wraps(original) def inner(cursor, sql, *args, **kwargs): try: return original(cursor, sql, *args, **kwargs) finally: connection = cursor.db if getattr(connection, 'raw', True): if isinstance(sql, bytes): sql = sql.decode('utf-8') sql = sql.lower() if 'update' in sql or 'insert' in sql or 'delete' in sql \ or 'alter' in sql or 'create' in sql \ or 'drop' in sql: tables = filter_cachable( _get_tables_from_sql(connection, sql)) if tables: invalidate( *tables, db_alias=connection.alias, cache_alias=cachalot_settings.CACHALOT_CACHE) return inner if cachalot_settings.CACHALOT_INVALIDATE_RAW: CursorWrapper.execute = _patch_cursor_execute(CursorWrapper.execute) CursorWrapper.executemany = _patch_cursor_execute(CursorWrapper.executemany) def _unpatch_cursor(): if hasattr(CursorWrapper.execute, '__wrapped__'): CursorWrapper.execute = CursorWrapper.execute.__wrapped__ CursorWrapper.executemany = CursorWrapper.executemany.__wrapped__ def _patch_atomic(): def patch_enter(original): @wraps(original) def inner(self): cachalot_caches.enter_atomic(self.using) original(self) return inner def patch_exit(original): @wraps(original) def inner(self, exc_type, exc_value, traceback): needs_rollback = get_connection(self.using).needs_rollback try: original(self, exc_type, exc_value, traceback) finally: cachalot_caches.exit_atomic( self.using, exc_type is None and not needs_rollback) return inner Atomic.__enter__ = patch_enter(Atomic.__enter__) Atomic.__exit__ = patch_exit(Atomic.__exit__) def _unpatch_atomic(): Atomic.__enter__ = Atomic.__enter__.__wrapped__ Atomic.__exit__ = Atomic.__exit__.__wrapped__ def _invalidate_on_migration(sender, **kwargs): invalidate(*sender.get_models(), db_alias=kwargs['using'], cache_alias=cachalot_settings.CACHALOT_CACHE) def patch(): post_migrate.connect(_invalidate_on_migration) _patch_cursor() _patch_atomic() _patch_orm() def unpatch(): post_migrate.disconnect(_invalidate_on_migration) _unpatch_cursor() _unpatch_atomic() _unpatch_orm() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/panels.py0000644000175100001710000000466000000000000017516 0ustar00runnerdockerfrom collections import defaultdict from datetime import datetime from debug_toolbar.panels import Panel from django.apps import apps from django.conf import settings from django.utils.translation import gettext_lazy as _ from django.utils.timesince import timesince from .cache import cachalot_caches from .settings import cachalot_settings class CachalotPanel(Panel): title = 'Cachalot' template = 'cachalot/panel.html' def __init__(self, *args, **kwargs): self.last_invalidation = None super(CachalotPanel, self).__init__(*args, **kwargs) @property def enabled(self): enabled = super(CachalotPanel, self).enabled if enabled: self.enable_instrumentation() else: self.disable_instrumentation() return enabled def enable_instrumentation(self): settings.CACHALOT_ENABLED = True cachalot_settings.reload() def disable_instrumentation(self): settings.CACHALOT_ENABLED = False cachalot_settings.reload() def process_request(self, request): self.collect_invalidations() return super(CachalotPanel, self).process_request(request) def collect_invalidations(self): models = apps.get_models() data = defaultdict(list) cache = cachalot_caches.get_cache() for db_alias in settings.DATABASES: get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN model_cache_keys = { get_table_cache_key(db_alias, model._meta.db_table): model for model in models} for cache_key, timestamp in cache.get_many( model_cache_keys.keys()).items(): invalidation = datetime.fromtimestamp(timestamp) model = model_cache_keys[cache_key] data[db_alias].append( (model._meta.app_label, model.__name__, invalidation)) if self.last_invalidation is None \ or invalidation > self.last_invalidation: self.last_invalidation = invalidation data[db_alias].sort(key=lambda row: row[2], reverse=True) self.record_stats({'invalidations_per_db': data.items()}) @property def nav_subtitle(self): if self.enabled and self.last_invalidation is not None: return (_('Last invalidation: %s') % timesince(self.last_invalidation)) return '' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/settings.py0000644000175100001710000001040100000000000020062 0ustar00runnerdockerfrom itertools import chain from django.apps import apps from django.conf import settings from django.utils.module_loading import import_string SUPPORTED_DATABASE_ENGINES = { 'django.db.backends.sqlite3', 'django.db.backends.postgresql', 'django.db.backends.mysql', # TODO: Remove when we drop Django 2.x support. 'django.db.backends.postgresql_psycopg2', # GeoDjango 'django.contrib.gis.db.backends.spatialite', 'django.contrib.gis.db.backends.postgis', 'django.contrib.gis.db.backends.mysql', # django-transaction-hooks 'transaction_hooks.backends.sqlite3', 'transaction_hooks.backends.postgis', # TODO: Remove when we drop Django 2.x support. 'transaction_hooks.backends.postgresql_psycopg2', 'transaction_hooks.backends.mysql', # django-prometheus wrapped engines 'django_prometheus.db.backends.sqlite3', 'django_prometheus.db.backends.postgresql', 'django_prometheus.db.backends.mysql', } SUPPORTED_CACHE_BACKENDS = { 'django.core.cache.backends.dummy.DummyCache', 'django.core.cache.backends.locmem.LocMemCache', 'django.core.cache.backends.filebased.FileBasedCache', 'django_redis.cache.RedisCache', 'django.core.cache.backends.memcached.MemcachedCache', 'django.core.cache.backends.memcached.PyLibMCCache', 'django.core.cache.backends.memcached.PyMemcacheCache', } SUPPORTED_ONLY = 'supported_only' ITERABLES = {tuple, list, frozenset, set} class Settings(object): patched = False converters = {} CACHALOT_ENABLED = True CACHALOT_CACHE = 'default' CACHALOT_DATABASES = 'supported_only' CACHALOT_TIMEOUT = None CACHALOT_CACHE_RANDOM = False CACHALOT_INVALIDATE_RAW = True CACHALOT_ONLY_CACHABLE_TABLES = () CACHALOT_ONLY_CACHABLE_APPS = () CACHALOT_UNCACHABLE_TABLES = ('django_migrations',) CACHALOT_UNCACHABLE_APPS = () CACHALOT_ADDITIONAL_TABLES = () CACHALOT_QUERY_KEYGEN = 'cachalot.utils.get_query_cache_key' CACHALOT_TABLE_KEYGEN = 'cachalot.utils.get_table_cache_key' @classmethod def add_converter(cls, setting): def inner(func): cls.converters[setting] = func return inner @classmethod def get_names(cls): return {name for name in cls.__dict__ if name[:2] != '__' and name.isupper()} def load(self): for name in self.get_names(): value = getattr(settings, name, getattr(self.__class__, name)) converter = self.converters.get(name) if converter is not None: value = converter(value) setattr(self, name, value) if not self.patched: from .monkey_patch import patch patch() self.patched = True def unload(self): if self.patched: from .monkey_patch import unpatch unpatch() self.patched = False def reload(self): self.unload() self.load() @Settings.add_converter('CACHALOT_DATABASES') def convert(value): if value == SUPPORTED_ONLY: value = {alias for alias, setting in settings.DATABASES.items() if setting['ENGINE'] in SUPPORTED_DATABASE_ENGINES} if value.__class__ in ITERABLES: return frozenset(value) return value def convert_tables(value, setting_app_name): dj_apps = getattr(settings, setting_app_name, ()) if dj_apps: dj_apps = tuple(model._meta.db_table for model in chain.from_iterable( apps.all_models[_app].values() for _app in dj_apps )) # Use [] lookup to make sure app is loaded (via INSTALLED_APP's order) return frozenset(tuple(value) + dj_apps) return frozenset(value) @Settings.add_converter('CACHALOT_ONLY_CACHABLE_TABLES') def convert(value): return convert_tables(value, 'CACHALOT_ONLY_CACHABLE_APPS') @Settings.add_converter('CACHALOT_UNCACHABLE_TABLES') def convert(value): return convert_tables(value, 'CACHALOT_UNCACHABLE_APPS') @Settings.add_converter('CACHALOT_ADDITIONAL_TABLES') def convert(value): return list(value) @Settings.add_converter('CACHALOT_QUERY_KEYGEN') def convert(value): return import_string(value) @Settings.add_converter('CACHALOT_TABLE_KEYGEN') def convert(value): return import_string(value) cachalot_settings = Settings() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/signals.py0000644000175100001710000000022400000000000017664 0ustar00runnerdockerfrom django.dispatch import Signal # sender: name of table invalidated # db_alias: name of database that was effected post_invalidation = Signal() ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1629730621.8326948 django-cachalot-2.4.3/cachalot/templates/0000755000175100001710000000000000000000000017652 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/cachalot/templates/cachalot/0000755000175100001710000000000000000000000021430 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/templates/cachalot/panel.html0000644000175100001710000000120000000000000023406 0ustar00runnerdocker{% load i18n %} {% for db_alias, invalidations in invalidations_per_db %}

{% blocktrans %}Database '{{ db_alias }}'{% endblocktrans %}

{% for app_label, model, datetime in invalidations %} {% endfor %}
{% trans 'Application' %} {% trans 'Model' %} {% trans 'Last invalidation' %}
{{ app_label }} {{ model }} {{ datetime|timesince }}
{% endfor %} ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/cachalot/templatetags/0000755000175100001710000000000000000000000020346 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/templatetags/__init__.py0000644000175100001710000000000000000000000022445 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/templatetags/cachalot.py0000644000175100001710000000022100000000000022471 0ustar00runnerdockerfrom django.template import Library from ..api import get_last_invalidation register = Library() register.simple_tag(get_last_invalidation) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/cachalot/tests/0000755000175100001710000000000000000000000017016 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/__init__.py0000644000175100001710000000125300000000000021130 0ustar00runnerdockerfrom django.core.signals import setting_changed from django.dispatch import receiver from ..settings import cachalot_settings from .read import ReadTestCase, ParameterTypeTestCase from .write import WriteTestCase, DatabaseCommandTestCase from .transaction import AtomicTestCase from .thread_safety import ThreadSafetyTestCase from .multi_db import MultiDatabaseTestCase from .settings import SettingsTestCase from .api import APITestCase, CommandTestCase from .signals import SignalsTestCase from .postgres import PostgresReadTestCase from .debug_toolbar import DebugToolbarTestCase @receiver(setting_changed) def reload_settings(sender, **kwargs): cachalot_settings.reload() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/api.py0000644000175100001710000003752500000000000020155 0ustar00runnerdockerimport os from time import time, sleep from unittest import skipIf from django.conf import settings from django.contrib.auth.models import Permission, User from django.core.cache import DEFAULT_CACHE_ALIAS, caches from django.core.management import call_command from django.db import connection, transaction, DEFAULT_DB_ALIAS from django.template import engines from django.test import TransactionTestCase from jinja2.exceptions import TemplateSyntaxError from ..api import * from .models import Test from .test_utils import TestUtilsMixin class APITestCase(TestUtilsMixin, TransactionTestCase): databases = set(settings.DATABASES.keys()) def setUp(self): super(APITestCase, self).setUp() self.t1 = Test.objects.create(name='test1') self.cache_alias2 = next(alias for alias in settings.CACHES if alias != DEFAULT_CACHE_ALIAS) # For cachalot_disabled test self.user = User.objects.create_user('user') self.t1__permission = (Permission.objects.order_by('?') .select_related('content_type')[0]) def test_invalidate_tables(self): with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, ['test1']) with self.settings(CACHALOT_INVALIDATE_RAW=False): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s);", [1 if self.is_sqlite else True]) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data2, ['test1']) invalidate('cachalot_test') with self.assertNumQueries(1): data3 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data3, ['test1', 'test2']) def test_invalidate_models_lookups(self): with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, ['test1']) with self.settings(CACHALOT_INVALIDATE_RAW=False): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s);", [1 if self.is_sqlite else True]) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data2, ['test1']) invalidate('cachalot.Test') with self.assertNumQueries(1): data3 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data3, ['test1', 'test2']) def test_invalidate_models(self): with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, ['test1']) with self.settings(CACHALOT_INVALIDATE_RAW=False): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s);", [1 if self.is_sqlite else True]) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data2, ['test1']) invalidate(Test) with self.assertNumQueries(1): data3 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data3, ['test1', 'test2']) def test_invalidate_all(self): with self.assertNumQueries(1): Test.objects.get() with self.assertNumQueries(0): Test.objects.get() invalidate() with self.assertNumQueries(1): Test.objects.get() def test_invalidate_all_in_atomic(self): with transaction.atomic(): with self.assertNumQueries(1): Test.objects.get() with self.assertNumQueries(0): Test.objects.get() invalidate() with self.assertNumQueries(1): Test.objects.get() with self.assertNumQueries(1): Test.objects.get() def test_get_last_invalidation(self): invalidate() timestamp = get_last_invalidation() delta = 0.15 if os.environ.get("CACHE_BACKEND") == "filebased" else 0.1 self.assertAlmostEqual(timestamp, time(), delta=delta) sleep(0.1) invalidate('cachalot_test') timestamp = get_last_invalidation('cachalot_test') self.assertAlmostEqual(timestamp, time(), delta=delta) same_timestamp = get_last_invalidation('cachalot.Test') self.assertEqual(same_timestamp, timestamp) same_timestamp = get_last_invalidation(Test) self.assertEqual(same_timestamp, timestamp) timestamp = get_last_invalidation('cachalot_testparent') self.assertNotAlmostEqual(timestamp, time(), delta=0.1) timestamp = get_last_invalidation('cachalot_testparent', 'cachalot_test') self.assertAlmostEqual(timestamp, time(), delta=delta) def test_get_last_invalidation_template_tag(self): # Without arguments original_timestamp = engines['django'].from_string( "{{ timestamp }}" ).render({ 'timestamp': get_last_invalidation(), }) template = engines['django'].from_string(""" {% load cachalot %} {% get_last_invalidation as timestamp %} {{ timestamp }} """) timestamp = template.render().strip() self.assertNotEqual(timestamp, '') self.assertNotEqual(timestamp, '0.0') self.assertAlmostEqual(float(timestamp), float(original_timestamp), delta=0.1) # With arguments original_timestamp = engines['django'].from_string( "{{ timestamp }}" ).render({ 'timestamp': get_last_invalidation('auth.Group', 'cachalot_test'), }) template = engines['django'].from_string(""" {% load cachalot %} {% get_last_invalidation 'auth.Group' 'cachalot_test' as timestamp %} {{ timestamp }} """) timestamp = template.render().strip() self.assertNotEqual(timestamp, '') self.assertNotEqual(timestamp, '0.0') self.assertAlmostEqual(float(timestamp), float(original_timestamp), delta=0.1) # While using the `cache` template tag, with invalidation template = engines['django'].from_string(""" {% load cachalot cache %} {% get_last_invalidation 'auth.Group' 'cachalot_test' as timestamp %} {% cache 10 cache_key_name timestamp %} {{ content }} {% endcache %} """) content = template.render({'content': 'something'}).strip() self.assertEqual(content, 'something') content = template.render({'content': 'anything'}).strip() self.assertEqual(content, 'something') invalidate('cachalot_test') content = template.render({'content': 'yet another'}).strip() self.assertEqual(content, 'yet another') def test_get_last_invalidation_jinja2(self): original_timestamp = engines['jinja2'].from_string( "{{ timestamp }}" ).render({ 'timestamp': get_last_invalidation('auth.Group', 'cachalot_test'), }) template = engines['jinja2'].from_string( "{{ get_last_invalidation('auth.Group', 'cachalot_test') }}") timestamp = template.render({}) self.assertNotEqual(timestamp, '') self.assertNotEqual(timestamp, '0.0') self.assertAlmostEqual(float(timestamp), float(original_timestamp), delta=0.1) def test_cache_jinja2(self): # Invalid arguments with self.assertRaises(TemplateSyntaxError, msg="'invalid' is not a valid keyword argument " "for {% cache %}"): engines['jinja2'].from_string(""" {% cache cache_key='anything', invalid='what?' %}{% endcache %} """) with self.assertRaises(ValueError, msg='You must set `cache_key` when ' 'the template is not a file.'): engines['jinja2'].from_string( '{% cache %} broken {% endcache %}').render() # With the minimum number of arguments template = engines['jinja2'].from_string(""" {%- cache cache_key='first' -%} {{ content1 }} {%- endcache -%} {%- cache cache_key='second' -%} {{ content2 }} {%- endcache -%} """) content = template.render({'content1': 'abc', 'content2': 'def'}) self.assertEqual(content, 'abcdef') invalidate() content = template.render({'content1': 'ghi', 'content2': 'jkl'}) self.assertEqual(content, 'abcdef') # With the maximum number of arguments template = engines['jinja2'].from_string(""" {%- cache get_last_invalidation('auth.Group', 'cachalot_test', cache_alias=cache), timeout=10, cache_key='cache_key_name', cache_alias=cache -%} {{ content }} {%- endcache -%} """) content = template.render({'content': 'something', 'cache': self.cache_alias2}) self.assertEqual(content, 'something') content = template.render({'content': 'anything', 'cache': self.cache_alias2}) self.assertEqual(content, 'something') invalidate('cachalot_test', cache_alias=DEFAULT_CACHE_ALIAS) content = template.render({'content': 'yet another', 'cache': self.cache_alias2}) self.assertEqual(content, 'something') invalidate('cachalot_test') content = template.render({'content': 'will you change?', 'cache': self.cache_alias2}) self.assertEqual(content, 'will you change?') caches[self.cache_alias2].clear() content = template.render({'content': 'better!', 'cache': self.cache_alias2}) self.assertEqual(content, 'better!') def test_cachalot_disabled_multiple_queries_ignoring_in_mem_cache(self): """ Test that when queries are given the `cachalot_disabled` context manager, the queries will not be cached. """ with cachalot_disabled(True): qs = Test.objects.all() with self.assertNumQueries(1): data1 = list(qs.all()) Test.objects.create( name='test3', owner=self.user, date='1789-07-14', datetime='1789-07-14T16:43:27', permission=self.t1__permission) with self.assertNumQueries(1): data2 = list(qs.all()) self.assertNotEqual(data1, data2) def test_query_cachalot_disabled_even_if_already_cached(self): """ Test that when a query is given the `cachalot_disabled` context manager, the query outside of the context manager will be cached. Any duplicated query will use the original query's cached result. """ qs = Test.objects.all() self.assert_query_cached(qs) with cachalot_disabled() and self.assertNumQueries(0): list(qs.all()) def test_duplicate_query_execute_anyways(self): """After an object is created, a duplicate query should execute rather than use the cached result. """ qs = Test.objects.all() self.assert_query_cached(qs) Test.objects.create( name='test3', owner=self.user, date='1789-07-14', datetime='1789-07-14T16:43:27', permission=self.t1__permission) with cachalot_disabled() and self.assertNumQueries(1): list(qs.all()) class CommandTestCase(TransactionTestCase): multi_db = True databases = "__all__" def setUp(self): self.db_alias2 = next(alias for alias in settings.DATABASES if alias != DEFAULT_DB_ALIAS) self.cache_alias2 = next(alias for alias in settings.CACHES if alias != DEFAULT_CACHE_ALIAS) self.t1 = Test.objects.create(name='test1') self.t2 = Test.objects.using(self.db_alias2).create(name='test2') self.u = User.objects.create_user('test') def test_invalidate_cachalot(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'auth', verbosity=0) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'cachalot', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'cachalot.testchild', verbosity=0) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', 'cachalot.test', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1): self.assertListEqual(list(User.objects.all()), [self.u]) call_command('invalidate_cachalot', 'cachalot.test', 'auth.user', verbosity=0) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1): self.assertListEqual(list(User.objects.all()), [self.u]) @skipIf(len(settings.DATABASES) == 1, 'We can’t change the DB used since there’s only one configured') def test_invalidate_cachalot_multi_db(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0, db_alias=self.db_alias2) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1, using=self.db_alias2): self.assertListEqual(list(Test.objects.using(self.db_alias2)), [self.t2]) call_command('invalidate_cachalot', verbosity=0, db_alias=self.db_alias2) with self.assertNumQueries(1, using=self.db_alias2): self.assertListEqual(list(Test.objects.using(self.db_alias2)), [self.t2]) @skipIf(len(settings.CACHES) == 1, 'We can’t change the cache used since there’s only one configured') def test_invalidate_cachalot_multi_cache(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0, cache_alias=self.cache_alias2) with self.assertNumQueries(0): self.assertListEqual(list(Test.objects.all()), [self.t1]) with self.assertNumQueries(1): with self.settings(CACHALOT_CACHE=self.cache_alias2): self.assertListEqual(list(Test.objects.all()), [self.t1]) call_command('invalidate_cachalot', verbosity=0, cache_alias=self.cache_alias2) with self.assertNumQueries(1): with self.settings(CACHALOT_CACHE=self.cache_alias2): self.assertListEqual(list(Test.objects.all()), [self.t1]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/db_router.py0000644000175100001710000000121700000000000021356 0ustar00runnerdockerfrom django.conf import settings class PostgresRouter(object): @staticmethod def in_postgres(model): app_label = model._meta.app_label model_name = model._meta.model_name return app_label == 'cachalot' and model_name == 'postgresmodel' def get_postgresql_alias(self): return ('postgresql' if 'postgresql' in settings.DATABASES else 'default') def allow_migrate(self, db, app_label, model=None, **hints): if hints.get('extension') in ('hstore', 'unaccent') \ or (model is not None and self.in_postgres(model)): return db == self.get_postgresql_alias() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/debug_toolbar.py0000644000175100001710000000223200000000000022177 0ustar00runnerdockerfrom uuid import UUID from bs4 import BeautifulSoup from django.conf import settings from django.test import LiveServerTestCase, override_settings @override_settings(DEBUG=True) class DebugToolbarTestCase(LiveServerTestCase): databases = set(settings.DATABASES.keys()) def test_rendering(self): # # Rendering toolbar # response = self.client.get('/') self.assertEqual(response.status_code, 200) soup = BeautifulSoup(response.content.decode('utf-8'), 'html.parser') toolbar = soup.find(id='djDebug') self.assertIsNotNone(toolbar) store_id = toolbar.attrs['data-store-id'] # Checks that store_id is a valid UUID. UUID(store_id) render_panel_url = toolbar.attrs['data-render-panel-url'] panel_id = soup.find(title='Cachalot')['class'][0] panel_url = ('%s?store_id=%s&panel_id=%s' % (render_panel_url, store_id, panel_id)) # # Rendering panel # panel_response = self.client.get(panel_url) self.assertEqual(panel_response.status_code, 200) # TODO: Check that the displayed data is correct. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/loaddata_fixture.json0000644000175100001710000000007200000000000023227 0ustar00runnerdocker[{"fields": {"name": "test2"}, "model": "cachalot.test"}] ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/cachalot/tests/migrations/0000755000175100001710000000000000000000000021172 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/migrations/0001_initial.py0000644000175100001710000001126200000000000023637 0ustar00runnerdockerfrom django.conf import settings from django.contrib.postgres.fields import ( ArrayField, HStoreField, IntegerRangeField, DateRangeField, DateTimeRangeField) from django.contrib.postgres.operations import ( HStoreExtension, UnaccentExtension) from django.db import models, migrations def extra_regular_available_fields(): fields = [] try: # TODO Add to module import when Dj40 dropped from django import VERSION as DJANGO_VERSION from django.contrib.postgres.fields import JSONField if float(".".join(map(str, DJANGO_VERSION[:2]))) > 3.0: fields.append(('json', JSONField(null=True, blank=True))) except ImportError: pass return fields def extra_postgres_available_fields(): fields = [] try: # TODO Remove when Dj31 support is dropped from django.contrib.postgres.fields import FloatRangeField fields.append(('float_range', FloatRangeField(null=True, blank=True))) except ImportError: pass try: # TODO Add to module import when Dj31 is dropped from django.contrib.postgres.fields import DecimalRangeField fields.append(('decimal_range', DecimalRangeField(null=True, blank=True))) except ImportError: pass # Future proofing with Django 40 deprecation try: # TODO Remove when Dj40 support is dropped from django.contrib.postgres.fields import JSONField fields.append(('json', JSONField(null=True, blank=True))) except ImportError: pass return fields class Migration(migrations.Migration): dependencies = [ ('auth', '0001_initial'), migrations.swappable_dependency(settings.AUTH_USER_MODEL), ] operations = [ migrations.CreateModel( name='Test', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('name', models.CharField(max_length=20)), ('public', models.BooleanField(default=False)), ('date', models.DateField(null=True, blank=True)), ('datetime', models.DateTimeField(null=True, blank=True)), ('owner', models.ForeignKey(blank=True, to=settings.AUTH_USER_MODEL, null=True, on_delete=models.SET_NULL)), ('permission', models.ForeignKey(blank=True, to='auth.Permission', null=True, on_delete=models.PROTECT)), ('a_float', models.FloatField(null=True, blank=True)), ('a_decimal', models.DecimalField(null=True, blank=True, max_digits=5, decimal_places=2)), ('bin', models.BinaryField(null=True, blank=True)), ('ip', models.GenericIPAddressField(null=True, blank=True)), ('duration', models.DurationField(null=True, blank=True)), ('uuid', models.UUIDField(null=True, blank=True)), ] + extra_regular_available_fields(), options={ 'ordering': ('name',), }, ), migrations.CreateModel( name='TestParent', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('name', models.CharField(max_length=20)), ], ), migrations.CreateModel( name='TestChild', fields=[ ('testparent_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='cachalot.TestParent', on_delete=models.CASCADE)), ('public', models.BooleanField(default=False)), ('permissions', models.ManyToManyField('auth.Permission', blank=True)) ], bases=('cachalot.testparent',), ), HStoreExtension(), UnaccentExtension(), migrations.CreateModel( name='PostgresModel', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('int_array', ArrayField( models.IntegerField(null=True, blank=True), size=3, null=True, blank=True)), ('hstore', HStoreField(null=True, blank=True)), ('int_range', IntegerRangeField(null=True, blank=True)), ('date_range', DateRangeField(null=True, blank=True)), ('datetime_range', DateTimeRangeField(null=True, blank=True)), ] + extra_postgres_available_fields(), ), migrations.RunSQL('CREATE TABLE cachalot_unmanagedmodel ' '(id SERIAL PRIMARY KEY, name VARCHAR(50));'), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/migrations/__init__.py0000644000175100001710000000000000000000000023271 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/models.py0000644000175100001710000000537700000000000020667 0ustar00runnerdockerfrom django.conf import settings from django.contrib.postgres.fields import ( ArrayField, HStoreField, IntegerRangeField, DateRangeField, DateTimeRangeField) from django.db.models import ( Model, CharField, ForeignKey, BooleanField, DateField, DateTimeField, ManyToManyField, BinaryField, IntegerField, GenericIPAddressField, FloatField, DecimalField, DurationField, UUIDField, SET_NULL, PROTECT) class Test(Model): name = CharField(max_length=20) owner = ForeignKey(settings.AUTH_USER_MODEL, null=True, blank=True, on_delete=SET_NULL) public = BooleanField(default=False) date = DateField(null=True, blank=True) datetime = DateTimeField(null=True, blank=True) permission = ForeignKey('auth.Permission', null=True, blank=True, on_delete=PROTECT) # We can’t use the exact names `float` or `decimal` as database column name # since it fails on MySQL. a_float = FloatField(null=True, blank=True) a_decimal = DecimalField(null=True, blank=True, max_digits=5, decimal_places=2) bin = BinaryField(null=True, blank=True) ip = GenericIPAddressField(null=True, blank=True) duration = DurationField(null=True, blank=True) uuid = UUIDField(null=True, blank=True) try: from django.db.models import JSONField json = JSONField(null=True, blank=True) except ImportError: pass class Meta: ordering = ('name',) class TestParent(Model): name = CharField(max_length=20) class TestChild(TestParent): public = BooleanField(default=False) permissions = ManyToManyField('auth.Permission', blank=True) class PostgresModel(Model): int_array = ArrayField(IntegerField(null=True, blank=True), size=3, null=True, blank=True) hstore = HStoreField(null=True, blank=True) try: from django.contrib.postgres.fields import JSONField json = JSONField(null=True, blank=True) except ImportError: pass int_range = IntegerRangeField(null=True, blank=True) try: from django.contrib.postgres.fields import FloatRangeField float_range = FloatRangeField(null=True, blank=True) except ImportError: pass try: from django.contrib.postgres.fields import DecimalRangeField decimal_range = DecimalRangeField(null=True, blank=True) except ImportError: pass date_range = DateRangeField(null=True, blank=True) datetime_range = DateTimeRangeField(null=True, blank=True) class Meta: # Tests schema name in table name. db_table = '"public"."cachalot_postgresmodel"' class UnmanagedModel(Model): name = CharField(max_length=50) class Meta: managed = False ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/multi_db.py0000644000175100001710000001307600000000000021176 0ustar00runnerdockerfrom unittest import skipIf from django import VERSION as DJANGO_VERSION from django.conf import settings from django.db import DEFAULT_DB_ALIAS, connections, transaction from django.test import TransactionTestCase from .models import Test @skipIf(len(settings.DATABASES) == 1, 'We can’t change the DB used since there’s only one configured') class MultiDatabaseTestCase(TransactionTestCase): multi_db = True databases = "__all__" def setUp(self): self.t1 = Test.objects.create(name='test1') self.t2 = Test.objects.create(name='test2') self.db_alias2 = next(alias for alias in settings.DATABASES if alias != DEFAULT_DB_ALIAS) connection2 = connections[self.db_alias2] self.is_sqlite2 = connection2.vendor == 'sqlite' self.is_mysql2 = connection2.vendor == 'mysql' if connection2.vendor in ('mysql', 'postgresql'): # We need to reopen the connection or Django # will execute an extra SQL request below. connection2.cursor() def is_django_21_below_and_sqlite2(self): """ Note: See test_utils.py with this function name Checks if Django 2.1 or below and SQLite2 """ django_version = DJANGO_VERSION if not self.is_sqlite2: # Immediately know if SQLite return False if django_version[0] < 2: # Takes Django 0 and 1 out of the picture return True else: if django_version[0] == 2 and django_version[1] < 2: # Takes Django 2.0-2.1 out return True return False def test_read(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, [self.t1, self.t2]) with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, []) with self.assertNumQueries(0, using=self.db_alias2): data3 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data3, []) def test_invalidate_other_db(self): """ Tests if the non-default database is invalidated when modified. """ with self.assertNumQueries(1, using=self.db_alias2): data1 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data1, []) with self.assertNumQueries(2 if self.is_django_21_below_and_sqlite2() else 1, using=self.db_alias2): t3 = Test.objects.using(self.db_alias2).create(name='test3') with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, [t3]) def test_invalidation_independence(self): """ Tests if invalidation doesn’t affect the unmodified databases. """ with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, [self.t1, self.t2]) with self.assertNumQueries(2 if self.is_django_21_below_and_sqlite2() else 1, using=self.db_alias2): Test.objects.using(self.db_alias2).create(name='test3') with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, [self.t1, self.t2]) def test_heterogeneous_atomics(self): """ Checks that an atomic block for a database nested inside another atomic block for another database has no impact on their caching. """ with transaction.atomic(): with transaction.atomic(self.db_alias2): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, [self.t1, self.t2]) with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, []) t3 = Test.objects.using(self.db_alias2).create(name='test3') with self.assertNumQueries(1, using=self.db_alias2): data3 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data3, [t3]) with self.assertNumQueries(0): data4 = list(Test.objects.all()) self.assertListEqual(data4, [self.t1, self.t2]) with self.assertNumQueries(1): data5 = list(Test.objects.filter(name='test3')) self.assertListEqual(data5, []) def test_heterogeneous_atomics_independence(self): """ Checks that interrupting an atomic block after the commit of another atomic block for another database nested inside it correctly invalidates the cache for the committed transaction. """ with self.assertNumQueries(1, using=self.db_alias2): data1 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data1, []) try: with transaction.atomic(): with transaction.atomic(self.db_alias2): t3 = Test.objects.using( self.db_alias2).create(name='test3') raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(1, using=self.db_alias2): data2 = list(Test.objects.using(self.db_alias2)) self.assertListEqual(data2, [t3]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/postgres.py0000644000175100001710000004167700000000000021255 0ustar00runnerdockerfrom datetime import date, datetime from decimal import Decimal from unittest import skipUnless from django.contrib.postgres.functions import TransactionNow from django.db import connection from django.test import TransactionTestCase, override_settings from psycopg2.extras import NumericRange, DateRange, DateTimeTZRange from pytz import timezone from ..utils import UncachableQuery from .api import invalidate from .models import PostgresModel, Test from .test_utils import TestUtilsMixin # FIXME: Add tests for aggregations. def is_pg_field_available(name): fields = [] try: from django.contrib.postgres.fields import FloatRangeField fields.append("FloatRangeField") except ImportError: pass try: from django.contrib.postgres.fields import DecimalRangeField fields.append("DecimalRangeField") except ImportError: pass try: from django import VERSION from django.contrib.postgres.fields import JSONField if VERSION[0] < 4: fields.append("JSONField") except ImportError: pass return name in fields @skipUnless(connection.vendor == 'postgresql', 'This test is only for PostgreSQL') @override_settings(USE_TZ=True) class PostgresReadTestCase(TestUtilsMixin, TransactionTestCase): def setUp(self): self.obj1 = PostgresModel( int_array=[1, 2, 3], hstore={'a': 'b', 'c': None}, int_range=[1900, 2000], date_range=['1678-03-04', '1741-07-28'], datetime_range=[ datetime(1989, 1, 30, 12, 20, tzinfo=timezone('Europe/Paris')), None ] ) self.obj2 = PostgresModel( int_array=[4, None, 6], hstore={'a': '1', 'b': '2'}, int_range=[1989, None], date_range=['1989-01-30', None], datetime_range=[None, None]) if is_pg_field_available("JSONField"): self.obj1.json = {'a': 1, 'b': 2} self.obj2.json = [ 'something', { 'a': 1, 'b': None, 'c': 123.456, 'd': True, 'e': { 'another': 'dict', 'and yet': { 'another': 'one', 'with a list': [], }, }, }, ] if is_pg_field_available("FloatRangeField"): self.obj1.float_range = [-1e3, 9.87654321] self.obj2.float_range = [0.0, None] if is_pg_field_available("DecimalRangeField"): self.obj1.decimal_range = [-1e3, 9.87654321] self.obj2.decimal_range = [0.0, None] self.obj1.save() self.obj2.save() def test_unaccent(self): Test.objects.create(name='Clémentine') Test.objects.create(name='Clementine') qs = (Test.objects.filter(name__unaccent='Clémentine') .values_list('name', flat=True)) self.assert_tables(qs, Test) self.assert_query_cached(qs, ['Clementine', 'Clémentine']) def test_int_array(self): with self.assertNumQueries(1): data1 = [o.int_array for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects .values_list('int_array', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [[1, 2, 3], [4, None, 6]]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('int_array', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3], [4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__contains=[3]) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3]]) qs = (PostgresModel.objects .filter(int_array__contained_by=[1, 2, 3, 4, 5, 6]) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3]]) qs = (PostgresModel.objects.filter(int_array__overlap=[3, 4]) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3], [4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__len__in=(2, 3)) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3], [4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__2=6) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[4, None, 6]]) qs = (PostgresModel.objects.filter(int_array__0_2=(1, 2)) .values_list('int_array', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [[1, 2, 3]]) def test_hstore(self): with self.assertNumQueries(1): data1 = [o.hstore for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects .values_list('hstore', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [{'a': 'b', 'c': None}, {'a': '1', 'b': '2'}]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('hstore', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}, {'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__a='1') .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__contains={'a': 'b'}) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}]) qs = (PostgresModel.objects .filter(hstore__contained_by={'a': 'b', 'c': None, 'b': '2'}) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}]) qs = (PostgresModel.objects.filter(hstore__has_key='c') .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': 'b', 'c': None}]) qs = (PostgresModel.objects.filter(hstore__has_keys=['a', 'b']) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__keys=['a', 'b']) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) qs = (PostgresModel.objects.filter(hstore__values=['1', '2']) .values_list('hstore', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [{'a': '1', 'b': '2'}]) @skipUnless(is_pg_field_available("JSONField"), "JSONField was removed in Dj 4.0") def test_json(self): with self.assertNumQueries(1): data1 = [o.json for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects.values_list('json', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.obj1.json, self.obj2.json]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('json', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json, self.obj2.json]) # Tests an index. qs = (PostgresModel.objects.filter(json__0='something') .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj2.json]) qs = (PostgresModel.objects .filter(json__0__nonexistent_key='something') .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, []) # Tests a path with spaces. qs = (PostgresModel.objects .filter(**{'json__1__e__and yet__another': 'one'}) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj2.json]) qs = (PostgresModel.objects.filter(json__contains=['something']) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj2.json]) qs = (PostgresModel.objects .filter(json__contained_by={'a': 1, 'b': 2, 'any': 'thing'}) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) qs = (PostgresModel.objects.filter(json__has_key='a') .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) qs = (PostgresModel.objects.filter(json__has_any_keys=['a', 'b', 'c']) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) qs = (PostgresModel.objects.filter(json__has_keys=['a', 'b']) .values_list('json', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [self.obj1.json]) @skipUnless(is_pg_field_available("JSONField"), "JSONField was removed in Dj 4.0") def test_mutable_result_change(self): """ Checks that changing a mutable returned by a query has no effect on other executions of the query. """ qs = PostgresModel.objects.values_list('int_array', flat=True) data = list(qs.all()) self.assertListEqual(data, [[1, 2, 3], [4, None, 6]]) data[0].append(4) data[1].remove(4) data[1][0] = 5 self.assertListEqual(data, [[1, 2, 3, 4], [5, 6]]) self.assertListEqual(list(qs.all()), [[1, 2, 3], [4, None, 6]]) qs = PostgresModel.objects.values_list('json', flat=True) data = list(qs.all()) self.assertListEqual(data, [self.obj1.json, self.obj2.json]) data[0]['c'] = 3 del data[0]['b'] data[1].pop(0) data[1][0]['e']['and yet']['some other'] = True data[1][0]['f'] = 6 json1 = {'a': 1, 'c': 3} json2 = [ { 'a': 1, 'b': None, 'c': 123.456, 'd': True, 'e': { 'another': 'dict', 'and yet': { 'another': 'one', 'with a list': [], 'some other': True }, }, 'f': 6 }, ] self.assertListEqual(data, [json1, json2]) self.assertListEqual(list(qs.all()), [self.obj1.json, self.obj2.json]) def test_int_range(self): with self.assertNumQueries(1): data1 = [o.int_range for o in PostgresModel.objects.all()] with self.assertNumQueries(1): data2 = list(PostgresModel.objects .values_list('int_range', flat=True)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [NumericRange(1900, 2000), NumericRange(1989)]) invalidate(PostgresModel) qs = PostgresModel.objects.values_list('int_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000), NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__contains=2015) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects .filter(int_range__contains=NumericRange(1950, 1990)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects .filter(int_range__contained_by=NumericRange(0, 2050)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects.filter(int_range__fully_lt=(2015, None)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects.filter(int_range__fully_gt=(1970, 1980)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__not_lt=(1970, 1980)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__not_gt=(1970, 1980)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, []) qs = (PostgresModel.objects.filter(int_range__adjacent_to=(1900, 1989)) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1989)]) qs = (PostgresModel.objects.filter(int_range__startswith=1900) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) qs = (PostgresModel.objects.filter(int_range__endswith=2000) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(1900, 2000)]) PostgresModel.objects.create(int_range=[1900, 1900]) qs = (PostgresModel.objects.filter(int_range__isempty=True) .values_list('int_range', flat=True)) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [NumericRange(empty=True)]) @skipUnless(is_pg_field_available("FloatRangeField"), "FloatRangeField was removed in Dj 3.1") def test_float_range(self): qs = PostgresModel.objects.values_list('float_range', flat=True) self.assert_tables(qs, PostgresModel) # For a strange reason, probably a misconception in psycopg2 # or a bad name in django.contrib.postgres (less probable), # FloatRange returns decimals instead of floats. # Note from ACW: crisis averted, renamed to DecimalRangeField self.assert_query_cached(qs, [ NumericRange(Decimal('-1000.0'), Decimal('9.87654321')), NumericRange(Decimal('0.0'))]) @skipUnless(is_pg_field_available("DecimalRangeField"), "DecimalRangeField was added in Dj 2.2") def test_decimal_range(self): qs = PostgresModel.objects.values_list('decimal_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [ NumericRange(Decimal('-1000.0'), Decimal('9.87654321')), NumericRange(Decimal('0.0'))]) def test_date_range(self): qs = PostgresModel.objects.values_list('date_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [ DateRange(date(1678, 3, 4), date(1741, 7, 28)), DateRange(date(1989, 1, 30))]) def test_datetime_range(self): qs = PostgresModel.objects.values_list('datetime_range', flat=True) self.assert_tables(qs, PostgresModel) self.assert_query_cached(qs, [ DateTimeTZRange(datetime(1989, 1, 30, 12, 20, tzinfo=timezone('Europe/Paris'))), DateTimeTZRange(bounds='()')]) def test_transaction_now(self): """ Checks that queries with a TransactionNow() parameter are not cached. """ obj = Test.objects.create(datetime='1992-07-02T12:00:00') qs = Test.objects.filter(datetime__lte=TransactionNow()) with self.assertRaises(UncachableQuery): self.assert_tables(qs, Test) self.assert_query_cached(qs, [obj], after=1) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/read.py0000644000175100001710000012747600000000000020324 0ustar00runnerdockerimport datetime from unittest import skipIf from uuid import UUID from decimal import Decimal from django import VERSION as django_version from django.contrib.auth.models import Group, Permission, User from django.contrib.contenttypes.models import ContentType from django.db import ( connection, transaction, DEFAULT_DB_ALIAS, ProgrammingError, OperationalError) from django.db.models import Case, Count, Q, Value, When from django.db.models.expressions import RawSQL, Subquery, OuterRef, Exists from django.db.models.functions import Coalesce, Now from django.db.transaction import TransactionManagementError from django.test import TransactionTestCase, skipUnlessDBFeature, override_settings from pytz import UTC from cachalot.cache import cachalot_caches from ..settings import cachalot_settings from ..utils import UncachableQuery from .models import Test, TestChild, TestParent, UnmanagedModel from .test_utils import TestUtilsMixin def is_field_available(name): fields = [] try: from django.db.models import JSONField fields.append("JSONField") except ImportError: pass return name in fields class ReadTestCase(TestUtilsMixin, TransactionTestCase): """ Tests if every SQL request that only reads data is cached. The only exception is for requests that don’t go through the ORM, using ``QuerySet.extra`` with ``select`` or ``where`` arguments, ``Model.objects.raw``, or ``cursor.execute``. """ def setUp(self): super(ReadTestCase, self).setUp() self.group = Group.objects.create(name='test_group') self.group__permissions = list(Permission.objects.all()[:3]) self.group.permissions.add(*self.group__permissions) self.user = User.objects.create_user('user') self.user__permissions = list(Permission.objects.all()[3:6]) self.user.groups.add(self.group) self.user.user_permissions.add(*self.user__permissions) self.admin = User.objects.create_superuser('admin', 'admin@test.me', 'password') self.t1__permission = (Permission.objects.order_by('?') .select_related('content_type')[0]) self.t1 = Test.objects.create( name='test1', owner=self.user, date='1789-07-14', datetime='1789-07-14T16:43:27', permission=self.t1__permission) self.t2 = Test.objects.create( name='test2', owner=self.admin, public=True, date='1944-06-06', datetime='1944-06-06T06:35:00') def test_empty(self): with self.assertNumQueries(0): data1 = list(Test.objects.none()) with self.assertNumQueries(0): data2 = list(Test.objects.none()) self.assertListEqual(data2, data1) self.assertListEqual(data2, []) def test_exists(self): with self.assertNumQueries(1): n1 = Test.objects.exists() with self.assertNumQueries(0): n2 = Test.objects.exists() self.assertEqual(n2, n1) self.assertTrue(n2) def test_count(self): with self.assertNumQueries(1): n1 = Test.objects.count() with self.assertNumQueries(0): n2 = Test.objects.count() self.assertEqual(n2, n1) self.assertEqual(n2, 2) def test_get(self): with self.assertNumQueries(1): data1 = Test.objects.get(name='test1') with self.assertNumQueries(0): data2 = Test.objects.get(name='test1') self.assertEqual(data2, data1) self.assertEqual(data2, self.t1) def test_first(self): with self.assertNumQueries(1): self.assertEqual(Test.objects.filter(name='bad').first(), None) with self.assertNumQueries(0): self.assertEqual(Test.objects.filter(name='bad').first(), None) with self.assertNumQueries(1): data1 = Test.objects.first() with self.assertNumQueries(0): data2 = Test.objects.first() self.assertEqual(data2, data1) self.assertEqual(data2, self.t1) def test_last(self): with self.assertNumQueries(1): data1 = Test.objects.last() with self.assertNumQueries(0): data2 = Test.objects.last() self.assertEqual(data2, data1) self.assertEqual(data2, self.t2) def test_all(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.t1, self.t2]) def test_filter(self): qs = Test.objects.filter(public=True) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2]) qs = Test.objects.filter(name__in=['test2', 'test72']) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2]) qs = Test.objects.filter(date__gt=datetime.date(1900, 1, 1)) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2]) qs = Test.objects.filter(datetime__lt=datetime.datetime(1900, 1, 1)) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) def test_filter_empty(self): qs = Test.objects.filter(public=True, name='user') self.assert_tables(qs, Test) self.assert_query_cached(qs, []) def test_exclude(self): qs = Test.objects.exclude(public=True) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) qs = Test.objects.exclude(name__in=['test2', 'test72']) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) def test_slicing(self): qs = Test.objects.all()[:1] self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1]) def test_order_by(self): qs = Test.objects.order_by('pk') self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t1, self.t2]) qs = Test.objects.order_by('-name') self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2, self.t1]) def test_random_order_by(self): qs = Test.objects.order_by('?') with self.assertRaises(UncachableQuery): self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1, compare_results=False) @skipIf(connection.vendor == 'mysql', 'MySQL does not support limit/offset on a subquery. ' 'Since Django only applies ordering in subqueries when they are ' 'offset/limited, we can’t test it on MySQL.') def test_random_order_by_subquery(self): qs = Test.objects.filter( pk__in=Test.objects.order_by('?')[:10]) with self.assertRaises(UncachableQuery): self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1, compare_results=False) def test_reverse(self): qs = Test.objects.reverse() self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2, self.t1]) def test_distinct(self): # We ensure that the query without distinct should return duplicate # objects, in order to have a real-world example. qs = Test.objects.filter( owner__user_permissions__content_type__app_label='auth') self.assert_tables(qs, Test, User, User.user_permissions.through, Permission, ContentType) self.assert_query_cached(qs, [self.t1, self.t1, self.t1]) qs = qs.distinct() self.assert_tables(qs, Test, User, User.user_permissions.through, Permission, ContentType) self.assert_query_cached(qs, [self.t1]) def test_iterator(self): with self.assertNumQueries(1): data1 = list(Test.objects.iterator()) with self.assertNumQueries(0): data2 = list(Test.objects.iterator()) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.t1, self.t2]) def test_in_bulk(self): with self.assertNumQueries(1): data1 = Test.objects.in_bulk((5432, self.t2.pk, 9200)) with self.assertNumQueries(0): data2 = Test.objects.in_bulk((5432, self.t2.pk, 9200)) self.assertDictEqual(data2, data1) self.assertDictEqual(data2, {self.t2.pk: self.t2}) def test_values(self): qs = Test.objects.values('name', 'public') self.assert_tables(qs, Test) self.assert_query_cached(qs, [{'name': 'test1', 'public': False}, {'name': 'test2', 'public': True}]) def test_values_list(self): qs = Test.objects.values_list('name', flat=True) self.assert_tables(qs, Test) self.assert_query_cached(qs, ['test1', 'test2']) def test_earliest(self): with self.assertNumQueries(1): data1 = Test.objects.earliest('date') with self.assertNumQueries(0): data2 = Test.objects.earliest('date') self.assertEqual(data2, data1) self.assertEqual(data2, self.t1) def test_latest(self): with self.assertNumQueries(1): data1 = Test.objects.latest('date') with self.assertNumQueries(0): data2 = Test.objects.latest('date') self.assertEqual(data2, data1) self.assertEqual(data2, self.t2) def test_dates(self): qs = Test.objects.dates('date', 'year') self.assert_tables(qs, Test) self.assert_query_cached(qs, [datetime.date(1789, 1, 1), datetime.date(1944, 1, 1)]) def test_datetimes(self): qs = Test.objects.datetimes('datetime', 'hour') self.assert_tables(qs, Test) self.assert_query_cached(qs, [datetime.datetime(1789, 7, 14, 16), datetime.datetime(1944, 6, 6, 6)]) @skipIf(connection.vendor == 'mysql', 'Time zones are not supported by MySQL.') @override_settings(USE_TZ=True) def test_datetimes_with_time_zones(self): qs = Test.objects.datetimes('datetime', 'hour') self.assert_tables(qs, Test) self.assert_query_cached(qs, [ datetime.datetime(1789, 7, 14, 16, tzinfo=UTC), datetime.datetime(1944, 6, 6, 6, tzinfo=UTC)]) def test_foreign_key(self): with self.assertNumQueries(3): data1 = [t.owner for t in Test.objects.all()] with self.assertNumQueries(0): data2 = [t.owner for t in Test.objects.all()] self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.user, self.admin]) qs = Test.objects.values_list('owner', flat=True) self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.user.pk, self.admin.pk]) def test_many_to_many(self): u = User.objects.create_user('test_user') ct = ContentType.objects.get_for_model(User) u.user_permissions.add( Permission.objects.create( name='Can discuss', content_type=ct, codename='discuss'), Permission.objects.create( name='Can touch', content_type=ct, codename='touch'), Permission.objects.create( name='Can cuddle', content_type=ct, codename='cuddle')) qs = u.user_permissions.values_list('codename', flat=True) self.assert_tables(qs, User, User.user_permissions.through, Permission) self.assert_query_cached(qs, ['cuddle', 'discuss', 'touch']) def test_subquery(self): qs = Test.objects.filter(owner__in=User.objects.all()) self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.t1, self.t2]) qs = Test.objects.filter( owner__groups__permissions__in=Permission.objects.all()) self.assert_tables(qs, Test, User, User.groups.through, Group, Group.permissions.through, Permission) self.assert_query_cached(qs, [self.t1, self.t1, self.t1]) qs = Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct() self.assert_tables(qs, Test, User, User.groups.through, Group, Group.permissions.through, Permission) self.assert_query_cached(qs, [self.t1]) qs = TestChild.objects.exclude(permissions__isnull=True) self.assert_tables(qs, TestParent, TestChild, TestChild.permissions.through, Permission) self.assert_query_cached(qs, []) qs = TestChild.objects.exclude(permissions__name='') self.assert_tables(qs, TestParent, TestChild, TestChild.permissions.through, Permission) self.assert_query_cached(qs, []) def test_custom_subquery(self): tests = Test.objects.filter(permission=OuterRef('pk')).values('name') qs = Permission.objects.annotate(first_permission=Subquery(tests[:1])) self.assert_tables(qs, Permission, Test) self.assert_query_cached(qs, list(Permission.objects.all())) def test_custom_subquery_exists(self): tests = Test.objects.filter(permission=OuterRef('pk')) qs = Permission.objects.annotate(has_tests=Exists(tests)) self.assert_tables(qs, Permission, Test) self.assert_query_cached(qs, list(Permission.objects.all())) def test_raw_subquery(self): with self.assertNumQueries(0): raw_sql = RawSQL('SELECT id FROM auth_permission WHERE id = %s', (self.t1__permission.pk,)) qs = Test.objects.filter(permission=raw_sql) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs, [self.t1]) qs = Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql)) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs, [self.t1]) def test_aggregate(self): Test.objects.create(name='test3', owner=self.user) with self.assertNumQueries(1): n1 = User.objects.aggregate(n=Count('test'))['n'] with self.assertNumQueries(0): n2 = User.objects.aggregate(n=Count('test'))['n'] self.assertEqual(n2, n1) self.assertEqual(n2, 3) def test_annotate(self): Test.objects.create(name='test3', owner=self.user) qs = (User.objects.annotate(n=Count('test')).order_by('pk') .values_list('n', flat=True)) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [2, 1]) def test_annotate_subquery(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate(first_test=Subquery(tests[:1])) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) def test_annotate_case_with_when_and_query_in_default(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate( first_test=Case( When(Q(pk=1), then=Value('noname')), default=Subquery(tests[:1]) ) ) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) def test_annotate_case_with_when(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate( first_test=Case( When(Q(pk=1), then=Subquery(tests[:1])), default=Value('noname') ) ) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) def test_annotate_coalesce(self): tests = Test.objects.filter(owner=OuterRef('pk')).values('name') qs = User.objects.annotate( name=Coalesce( Subquery(tests[:1]), Value('notest') ) ) self.assert_tables(qs, User, Test) self.assert_query_cached(qs, [self.user, self.admin]) def test_annotate_raw(self): qs = User.objects.annotate( perm_id=RawSQL('SELECT id FROM auth_permission WHERE id = %s', (self.t1__permission.pk,)) ) self.assert_tables(qs, User, Permission) self.assert_query_cached(qs, [self.user, self.admin]) def test_only(self): with self.assertNumQueries(1): t1 = Test.objects.only('name').first() t1.name with self.assertNumQueries(0): t2 = Test.objects.only('name').first() t2.name with self.assertNumQueries(1): t1.public with self.assertNumQueries(0): t2.public self.assertEqual(t2, t1) self.assertEqual(t2.name, t1.name) self.assertEqual(t2.public, t1.public) def test_defer(self): with self.assertNumQueries(1): t1 = Test.objects.defer('name').first() t1.public with self.assertNumQueries(0): t2 = Test.objects.defer('name').first() t2.public with self.assertNumQueries(1): t1.name with self.assertNumQueries(0): t2.name self.assertEqual(t2, t1) self.assertEqual(t2.name, t1.name) self.assertEqual(t2.public, t1.public) def test_select_related(self): # Simple select_related with self.assertNumQueries(1): t1 = Test.objects.select_related('owner').get(name='test1') self.assertEqual(t1.owner, self.user) with self.assertNumQueries(0): t2 = Test.objects.select_related('owner').get(name='test1') self.assertEqual(t2.owner, self.user) self.assertEqual(t2, t1) self.assertEqual(t2, self.t1) # Select_related through a foreign key with self.assertNumQueries(1): t3 = Test.objects.select_related('permission__content_type')[0] self.assertEqual(t3.permission, self.t1.permission) self.assertEqual(t3.permission.content_type, self.t1__permission.content_type) with self.assertNumQueries(0): t4 = Test.objects.select_related('permission__content_type')[0] self.assertEqual(t4.permission, self.t1.permission) self.assertEqual(t4.permission.content_type, self.t1__permission.content_type) self.assertEqual(t4, t3) self.assertEqual(t4, self.t1) def test_prefetch_related(self): # Simple prefetch_related with self.assertNumQueries(2): data1 = list(User.objects.prefetch_related('user_permissions')) with self.assertNumQueries(0): permissions1 = [p for u in data1 for p in u.user_permissions.all()] with self.assertNumQueries(0): data2 = list(User.objects.prefetch_related('user_permissions')) permissions2 = [p for u in data2 for p in u.user_permissions.all()] self.assertListEqual(permissions2, permissions1) self.assertListEqual(permissions2, self.user__permissions) # Prefetch_related through a foreign key where exactly # the same prefetch_related SQL request was executed before with self.assertNumQueries(1): data3 = list(Test.objects.select_related('owner') .prefetch_related('owner__user_permissions')) with self.assertNumQueries(0): permissions3 = [p for t in data3 for p in t.owner.user_permissions.all()] with self.assertNumQueries(0): data4 = list(Test.objects.select_related('owner') .prefetch_related('owner__user_permissions')) permissions4 = [p for t in data4 for p in t.owner.user_permissions.all()] self.assertListEqual(permissions4, permissions3) self.assertListEqual(permissions4, self.user__permissions) # Prefetch_related through a foreign key where exactly # the same prefetch_related SQL request was not fetched before with self.assertNumQueries(2): data5 = list(Test.objects .select_related('owner') .prefetch_related('owner__user_permissions')[:1]) with self.assertNumQueries(0): permissions5 = [p for t in data5 for p in t.owner.user_permissions.all()] with self.assertNumQueries(0): data6 = list(Test.objects.select_related('owner') .prefetch_related('owner__user_permissions')[:1]) permissions6 = [p for t in data6 for p in t.owner.user_permissions.all()] self.assertListEqual(permissions6, permissions5) self.assertListEqual(permissions6, self.user__permissions) # Prefetch_related through a many to many with self.assertNumQueries(2): data7 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) with self.assertNumQueries(0): permissions7 = [p for t in data7 for g in t.owner.groups.all() for p in g.permissions.all()] with self.assertNumQueries(0): data8 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) permissions8 = [p for t in data8 for g in t.owner.groups.all() for p in g.permissions.all()] self.assertListEqual(permissions8, permissions7) self.assertListEqual(permissions8, self.group__permissions) def test_filtered_relation(self): from django.db.models import FilteredRelation qs = TestChild.objects.annotate( filtered_permissions=FilteredRelation( 'permissions', condition=Q(permissions__pk__gt=1))) self.assert_tables(qs, TestChild) self.assert_query_cached(qs) values_qs = qs.values('filtered_permissions') self.assert_tables( values_qs, TestChild, TestChild.permissions.through, Permission) self.assert_query_cached(values_qs) filtered_qs = qs.filter(filtered_permissions__pk__gt=2) self.assert_tables( values_qs, TestChild, TestChild.permissions.through, Permission) self.assert_query_cached(filtered_qs) @skipUnlessDBFeature('supports_select_union') def test_union(self): qs = (Test.objects.filter(pk__lt=5) | Test.objects.filter(permission__name__contains='a')) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) with self.assertRaisesMessage( AssertionError, 'Cannot combine queries on two different base models.'): Test.objects.all() | Permission.objects.all() qs = Test.objects.filter(pk__lt=5) sub_qs = Test.objects.filter(permission__name__contains='a') if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.union(sub_qs) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) qs = Test.objects.all() sub_qs = Permission.objects.all() if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.union(sub_qs) self.assert_tables(qs, Test, Permission) with self.assertRaises((ProgrammingError, OperationalError)): self.assert_query_cached(qs) @skipUnlessDBFeature('supports_select_intersection') def test_intersection(self): qs = (Test.objects.filter(pk__lt=5) & Test.objects.filter(permission__name__contains='a')) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) with self.assertRaisesMessage( AssertionError, 'Cannot combine queries on two different base models.'): Test.objects.all() & Permission.objects.all() qs = Test.objects.filter(pk__lt=5) sub_qs = Test.objects.filter(permission__name__contains='a') if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.intersection(sub_qs) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) qs = Test.objects.all() sub_qs = Permission.objects.all() if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.intersection(sub_qs) self.assert_tables(qs, Test, Permission) with self.assertRaises((ProgrammingError, OperationalError)): self.assert_query_cached(qs) @skipUnlessDBFeature('supports_select_difference') def test_difference(self): qs = Test.objects.filter(pk__lt=5) sub_qs = Test.objects.filter(permission__name__contains='a') if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.difference(sub_qs) self.assert_tables(qs, Test, Permission) self.assert_query_cached(qs) qs = Test.objects.all() sub_qs = Permission.objects.all() if self.is_sqlite: qs = qs.order_by() sub_qs = sub_qs.order_by() qs = qs.difference(sub_qs) self.assert_tables(qs, Test, Permission) with self.assertRaises((ProgrammingError, OperationalError)): self.assert_query_cached(qs) @skipUnlessDBFeature('has_select_for_update') def test_select_for_update(self): """ Tests if ``select_for_update`` queries are not cached. """ with self.assertRaises(TransactionManagementError): list(Test.objects.select_for_update()) with self.assertNumQueries(1): with transaction.atomic(): data1 = list(Test.objects.select_for_update()) self.assertListEqual(data1, [self.t1, self.t2]) self.assertListEqual([t.name for t in data1], ['test1', 'test2']) with self.assertNumQueries(1): with transaction.atomic(): data2 = list(Test.objects.select_for_update()) self.assertListEqual(data2, [self.t1, self.t2]) self.assertListEqual([t.name for t in data2], ['test1', 'test2']) with self.assertNumQueries(2): with transaction.atomic(): data3 = list(Test.objects.select_for_update()) data4 = list(Test.objects.select_for_update()) self.assertListEqual(data3, [self.t1, self.t2]) self.assertListEqual(data4, [self.t1, self.t2]) self.assertListEqual([t.name for t in data3], ['test1', 'test2']) self.assertListEqual([t.name for t in data4], ['test1', 'test2']) def test_having(self): qs = (User.objects.annotate(n=Count('user_permissions')).filter(n__gte=1)) self.assert_tables(qs, User, User.user_permissions.through, Permission) self.assert_query_cached(qs, [self.user]) with self.assertNumQueries(1): self.assertEqual(User.objects.annotate(n=Count('user_permissions')) .filter(n__gte=1).count(), 1) with self.assertNumQueries(0): self.assertEqual(User.objects.annotate(n=Count('user_permissions')) .filter(n__gte=1).count(), 1) def test_extra_select(self): username_length_sql = """ SELECT LENGTH(%(user_table)s.username) FROM %(user_table)s WHERE %(user_table)s.id = %(test_table)s.owner_id """ % {'user_table': User._meta.db_table, 'test_table': Test._meta.db_table} with self.assertNumQueries(1): data1 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data1, [self.t1, self.t2]) self.assertListEqual([o.username_length for o in data1], [4, 5]) with self.assertNumQueries(0): data2 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data2, [self.t1, self.t2]) self.assertListEqual([o.username_length for o in data2], [4, 5]) def test_extra_where(self): sql_condition = ("owner_id IN " "(SELECT id FROM auth_user WHERE username = 'admin')") qs = Test.objects.extra(where=[sql_condition]) self.assert_tables(qs, Test, User) self.assert_query_cached(qs, [self.t2]) def test_extra_tables(self): qs = Test.objects.extra(tables=['auth_user'], select={'extra_id': 'auth_user.id'}) self.assert_tables(qs, Test, User) self.assert_query_cached(qs) def test_extra_order_by(self): qs = Test.objects.extra(order_by=['-cachalot_test.name']) self.assert_tables(qs, Test) self.assert_query_cached(qs, [self.t2, self.t1]) def test_table_inheritance(self): with self.assertNumQueries(3 if self.is_sqlite else 2): t_child = TestChild.objects.create(name='test_child') with self.assertNumQueries(1): self.assertEqual(TestChild.objects.get(), t_child) with self.assertNumQueries(0): self.assertEqual(TestChild.objects.get(), t_child) def test_explain(self): explain_kwargs = {} if self.is_sqlite: expected = (r'\d+ 0 0 SCAN TABLE cachalot_test\n' r'\d+ 0 0 USE TEMP B-TREE FOR ORDER BY') elif self.is_mysql: if self.django_version < (3, 1): expected = ( r'1 SIMPLE cachalot_test ' r'(?:None )?ALL None None None None 2 100\.0 Using filesort') else: expected = ( r'-> Sort row IDs: cachalot_test.`name` \(cost=[\d\.]+ rows=\d\)\n ' r'-> Table scan on cachalot_test \(cost=[\d\.]+ rows=\d\)\n' ) else: explain_kwargs.update( analyze=True, costs=False, ) operation_detail = (r'\(actual time=[\d\.]+..[\d\.]+\ ' r'rows=\d+ loops=\d+\)') expected = ( r'^Sort %s\n' r' Sort Key: name\n' r' Sort Method: quicksort Memory: \d+kB\n' r' -> Seq Scan on cachalot_test %s\n' r'Planning Time: [\d\.]+ ms\n' r'Execution Time: [\d\.]+ ms$') % (operation_detail, operation_detail) with self.assertNumQueries( 2 if self.is_mysql and django_version[0] < 3 else 1): explanation1 = Test.objects.explain(**explain_kwargs) self.assertRegex(explanation1, expected) with self.assertNumQueries(0): explanation2 = Test.objects.explain(**explain_kwargs) self.assertEqual(explanation2, explanation1) def test_raw(self): """ Tests if ``Model.objects.raw`` queries are not cached. """ sql = 'SELECT * FROM %s;' % Test._meta.db_table with self.assertNumQueries(1): data1 = list(Test.objects.raw(sql)) with self.assertNumQueries(1): data2 = list(Test.objects.raw(sql)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [self.t1, self.t2]) def test_raw_no_table(self): sql = 'SELECT * FROM (SELECT 1 AS id UNION ALL SELECT 2) AS t;' with self.assertNumQueries(1): data1 = list(Test.objects.raw(sql)) with self.assertNumQueries(1): data2 = list(Test.objects.raw(sql)) self.assertListEqual(data2, data1) self.assertListEqual(data2, [Test(pk=1), Test(pk=2)]) def test_cursor_execute_unicode(self): """ Tests if queries executed from a DB cursor are not cached. """ attname_column_list = [f.get_attname_column() for f in Test._meta.fields] attnames = [t[0] for t in attname_column_list] columns = [t[1] for t in attname_column_list] sql = "SELECT CAST('é' AS CHAR), %s FROM %s;" % (', '.join(columns), Test._meta.db_table) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data1 = list(cursor.fetchall()) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data2 = list(cursor.fetchall()) self.assertListEqual(data2, data1) self.assertListEqual( data2, [('é',) + l for l in Test.objects.values_list(*attnames)]) @skipIf(connection.vendor == 'sqlite', 'SQLite doesn’t accept bytes as raw query.') def test_cursor_execute_bytes(self): attname_column_list = [f.get_attname_column() for f in Test._meta.fields] attnames = [t[0] for t in attname_column_list] columns = [t[1] for t in attname_column_list] sql = "SELECT CAST('é' AS CHAR), %s FROM %s;" % (', '.join(columns), Test._meta.db_table) sql = sql.encode('utf-8') with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data1 = list(cursor.fetchall()) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data2 = list(cursor.fetchall()) self.assertListEqual(data2, data1) self.assertListEqual( data2, [('é',) + l for l in Test.objects.values_list(*attnames)]) def test_cursor_execute_no_table(self): sql = 'SELECT * FROM (SELECT 1 AS id UNION ALL SELECT 2) AS t;' with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data1 = list(cursor.fetchall()) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute(sql) data2 = list(cursor.fetchall()) self.assertListEqual(data2, data1) self.assertListEqual(data2, [(1,), (2,)]) def test_missing_table_cache_key(self): qs = Test.objects.all() self.assert_tables(qs, Test) self.assert_query_cached(qs) table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN( connection.alias, Test._meta.db_table) cachalot_caches.get_cache().delete(table_cache_key) self.assert_query_cached(qs) def test_broken_query_cache_value(self): """ In some undetermined cases, cache.get_many return wrong values such as `None` or other invalid values. They should be gracefully handled. See https://github.com/noripyt/django-cachalot/issues/110 This test artificially creates a wrong value, but it’s usually a cache backend bug that leads to these wrong values. """ qs = Test.objects.all() self.assert_tables(qs, Test) self.assert_query_cached(qs) query_cache_key = cachalot_settings.CACHALOT_QUERY_KEYGEN( qs.query.get_compiler(DEFAULT_DB_ALIAS)) cachalot_caches.get_cache().set(query_cache_key, (), cachalot_settings.CACHALOT_TIMEOUT) self.assert_query_cached(qs) def test_unicode_get(self): with self.assertNumQueries(1): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='Clémentine') with self.assertNumQueries(0): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='Clémentine') def test_unicode_table_name(self): """ Tests if using unicode in table names does not break caching. """ table_name = 'Clémentine' if self.is_postgresql: table_name = '"%s"' % table_name with connection.cursor() as cursor: cursor.execute('CREATE TABLE %s (taste VARCHAR(20));' % table_name) qs = Test.objects.extra(tables=['Clémentine'], select={'taste': '%s.taste' % table_name}) # Here the table `Clémentine` is not detected because it is not # registered by Django, and we only check for registered tables # to avoid creating an extra SQL query fetching table names. self.assert_tables(qs, Test) self.assert_query_cached(qs) with connection.cursor() as cursor: cursor.execute('DROP TABLE %s;' % table_name) def test_unmanaged_model(self): qs = UnmanagedModel.objects.all() self.assert_tables(qs, UnmanagedModel) self.assert_query_cached(qs) def test_now_annotate(self): """Check that queries with a Now() annotation are not cached #193""" qs = Test.objects.annotate(now=Now()) self.assert_query_cached(qs, after=1) class ParameterTypeTestCase(TestUtilsMixin, TransactionTestCase): def test_tuple(self): qs = Test.objects.filter(pk__in=(1, 2, 3)) self.assert_tables(qs, Test) self.assert_query_cached(qs) qs = Test.objects.filter(pk__in=(4, 5, 6)) self.assert_tables(qs, Test) self.assert_query_cached(qs) def test_list(self): qs = Test.objects.filter(pk__in=[1, 2, 3]) self.assert_tables(qs, Test) self.assert_query_cached(qs) l = [4, 5, 6] qs = Test.objects.filter(pk__in=l) self.assert_tables(qs, Test) self.assert_query_cached(qs) l.append(7) self.assert_tables(qs, Test) # The queryset is not taking the new element into account because # the list was copied during `.filter()`. self.assert_query_cached(qs, before=0) qs = Test.objects.filter(pk__in=l) self.assert_tables(qs, Test) self.assert_query_cached(qs) def test_binary(self): """ Binary data should be cached on PostgreSQL & MySQL, but not on SQLite, because SQLite uses a type making it hard to access data itself. So this also tests how django-cachalot handles unknown params, in this case the `memory` object passed to SQLite. """ qs = Test.objects.filter(bin=None) self.assert_tables(qs, Test) self.assert_query_cached(qs) qs = Test.objects.filter(bin=b'abc') self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1 if self.is_sqlite else 0) qs = Test.objects.filter(bin=b'def') self.assert_tables(qs, Test) self.assert_query_cached(qs, after=1 if self.is_sqlite else 0) def test_float(self): with self.assertNumQueries(1): Test.objects.create(name='test1', a_float=0.123456789) with self.assertNumQueries(1): Test.objects.create(name='test2', a_float=12345.6789) with self.assertNumQueries(1): data1 = list(Test.objects.values_list('a_float', flat=True).filter( a_float__isnull=False).order_by('a_float')) with self.assertNumQueries(0): data2 = list(Test.objects.values_list('a_float', flat=True).filter( a_float__isnull=False).order_by('a_float')) self.assertListEqual(data2, data1) self.assertEqual(len(data2), 2) self.assertAlmostEqual(data2[0], 0.123456789, delta=0.0001) self.assertAlmostEqual(data2[1], 12345.6789, delta=0.0001) with self.assertNumQueries(1): Test.objects.get(a_float=0.123456789) with self.assertNumQueries(0): Test.objects.get(a_float=0.123456789) def test_decimal(self): with self.assertNumQueries(1): Test.objects.create(name='test1', a_decimal=Decimal('123.45')) with self.assertNumQueries(1): Test.objects.create(name='test1', a_decimal=Decimal('12.3')) qs = Test.objects.values_list('a_decimal', flat=True).filter( a_decimal__isnull=False).order_by('a_decimal') self.assert_tables(qs, Test) self.assert_query_cached(qs, [Decimal('12.3'), Decimal('123.45')]) with self.assertNumQueries(1): Test.objects.get(a_decimal=Decimal('123.45')) with self.assertNumQueries(0): Test.objects.get(a_decimal=Decimal('123.45')) def test_ipv4_address(self): with self.assertNumQueries(1): Test.objects.create(name='test1', ip='127.0.0.1') with self.assertNumQueries(1): Test.objects.create(name='test2', ip='192.168.0.1') qs = Test.objects.values_list('ip', flat=True).filter( ip__isnull=False).order_by('ip') self.assert_tables(qs, Test) self.assert_query_cached(qs, ['127.0.0.1', '192.168.0.1']) with self.assertNumQueries(1): Test.objects.get(ip='127.0.0.1') with self.assertNumQueries(0): Test.objects.get(ip='127.0.0.1') def test_ipv6_address(self): with self.assertNumQueries(1): Test.objects.create(name='test1', ip='2001:db8:a0b:12f0::1/64') with self.assertNumQueries(1): Test.objects.create(name='test2', ip='2001:db8:0:85a3::ac1f:8001') qs = Test.objects.values_list('ip', flat=True).filter( ip__isnull=False).order_by('ip') self.assert_tables(qs, Test) self.assert_query_cached(qs, ['2001:db8:0:85a3::ac1f:8001', '2001:db8:a0b:12f0::1/64']) with self.assertNumQueries(1): Test.objects.get(ip='2001:db8:0:85a3::ac1f:8001') with self.assertNumQueries(0): Test.objects.get(ip='2001:db8:0:85a3::ac1f:8001') def test_duration(self): with self.assertNumQueries(1): Test.objects.create(name='test1', duration=datetime.timedelta(30)) with self.assertNumQueries(1): Test.objects.create(name='test2', duration=datetime.timedelta(60)) qs = Test.objects.values_list('duration', flat=True).filter( duration__isnull=False).order_by('duration') self.assert_tables(qs, Test) self.assert_query_cached(qs, [datetime.timedelta(30), datetime.timedelta(60)]) with self.assertNumQueries(1): Test.objects.get(duration=datetime.timedelta(30)) with self.assertNumQueries(0): Test.objects.get(duration=datetime.timedelta(30)) def test_uuid(self): with self.assertNumQueries(1): Test.objects.create(name='test1', uuid='1cc401b7-09f4-4520-b8d0-c267576d196b') with self.assertNumQueries(1): Test.objects.create(name='test2', uuid='ebb3b6e1-1737-4321-93e3-4c35d61ff491') qs = Test.objects.values_list('uuid', flat=True).filter( uuid__isnull=False).order_by('uuid') self.assert_tables(qs, Test) self.assert_query_cached(qs, [ UUID('1cc401b7-09f4-4520-b8d0-c267576d196b'), UUID('ebb3b6e1-1737-4321-93e3-4c35d61ff491')]) with self.assertNumQueries(1): Test.objects.get(uuid=UUID('1cc401b7-09f4-4520-b8d0-c267576d196b')) with self.assertNumQueries(0): Test.objects.get(uuid=UUID('1cc401b7-09f4-4520-b8d0-c267576d196b')) def test_now(self): """ Checks that queries with a Now() parameter are not cached. """ obj = Test.objects.create(datetime='1992-07-02T12:00:00') qs = Test.objects.filter(datetime__lte=Now()) with self.assertNumQueries(1): obj1 = qs.get() with self.assertNumQueries(1): obj2 = qs.get() self.assertEqual(obj1, obj2) self.assertEqual(obj1, obj) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/settings.py0000644000175100001710000003166100000000000021237 0ustar00runnerdockerfrom time import sleep from unittest import skipIf from django.conf import settings from django.contrib.auth.models import User from django.core.cache import DEFAULT_CACHE_ALIAS from django.core.checks import run_checks, Tags, Warning, Error from django.db import connection from django.test import TransactionTestCase from django.test.utils import override_settings from ..api import invalidate from ..settings import SUPPORTED_ONLY, SUPPORTED_DATABASE_ENGINES from .models import Test, TestParent, TestChild, UnmanagedModel from .test_utils import TestUtilsMixin class SettingsTestCase(TestUtilsMixin, TransactionTestCase): @override_settings(CACHALOT_ENABLED=False) def test_decorator(self): self.assert_query_cached(Test.objects.all(), after=1) def test_django_override(self): with self.settings(CACHALOT_ENABLED=False): qs = Test.objects.all() self.assert_query_cached(qs, after=1) with self.settings(CACHALOT_ENABLED=True): self.assert_query_cached(qs) def test_enabled(self): qs = Test.objects.all() with self.settings(CACHALOT_ENABLED=True): self.assert_query_cached(qs) with self.settings(CACHALOT_ENABLED=False): self.assert_query_cached(qs, after=1) with self.assertNumQueries(0): list(Test.objects.all()) with self.settings(CACHALOT_ENABLED=False): with self.assertNumQueries(1): t = Test.objects.create(name='test') with self.assertNumQueries(1): data = list(Test.objects.all()) self.assertListEqual(data, [t]) @skipIf(len(settings.CACHES) == 1, 'We can’t change the cache used ' 'since there’s only one configured.') def test_cache(self): other_cache_alias = next(alias for alias in settings.CACHES if alias != DEFAULT_CACHE_ALIAS) invalidate(Test, cache_alias=other_cache_alias) qs = Test.objects.all() with self.settings(CACHALOT_CACHE=DEFAULT_CACHE_ALIAS): self.assert_query_cached(qs) with self.settings(CACHALOT_CACHE=other_cache_alias): self.assert_query_cached(qs) Test.objects.create(name='test') # Only `CACHALOT_CACHE` is invalidated, so changing the database should # not invalidate all caches. with self.settings(CACHALOT_CACHE=other_cache_alias): self.assert_query_cached(qs, before=0) def test_databases(self): qs = Test.objects.all() with self.settings(CACHALOT_DATABASES=SUPPORTED_ONLY): self.assert_query_cached(qs) invalidate(Test) engine = connection.settings_dict['ENGINE'] SUPPORTED_DATABASE_ENGINES.remove(engine) with self.settings(CACHALOT_DATABASES=SUPPORTED_ONLY): self.assert_query_cached(qs, after=1) SUPPORTED_DATABASE_ENGINES.add(engine) with self.settings(CACHALOT_DATABASES=SUPPORTED_ONLY): self.assert_query_cached(qs) with self.settings(CACHALOT_DATABASES=[]): self.assert_query_cached(qs, after=1) def test_cache_timeout(self): qs = Test.objects.all() with self.assertNumQueries(1): list(qs.all()) sleep(1) with self.assertNumQueries(0): list(qs.all()) invalidate(Test) with self.settings(CACHALOT_TIMEOUT=0): with self.assertNumQueries(1): list(qs.all()) sleep(0.05) with self.assertNumQueries(1): list(qs.all()) # We have to test with a full second and not a shorter time because # memcached only takes the integer part of the timeout into account. with self.settings(CACHALOT_TIMEOUT=1): self.assert_query_cached(qs) sleep(1) with self.assertNumQueries(1): list(Test.objects.all()) def test_cache_random(self): qs = Test.objects.order_by('?') self.assert_query_cached(qs, after=1, compare_results=False) with self.settings(CACHALOT_CACHE_RANDOM=True): self.assert_query_cached(qs) def test_invalidate_raw(self): with self.assertNumQueries(1): list(Test.objects.all()) with self.settings(CACHALOT_INVALIDATE_RAW=False): with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute("UPDATE %s SET name = 'new name';" % Test._meta.db_table) with self.assertNumQueries(0): list(Test.objects.all()) def test_only_cachable_tables(self): with self.settings(CACHALOT_ONLY_CACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(Test.objects.all()) self.assert_query_cached(TestParent.objects.all(), after=1) self.assert_query_cached(Test.objects.select_related('owner'), after=1) self.assert_query_cached(TestParent.objects.all()) with self.settings(CACHALOT_ONLY_CACHABLE_TABLES=( 'cachalot_test', 'cachalot_testchild', 'auth_user')): self.assert_query_cached(Test.objects.select_related('owner')) # TestChild uses multi-table inheritance, and since its parent, # 'cachalot_testparent', is not cachable, a basic # TestChild query can’t be cached self.assert_query_cached(TestChild.objects.all(), after=1) # However, if we only fetch data from the 'cachalot_testchild' # table, it’s cachable. self.assert_query_cached(TestChild.objects.values('public')) @override_settings(CACHALOT_ONLY_CACHABLE_APPS=('cachalot',)) def test_only_cachable_apps(self): self.assert_query_cached(Test.objects.all()) self.assert_query_cached(TestParent.objects.all()) self.assert_query_cached(Test.objects.select_related('owner'), after=1) # Must use override_settings to get the correct effect. Using the cm doesn't # reload settings on cachalot's side @override_settings(CACHALOT_ONLY_CACHABLE_TABLES=('cachalot_test', 'auth_user'), CACHALOT_ONLY_CACHABLE_APPS=('cachalot',)) def test_only_cachable_apps_set_combo(self): self.assert_query_cached(Test.objects.all()) self.assert_query_cached(TestParent.objects.all()) self.assert_query_cached(Test.objects.select_related('owner')) def test_uncachable_tables(self): qs = Test.objects.all() with self.settings(CACHALOT_UNCACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(qs, after=1) self.assert_query_cached(qs) with self.settings(CACHALOT_UNCACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(qs, after=1) @override_settings(CACHALOT_UNCACHABLE_APPS=('cachalot',)) def test_uncachable_apps(self): self.assert_query_cached(Test.objects.all(), after=1) self.assert_query_cached(TestParent.objects.all(), after=1) @override_settings(CACHALOT_UNCACHABLE_TABLES=('cachalot_test',), CACHALOT_UNCACHABLE_APPS=('cachalot',)) def test_uncachable_apps_set_combo(self): self.assert_query_cached(Test.objects.all(), after=1) self.assert_query_cached(TestParent.objects.all(), after=1) def test_only_cachable_and_uncachable_table(self): with self.settings( CACHALOT_ONLY_CACHABLE_TABLES=('cachalot_test', 'cachalot_testparent'), CACHALOT_UNCACHABLE_TABLES=('cachalot_test',)): self.assert_query_cached(Test.objects.all(), after=1) self.assert_query_cached(TestParent.objects.all()) self.assert_query_cached(User.objects.all(), after=1) def test_uncachable_unmanaged_table(self): qs = UnmanagedModel.objects.all() with self.settings( CACHALOT_UNCACHABLE_TABLES=("cachalot_unmanagedmodel",), CACHALOT_ADDITIONAL_TABLES=("cachalot_unmanagedmodel",) ): self.assert_query_cached(qs, after=1) def test_cache_compatibility(self): compatible_cache = { 'BACKEND': 'django.core.cache.backends.locmem.LocMemCache', } incompatible_cache = { 'BACKEND': 'django.core.cache.backends.db.DatabaseCache', 'LOCATION': 'cache_table' } with self.settings(CACHES={'default': compatible_cache, 'secondary': incompatible_cache}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, []) warning001 = Warning( 'Cache backend %r is not supported by django-cachalot.' % 'django.core.cache.backends.db.DatabaseCache', hint='Switch to a supported cache backend ' 'like Redis or Memcached.', id='cachalot.W001') with self.settings(CACHES={'default': incompatible_cache}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning001]) with self.settings(CACHES={'default': compatible_cache, 'secondary': incompatible_cache}, CACHALOT_CACHE='secondary'): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning001]) def test_database_compatibility(self): compatible_database = { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': 'non_existent_db.sqlite3', } incompatible_database = { 'ENGINE': 'django.db.backends.oracle', 'NAME': 'non_existent_db', } warning002 = Warning( 'None of the configured databases are supported ' 'by django-cachalot.', hint='Use a supported database, or remove django-cachalot, or ' 'put at least one database alias in `CACHALOT_DATABASES` ' 'to force django-cachalot to use it.', id='cachalot.W002' ) warning003 = Warning( 'Database engine %r is not supported by django-cachalot.' % 'django.db.backends.oracle', hint='Switch to a supported database engine.', id='cachalot.W003' ) warning004 = Warning( 'Django-cachalot is useless because no database ' 'is configured in `CACHALOT_DATABASES`.', hint='Reconfigure django-cachalot or remove it.', id='cachalot.W004' ) error001 = Error( 'Database alias %r from `CACHALOT_DATABASES` ' 'is not defined in `DATABASES`.' % 'secondary', hint='Change `CACHALOT_DATABASES` to be compliant with' '`CACHALOT_DATABASES`', id='cachalot.E001', ) error002 = Error( "`CACHALOT_DATABASES` must be either %r or a list, tuple, " "frozenset or set of database aliases." % SUPPORTED_ONLY, hint='Remove `CACHALOT_DATABASES` or change it.', id='cachalot.E002', ) with self.settings(DATABASES={'default': incompatible_database}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning002]) with self.settings(DATABASES={'default': compatible_database, 'secondary': incompatible_database}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, []) with self.settings(DATABASES={'default': incompatible_database, 'secondary': compatible_database}): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, []) with self.settings(DATABASES={'default': incompatible_database}, CACHALOT_DATABASES=['default']): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning003]) with self.settings(DATABASES={'default': incompatible_database}, CACHALOT_DATABASES=[]): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [warning004]) with self.settings(DATABASES={'default': incompatible_database}, CACHALOT_DATABASES=['secondary']): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [error001]) with self.settings(DATABASES={'default': compatible_database}, CACHALOT_DATABASES=['default', 'secondary']): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [error001]) with self.settings(CACHALOT_DATABASES='invalid value'): errors = run_checks(tags=[Tags.compatibility]) self.assertListEqual(errors, [error002]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/signals.py0000644000175100001710000001046400000000000021035 0ustar00runnerdockerfrom unittest import skipIf from django.conf import settings from django.contrib.auth.models import User from django.db import DEFAULT_DB_ALIAS, transaction from django.test import TransactionTestCase from ..api import invalidate from ..signals import post_invalidation from .models import Test class SignalsTestCase(TransactionTestCase): databases = set(settings.DATABASES.keys()) def test_table_invalidated(self): l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) list(Test.objects.all()) self.assertListEqual(l, []) Test.objects.create(name='test1') self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver) del l[:] # Empties the list post_invalidation.connect(receiver, sender=User._meta.db_table) Test.objects.create(name='test2') self.assertListEqual(l, []) User.objects.create_user('user') self.assertListEqual(l, [('auth_user', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver, sender=User._meta.db_table) def test_table_invalidated_in_transaction(self): """ Checks that the ``post_invalidation`` signal is triggered only after the end of a transaction. """ l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) with transaction.atomic(): Test.objects.create(name='test1') self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) del l[:] # Empties the list self.assertListEqual(l, []) with transaction.atomic(): Test.objects.create(name='test2') with transaction.atomic(): Test.objects.create(name='test3') self.assertListEqual(l, []) self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver) def test_table_invalidated_once_per_transaction_or_invalidate(self): """ Checks that the ``post_invalidation`` signal is triggered only after the end of a transaction. """ l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) with transaction.atomic(): Test.objects.create(name='test1') self.assertListEqual(l, []) Test.objects.create(name='test2') self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) del l[:] # Empties the list self.assertListEqual(l, []) invalidate(Test, db_alias=DEFAULT_DB_ALIAS) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) del l[:] # Empties the list self.assertListEqual(l, []) with transaction.atomic(): invalidate(Test, db_alias=DEFAULT_DB_ALIAS) self.assertListEqual(l, []) self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) post_invalidation.disconnect(receiver) @skipIf(len(settings.DATABASES) == 1, 'We can’t change the DB used since there’s only one configured') def test_table_invalidated_multi_db(self): db_alias2 = next(alias for alias in settings.DATABASES if alias != DEFAULT_DB_ALIAS) l = [] def receiver(sender, **kwargs): db_alias = kwargs['db_alias'] l.append((sender, db_alias)) post_invalidation.connect(receiver) self.assertListEqual(l, []) Test.objects.using(DEFAULT_DB_ALIAS).create(name='test') self.assertListEqual(l, [('cachalot_test', DEFAULT_DB_ALIAS)]) Test.objects.using(db_alias2).create(name='test') self.assertListEqual(l, [ ('cachalot_test', DEFAULT_DB_ALIAS), ('cachalot_test', db_alias2)]) post_invalidation.disconnect(receiver) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/test_utils.py0000644000175100001710000000476600000000000021604 0ustar00runnerdockerfrom django import VERSION as DJANGO_VERSION from django.core.management.color import no_style from django.db import connection, transaction from .models import PostgresModel from ..utils import _get_tables class TestUtilsMixin: def setUp(self): self.is_sqlite = connection.vendor == 'sqlite' self.is_mysql = connection.vendor == 'mysql' self.is_postgresql = connection.vendor == 'postgresql' self.django_version = DJANGO_VERSION self.force_reopen_connection() # TODO: Remove this workaround when this issue is fixed: # https://code.djangoproject.com/ticket/29494 def tearDown(self): if connection.vendor == 'postgresql': flush_args = [no_style(), (PostgresModel._meta.db_table,),] if float(".".join(map(str, DJANGO_VERSION[:2]))) < 3.1: flush_args.append(()) flush_sql_list = connection.ops.sql_flush(*flush_args) with transaction.atomic(): for sql in flush_sql_list: with connection.cursor() as cursor: cursor.execute(sql) def force_reopen_connection(self): if connection.vendor in ('mysql', 'postgresql'): # We need to reopen the connection or Django # will execute an extra SQL request below. connection.cursor() def assert_tables(self, queryset, *tables): tables = {table if isinstance(table, str) else table._meta.db_table for table in tables} self.assertSetEqual(_get_tables(queryset.db, queryset.query), tables) def assert_query_cached(self, queryset, result=None, result_type=None, compare_results=True, before=1, after=0): if result_type is None: result_type = list if result is None else type(result) with self.assertNumQueries(before): data1 = queryset.all() if result_type is list: data1 = list(data1) with self.assertNumQueries(after): data2 = queryset.all() if result_type is list: data2 = list(data2) if not compare_results: return assert_functions = { list: self.assertListEqual, set: self.assertSetEqual, dict: self.assertDictEqual, } assert_function = assert_functions.get(result_type, self.assertEqual) assert_function(data2, data1) if result is not None: assert_function(data2, result) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/thread_safety.py0000644000175100001710000000720100000000000022212 0ustar00runnerdockerfrom threading import Thread from django.db import connection, transaction from django.test import TransactionTestCase, skipUnlessDBFeature from .models import Test from .test_utils import TestUtilsMixin class TestThread(Thread): def start_and_join(self): self.start() self.join() return self.t def run(self): self.t = Test.objects.first() connection.close() @skipUnlessDBFeature('test_db_allows_multiple_connections') class ThreadSafetyTestCase(TestUtilsMixin, TransactionTestCase): def test_concurrent_caching(self): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, t) def test_concurrent_caching_during_atomic(self): with self.assertNumQueries(1): with transaction.atomic(): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, None) with self.assertNumQueries(1): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_before_and_during_atomic_1(self): t1 = TestThread().start_and_join() with self.assertNumQueries(1): with transaction.atomic(): t2 = TestThread().start_and_join() t = Test.objects.create(name='test') self.assertEqual(t1, None) self.assertEqual(t2, None) with self.assertNumQueries(1): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_before_and_during_atomic_2(self): t1 = TestThread().start_and_join() with self.assertNumQueries(1): with transaction.atomic(): t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, None) with self.assertNumQueries(1): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_during_and_after_atomic_1(self): with self.assertNumQueries(1): with transaction.atomic(): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, t) with self.assertNumQueries(0): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_during_and_after_atomic_2(self): with self.assertNumQueries(1): with transaction.atomic(): t = Test.objects.create(name='test') t1 = TestThread().start_and_join() t2 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, t) with self.assertNumQueries(0): data = Test.objects.first() self.assertEqual(data, t) def test_concurrent_caching_during_and_after_atomic_3(self): with self.assertNumQueries(1): with transaction.atomic(): t1 = TestThread().start_and_join() t = Test.objects.create(name='test') t2 = TestThread().start_and_join() t3 = TestThread().start_and_join() self.assertEqual(t1, None) self.assertEqual(t2, None) self.assertEqual(t3, t) with self.assertNumQueries(0): data = Test.objects.first() self.assertEqual(data, t) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/transaction.py0000644000175100001710000001731000000000000021717 0ustar00runnerdockerfrom django.contrib.auth.models import User from django.db import transaction, connection, IntegrityError from django.test import TransactionTestCase, skipUnlessDBFeature from .models import Test from .test_utils import TestUtilsMixin class AtomicTestCase(TestUtilsMixin, TransactionTestCase): def test_successful_read_atomic(self): with self.assertNumQueries(2 if self.is_sqlite else 1): with transaction.atomic(): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, []) def test_unsuccessful_read_atomic(self): with self.assertNumQueries(2 if self.is_sqlite else 1): try: with transaction.atomic(): data1 = list(Test.objects.all()) raise ZeroDivisionError except ZeroDivisionError: pass self.assertListEqual(data1, []) with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertListEqual(data2, []) def test_successful_write_atomic(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(2 if self.is_sqlite else 1): with transaction.atomic(): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1]) with self.assertNumQueries(2 if self.is_sqlite else 1): with transaction.atomic(): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data3 = list(Test.objects.all()) self.assertListEqual(data3, [t1, t2]) with self.assertNumQueries(4 if self.is_sqlite else 3): with transaction.atomic(): data4 = list(Test.objects.all()) t3 = Test.objects.create(name='test3') t4 = Test.objects.create(name='test4') data5 = list(Test.objects.all()) self.assertListEqual(data4, [t1, t2]) self.assertListEqual(data5, [t1, t2, t3, t4]) self.assertNotEqual(t4, t3) def test_unsuccessful_write_atomic(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(2 if self.is_sqlite else 1): try: with transaction.atomic(): Test.objects.create(name='test') raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(0): data2 = list(Test.objects.all()) self.assertListEqual(data2, []) with self.assertNumQueries(1): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='test') def test_cache_inside_atomic(self): with self.assertNumQueries(2 if self.is_sqlite else 1): with transaction.atomic(): data1 = list(Test.objects.all()) data2 = list(Test.objects.all()) self.assertListEqual(data2, data1) self.assertListEqual(data2, []) def test_invalidation_inside_atomic(self): with self.assertNumQueries(4 if self.is_sqlite else 3): with transaction.atomic(): data1 = list(Test.objects.all()) t = Test.objects.create(name='test') data2 = list(Test.objects.all()) self.assertListEqual(data1, []) self.assertListEqual(data2, [t]) def test_successful_nested_read_atomic(self): with self.assertNumQueries(7 if self.is_sqlite else 6): with transaction.atomic(): list(Test.objects.all()) with transaction.atomic(): list(User.objects.all()) with self.assertNumQueries(2): with transaction.atomic(): list(User.objects.all()) with self.assertNumQueries(0): list(User.objects.all()) with self.assertNumQueries(0): list(Test.objects.all()) list(User.objects.all()) def test_unsuccessful_nested_read_atomic(self): with self.assertNumQueries(6 if self.is_sqlite else 5): with transaction.atomic(): try: with transaction.atomic(): with self.assertNumQueries(1): list(Test.objects.all()) raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(1): list(Test.objects.all()) def test_successful_nested_write_atomic(self): with self.assertNumQueries(13 if self.is_sqlite else 12): with transaction.atomic(): t1 = Test.objects.create(name='test1') with transaction.atomic(): t2 = Test.objects.create(name='test2') data1 = list(Test.objects.all()) self.assertListEqual(data1, [t1, t2]) with transaction.atomic(): t3 = Test.objects.create(name='test3') with transaction.atomic(): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1, t2, t3]) t4 = Test.objects.create(name='test4') data3 = list(Test.objects.all()) self.assertListEqual(data3, [t1, t2, t3, t4]) def test_unsuccessful_nested_write_atomic(self): with self.assertNumQueries(16 if self.is_sqlite else 15): with transaction.atomic(): t1 = Test.objects.create(name='test1') try: with transaction.atomic(): t2 = Test.objects.create(name='test2') data1 = list(Test.objects.all()) self.assertListEqual(data1, [t1, t2]) raise ZeroDivisionError except ZeroDivisionError: pass data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1]) try: with transaction.atomic(): t3 = Test.objects.create(name='test3') with transaction.atomic(): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t1, t3]) raise ZeroDivisionError except ZeroDivisionError: pass with self.assertNumQueries(1): data3 = list(Test.objects.all()) self.assertListEqual(data3, [t1]) @skipUnlessDBFeature('can_defer_constraint_checks') def test_deferred_error(self): """ Checks that an error occurring during the end of a transaction has no impact on future queries. """ with connection.cursor() as cursor: cursor.execute( 'CREATE TABLE example (' 'id int UNIQUE DEFERRABLE INITIALLY DEFERRED);') with self.assertRaises(IntegrityError): with transaction.atomic(): with self.assertNumQueries(1): list(Test.objects.all()) cursor.execute( 'INSERT INTO example VALUES (1), (1);' '-- ' + Test._meta.db_table) # Should invalidate Test. with self.assertNumQueries(1): list(Test.objects.all()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/tests/write.py0000644000175100001710000011552400000000000020532 0ustar00runnerdockerfrom unittest import skipIf, skipUnless from django import VERSION as DJANGO_VERSION from django.contrib.auth.models import User, Permission, Group from django.contrib.contenttypes.models import ContentType from django.core.exceptions import MultipleObjectsReturned from django.core.management import call_command from django.db import ( connection, transaction, ProgrammingError, OperationalError) from django.db.models import Count from django.db.models.expressions import RawSQL from django.test import TransactionTestCase, skipUnlessDBFeature from .models import Test, TestParent, TestChild from .test_utils import TestUtilsMixin class WriteTestCase(TestUtilsMixin, TransactionTestCase): """ Tests if every SQL request writing data is not cached and invalidates the implied data. """ def test_create(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data2 = list(Test.objects.all()) with self.assertNumQueries(1): t3 = Test.objects.create(name='test3') with self.assertNumQueries(1): data3 = list(Test.objects.all()) self.assertListEqual(data2, [t1, t2]) self.assertListEqual(data3, [t1, t2, t3]) with self.assertNumQueries(1): t3_copy = Test.objects.create(name='test3') self.assertNotEqual(t3_copy, t3) with self.assertNumQueries(1): data4 = list(Test.objects.all()) self.assertListEqual(data4, [t1, t2, t3, t3_copy]) def test_get_or_create(self): """ Tests if the ``SELECT`` query of a ``QuerySet.get_or_create`` is cached, but not the ``INSERT`` one. """ with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(3 if self.is_sqlite else 2): t, created = Test.objects.get_or_create(name='test') self.assertTrue(created) with self.assertNumQueries(1): t_clone, created = Test.objects.get_or_create(name='test') self.assertFalse(created) self.assertEqual(t_clone, t) with self.assertNumQueries(0): t_clone, created = Test.objects.get_or_create(name='test') self.assertFalse(created) self.assertEqual(t_clone, t) with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertListEqual(data2, [t]) def test_update_or_create(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) with self.assertNumQueries(5 if self.is_sqlite else 4): t, created = Test.objects.update_or_create( name='test', defaults={'public': True}) self.assertTrue(created) self.assertEqual(t.name, 'test') self.assertEqual(t.public, True) with self.assertNumQueries(3 if self.is_sqlite else 2): t, created = Test.objects.update_or_create( name='test', defaults={'public': False}) self.assertFalse(created) self.assertEqual(t.name, 'test') self.assertEqual(t.public, False) # The number of SQL queries doesn’t decrease because update_or_create # always calls an UPDATE, even when data wasn’t changed. with self.assertNumQueries(3 if self.is_sqlite else 2): t, created = Test.objects.update_or_create( name='test', defaults={'public': False}) self.assertFalse(created) self.assertEqual(t.name, 'test') self.assertEqual(t.public, False) with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [t]) def test_bulk_create(self): with self.assertNumQueries(1): data1 = list(Test.objects.all()) self.assertListEqual(data1, []) with self.assertNumQueries(2 if self.is_sqlite else 1): unsaved_tests = [Test(name='test%02d' % i) for i in range(1, 11)] Test.objects.bulk_create(unsaved_tests) self.assertEqual(Test.objects.count(), 10) with self.assertNumQueries(2 if self.is_sqlite else 1): unsaved_tests = [Test(name='test%02d' % i) for i in range(1, 11)] Test.objects.bulk_create(unsaved_tests) self.assertEqual(Test.objects.count(), 20) with self.assertNumQueries(1): data2 = list(Test.objects.all()) self.assertEqual(len(data2), 20) self.assertListEqual([t.name for t in data2], ['test%02d' % (i // 2) for i in range(2, 22)]) def test_update(self): with self.assertNumQueries(1): t = Test.objects.create(name='test1') with self.assertNumQueries(1): t1 = Test.objects.get() with self.assertNumQueries(1): t.name = 'test2' t.save() with self.assertNumQueries(1): t2 = Test.objects.get() self.assertEqual(t1.name, 'test1') self.assertEqual(t2.name, 'test2') with self.assertNumQueries(1): Test.objects.update(name='test3') with self.assertNumQueries(1): t3 = Test.objects.get() self.assertEqual(t3.name, 'test3') def test_delete(self): with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data1 = list(Test.objects.values_list('name', flat=True)) with self.assertNumQueries(1): t2.delete() with self.assertNumQueries(1): data2 = list(Test.objects.values_list('name', flat=True)) self.assertListEqual(data1, [t1.name, t2.name]) self.assertListEqual(data2, [t1.name]) with self.assertNumQueries(2 if self.is_sqlite else 1): Test.objects.bulk_create([Test(name='test%s' % i) for i in range(2, 11)]) with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 10) with self.assertNumQueries(2 if self.is_sqlite else 1): Test.objects.all().delete() with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 0) def test_invalidate_exists(self): with self.assertNumQueries(1): self.assertFalse(Test.objects.exists()) Test.objects.create(name='test') with self.assertNumQueries(1): self.assertTrue(Test.objects.create()) def test_invalidate_count(self): with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 0) Test.objects.create(name='test1') with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 1) Test.objects.create(name='test2') with self.assertNumQueries(1): self.assertEqual(Test.objects.count(), 2) def test_invalidate_get(self): with self.assertNumQueries(1): with self.assertRaises(Test.DoesNotExist): Test.objects.get(name='test') Test.objects.create(name='test') with self.assertNumQueries(1): Test.objects.get(name='test') Test.objects.create(name='test') with self.assertNumQueries(1): with self.assertRaises(MultipleObjectsReturned): Test.objects.get(name='test') def test_invalidate_values(self): with self.assertNumQueries(1): data1 = list(Test.objects.values('name', 'public')) self.assertListEqual(data1, []) Test.objects.bulk_create([Test(name='test1'), Test(name='test2', public=True)]) with self.assertNumQueries(1): data2 = list(Test.objects.values('name', 'public')) self.assertEqual(len(data2), 2) self.assertDictEqual(data2[0], {'name': 'test1', 'public': False}) self.assertDictEqual(data2[1], {'name': 'test2', 'public': True}) Test.objects.all()[0].delete() with self.assertNumQueries(1): data3 = list(Test.objects.values('name', 'public')) self.assertEqual(len(data3), 1) self.assertDictEqual(data3[0], {'name': 'test2', 'public': True}) def test_invalidate_foreign_key(self): with self.assertNumQueries(1): data1 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data1, []) u1 = User.objects.create_user('user1') Test.objects.bulk_create([Test(name='test1', owner=u1), Test(name='test2')]) with self.assertNumQueries(2): data2 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data2, ['user1']) Test.objects.create(name='test3') with self.assertNumQueries(1): data3 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data3, ['user1']) t2 = Test.objects.get(name='test2') t2.owner = u1 t2.save() with self.assertNumQueries(1): data4 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data4, ['user1', 'user1']) u2 = User.objects.create_user('user2') Test.objects.filter(name='test3').update(owner=u2) with self.assertNumQueries(3): data5 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data5, ['user1', 'user1', 'user2']) User.objects.filter(username='user2').update(username='user3') with self.assertNumQueries(2): data6 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data6, ['user1', 'user1', 'user3']) u2 = User.objects.create_user('user2') Test.objects.filter(name='test2').update(owner=u2) with self.assertNumQueries(4): data7 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data7, ['user1', 'user2', 'user3']) with self.assertNumQueries(0): data8 = [t.owner.username for t in Test.objects.all() if t.owner] self.assertListEqual(data8, ['user1', 'user2', 'user3']) def test_invalidate_many_to_many(self): u = User.objects.create_user('test_user') ct = ContentType.objects.get_for_model(User) discuss = Permission.objects.create( name='Can discuss', content_type=ct, codename='discuss') touch = Permission.objects.create( name='Can touch', content_type=ct, codename='touch') cuddle = Permission.objects.create( name='Can cuddle', content_type=ct, codename='cuddle') u.user_permissions.add(discuss, touch, cuddle) with self.assertNumQueries(1): data1 = [p.codename for p in u.user_permissions.all()] self.assertListEqual(data1, ['cuddle', 'discuss', 'touch']) touch.name = 'Can lick' touch.codename = 'lick' touch.save() with self.assertNumQueries(1): data2 = [p.codename for p in u.user_permissions.all()] self.assertListEqual(data2, ['cuddle', 'discuss', 'lick']) Permission.objects.filter(pk=discuss.pk).update( name='Can finger', codename='finger') with self.assertNumQueries(1): data3 = [p.codename for p in u.user_permissions.all()] self.assertListEqual(data3, ['cuddle', 'finger', 'lick']) def test_invalidate_aggregate(self): with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 0) with self.assertNumQueries(1): u = User.objects.create_user('test') with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 0) with self.assertNumQueries(1): Test.objects.create(name='test1') with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 0) with self.assertNumQueries(1): Test.objects.create(name='test2', owner=u) with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 1) with self.assertNumQueries(1): Test.objects.create(name='test3') with self.assertNumQueries(1): self.assertEqual(User.objects.aggregate(n=Count('test'))['n'], 1) def test_invalidate_annotate(self): with self.assertNumQueries(1): data1 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data1, []) with self.assertNumQueries(1): Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data2, []) with self.assertNumQueries(2): user1 = User.objects.create_user('user1') user2 = User.objects.create_user('user2') with self.assertNumQueries(1): data3 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data3, [user1, user2]) self.assertListEqual([u.n for u in data3], [0, 0]) with self.assertNumQueries(1): Test.objects.create(name='test2', owner=user1) with self.assertNumQueries(1): data4 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data4, [user1, user2]) self.assertListEqual([u.n for u in data4], [1, 0]) with self.assertNumQueries(2 if self.is_sqlite else 1): Test.objects.bulk_create([ Test(name='test3', owner=user1), Test(name='test4', owner=user2), Test(name='test5', owner=user1), Test(name='test6', owner=user2), ]) with self.assertNumQueries(1): data5 = list(User.objects.annotate(n=Count('test')).order_by('pk')) self.assertListEqual(data5, [user1, user2]) self.assertListEqual([u.n for u in data5], [3, 2]) def test_invalidate_subquery(self): with self.assertNumQueries(1): data1 = list(Test.objects.filter(owner__in=User.objects.all())) self.assertListEqual(data1, []) u = User.objects.create_user('test') with self.assertNumQueries(1): data2 = list(Test.objects.filter(owner__in=User.objects.all())) self.assertListEqual(data2, []) t = Test.objects.create(name='test', owner=u) with self.assertNumQueries(1): data3 = list(Test.objects.filter(owner__in=User.objects.all())) self.assertListEqual(data3, [t]) with self.assertNumQueries(1): data4 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data4, []) g = Group.objects.create(name='test_group') with self.assertNumQueries(1): data5 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data5, []) p = Permission.objects.first() g.permissions.add(p) with self.assertNumQueries(1): data6 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data6, []) u.groups.add(g) with self.assertNumQueries(1): data7 = list( Test.objects.filter( owner__groups__permissions__in=Permission.objects.all() ).distinct()) self.assertListEqual(data7, [t]) with self.assertNumQueries(1): data8 = list( User.objects.filter(user_permissions__in=g.permissions.all()) ) self.assertListEqual(data8, []) u.user_permissions.add(p) with self.assertNumQueries(1): data9 = list( User.objects.filter(user_permissions__in=g.permissions.all()) ) self.assertListEqual(data9, [u]) g.permissions.remove(p) with self.assertNumQueries(1): data10 = list( User.objects.filter(user_permissions__in=g.permissions.all()) ) self.assertListEqual(data10, []) with self.assertNumQueries(1): data11 = list(User.objects.exclude(user_permissions=None)) self.assertListEqual(data11, [u]) u.user_permissions.clear() with self.assertNumQueries(1): data12 = list(User.objects.exclude(user_permissions=None)) self.assertListEqual(data12, []) def test_invalidate_nested_subqueries(self): with self.assertNumQueries(1): data1 = list( User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) self.assertListEqual(data1, []) u = User.objects.create_user('test') with self.assertNumQueries(1): data2 = list( User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) self.assertListEqual(data2, []) p = Permission.objects.first() u.user_permissions.add(p) with self.assertNumQueries(1): data3 = list( User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) self.assertListEqual(data3, [u]) with self.assertNumQueries(1): data4 = list( User.objects.filter( pk__in=User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) ) self.assertListEqual(data4, [u]) u.user_permissions.remove(p) with self.assertNumQueries(1): data5 = list( User.objects.filter( pk__in=User.objects.filter( pk__in=User.objects.filter( user_permissions__in=Permission.objects.all() ) ) ) ) self.assertListEqual(data5, []) def test_invalidate_raw_subquery(self): permission = Permission.objects.first() with self.assertNumQueries(0): raw_sql = RawSQL('SELECT id FROM auth_permission WHERE id = %s', (permission.pk,)) with self.assertNumQueries(1): data1 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data1, []) test = Test.objects.create(name='test', permission=permission) with self.assertNumQueries(1): data2 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data2, [test]) permission.save() with self.assertNumQueries(1): data3 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data3, [test]) test.delete() with self.assertNumQueries(1): data4 = list(Test.objects.filter(permission=raw_sql)) self.assertListEqual(data4, []) def test_invalidate_nested_raw_subquery(self): permission = Permission.objects.first() with self.assertNumQueries(0): raw_sql = RawSQL('SELECT id FROM auth_permission WHERE id = %s', (permission.pk,)) with self.assertNumQueries(1): data1 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data1, []) test = Test.objects.create(name='test', permission=permission) with self.assertNumQueries(1): data2 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data2, [test]) permission.save() with self.assertNumQueries(1): data3 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data3, [test]) test.delete() with self.assertNumQueries(1): data4 = list(Test.objects.filter( pk__in=Test.objects.filter(permission=raw_sql))) self.assertListEqual(data4, []) def test_invalidate_select_related(self): with self.assertNumQueries(1): data1 = list(Test.objects.select_related('owner')) self.assertListEqual(data1, []) with self.assertNumQueries(2): u1 = User.objects.create_user('test1') u2 = User.objects.create_user('test2') with self.assertNumQueries(1): data2 = list(Test.objects.select_related('owner')) self.assertListEqual(data2, []) with self.assertNumQueries(2 if self.is_sqlite else 1): Test.objects.bulk_create([ Test(name='test1', owner=u1), Test(name='test2', owner=u2), Test(name='test3', owner=u2), Test(name='test4', owner=u1), ]) with self.assertNumQueries(1): data3 = list(Test.objects.select_related('owner')) self.assertEqual(data3[0].owner, u1) self.assertEqual(data3[1].owner, u2) self.assertEqual(data3[2].owner, u2) self.assertEqual(data3[3].owner, u1) with self.assertNumQueries(2 if self.is_sqlite else 1): Test.objects.filter(name__in=['test1', 'test2']).delete() with self.assertNumQueries(1): data4 = list(Test.objects.select_related('owner')) self.assertEqual(data4[0].owner, u2) self.assertEqual(data4[1].owner, u1) def test_invalidate_prefetch_related(self): with self.assertNumQueries(1): data1 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data1, []) with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data2, [t1]) self.assertEqual(data2[0].owner, None) with self.assertNumQueries(2): u = User.objects.create_user('user') t1.owner = u t1.save() with self.assertNumQueries(2): data3 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data3, [t1]) self.assertEqual(data3[0].owner, u) self.assertListEqual(list(data3[0].owner.groups.all()), []) with self.assertNumQueries( 8 if self.is_sqlite and DJANGO_VERSION[0] == 2 and DJANGO_VERSION[1] == 2 else 4 if self.is_postgresql and DJANGO_VERSION[0] > 2 else 4 if self.is_mysql and DJANGO_VERSION[0] > 2 else 6 ): group = Group.objects.create(name='test_group') permissions = list(Permission.objects.all()[:5]) group.permissions.add(*permissions) u.groups.add(group) with self.assertNumQueries(2): data4 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data4, [t1]) owner = data4[0].owner self.assertEqual(owner, u) groups = list(owner.groups.all()) self.assertListEqual(groups, [group]) self.assertListEqual(list(groups[0].permissions.all()), permissions) with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data5 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertListEqual(data5, [t1, t2]) owners = [t.owner for t in data5 if t.owner is not None] self.assertListEqual(owners, [u]) groups = [g for o in owners for g in o.groups.all()] self.assertListEqual(groups, [group]) data5_permissions = [p for g in groups for p in g.permissions.all()] self.assertListEqual(data5_permissions, permissions) with self.assertNumQueries(1): permissions[0].save() with self.assertNumQueries(1): list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) with self.assertNumQueries(1): group.name = 'modified_test_group' group.save() with self.assertNumQueries(2): data6 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) g = list(data6[0].owner.groups.all())[0] self.assertEqual(g.name, 'modified_test_group') with self.assertNumQueries(1): User.objects.update(username='modified_user') with self.assertNumQueries(2): data7 = list(Test.objects.select_related('owner') .prefetch_related('owner__groups__permissions')) self.assertEqual(data7[0].owner.username, 'modified_user') @skipUnlessDBFeature('has_select_for_update') def test_invalidate_select_for_update(self): with self.assertNumQueries(1): Test.objects.bulk_create([Test(name='test1'), Test(name='test2')]) with self.assertNumQueries(1): with transaction.atomic(): data1 = list(Test.objects.select_for_update()) self.assertListEqual([t.name for t in data1], ['test1', 'test2']) with self.assertNumQueries(1): with transaction.atomic(): qs = Test.objects.select_for_update() qs.update(name='test3') with self.assertNumQueries(1): with transaction.atomic(): data2 = list(Test.objects.select_for_update()) self.assertListEqual([t.name for t in data2], ['test3'] * 2) def test_invalidate_extra_select(self): user = User.objects.create_user('user') t1 = Test.objects.create(name='test1', owner=user, public=True) username_length_sql = """ SELECT LENGTH(%(user_table)s.username) FROM %(user_table)s WHERE %(user_table)s.id = %(test_table)s.owner_id """ % {'user_table': User._meta.db_table, 'test_table': Test._meta.db_table} with self.assertNumQueries(1): data1 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data1, [t1]) self.assertListEqual([o.username_length for o in data1], [4]) Test.objects.update(public=False) with self.assertNumQueries(1): data2 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data2, [t1]) self.assertListEqual([o.username_length for o in data2], [4]) admin = User.objects.create_superuser('admin', 'admin@test.me', 'password') with self.assertNumQueries(1): data3 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data3, [t1]) self.assertListEqual([o.username_length for o in data3], [4]) t2 = Test.objects.create(name='test2', owner=admin) with self.assertNumQueries(1): data4 = list(Test.objects.extra( select={'username_length': username_length_sql})) self.assertListEqual(data4, [t1, t2]) self.assertListEqual([o.username_length for o in data4], [4, 5]) def test_invalidate_having(self): def _query(): return User.objects.annotate(n=Count('user_permissions')).filter(n__gte=1) with self.assertNumQueries(1): data1 = list(_query()) self.assertListEqual(data1, []) u = User.objects.create_user('user') with self.assertNumQueries(1): data2 = list(_query()) self.assertListEqual(data2, []) p = Permission.objects.first() p.save() with self.assertNumQueries(1): data3 = list(_query()) self.assertListEqual(data3, []) u.user_permissions.add(p) with self.assertNumQueries(1): data3 = list(_query()) self.assertListEqual(data3, [u]) with self.assertNumQueries(1): self.assertEqual(_query().count(), 1) u.user_permissions.clear() with self.assertNumQueries(1): self.assertEqual(_query().count(), 0) def test_invalidate_extra_where(self): sql_condition = ("owner_id IN " "(SELECT id FROM auth_user WHERE username = 'admin')") with self.assertNumQueries(1): data1 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data1, []) admin = User.objects.create_superuser('admin', 'admin@test.me', 'password') with self.assertNumQueries(1): data2 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data2, []) t = Test.objects.create(name='test', owner=admin) with self.assertNumQueries(1): data3 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data3, [t]) admin.username = 'modified' admin.save() with self.assertNumQueries(1): data4 = list(Test.objects.extra(where=[sql_condition])) self.assertListEqual(data4, []) def test_invalidate_extra_tables(self): with self.assertNumQueries(1): User.objects.create_user('user1') with self.assertNumQueries(1): data1 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data1, []) with self.assertNumQueries(1): t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data2, [t1]) with self.assertNumQueries(1): t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data3 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data3, [t1, t2]) with self.assertNumQueries(1): User.objects.create_user('user2') with self.assertNumQueries(1): data4 = list(Test.objects.all().extra(tables=['auth_user'])) self.assertListEqual(data4, [t1, t1, t2, t2]) def test_invalidate_extra_order_by(self): with self.assertNumQueries(1): data1 = list(Test.objects.extra(order_by=['-cachalot_test.name'])) self.assertListEqual(data1, []) t1 = Test.objects.create(name='test1') with self.assertNumQueries(1): data2 = list(Test.objects.extra(order_by=['-cachalot_test.name'])) self.assertListEqual(data2, [t1]) t2 = Test.objects.create(name='test2') with self.assertNumQueries(1): data2 = list(Test.objects.extra(order_by=['-cachalot_test.name'])) self.assertListEqual(data2, [t2, t1]) def test_invalidate_table_inheritance(self): with self.assertNumQueries(1): with self.assertRaises(TestChild.DoesNotExist): TestChild.objects.get() with self.assertNumQueries(3 if self.is_sqlite else 2): t_child = TestChild.objects.create(name='test_child') with self.assertNumQueries(1): self.assertEqual(TestChild.objects.get(), t_child) with self.assertNumQueries(1): TestParent.objects.filter(pk=t_child.pk).update(name='modified') with self.assertNumQueries(1): modified_t_child = TestChild.objects.get() self.assertEqual(modified_t_child.pk, t_child.pk) self.assertEqual(modified_t_child.name, 'modified') with self.assertNumQueries(2): TestChild.objects.filter(pk=t_child.pk).update(name='modified2') with self.assertNumQueries(1): modified2_t_child = TestChild.objects.get() self.assertEqual(modified2_t_child.pk, t_child.pk) self.assertEqual(modified2_t_child.name, 'modified2') def test_raw_insert(self): with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), []) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test1', %s)", [True]) with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test1']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( "INSERT INTO cachalot_test (name, public) " "VALUES ('test2', %s)", [True]) with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test1', 'test2']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.executemany( "INSERT INTO cachalot_test (name, public) " "VALUES ('test3', %s)", [[True]]) with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test1', 'test2', 'test3']) def test_raw_update(self): with self.assertNumQueries(1): Test.objects.create(name='test') with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute("UPDATE cachalot_test SET name = 'new name';") with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['new name']) def test_raw_delete(self): with self.assertNumQueries(1): Test.objects.create(name='test') with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), ['test']) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute('DELETE FROM cachalot_test;') with self.assertNumQueries(1): self.assertListEqual( list(Test.objects.values_list('name', flat=True)), []) def test_raw_create(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) try: with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( 'CREATE INDEX tmp_index ON cachalot_test(name);') with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) finally: with connection.cursor() as cursor: cursor.execute('DROP INDEX tmp_index ON cachalot_test;' if self.is_mysql else 'DROP INDEX tmp_index;') @skipIf(connection.vendor == 'sqlite', 'SQLite does not support column drop, ' 'making it hard to test this.') def test_raw_alter(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) try: with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute( 'ALTER TABLE cachalot_test ADD COLUMN tmp INTEGER;') with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) finally: with connection.cursor() as cursor: cursor.execute('ALTER TABLE cachalot_test DROP COLUMN tmp;') @skipUnless( connection.vendor == 'postgresql', 'SQLite & MySQL do not revert schema changes in a transaction, ' 'making it hard to test this.') @transaction.atomic def test_raw_drop(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) with self.assertNumQueries(1): with connection.cursor() as cursor: cursor.execute('DROP TABLE cachalot_test;') # The table no longer exists, so an error should be raised # after querying it. with self.assertRaises((ProgrammingError, OperationalError)): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) class DatabaseCommandTestCase(TestUtilsMixin, TransactionTestCase): def setUp(self): self.t = Test.objects.create(name='test1') def test_flush(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t]) call_command('flush', verbosity=0, interactive=False) self.force_reopen_connection() with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), []) def test_loaddata(self): with self.assertNumQueries(1): self.assertListEqual(list(Test.objects.all()), [self.t]) call_command('loaddata', 'cachalot/tests/loaddata_fixture.json', verbosity=0) self.force_reopen_connection() with self.assertNumQueries(1): self.assertListEqual([t.name for t in Test.objects.all()], ['test1', 'test2']) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/transaction.py0000644000175100001710000000220300000000000020550 0ustar00runnerdockerfrom .settings import cachalot_settings class AtomicCache(dict): def __init__(self, parent_cache, db_alias): super(AtomicCache, self).__init__() self.parent_cache = parent_cache self.db_alias = db_alias self.to_be_invalidated = set() def set(self, k, v, timeout): self[k] = v def get_many(self, keys): data = {k: self[k] for k in keys if k in self} missing_keys = set(keys) missing_keys.difference_update(data) data.update(self.parent_cache.get_many(missing_keys)) return data def set_many(self, data, timeout): self.update(data) def commit(self): # We import this here to avoid a circular import issue. from .utils import _invalidate_tables if self: self.parent_cache.set_many( self, cachalot_settings.CACHALOT_TIMEOUT) # The previous `set_many` is not enough. The parent cache needs to be # invalidated in case another transaction occurred in the meantime. _invalidate_tables(self.parent_cache, self.db_alias, self.to_be_invalidated) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/cachalot/utils.py0000644000175100001710000002024600000000000017372 0ustar00runnerdockerimport datetime from decimal import Decimal from hashlib import sha1 from time import time from typing import TYPE_CHECKING from uuid import UUID from django.contrib.postgres.functions import TransactionNow from django.db import connections from django.db.models import Exists, QuerySet, Subquery from django.db.models.expressions import RawSQL from django.db.models.functions import Now from django.db.models.sql import Query, AggregateQuery from django.db.models.sql.where import ExtraWhere, WhereNode, NothingNode from .settings import ITERABLES, cachalot_settings from .transaction import AtomicCache if TYPE_CHECKING: from django.db.models.expressions import BaseExpression class UncachableQuery(Exception): pass class IsRawQuery(Exception): pass CACHABLE_PARAM_TYPES = { bool, int, float, Decimal, bytearray, bytes, str, type(None), datetime.date, datetime.time, datetime.datetime, datetime.timedelta, UUID, } UNCACHABLE_FUNCS = {Now, TransactionNow} try: # TODO Drop after Dj30 drop from django.contrib.postgres.fields.jsonb import JsonAdapter CACHABLE_PARAM_TYPES.update((JsonAdapter,)) except ImportError: pass try: from psycopg2 import Binary from psycopg2.extras import ( NumericRange, DateRange, DateTimeRange, DateTimeTZRange, Inet, Json) except ImportError: pass else: CACHABLE_PARAM_TYPES.update(( Binary, NumericRange, DateRange, DateTimeRange, DateTimeTZRange, Inet, Json,)) def check_parameter_types(params): for p in params: cl = p.__class__ if cl not in CACHABLE_PARAM_TYPES: if cl in ITERABLES: check_parameter_types(p) elif cl is dict: check_parameter_types(p.items()) else: raise UncachableQuery def get_query_cache_key(compiler): """ Generates a cache key from a SQLCompiler. This cache key is specific to the SQL query and its context (which database is used). The same query in the same context (= the same database) must generate the same cache key. :arg compiler: A SQLCompiler that will generate the SQL query :type compiler: django.db.models.sql.compiler.SQLCompiler :return: A cache key :rtype: int """ sql, params = compiler.as_sql() check_parameter_types(params) cache_key = '%s:%s:%s' % (compiler.using, sql, [str(p) for p in params]) return sha1(cache_key.encode('utf-8')).hexdigest() def get_table_cache_key(db_alias, table): """ Generates a cache key from a SQL table. :arg db_alias: Alias of the used database :type db_alias: str or unicode :arg table: Name of the SQL table :type table: str or unicode :return: A cache key :rtype: int """ cache_key = '%s:%s' % (db_alias, table) return sha1(cache_key.encode('utf-8')).hexdigest() def _get_tables_from_sql(connection, lowercased_sql): return {t for t in connection.introspection.django_table_names() + cachalot_settings.CACHALOT_ADDITIONAL_TABLES if t in lowercased_sql} def _find_rhs_lhs_subquery(side): h_class = side.__class__ if h_class is Query: return side elif h_class is QuerySet: return side.query elif h_class in (Subquery, Exists): # Subquery allows QuerySet & Query try: return side.query.query if side.query.__class__ is QuerySet else side.query except AttributeError: # TODO Remove try/except closure after drop Django 2.2 try: return side.queryset.query except AttributeError: return None elif h_class in UNCACHABLE_FUNCS: raise UncachableQuery def _find_subqueries_in_where(children): for child in children: child_class = child.__class__ if child_class is WhereNode: for grand_child in _find_subqueries_in_where(child.children): yield grand_child elif child_class is ExtraWhere: raise IsRawQuery elif child_class is NothingNode: pass else: rhs = _find_rhs_lhs_subquery(child.rhs) if rhs is not None: yield rhs lhs = _find_rhs_lhs_subquery(child.lhs) if lhs is not None: yield lhs def is_cachable(table): whitelist = cachalot_settings.CACHALOT_ONLY_CACHABLE_TABLES if whitelist and table not in whitelist: return False return table not in cachalot_settings.CACHALOT_UNCACHABLE_TABLES def are_all_cachable(tables): whitelist = cachalot_settings.CACHALOT_ONLY_CACHABLE_TABLES if whitelist and not tables.issubset(whitelist): return False return tables.isdisjoint(cachalot_settings.CACHALOT_UNCACHABLE_TABLES) def filter_cachable(tables): whitelist = cachalot_settings.CACHALOT_ONLY_CACHABLE_TABLES tables = tables.difference(cachalot_settings.CACHALOT_UNCACHABLE_TABLES) if whitelist: return tables.intersection(whitelist) return tables def _flatten(expression: "BaseExpression"): """ Recursively yield this expression and all subexpressions, in depth-first order. Taken from Django 3.2 as the previous Django versions don’t check for existence of flatten. """ yield expression for expr in expression.get_source_expressions(): if expr: if hasattr(expr, 'flatten'): yield from _flatten(expr) else: yield expr def _get_tables(db_alias, query): if query.select_for_update or ( not cachalot_settings.CACHALOT_CACHE_RANDOM and '?' in query.order_by): raise UncachableQuery try: if query.extra_select: raise IsRawQuery # Gets all tables already found by the ORM. tables = set(query.table_map) tables.add(query.get_meta().db_table) # Gets tables in subquery annotations. for annotation in query.annotations.values(): if type(annotation) in UNCACHABLE_FUNCS: raise UncachableQuery for expression in _flatten(annotation): if isinstance(expression, Subquery): if hasattr(expression, "queryset"): tables.update(_get_tables(db_alias, expression.queryset.query)) else: tables.update(_get_tables(db_alias, expression.query)) elif isinstance(expression, RawSQL): sql = expression.as_sql(None, None)[0].lower() tables.update(_get_tables_from_sql(connections[db_alias], sql)) # Gets tables in WHERE subqueries. for subquery in _find_subqueries_in_where(query.where.children): tables.update(_get_tables(db_alias, subquery)) # Gets tables in HAVING subqueries. if isinstance(query, AggregateQuery): try: tables.update(_get_tables_from_sql(connections[db_alias], query.subquery)) except TypeError: # For Django 3.2+ tables.update(_get_tables(db_alias, query.inner_query)) # Gets tables in combined queries # using `.union`, `.intersection`, or `difference`. if query.combined_queries: for combined_query in query.combined_queries: tables.update(_get_tables(db_alias, combined_query)) except IsRawQuery: sql = query.get_compiler(db_alias).as_sql()[0].lower() tables = _get_tables_from_sql(connections[db_alias], sql) if not are_all_cachable(tables): raise UncachableQuery return tables def _get_table_cache_keys(compiler): db_alias = compiler.using get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN return [get_table_cache_key(db_alias, t) for t in _get_tables(db_alias, compiler.query)] def _invalidate_tables(cache, db_alias, tables): tables = filter_cachable(set(tables)) if not tables: return now = time() get_table_cache_key = cachalot_settings.CACHALOT_TABLE_KEYGEN cache.set_many( {get_table_cache_key(db_alias, t): now for t in tables}, cachalot_settings.CACHALOT_TIMEOUT) if isinstance(cache, AtomicCache): cache.to_be_invalidated.update(tables) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/django_cachalot.egg-info/0000755000175100001710000000000000000000000020670 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730621.0 django-cachalot-2.4.3/django_cachalot.egg-info/PKG-INFO0000644000175100001710000001765700000000000022005 0ustar00runnerdockerMetadata-Version: 1.1 Name: django-cachalot Version: 2.4.3 Summary: Caches your Django ORM queries and automatically invalidates them. Home-page: https://github.com/noripyt/django-cachalot Author: Bertrand Bordage, Andrew Chen Wang Author-email: acwangpython@gmail.com License: BSD Description: Django Cachalot =============== Caches your Django ORM queries and automatically invalidates them. Documentation: http://django-cachalot.readthedocs.io ---- .. image:: http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 :target: https://pypi.python.org/pypi/django-cachalot .. image:: https://img.shields.io/pypi/pyversions/django-cachalot :target: https://django-cachalot.readthedocs.io/en/latest/ .. image:: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml/badge.svg :target: https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml .. image:: http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://coveralls.io/r/noripyt/django-cachalot?branch=master .. image:: http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 :target: https://scrutinizer-ci.com/g/noripyt/django-cachalot/ .. image:: https://img.shields.io/discord/773656139207802881 :target: https://discord.gg/WFGFBk8rSU ---- Table of Contents: - Quickstart - Usage - Hacking - Benchmark - Third-Party Cache Comparison - Discussion Quickstart ---------- Cachalot officially supports Python 3.6-3.9 and Django 2.2 and 3.1-3.2 with the databases PostgreSQL, SQLite, and MySQL. Note: an upper limit on Django version is set for your safety. Please do not ignore it. Usage ----- #. ``pip install django-cachalot`` #. Add ``'cachalot',`` to your ``INSTALLED_APPS`` #. If you use multiple servers with a common cache server, `double check their clock synchronisation `_ #. If you modify data outside Django – typically after restoring a SQL database –, use the `manage.py command `_ #. Be aware of `the few other limits `_ #. If you use `django-debug-toolbar `_, you can add ``'cachalot.panels.CachalotPanel',`` to your ``DEBUG_TOOLBAR_PANELS`` #. Enjoy! Hacking ------- To start developing, install the requirements and run the tests via tox. Make sure you have the following services: * Memcached * Redis * PostgreSQL * MySQL For setup: #. Install: ``pip install -r requirements/hacking.txt`` #. For PostgreSQL: ``CREATE ROLE cachalot LOGIN SUPERUSER;`` #. Run: ``tox --current-env`` to run the test suite on your current Python version. #. You can also run specific databases and Django versions: ``tox -e py38-django3.1-postgresql-redis`` Benchmark --------- Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called "cachalot" on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called "cachalot." You can also run the benchmark, and it'll raise errors with specific instructions for how to fix it. #. Install: ``pip install -r requirements/benchmark.txt`` #. Run: ``python benchmark.py`` The output will be in benchmark/TODAY'S_DATE/ TODO Create Docker-compose file to allow for easier running of data. Third-Party Cache Comparison ---------------------------- There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix: TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won't need to scale to the point of needing cache-machine added to the bowl). If you're an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with ``select_related`` and ``prefetch_related``, you can get a nearly 100x speed up for your initial deployment. Recall, cachalot caches THE ENTIRE TABLE. That's where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it's just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it. Cachalot is more-or-less intended for cold caches or "just-right" conditions. If you find a partition library for Django (also authored but work-in-progress by `Andrew Chen Wang`_), then the caching will work better since sharding the cold/accessed-the-least records aren't invalidated as much. Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It's the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot. Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that's modified rapidly? Since you've mixed your cold (90% of) with your hot (10% of) records, you're caching and invalidating an entire table. It's like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them. Note 1: My personal experience with caches stems from Reddit's: https://redditblog.com/2017/01/17/caching-at-reddit/ Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools Discussion ---------- Help? Technical chat? `It's here on Discord `_. Legacy chats: - https://gitter.im/django-cachalot/Lobby - https://join.slack.com/t/cachalotdjango/shared_invite/zt-dd0tj27b-cIH6VlaSOjAWnTG~II5~qw .. _Andrew Chen Wang: https://github.com/Andrew-Chen-Wang .. image:: https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Framework :: Django Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Framework :: Django :: 2.2 Classifier: Framework :: Django :: 3.1 Classifier: Framework :: Django :: 3.2 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Topic :: Internet :: WWW/HTTP ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730621.0 django-cachalot-2.4.3/django_cachalot.egg-info/SOURCES.txt0000644000175100001710000000236600000000000022563 0ustar00runnerdockerCHANGELOG.rst LICENSE MANIFEST.in README.rst requirements.txt setup.py cachalot/__init__.py cachalot/api.py cachalot/apps.py cachalot/cache.py cachalot/jinja2ext.py cachalot/models.py cachalot/monkey_patch.py cachalot/panels.py cachalot/settings.py cachalot/signals.py cachalot/transaction.py cachalot/utils.py cachalot/management/__init__.py cachalot/management/commands/__init__.py cachalot/management/commands/invalidate_cachalot.py cachalot/templates/cachalot/panel.html cachalot/templatetags/__init__.py cachalot/templatetags/cachalot.py cachalot/tests/__init__.py cachalot/tests/api.py cachalot/tests/db_router.py cachalot/tests/debug_toolbar.py cachalot/tests/loaddata_fixture.json cachalot/tests/models.py cachalot/tests/multi_db.py cachalot/tests/postgres.py cachalot/tests/read.py cachalot/tests/settings.py cachalot/tests/signals.py cachalot/tests/test_utils.py cachalot/tests/thread_safety.py cachalot/tests/transaction.py cachalot/tests/write.py cachalot/tests/migrations/0001_initial.py cachalot/tests/migrations/__init__.py django_cachalot.egg-info/PKG-INFO django_cachalot.egg-info/SOURCES.txt django_cachalot.egg-info/dependency_links.txt django_cachalot.egg-info/not-zip-safe django_cachalot.egg-info/requires.txt django_cachalot.egg-info/top_level.txt././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730621.0 django-cachalot-2.4.3/django_cachalot.egg-info/dependency_links.txt0000644000175100001710000000000100000000000024736 0ustar00runnerdocker ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730621.0 django-cachalot-2.4.3/django_cachalot.egg-info/not-zip-safe0000644000175100001710000000000100000000000023116 0ustar00runnerdocker ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730621.0 django-cachalot-2.4.3/django_cachalot.egg-info/requires.txt0000644000175100001710000000002100000000000023261 0ustar00runnerdockerDjango<3.3,>=2.2 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730621.0 django-cachalot-2.4.3/django_cachalot.egg-info/top_level.txt0000644000175100001710000000001100000000000023412 0ustar00runnerdockercachalot ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/requirements.txt0000644000175100001710000000002100000000000017353 0ustar00runnerdockerDjango>=2.2,<3.3 ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1629730621.836695 django-cachalot-2.4.3/setup.cfg0000644000175100001710000000004600000000000015717 0ustar00runnerdocker[egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1629730609.0 django-cachalot-2.4.3/setup.py0000755000175100001710000000256300000000000015621 0ustar00runnerdocker#!/usr/bin/env python import os from setuptools import setup, find_packages from cachalot import __version__ CURRENT_PATH = os.path.abspath(os.path.dirname(__file__)) with open(os.path.join(CURRENT_PATH, 'requirements.txt')) as f: required = f.read().splitlines() setup( name='django-cachalot', version=__version__, author='Bertrand Bordage, Andrew Chen Wang', author_email='acwangpython@gmail.com', url='https://github.com/noripyt/django-cachalot', description='Caches your Django ORM queries and automatically invalidates them.', long_description=open('README.rst').read(), classifiers=[ 'Development Status :: 5 - Production/Stable', 'Framework :: Django', 'Intended Audience :: Developers', 'License :: OSI Approved :: BSD License', 'Operating System :: OS Independent', 'Framework :: Django :: 2.2', 'Framework :: Django :: 3.1', 'Framework :: Django :: 3.2', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', 'Programming Language :: Python :: 3.8', 'Programming Language :: Python :: 3.9', 'Topic :: Internet :: WWW/HTTP', ], license='BSD', packages=find_packages(), install_requires=required, include_package_data=True, zip_safe=False, )