dogpile.cache-0.5.1/0000755000076500000240000000000012225644023014746 5ustar classicstaff00000000000000dogpile.cache-0.5.1/docs/0000755000076500000240000000000012225644023015676 5ustar classicstaff00000000000000dogpile.cache-0.5.1/docs/_sources/0000755000076500000240000000000012225644023017520 5ustar classicstaff00000000000000dogpile.cache-0.5.1/docs/_sources/api.txt0000644000076500000240000000175112225642516021043 0ustar classicstaff00000000000000=== API === Region ====== .. automodule:: dogpile.cache.region :members: .. autofunction:: dogpile.cache.util.function_key_generator Backend API ============= See the section :ref:`creating_backends` for details on how to register new backends or :ref:`changing_backend_behavior` for details on how to alter the behavior of existing backends. .. automodule:: dogpile.cache.api :members: Backends ========== .. automodule:: dogpile.cache.backends.memory :members: .. automodule:: dogpile.cache.backends.memcached :members: .. automodule:: dogpile.cache.backends.redis :members: .. automodule:: dogpile.cache.backends.file :members: .. automodule:: dogpile.cache.proxy :members: Plugins ======== .. automodule:: dogpile.cache.plugins.mako_cache :members: Utilities ========= .. currentmodule:: dogpile.cache.util .. autofunction:: function_key_generator .. autofunction:: sha1_mangle_key .. autofunction:: length_conditional_mangler dogpile.cache-0.5.1/docs/_sources/changelog.txt0000644000076500000240000002652112225643265022225 0ustar classicstaff00000000000000============== Changelog ============== .. changelog:: :version: 0.5.1 :released: Thu Oct 10 2013 .. change:: :tags: feature :tickets: 38 The :meth:`.CacheRegion.invalidate` method now supports an option ``hard=True|False``. A "hard" invalidation, equivalent to the existing functionality of :meth:`.CacheRegion.invalidate`, means :meth:`.CacheRegion.get_or_create` will not return the "old" value at all, forcing all getters to regenerate or wait for a regeneration. "soft" invalidation means that getters can continue to return the old value until a new one is generated. .. change:: :tags: feature :tickets: 40 New dogpile-specific exception classes have been added, so that issues like "region already configured", "region unconfigured", raise dogpile-specific exceptions. Other exception classes have been made more specific. Also added new accessor :attr:`.CacheRegion.is_configured`. Pullreq courtesy Morgan Fainberg. .. change:: :tags: bug Erroneously missed when the same change was made for ``set()`` in 0.5.0, the Redis backend now uses ``pickle.HIGHEST_PROTOCOL`` for the ``set_multi()`` method as well when producing pickles. Courtesy Łukasz Fidosz. .. change:: :tags: bug, redis, py3k :tickets: 39 Fixed an errant ``u''`` causing incompatibility in Python3.2 in the Redis backend, courtesy Jimmey Mabey. .. change:: :tags: bug The :func:`.util.coerce_string_conf` method now correctly coerces negative integers and those with a leading + sign. This previously prevented configuring a :class:`.CacheRegion` with an ``expiration_time`` of ``'-1'``. Courtesy David Beitey. .. change:: :tags: bug The ``refresh()`` method on :meth:`.CacheRegion.cache_multi_on_arguments` now supports the ``asdict`` flag. .. changelog:: :version: 0.5.0 :released: Fri Jun 21 2013 .. change:: :tags: misc Source repository has been moved to git. .. change:: :tags: bug The Redis backend now uses ``pickle.HIGHEST_PROTOCOL`` when producing pickles. Courtesy Lx Yu. .. change:: :tags: bug :meth:`.CacheRegion.cache_on_arguments` now has a new argument ``to_str``, defaults to ``str()``. Can be replaced with ``unicode()`` or other functions to support caching of functions that accept non-unicode arguments. Initial patch courtesy Lx Yu. .. change:: :tags: feature Now using the ``Lock`` included with the Python ``redis`` backend, which adds ``lock_timeout`` and ``lock_sleep`` arguments to the :class:`.RedisBackend`. .. change:: :tags: feature :tickets: 33, 35 Added new methods :meth:`.CacheRegion.get_or_create_multi` and :meth:`.CacheRegion.cache_multi_on_arguments`, which make use of the :meth:`.CacheRegion.get_multi` and similar functions to store and retrieve multiple keys at once while maintaining dogpile semantics for each. .. change:: :tags: feature :tickets: 36 Added a method ``refresh()`` to functions decorated by :meth:`.CacheRegion.cache_on_arguments` and :meth:`.CacheRegion.cache_multi_on_arguments`, to complement ``invalidate()`` and ``set()``. .. change:: :tags: feature :tickets: 13 :meth:`.CacheRegion.configure` accepts an optional ``datetime.timedelta`` object for the ``expiration_time`` argument as well as an integer, courtesy Jack Lutz. .. change:: :tags: feature :tickets: 20 The ``expiration_time`` argument passed to :meth:`.CacheRegion.cache_on_arguments` may be a callable, to return a dynamic timeout value. Courtesy David Beitey. .. change:: :tags: feature :tickets: 26 Added support for simple augmentation of existing backends using the :class:`.ProxyBackend` class. Thanks to Tim Hanus for the great effort with development, testing, and documentation. .. change:: :tags: feature :pullreq: 14 Full support for multivalue get/set/delete added, using :meth:`.CacheRegion.get_multi`, :meth:`.CacheRegion.set_multi`, :meth:`.CacheRegion.delete_multi`, courtesy Marcos Araujo Sobrinho. .. change:: :tags: bug :tickets: 27 Fixed bug where the "name" parameter for :class:`.CacheRegion` was ignored entirely. Courtesy Wichert Akkerman. .. changelog:: :version: 0.4.3 :released: Thu Apr 4 2013 .. change:: :tags: bug Added support for the ``cache_timeout`` Mako argument to the Mako plugin, which will pass the value to the ``expiration_time`` argument of :meth:`.CacheRegion.get_or_create`. .. change:: :tags: feature :pullreq: 13 :meth:`.CacheRegion.get_or_create` and :meth:`.CacheRegion.cache_on_arguments` now accept a new argument ``should_cache_fn``, receives the value returned by the "creator" and then returns True or False, where True means "cache plus return", False means "return the value but don't cache it." .. changelog:: :version: 0.4.2 :released: Sat Jan 19 2013 .. change:: :tags: feature :pullreq: 10 An "async creator" function can be specified to :class:`.CacheRegion` which allows the "creation" function to be called asynchronously or be subsituted for another asynchronous creation scheme. Courtesy Ralph Bean. .. changelog:: :version: 0.4.1 :released: Sat Dec 15 2012 .. change:: :tags: feature :pullreq: 9 The function decorated by :meth:`.CacheRegion.cache_on_arguments` now includes a ``set()`` method, in addition to the existing ``invalidate()`` method. Like ``invalidate()``, it accepts a set of function arguments, but additionally accepts as the first positional argument a new value to place in the cache, to take the place of that key. Courtesy Antoine Bertin. .. change:: :tags: bug :tickets: 15 Fixed bug in DBM backend whereby if an error occurred during the "write" operation, the file lock, if enabled, would not be released, thereby deadlocking the app. .. change:: :tags: bug :tickets: 12 The :func:`.util.function_key_generator` used by the function decorator no longer coerces non-unicode arguments into a Python unicode object on Python 2.x; this causes failures on backends such as DBM which on Python 2.x apparently require bytestrings. The key_mangler is still needed if actual unicode arguments are being used by the decorated function, however. .. change:: :tags: feature Redis backend now accepts optional "url" argument, will be passed to the new ``StrictRedis.from_url()`` method to determine connection info. Courtesy Jon Rosebaugh. .. change:: :tags: feature Redis backend now accepts optional "password" argument. Courtesy Jon Rosebaugh. .. change:: :tags: feature DBM backend has "fallback" when calling dbm.get() to instead use dictionary access + KeyError, in the case that the "gdbm" backend is used which does not include .get(). Courtesy Jon Rosebaugh. .. changelog:: :version: 0.4.0 :released: Tue Oct 30 2012 .. change:: :tags: bug :tickets: 1 Using dogpile.core 0.4.0 now, fixes a critical bug whereby dogpile pileup could occur on first value get across multiple processes, due to reliance upon a non-shared creation time. This is a dogpile.core issue. .. change:: :tags: bug :tickets: Fixed missing __future__ with_statement directive in region.py. .. changelog:: :version: 0.3.1 :released: Tue Sep 25 2012 .. change:: :tags: bug :tickets: Fixed the mako_cache plugin which was not yet covered, and wasn't implementing the mako plugin API correctly; fixed docs as well. Courtesy Ben Hayden. .. change:: :tags: bug :tickets: Fixed setup so that the tests/* directory isn't yanked into the install. Courtesy Ben Hayden. .. changelog:: :version: 0.3.0 :released: Thu Jun 14 2012 .. change:: :tags: feature :tickets: get() method now checks expiration time by default. Use ignore_expiration=True to bypass this. .. change:: :tags: feature :tickets: 7 Added new invalidate() method. Sets the current timestamp as a minimum value that all retrieved values must be created after. Is honored by the get_or_create() and get() methods. .. change:: :tags: bug :tickets: 8 Fixed bug whereby region.get() didn't work if the value wasn't present. .. changelog:: :version: 0.2.4 :released: .. change:: :tags: :tickets: Fixed py3k issue with config string coerce, courtesy Alexander Fedorov .. changelog:: :version: 0.2.3 :released: Wed May 16 2012 .. change:: :tags: :tickets: 3 support "min_compress_len" and "memcached_expire_time" with python-memcached backend. Tests courtesy Justin Azoff .. change:: :tags: :tickets: 4 Add support for coercion of string config values to Python objects - ints, "false", "true", "None". .. change:: :tags: :tickets: 5 Added support to DBM file lock to allow reentrant access per key within a single thread, so that even though the DBM backend locks for the whole file, a creation function that calls upon a different key in the cache can still proceed. .. change:: :tags: :tickets: Fixed DBM glitch where multiple readers could be serialized. .. change:: :tags: :tickets: Adjust bmemcached backend to work with newly-repaired bmemcached calling API (see bmemcached ef206ed4473fec3b639e). .. changelog:: :version: 0.2.2 :released: Thu Apr 19 2012 .. change:: :tags: :tickets: add Redis backend, courtesy Ollie Rutherfurd .. changelog:: :version: 0.2.1 :released: Sun Apr 15 2012 .. change:: :tags: :tickets: move tests into tests/cache namespace .. change:: :tags: :tickets: py3k compatibility is in-place now, no 2to3 needed. .. changelog:: :version: 0.2.0 :released: Sat Apr 14 2012 .. change:: :tags: :tickets: Based on dogpile.core now, to get the package namespace thing worked out. .. changelog:: :version: 0.1.1 :released: Tue Apr 10 2012 .. change:: :tags: :tickets: Fixed the configure_from_config() method of region and backend which wasn't working. Courtesy Christian Klinger. .. changelog:: :version: 0.1.0 :released: Sun Apr 08 2012 .. change:: :tags: :tickets: Initial release. .. change:: :tags: :tickets: Includes a pylibmc backend and a plain dictionary backend. dogpile.cache-0.5.1/docs/_sources/front.txt0000644000076500000240000000345412225642516021424 0ustar classicstaff00000000000000============ Front Matter ============ Information about the dogpile.cache project. Project Homepage ================ dogpile.cache is hosted on `Bitbucket `_ - the lead project page is at https://bitbucket.org/zzzeek/dogpile.cache. Source code is tracked here using Git. .. versionchanged:: 0.5.0 Moved source repository to git. Releases and project status are available on Pypi at http://pypi.python.org/pypi/dogpile.cache. The most recent published version of this documentation should be at http://dogpilecache.readthedocs.org. Installation ============ Install released versions of dogpile.cache from the Python package index with `pip `_ or a similar tool:: pip install dogpile.cache Installation via source distribution is via the ``setup.py`` script:: python setup.py install Community ========= dogpile.cache is developed by `Mike Bayer `_, and is loosely associated with the `Pylons Project `_. As dogpile.cache's usage increases, it is anticipated that the Pylons mailing list and IRC channel will become the primary channels for support. Bugs ==== Bugs and feature enhancements to dogpile.cache should be reported on the `Bitbucket issue tracker `_. If you're not sure that a particular issue is specific to either dogpile.cache or `dogpile.core `_, posting to the dogpile.cache tracker is likely the better place to post first. * `dogpile.cache issue tracker `_ (post here if unsure) * `dogpile.core issue tracker `_ dogpile.cache-0.5.1/docs/_sources/index.txt0000644000076500000240000000172712225642516021404 0ustar classicstaff00000000000000========================================== Welcome to dogpile.cache's documentation! ========================================== `dogpile.cache `_ provides a simple caching pattern based on the `dogpile.core `_ locking system, including rudimentary backends. It effectively completes the replacement of `Beaker `_ as far as caching (though **not** HTTP sessions) is concerned, providing an open-ended, simple, and higher-performing pattern to configure and use cache backends. New backends are very easy to create and use; users are encouraged to adapt the provided backends for their own needs, as high volume caching requires lots of tweaks and adjustments specific to an application and its environment. .. toctree:: :maxdepth: 2 front usage api changelog Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` dogpile.cache-0.5.1/docs/_sources/usage.txt0000644000076500000240000003111012225642516021366 0ustar classicstaff00000000000000============ Usage Guide ============ Overview ======== At the time of this writing, popular key/value servers include `Memcached `_, `Redis `_, and `Riak `_. While these tools all have different usage focuses, they all have in common that the storage model is based on the retrieval of a value based on a key; as such, they are all potentially suitable for caching, particularly Memcached which is first and foremost designed for caching. With a caching system in mind, dogpile.cache provides an interface to a particular Python API targeted at that system. A dogpile.cache configuration consists of the following components: * A *region*, which is an instance of :class:`.CacheRegion`, and defines the configuration details for a particular cache backend. The :class:`.CacheRegion` can be considered the "front end" used by applications. * A *backend*, which is an instance of :class:`.CacheBackend`, describing how values are stored and retrieved from a backend. This interface specifies only :meth:`~.CacheBackend.get`, :meth:`~.CacheBackend.set` and :meth:`~.CacheBackend.delete`. The actual kind of :class:`.CacheBackend` in use for a particular :class:`.CacheRegion` is determined by the underlying Python API being used to talk to the cache, such as Pylibmc. The :class:`.CacheBackend` is instantiated behind the scenes and not directly accessed by applications under normal circumstances. * Value generation functions. These are user-defined functions that generate new values to be placed in the cache. While dogpile.cache offers the usual "set" approach of placing data into the cache, the usual mode of usage is to only instruct it to "get" a value, passing it a *creation function* which will be used to generate a new value if and only if one is needed. This "get-or-create" pattern is the entire key to the "Dogpile" system, which coordinates a single value creation operation among many concurrent get operations for a particular key, eliminating the issue of an expired value being redundantly re-generated by many workers simultaneously. Rudimentary Usage ================= dogpile.cache includes a Pylibmc backend. A basic configuration looks like:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], } ) @region.cache_on_arguments() def load_user_info(user_id): return some_database.lookup_user_by_id(user_id) .. sidebar:: pylibmc In this section, we're illustrating Memcached usage using the `pylibmc `_ backend, which is a high performing Python library for Memcached. It can be compared to the `python-memcached `_ client, which is also an excellent product. Pylibmc is written against Memcached's native API so is markedly faster, though might be considered to have rougher edges. The API is actually a bit more verbose to allow for correct multithreaded usage. Above, we create a :class:`.CacheRegion` using the :func:`.make_region` function, then apply the backend configuration via the :meth:`.CacheRegion.configure` method, which returns the region. The name of the backend is the only argument required by :meth:`.CacheRegion.configure` itself, in this case ``dogpile.cache.pylibmc``. However, in this specific case, the ``pylibmc`` backend also requires that the URL of the memcached server be passed within the ``arguments`` dictionary. The configuration is separated into two sections. Upon construction via :func:`.make_region`, the :class:`.CacheRegion` object is available, typically at module import time, for usage in decorating functions. Additional configuration details passed to :meth:`.CacheRegion.configure` are typically loaded from a configuration file and therefore not necessarily available until runtime, hence the two-step configurational process. Key arguments passed to :meth:`.CacheRegion.configure` include *expiration_time*, which is the expiration time passed to the Dogpile lock, and *arguments*, which are arguments used directly by the backend - in this case we are using arguments that are passed directly to the pylibmc module. Region Configuration ==================== The :func:`.make_region` function currently calls the :class:`.CacheRegion` constructor directly. .. autoclass:: dogpile.cache.region.CacheRegion :noindex: One you have a :class:`.CacheRegion`, the :meth:`.CacheRegion.cache_on_arguments` method can be used to decorate functions, but the cache itself can't be used until :meth:`.CacheRegion.configure` is called. The interface for that method is as follows: .. automethod:: dogpile.cache.region.CacheRegion.configure :noindex: The :class:`.CacheRegion` can also be configured from a dictionary, using the :meth:`.CacheRegion.configure_from_config` method: .. automethod:: dogpile.cache.region.CacheRegion.configure_from_config :noindex: Using a Region ============== The :class:`.CacheRegion` object is our front-end interface to a cache. It includes the following methods: .. automethod:: dogpile.cache.region.CacheRegion.get :noindex: .. automethod:: dogpile.cache.region.CacheRegion.get_or_create :noindex: .. automethod:: dogpile.cache.region.CacheRegion.set :noindex: .. automethod:: dogpile.cache.region.CacheRegion.delete :noindex: .. automethod:: dogpile.cache.region.CacheRegion.cache_on_arguments :noindex: .. _creating_backends: Creating Backends ================= Backends are located using the setuptools entrypoint system. To make life easier for writers of ad-hoc backends, a helper function is included which registers any backend in the same way as if it were part of the existing sys.path. For example, to create a backend called ``DictionaryBackend``, we subclass :class:`.CacheBackend`:: from dogpile.cache.api import CacheBackend, NO_VALUE class DictionaryBackend(CacheBackend): def __init__(self, arguments): self.cache = {} def get(self, key): return self.cache.get(key, NO_VALUE) def set(self, key, value): self.cache[key] = value def delete(self, key): self.cache.pop(key) Then make sure the class is available underneath the entrypoint ``dogpile.cache``. If we did this in a ``setup.py`` file, it would be in ``setup()`` as:: entry_points=""" [dogpile.cache] dictionary = mypackage.mybackend:DictionaryBackend """ Alternatively, if we want to register the plugin in the same process space without bothering to install anything, we can use ``register_backend``:: from dogpile.cache import register_backend register_backend("dictionary", "mypackage.mybackend", "DictionaryBackend") Our new backend would be usable in a region like this:: from dogpile.cache import make_region region = make_region("dictionary") data = region.set("somekey", "somevalue") The values we receive for the backend here are instances of ``CachedValue``. This is a tuple subclass of length two, of the form:: (payload, metadata) Where "payload" is the thing being cached, and "metadata" is information we store in the cache - a dictionary which currently has just the "creation time" and a "version identifier" as key/values. If the cache backend requires serialization, pickle or similar can be used on the tuple - the "metadata" portion will always be a small and easily serializable Python structure. .. _changing_backend_behavior: Changing Backend Behavior ========================= The :class:`.ProxyBackend` is a decorator class provided to easily augment existing backend behavior without having to extend the original class. Using a decorator class is also adventageous as it allows us to share the altered behavior between different backends. Proxies are added to the :class:`.CacheRegion` object using the :meth:`.CacheRegion.configure` method. Only the overridden methods need to be specified and the real backend can be accessed with the ``self.proxied`` object from inside the :class:`.ProxyBackend`. For example, a simple class to log all calls to ``.set()`` would look like this:: from dogpile.cache.proxy import ProxyBackend import logging log = logging.getLogger(__name__) class LoggingProxy(ProxyBackend): def set(self, key, value): log.debug('Setting Cache Key: %s' % key) self.proxied.set(key, value) :class:`.ProxyBackend` can be be configured to optionally take arguments (as long as the :meth:`.ProxyBackend.__init__` method is called properly, either directly or via ``super()``. In the example below, the ``RetryDeleteProxy`` class accepts a ``retry_count`` parameter on initialization. In the event of an exception on delete(), it will retry this many times before returning:: from dogpile.cache.proxy import ProxyBackend class RetryDeleteProxy(ProxyBackend): def __init__(self, retry_count=5): super(RetryDeleteProxy, self).__init__() self.retry_count = retry_count def delete(self, key): retries = self.retry_count while retries > 0: retries -= 1 try: self.proxied.delete(key) return except: pass The ``wrap`` parameter of the :meth:`.CacheRegion.configure` accepts a list which can contain any combination of instantiated proxy objects as well as uninstantiated proxy classes. Putting the two examples above together would look like this:: from dogpile.cache import make_region retry_proxy = RetryDeleteProxy(5) region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], }, wrap = [ LoggingProxy, retry_proxy ] ) In the above example, the ``LoggingProxy`` object would be instantated by the :class:`.CacheRegion` and applied to wrap requests on behalf of the ``retry_proxy`` instance; that proxy in turn wraps requests on behalf of the original dogpile.cache.pylibmc backend. .. versionadded:: 0.4.4 Added support for the :class:`.ProxyBackend` class. Recipes ======= Invalidating a group of related keys ------------------------------------- This recipe presents a way to track the cache keys related to a particular region, for the purposes of invalidating a series of keys that relate to a particular id. Three cached functions, ``user_fn_one()``, ``user_fn_two()``, ``user_fn_three()`` each perform a different function based on a ``user_id`` integer value. The region applied to cache them uses a custom key generator which tracks each cache key generated, pulling out the integer "id" and replacing with a template. When all three functions have been called, the key generator is now aware of these three keys: ``user_fn_one_%d``, ``user_fn_two_%d``, and ``user_fn_three_%d``. The ``invalidate_user_id()`` function then knows that for a particular ``user_id``, it needs to hit all three of those keys in order to invalidate everything having to do with that id. :: from dogpile.cache import make_region from itertools import count user_keys = set() def my_key_generator(namespace, fn): fname = fn.__name__ def generate_key(*arg): # generate a key template: # "fname_%d_arg1_arg2_arg3..." key_template = fname + "_" + \ "%d" + \ "_".join(str(s) for s in arg[1:]) # store key template user_keys.add(key_template) # return cache key user_id = arg[0] return key_template % user_id return generate_key def invalidate_user_id(region, user_id): for key in user_keys: region.delete(key % user_id) region = make_region( function_key_generator=my_key_generator ).configure( "dogpile.cache.memory" ) counter = count() @region.cache_on_arguments() def user_fn_one(user_id): return "user fn one: %d, %d" % (next(counter), user_id) @region.cache_on_arguments() def user_fn_two(user_id): return "user fn two: %d, %d" % (next(counter), user_id) @region.cache_on_arguments() def user_fn_three(user_id): return "user fn three: %d, %d" % (next(counter), user_id) print user_fn_one(5) print user_fn_two(5) print user_fn_three(7) print user_fn_two(7) invalidate_user_id(region, 5) print "invalidated:" print user_fn_one(5) print user_fn_two(5) print user_fn_three(7) print user_fn_two(7) dogpile.cache-0.5.1/docs/_static/0000755000076500000240000000000012225644023017324 5ustar classicstaff00000000000000dogpile.cache-0.5.1/docs/_static/basic.css0000644000076500000240000002041712225643512021125 0ustar classicstaff00000000000000/* * basic.css * ~~~~~~~~~ * * Sphinx stylesheet -- basic theme. * * :copyright: Copyright 2007-2013 by the Sphinx team, see AUTHORS. * :license: BSD, see LICENSE for details. * */ /* -- main layout ----------------------------------------------------------- */ div.clearer { clear: both; } /* -- relbar ---------------------------------------------------------------- */ div.related { width: 100%; font-size: 90%; } div.related h3 { display: none; } div.related ul { margin: 0; padding: 0 0 0 10px; list-style: none; } div.related li { display: inline; } div.related li.right { float: right; margin-right: 5px; } /* -- sidebar --------------------------------------------------------------- */ div.sphinxsidebarwrapper { padding: 10px 5px 0 10px; } div.sphinxsidebar { float: left; width: 230px; margin-left: -100%; font-size: 90%; } div.sphinxsidebar ul { list-style: none; } div.sphinxsidebar ul ul, div.sphinxsidebar ul.want-points { margin-left: 20px; list-style: square; } div.sphinxsidebar ul ul { margin-top: 0; margin-bottom: 0; } div.sphinxsidebar form { margin-top: 10px; } div.sphinxsidebar input { border: 1px solid #98dbcc; font-family: sans-serif; font-size: 1em; } div.sphinxsidebar #searchbox input[type="text"] { width: 170px; } div.sphinxsidebar #searchbox input[type="submit"] { width: 30px; } img { border: 0; } /* -- search page ----------------------------------------------------------- */ ul.search { margin: 10px 0 0 20px; padding: 0; } ul.search li { padding: 5px 0 5px 20px; background-image: url(file.png); background-repeat: no-repeat; background-position: 0 7px; } ul.search li a { font-weight: bold; } ul.search li div.context { color: #888; margin: 2px 0 0 30px; text-align: left; } ul.keywordmatches li.goodmatch a { font-weight: bold; } /* -- index page ------------------------------------------------------------ */ table.contentstable { width: 90%; } table.contentstable p.biglink { line-height: 150%; } a.biglink { font-size: 1.3em; } span.linkdescr { font-style: italic; padding-top: 5px; font-size: 90%; } /* -- general index --------------------------------------------------------- */ table.indextable { width: 100%; } table.indextable td { text-align: left; vertical-align: top; } table.indextable dl, table.indextable dd { margin-top: 0; margin-bottom: 0; } table.indextable tr.pcap { height: 10px; } table.indextable tr.cap { margin-top: 10px; background-color: #f2f2f2; } img.toggler { margin-right: 3px; margin-top: 3px; cursor: pointer; } div.modindex-jumpbox { border-top: 1px solid #ddd; border-bottom: 1px solid #ddd; margin: 1em 0 1em 0; padding: 0.4em; } div.genindex-jumpbox { border-top: 1px solid #ddd; border-bottom: 1px solid #ddd; margin: 1em 0 1em 0; padding: 0.4em; } /* -- general body styles --------------------------------------------------- */ a.headerlink { visibility: hidden; } h1:hover > a.headerlink, h2:hover > a.headerlink, h3:hover > a.headerlink, h4:hover > a.headerlink, h5:hover > a.headerlink, h6:hover > a.headerlink, dt:hover > a.headerlink { visibility: visible; } div.body p.caption { text-align: inherit; } div.body td { text-align: left; } .field-list ul { padding-left: 1em; } .first { margin-top: 0 !important; } p.rubric { margin-top: 30px; font-weight: bold; } img.align-left, .figure.align-left, object.align-left { clear: left; float: left; margin-right: 1em; } img.align-right, .figure.align-right, object.align-right { clear: right; float: right; margin-left: 1em; } img.align-center, .figure.align-center, object.align-center { display: block; margin-left: auto; margin-right: auto; } .align-left { text-align: left; } .align-center { text-align: center; } .align-right { text-align: right; } /* -- sidebars -------------------------------------------------------------- */ div.sidebar { margin: 0 0 0.5em 1em; border: 1px solid #ddb; padding: 7px 7px 0 7px; background-color: #ffe; width: 40%; float: right; } p.sidebar-title { font-weight: bold; } /* -- topics ---------------------------------------------------------------- */ div.topic { border: 1px solid #ccc; padding: 7px 7px 0 7px; margin: 10px 0 10px 0; } p.topic-title { font-size: 1.1em; font-weight: bold; margin-top: 10px; } /* -- admonitions ----------------------------------------------------------- */ div.admonition { margin-top: 10px; margin-bottom: 10px; padding: 7px; } div.admonition dt { font-weight: bold; } div.admonition dl { margin-bottom: 0; } p.admonition-title { margin: 0px 10px 5px 0px; font-weight: bold; } div.body p.centered { text-align: center; margin-top: 25px; } /* -- tables ---------------------------------------------------------------- */ table.docutils { border: 0; border-collapse: collapse; } table.docutils td, table.docutils th { padding: 1px 8px 1px 5px; border-top: 0; border-left: 0; border-right: 0; border-bottom: 1px solid #aaa; } table.field-list td, table.field-list th { border: 0 !important; } table.footnote td, table.footnote th { border: 0 !important; } th { text-align: left; padding-right: 5px; } table.citation { border-left: solid 1px gray; margin-left: 1px; } table.citation td { border-bottom: none; } /* -- other body styles ----------------------------------------------------- */ ol.arabic { list-style: decimal; } ol.loweralpha { list-style: lower-alpha; } ol.upperalpha { list-style: upper-alpha; } ol.lowerroman { list-style: lower-roman; } ol.upperroman { list-style: upper-roman; } dl { margin-bottom: 15px; } dd p { margin-top: 0px; } dd ul, dd table { margin-bottom: 10px; } dd { margin-top: 3px; margin-bottom: 10px; margin-left: 30px; } dt:target, .highlighted { background-color: #fbe54e; } dl.glossary dt { font-weight: bold; font-size: 1.1em; } .field-list ul { margin: 0; padding-left: 1em; } .field-list p { margin: 0; } .refcount { color: #060; } .optional { font-size: 1.3em; } .versionmodified { font-style: italic; } .system-message { background-color: #fda; padding: 5px; border: 3px solid red; } .footnote:target { background-color: #ffa; } .line-block { display: block; margin-top: 1em; margin-bottom: 1em; } .line-block .line-block { margin-top: 0; margin-bottom: 0; margin-left: 1.5em; } .guilabel, .menuselection { font-family: sans-serif; } .accelerator { text-decoration: underline; } .classifier { font-style: oblique; } abbr, acronym { border-bottom: dotted 1px; cursor: help; } /* -- code displays --------------------------------------------------------- */ pre { overflow: auto; overflow-y: hidden; /* fixes display issues on Chrome browsers */ } td.linenos pre { padding: 5px 0px; border: 0; background-color: transparent; color: #aaa; } table.highlighttable { margin-left: 0.5em; } table.highlighttable td { padding: 0 0.5em 0 0.5em; } tt.descname { background-color: transparent; font-weight: bold; font-size: 1.2em; } tt.descclassname { background-color: transparent; } tt.xref, a tt { background-color: transparent; font-weight: bold; } h1 tt, h2 tt, h3 tt, h4 tt, h5 tt, h6 tt { background-color: transparent; } .viewcode-link { float: right; } .viewcode-back { float: right; font-family: sans-serif; } div.viewcode-block:target { margin: -1px -10px; padding: 0 10px; } /* -- math display ---------------------------------------------------------- */ img.math { vertical-align: middle; } div.body div.math p { text-align: center; } span.eqno { float: right; } /* -- printout stylesheet --------------------------------------------------- */ @media print { div.document, div.documentwrapper, div.bodywrapper { margin: 0 !important; width: 100%; } div.sphinxsidebar, div.related, div.footer, #top-link { display: none; } }dogpile.cache-0.5.1/docs/_static/comment-bright.png0000644000076500000240000000665412167630573022776 0ustar classicstaff00000000000000PNG  IHDRa OiCCPPhotoshop ICC profilexڝSgTS=BKKoR RB&*! J!QEEȠQ, !{kּ> H3Q5 B.@ $pd!s#~<<+"x M0B\t8K@zB@F&S`cbP-`'{[! eDh;VEX0fK9-0IWfH  0Q){`##xFW<+*x<$9E[-qWW.(I+6aa@.y24x6_-"bbϫp@t~,/;m%h^ uf@Wp~<5j>{-]cK'Xto(hw?G%fIq^D$.Tʳ?D*A, `6B$BB dr`)B(Ͱ*`/@4Qhp.U=pa( Aa!ڈbX#!H$ ɈQ"K5H1RT UH=r9\F;2G1Q= C7F dt1r=6Ыhڏ>C03l0.B8, c˱" VcϱwE 6wB aAHXLXNH $4 7 Q'"K&b21XH,#/{C7$C2'ITFnR#,4H#dk9, +ȅ3![ b@qS(RjJ4e2AURݨT5ZBRQ4u9̓IKhhitݕNWGw Ljg(gwLӋT071oUX**| J&*/Tު UUT^S}FU3S ԖUPSSg;goT?~YYLOCQ_ cx,!k u5&|v*=9C3J3WRf?qtN (~))4L1e\kXHQG6EYAJ'\'GgSSݧ M=:.kDwn^Loy}/TmG X $ <5qo</QC]@Caaᄑ.ȽJtq]zۯ6iܟ4)Y3sCQ? 0k߬~OCOg#/c/Wװwa>>r><72Y_7ȷOo_C#dz%gA[z|!?:eAAA!h쐭!ΑiP~aa~ 'W?pX15wCsDDDޛg1O9-J5*>.j<74?.fYXXIlK9.*6nl {/]py.,:@LN8A*%w% yg"/6шC\*NH*Mz쑼5y$3,幄'L Lݛ:v m2=:1qB!Mggfvˬen/kY- BTZ(*geWf͉9+̳ې7ᒶKW-X潬j9(xoʿܔĹdff-[n ڴ VE/(ۻCɾUUMfeI?m]Nmq#׹=TR+Gw- 6 U#pDy  :v{vg/jBFS[b[O>zG499?rCd&ˮ/~јѡ򗓿m|x31^VwwO| (hSЧc3-bKGD pHYs  tIME 6 B\<IDAT8˅Kh]es1mA`jh[-E(FEaA!bIȐ*BX"؁4)NURZ!Mhjssm؋^-\gg ]o|Ҭ[346>zd ]#8Oݺt{5uIXN!I=@Vf=v1}e>;fvnvxaHrʪJF`D¹WZ]S%S)WAb |0K=So7D~\~q-˟\aMZ,S'*} F`Nnz674U H3Q5 B.@ $pd!s#~<<+"x M0B\t8K@zB@F&S`cbP-`'{[! eDh;VEX0fK9-0IWfH  0Q){`##xFW<+*x<$9E[-qWW.(I+6aa@.y24x6_-"bbϫp@t~,/;m%h^ uf@Wp~<5j>{-]cK'Xto(hw?G%fIq^D$.Tʳ?D*A, `6B$BB dr`)B(Ͱ*`/@4Qhp.U=pa( Aa!ڈbX#!H$ ɈQ"K5H1RT UH=r9\F;2G1Q= C7F dt1r=6Ыhڏ>C03l0.B8, c˱" VcϱwE 6wB aAHXLXNH $4 7 Q'"K&b21XH,#/{C7$C2'ITFnR#,4H#dk9, +ȅ3![ b@qS(RjJ4e2AURݨT5ZBRQ4u9̓IKhhitݕNWGw Ljg(gwLӋT071oUX**| J&*/Tު UUT^S}FU3S ԖUPSSg;goT?~YYLOCQ_ cx,!k u5&|v*=9C3J3WRf?qtN (~))4L1e\kXHQG6EYAJ'\'GgSSݧ M=:.kDwn^Loy}/TmG X $ <5qo</QC]@Caaᄑ.ȽJtq]zۯ6iܟ4)Y3sCQ? 0k߬~OCOg#/c/Wװwa>>r><72Y_7ȷOo_C#dz%gA[z|!?:eAAA!h쐭!ΑiP~aa~ 'W?pX15wCsDDDޛg1O9-J5*>.j<74?.fYXXIlK9.*6nl {/]py.,:@LN8A*%w% yg"/6шC\*NH*Mz쑼5y$3,幄'L Lݛ:v m2=:1qB!Mggfvˬen/kY- BTZ(*geWf͉9+̳ې7ᒶKW-X潬j9(xoʿܔĹdff-[n ڴ VE/(ۻCɾUUMfeI?m]Nmq#׹=TR+Gw- 6 U#pDy  :v{vg/jBFS[b[O>zG499?rCd&ˮ/~јѡ򗓿m|x31^VwwO| (hSЧc3-bKGD pHYs  tIME!,IDAT8e_Hu?}s3y˕U2MvQ֊FE.łĊbE$DDZF5b@Q":2{n.s<_ y?mwV@tR`}Z _# _=_@ w^R%6gC-έ(K>| ${} H3Q5 B.@ $pd!s#~<<+"x M0B\t8K@zB@F&S`cbP-`'{[! eDh;VEX0fK9-0IWfH  0Q){`##xFW<+*x<$9E[-qWW.(I+6aa@.y24x6_-"bbϫp@t~,/;m%h^ uf@Wp~<5j>{-]cK'Xto(hw?G%fIq^D$.Tʳ?D*A, `6B$BB dr`)B(Ͱ*`/@4Qhp.U=pa( Aa!ڈbX#!H$ ɈQ"K5H1RT UH=r9\F;2G1Q= C7F dt1r=6Ыhڏ>C03l0.B8, c˱" VcϱwE 6wB aAHXLXNH $4 7 Q'"K&b21XH,#/{C7$C2'ITFnR#,4H#dk9, +ȅ3![ b@qS(RjJ4e2AURݨT5ZBRQ4u9̓IKhhitݕNWGw Ljg(gwLӋT071oUX**| J&*/Tު UUT^S}FU3S ԖUPSSg;goT?~YYLOCQ_ cx,!k u5&|v*=9C3J3WRf?qtN (~))4L1e\kXHQG6EYAJ'\'GgSSݧ M=:.kDwn^Loy}/TmG X $ <5qo</QC]@Caaᄑ.ȽJtq]zۯ6iܟ4)Y3sCQ? 0k߬~OCOg#/c/Wװwa>>r><72Y_7ȷOo_C#dz%gA[z|!?:eAAA!h쐭!ΑiP~aa~ 'W?pX15wCsDDDޛg1O9-J5*>.j<74?.fYXXIlK9.*6nl {/]py.,:@LN8A*%w% yg"/6шC\*NH*Mz쑼5y$3,幄'L Lݛ:v m2=:1qB!Mggfvˬen/kY- BTZ(*geWf͉9+̳ې7ᒶKW-X潬j9(xoʿܔĹdff-[n ڴ VE/(ۻCɾUUMfeI?m]Nmq#׹=TR+Gw- 6 U#pDy  :v{vg/jBFS[b[O>zG499?rCd&ˮ/~јѡ򗓿m|x31^VwwO| (hSЧc3-bKGD pHYs  tIME 1;VIDAT8ukU?sg4h`G1 RQܸp%Bn"bЍXJ .4V iZ##T;m!4bP~7r>ιbwc;m;oӍAΆ ζZ^/|s{;yR=9(rtVoG1w#_ө{*E&!(LVuoᲵ‘D PG4 :&~*ݳreu: S-,U^E&JY[P!RB ŖޞʖR@_ȐdBfNvHf"2T]R j'B1ddAak/DIJD D2H&L`&L $Ex,6|~_\P $MH`I=@Z||ttvgcЕWTZ'3rje"ܵx9W> mb|byfFRx{w%DZC$wdցHmWnta(M<~;9]C/_;Տ#}o`zSڷ_>:;x컓?yݩ|}~wam-/7=0S5RP"*֯ IENDB`dogpile.cache-0.5.1/docs/_static/doctools.js0000644000076500000240000001473412167630573021533 0ustar classicstaff00000000000000/* * doctools.js * ~~~~~~~~~~~ * * Sphinx JavaScript utilities for all documentation. * * :copyright: Copyright 2007-2013 by the Sphinx team, see AUTHORS. * :license: BSD, see LICENSE for details. * */ /** * select a different prefix for underscore */ $u = _.noConflict(); /** * make the code below compatible with browsers without * an installed firebug like debugger if (!window.console || !console.firebug) { var names = ["log", "debug", "info", "warn", "error", "assert", "dir", "dirxml", "group", "groupEnd", "time", "timeEnd", "count", "trace", "profile", "profileEnd"]; window.console = {}; for (var i = 0; i < names.length; ++i) window.console[names[i]] = function() {}; } */ /** * small helper function to urldecode strings */ jQuery.urldecode = function(x) { return decodeURIComponent(x).replace(/\+/g, ' '); }; /** * small helper function to urlencode strings */ jQuery.urlencode = encodeURIComponent; /** * This function returns the parsed url parameters of the * current request. Multiple values per key are supported, * it will always return arrays of strings for the value parts. */ jQuery.getQueryParameters = function(s) { if (typeof s == 'undefined') s = document.location.search; var parts = s.substr(s.indexOf('?') + 1).split('&'); var result = {}; for (var i = 0; i < parts.length; i++) { var tmp = parts[i].split('=', 2); var key = jQuery.urldecode(tmp[0]); var value = jQuery.urldecode(tmp[1]); if (key in result) result[key].push(value); else result[key] = [value]; } return result; }; /** * highlight a given string on a jquery object by wrapping it in * span elements with the given class name. */ jQuery.fn.highlightText = function(text, className) { function highlight(node) { if (node.nodeType == 3) { var val = node.nodeValue; var pos = val.toLowerCase().indexOf(text); if (pos >= 0 && !jQuery(node.parentNode).hasClass(className)) { var span = document.createElement("span"); span.className = className; span.appendChild(document.createTextNode(val.substr(pos, text.length))); node.parentNode.insertBefore(span, node.parentNode.insertBefore( document.createTextNode(val.substr(pos + text.length)), node.nextSibling)); node.nodeValue = val.substr(0, pos); } } else if (!jQuery(node).is("button, select, textarea")) { jQuery.each(node.childNodes, function() { highlight(this); }); } } return this.each(function() { highlight(this); }); }; /** * Small JavaScript module for the documentation. */ var Documentation = { init : function() { this.fixFirefoxAnchorBug(); this.highlightSearchWords(); this.initIndexTable(); }, /** * i18n support */ TRANSLATIONS : {}, PLURAL_EXPR : function(n) { return n == 1 ? 0 : 1; }, LOCALE : 'unknown', // gettext and ngettext don't access this so that the functions // can safely bound to a different name (_ = Documentation.gettext) gettext : function(string) { var translated = Documentation.TRANSLATIONS[string]; if (typeof translated == 'undefined') return string; return (typeof translated == 'string') ? translated : translated[0]; }, ngettext : function(singular, plural, n) { var translated = Documentation.TRANSLATIONS[singular]; if (typeof translated == 'undefined') return (n == 1) ? singular : plural; return translated[Documentation.PLURALEXPR(n)]; }, addTranslations : function(catalog) { for (var key in catalog.messages) this.TRANSLATIONS[key] = catalog.messages[key]; this.PLURAL_EXPR = new Function('n', 'return +(' + catalog.plural_expr + ')'); this.LOCALE = catalog.locale; }, /** * add context elements like header anchor links */ addContextElements : function() { $('div[id] > :header:first').each(function() { $('\u00B6'). attr('href', '#' + this.id). attr('title', _('Permalink to this headline')). appendTo(this); }); $('dt[id]').each(function() { $('\u00B6'). attr('href', '#' + this.id). attr('title', _('Permalink to this definition')). appendTo(this); }); }, /** * workaround a firefox stupidity */ fixFirefoxAnchorBug : function() { if (document.location.hash && $.browser.mozilla) window.setTimeout(function() { document.location.href += ''; }, 10); }, /** * highlight the search words provided in the url in the text */ highlightSearchWords : function() { var params = $.getQueryParameters(); var terms = (params.highlight) ? params.highlight[0].split(/\s+/) : []; if (terms.length) { var body = $('div.body'); window.setTimeout(function() { $.each(terms, function() { body.highlightText(this.toLowerCase(), 'highlighted'); }); }, 10); $('') .appendTo($('#searchbox')); } }, /** * init the domain index toggle buttons */ initIndexTable : function() { var togglers = $('img.toggler').click(function() { var src = $(this).attr('src'); var idnum = $(this).attr('id').substr(7); $('tr.cg-' + idnum).toggle(); if (src.substr(-9) == 'minus.png') $(this).attr('src', src.substr(0, src.length-9) + 'plus.png'); else $(this).attr('src', src.substr(0, src.length-8) + 'minus.png'); }).css('display', ''); if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) { togglers.click(); } }, /** * helper function to hide the search marks again */ hideSearchWords : function() { $('#searchbox .highlight-link').fadeOut(300); $('span.highlighted').removeClass('highlighted'); }, /** * make the url absolute */ makeURL : function(relativeURL) { return DOCUMENTATION_OPTIONS.URL_ROOT + '/' + relativeURL; }, /** * get the current relative url */ getCurrentURL : function() { var path = document.location.pathname; var parts = path.split(/\//); $.each(DOCUMENTATION_OPTIONS.URL_ROOT.split(/\//), function() { if (this == '..') parts.pop(); }); var url = parts.join('/'); return path.substring(url.lastIndexOf('/') + 1, path.length - 1); } }; // quick alias for translations _ = Documentation.gettext; $(document).ready(function() { Documentation.init(); }); dogpile.cache-0.5.1/docs/_static/down-pressed.png0000644000076500000240000000056012167630573022457 0ustar classicstaff00000000000000PNG  IHDRasRGBbKGDC pHYs B(xtIME -vF#IDAT8!OAJ, ++@I vbÿ@W7F HN#48646TMvv޼7Dsax1U q;< E-f)j%po4xF78G>)- EYm4%7YTk-Qa"NWAo-yeq,) Ypt\hqmszG]Nar߶s^l vh\2%0EeRvIENDB`dogpile.cache-0.5.1/docs/_static/down.png0000644000076500000240000000055312167630573021016 0ustar classicstaff00000000000000PNG  IHDRasRGBbKGDC pHYs B(xtIME"U{IDAT8ҡNCAJ, ++@4>/U^,~T&3M^^^PM6ٹs*RJa)eG*W<"F Fg78G>q OIp:sAj5GنyD^+yU:p_%G@D|aOs(yM,"msx:.b@D|`Vٟ۲иeKſ/G!IENDB`dogpile.cache-0.5.1/docs/_static/file.png0000644000076500000240000000061012167630573020760 0ustar classicstaff00000000000000PNG  IHDRabKGD pHYs  tIME  )TIDAT8˭J@Ir('[ "&xYZ X0!i|_@tD] #xjv YNaEi(əy@D&`6PZk$)5%"z.NA#Aba`Vs_3c,2mj [klvy|!Iմy;v "߮a?A7`c^nk?Bg}TЙD# "RD1yER*6MJ3K_Ut8F~IENDB`dogpile.cache-0.5.1/docs/_static/jquery.js0000644000076500000240000026725412167630573021233 0ustar classicstaff00000000000000/*! jQuery v1.7.1 jquery.com | jquery.org/license */ (function(a,b){function cy(a){return f.isWindow(a)?a:a.nodeType===9?a.defaultView||a.parentWindow:!1}function cv(a){if(!ck[a]){var b=c.body,d=f("<"+a+">").appendTo(b),e=d.css("display");d.remove();if(e==="none"||e===""){cl||(cl=c.createElement("iframe"),cl.frameBorder=cl.width=cl.height=0),b.appendChild(cl);if(!cm||!cl.createElement)cm=(cl.contentWindow||cl.contentDocument).document,cm.write((c.compatMode==="CSS1Compat"?"":"")+""),cm.close();d=cm.createElement(a),cm.body.appendChild(d),e=f.css(d,"display"),b.removeChild(cl)}ck[a]=e}return ck[a]}function cu(a,b){var c={};f.each(cq.concat.apply([],cq.slice(0,b)),function(){c[this]=a});return c}function ct(){cr=b}function cs(){setTimeout(ct,0);return cr=f.now()}function cj(){try{return new a.ActiveXObject("Microsoft.XMLHTTP")}catch(b){}}function ci(){try{return new a.XMLHttpRequest}catch(b){}}function cc(a,c){a.dataFilter&&(c=a.dataFilter(c,a.dataType));var d=a.dataTypes,e={},g,h,i=d.length,j,k=d[0],l,m,n,o,p;for(g=1;g0){if(c!=="border")for(;g=0===c})}function S(a){return!a||!a.parentNode||a.parentNode.nodeType===11}function K(){return!0}function J(){return!1}function n(a,b,c){var d=b+"defer",e=b+"queue",g=b+"mark",h=f._data(a,d);h&&(c==="queue"||!f._data(a,e))&&(c==="mark"||!f._data(a,g))&&setTimeout(function(){!f._data(a,e)&&!f._data(a,g)&&(f.removeData(a,d,!0),h.fire())},0)}function m(a){for(var b in a){if(b==="data"&&f.isEmptyObject(a[b]))continue;if(b!=="toJSON")return!1}return!0}function l(a,c,d){if(d===b&&a.nodeType===1){var e="data-"+c.replace(k,"-$1").toLowerCase();d=a.getAttribute(e);if(typeof d=="string"){try{d=d==="true"?!0:d==="false"?!1:d==="null"?null:f.isNumeric(d)?parseFloat(d):j.test(d)?f.parseJSON(d):d}catch(g){}f.data(a,c,d)}else d=b}return d}function h(a){var b=g[a]={},c,d;a=a.split(/\s+/);for(c=0,d=a.length;c)[^>]*$|#([\w\-]*)$)/,j=/\S/,k=/^\s+/,l=/\s+$/,m=/^<(\w+)\s*\/?>(?:<\/\1>)?$/,n=/^[\],:{}\s]*$/,o=/\\(?:["\\\/bfnrt]|u[0-9a-fA-F]{4})/g,p=/"[^"\\\n\r]*"|true|false|null|-?\d+(?:\.\d*)?(?:[eE][+\-]?\d+)?/g,q=/(?:^|:|,)(?:\s*\[)+/g,r=/(webkit)[ \/]([\w.]+)/,s=/(opera)(?:.*version)?[ \/]([\w.]+)/,t=/(msie) ([\w.]+)/,u=/(mozilla)(?:.*? rv:([\w.]+))?/,v=/-([a-z]|[0-9])/ig,w=/^-ms-/,x=function(a,b){return(b+"").toUpperCase()},y=d.userAgent,z,A,B,C=Object.prototype.toString,D=Object.prototype.hasOwnProperty,E=Array.prototype.push,F=Array.prototype.slice,G=String.prototype.trim,H=Array.prototype.indexOf,I={};e.fn=e.prototype={constructor:e,init:function(a,d,f){var g,h,j,k;if(!a)return this;if(a.nodeType){this.context=this[0]=a,this.length=1;return this}if(a==="body"&&!d&&c.body){this.context=c,this[0]=c.body,this.selector=a,this.length=1;return this}if(typeof a=="string"){a.charAt(0)!=="<"||a.charAt(a.length-1)!==">"||a.length<3?g=i.exec(a):g=[null,a,null];if(g&&(g[1]||!d)){if(g[1]){d=d instanceof e?d[0]:d,k=d?d.ownerDocument||d:c,j=m.exec(a),j?e.isPlainObject(d)?(a=[c.createElement(j[1])],e.fn.attr.call(a,d,!0)):a=[k.createElement(j[1])]:(j=e.buildFragment([g[1]],[k]),a=(j.cacheable?e.clone(j.fragment):j.fragment).childNodes);return e.merge(this,a)}h=c.getElementById(g[2]);if(h&&h.parentNode){if(h.id!==g[2])return f.find(a);this.length=1,this[0]=h}this.context=c,this.selector=a;return this}return!d||d.jquery?(d||f).find(a):this.constructor(d).find(a)}if(e.isFunction(a))return f.ready(a);a.selector!==b&&(this.selector=a.selector,this.context=a.context);return e.makeArray(a,this)},selector:"",jquery:"1.7.1",length:0,size:function(){return this.length},toArray:function(){return F.call(this,0)},get:function(a){return a==null?this.toArray():a<0?this[this.length+a]:this[a]},pushStack:function(a,b,c){var d=this.constructor();e.isArray(a)?E.apply(d,a):e.merge(d,a),d.prevObject=this,d.context=this.context,b==="find"?d.selector=this.selector+(this.selector?" ":"")+c:b&&(d.selector=this.selector+"."+b+"("+c+")");return d},each:function(a,b){return e.each(this,a,b)},ready:function(a){e.bindReady(),A.add(a);return this},eq:function(a){a=+a;return a===-1?this.slice(a):this.slice(a,a+1)},first:function(){return this.eq(0)},last:function(){return this.eq(-1)},slice:function(){return this.pushStack(F.apply(this,arguments),"slice",F.call(arguments).join(","))},map:function(a){return this.pushStack(e.map(this,function(b,c){return a.call(b,c,b)}))},end:function(){return this.prevObject||this.constructor(null)},push:E,sort:[].sort,splice:[].splice},e.fn.init.prototype=e.fn,e.extend=e.fn.extend=function(){var a,c,d,f,g,h,i=arguments[0]||{},j=1,k=arguments.length,l=!1;typeof i=="boolean"&&(l=i,i=arguments[1]||{},j=2),typeof i!="object"&&!e.isFunction(i)&&(i={}),k===j&&(i=this,--j);for(;j0)return;A.fireWith(c,[e]),e.fn.trigger&&e(c).trigger("ready").off("ready")}},bindReady:function(){if(!A){A=e.Callbacks("once memory");if(c.readyState==="complete")return setTimeout(e.ready,1);if(c.addEventListener)c.addEventListener("DOMContentLoaded",B,!1),a.addEventListener("load",e.ready,!1);else if(c.attachEvent){c.attachEvent("onreadystatechange",B),a.attachEvent("onload",e.ready);var b=!1;try{b=a.frameElement==null}catch(d){}c.documentElement.doScroll&&b&&J()}}},isFunction:function(a){return e.type(a)==="function"},isArray:Array.isArray||function(a){return e.type(a)==="array"},isWindow:function(a){return a&&typeof a=="object"&&"setInterval"in a},isNumeric:function(a){return!isNaN(parseFloat(a))&&isFinite(a)},type:function(a){return a==null?String(a):I[C.call(a)]||"object"},isPlainObject:function(a){if(!a||e.type(a)!=="object"||a.nodeType||e.isWindow(a))return!1;try{if(a.constructor&&!D.call(a,"constructor")&&!D.call(a.constructor.prototype,"isPrototypeOf"))return!1}catch(c){return!1}var d;for(d in a);return d===b||D.call(a,d)},isEmptyObject:function(a){for(var b in a)return!1;return!0},error:function(a){throw new Error(a)},parseJSON:function(b){if(typeof b!="string"||!b)return null;b=e.trim(b);if(a.JSON&&a.JSON.parse)return a.JSON.parse(b);if(n.test(b.replace(o,"@").replace(p,"]").replace(q,"")))return(new Function("return "+b))();e.error("Invalid JSON: "+b)},parseXML:function(c){var d,f;try{a.DOMParser?(f=new DOMParser,d=f.parseFromString(c,"text/xml")):(d=new ActiveXObject("Microsoft.XMLDOM"),d.async="false",d.loadXML(c))}catch(g){d=b}(!d||!d.documentElement||d.getElementsByTagName("parsererror").length)&&e.error("Invalid XML: "+c);return d},noop:function(){},globalEval:function(b){b&&j.test(b)&&(a.execScript||function(b){a.eval.call(a,b)})(b)},camelCase:function(a){return a.replace(w,"ms-").replace(v,x)},nodeName:function(a,b){return a.nodeName&&a.nodeName.toUpperCase()===b.toUpperCase()},each:function(a,c,d){var f,g=0,h=a.length,i=h===b||e.isFunction(a);if(d){if(i){for(f in a)if(c.apply(a[f],d)===!1)break}else for(;g0&&a[0]&&a[j-1]||j===0||e.isArray(a));if(k)for(;i1?i.call(arguments,0):b,j.notifyWith(k,e)}}function l(a){return function(c){b[a]=arguments.length>1?i.call(arguments,0):c,--g||j.resolveWith(j,b)}}var b=i.call(arguments,0),c=0,d=b.length,e=Array(d),g=d,h=d,j=d<=1&&a&&f.isFunction(a.promise)?a:f.Deferred(),k=j.promise();if(d>1){for(;c
a",d=q.getElementsByTagName("*"),e=q.getElementsByTagName("a")[0];if(!d||!d.length||!e)return{};g=c.createElement("select"),h=g.appendChild(c.createElement("option")),i=q.getElementsByTagName("input")[0],b={leadingWhitespace:q.firstChild.nodeType===3,tbody:!q.getElementsByTagName("tbody").length,htmlSerialize:!!q.getElementsByTagName("link").length,style:/top/.test(e.getAttribute("style")),hrefNormalized:e.getAttribute("href")==="/a",opacity:/^0.55/.test(e.style.opacity),cssFloat:!!e.style.cssFloat,checkOn:i.value==="on",optSelected:h.selected,getSetAttribute:q.className!=="t",enctype:!!c.createElement("form").enctype,html5Clone:c.createElement("nav").cloneNode(!0).outerHTML!=="<:nav>",submitBubbles:!0,changeBubbles:!0,focusinBubbles:!1,deleteExpando:!0,noCloneEvent:!0,inlineBlockNeedsLayout:!1,shrinkWrapBlocks:!1,reliableMarginRight:!0},i.checked=!0,b.noCloneChecked=i.cloneNode(!0).checked,g.disabled=!0,b.optDisabled=!h.disabled;try{delete q.test}catch(s){b.deleteExpando=!1}!q.addEventListener&&q.attachEvent&&q.fireEvent&&(q.attachEvent("onclick",function(){b.noCloneEvent=!1}),q.cloneNode(!0).fireEvent("onclick")),i=c.createElement("input"),i.value="t",i.setAttribute("type","radio"),b.radioValue=i.value==="t",i.setAttribute("checked","checked"),q.appendChild(i),k=c.createDocumentFragment(),k.appendChild(q.lastChild),b.checkClone=k.cloneNode(!0).cloneNode(!0).lastChild.checked,b.appendChecked=i.checked,k.removeChild(i),k.appendChild(q),q.innerHTML="",a.getComputedStyle&&(j=c.createElement("div"),j.style.width="0",j.style.marginRight="0",q.style.width="2px",q.appendChild(j),b.reliableMarginRight=(parseInt((a.getComputedStyle(j,null)||{marginRight:0}).marginRight,10)||0)===0);if(q.attachEvent)for(o in{submit:1,change:1,focusin:1})n="on"+o,p=n in q,p||(q.setAttribute(n,"return;"),p=typeof q[n]=="function"),b[o+"Bubbles"]=p;k.removeChild(q),k=g=h=j=q=i=null,f(function(){var a,d,e,g,h,i,j,k,m,n,o,r=c.getElementsByTagName("body")[0];!r||(j=1,k="position:absolute;top:0;left:0;width:1px;height:1px;margin:0;",m="visibility:hidden;border:0;",n="style='"+k+"border:5px solid #000;padding:0;'",o="
"+""+"
",a=c.createElement("div"),a.style.cssText=m+"width:0;height:0;position:static;top:0;margin-top:"+j+"px",r.insertBefore(a,r.firstChild),q=c.createElement("div"),a.appendChild(q),q.innerHTML="
t
",l=q.getElementsByTagName("td"),p=l[0].offsetHeight===0,l[0].style.display="",l[1].style.display="none",b.reliableHiddenOffsets=p&&l[0].offsetHeight===0,q.innerHTML="",q.style.width=q.style.paddingLeft="1px",f.boxModel=b.boxModel=q.offsetWidth===2,typeof q.style.zoom!="undefined"&&(q.style.display="inline",q.style.zoom=1,b.inlineBlockNeedsLayout=q.offsetWidth===2,q.style.display="",q.innerHTML="
",b.shrinkWrapBlocks=q.offsetWidth!==2),q.style.cssText=k+m,q.innerHTML=o,d=q.firstChild,e=d.firstChild,h=d.nextSibling.firstChild.firstChild,i={doesNotAddBorder:e.offsetTop!==5,doesAddBorderForTableAndCells:h.offsetTop===5},e.style.position="fixed",e.style.top="20px",i.fixedPosition=e.offsetTop===20||e.offsetTop===15,e.style.position=e.style.top="",d.style.overflow="hidden",d.style.position="relative",i.subtractsBorderForOverflowNotVisible=e.offsetTop===-5,i.doesNotIncludeMarginInBodyOffset=r.offsetTop!==j,r.removeChild(a),q=a=null,f.extend(b,i))});return b}();var j=/^(?:\{.*\}|\[.*\])$/,k=/([A-Z])/g;f.extend({cache:{},uuid:0,expando:"jQuery"+(f.fn.jquery+Math.random()).replace(/\D/g,""),noData:{embed:!0,object:"clsid:D27CDB6E-AE6D-11cf-96B8-444553540000",applet:!0},hasData:function(a){a=a.nodeType?f.cache[a[f.expando]]:a[f.expando];return!!a&&!m(a)},data:function(a,c,d,e){if(!!f.acceptData(a)){var g,h,i,j=f.expando,k=typeof c=="string",l=a.nodeType,m=l?f.cache:a,n=l?a[j]:a[j]&&j,o=c==="events";if((!n||!m[n]||!o&&!e&&!m[n].data)&&k&&d===b)return;n||(l?a[j]=n=++f.uuid:n=j),m[n]||(m[n]={},l||(m[n].toJSON=f.noop));if(typeof c=="object"||typeof c=="function")e?m[n]=f.extend(m[n],c):m[n].data=f.extend(m[n].data,c);g=h=m[n],e||(h.data||(h.data={}),h=h.data),d!==b&&(h[f.camelCase(c)]=d);if(o&&!h[c])return g.events;k?(i=h[c],i==null&&(i=h[f.camelCase(c)])):i=h;return i}},removeData:function(a,b,c){if(!!f.acceptData(a)){var d,e,g,h=f.expando,i=a.nodeType,j=i?f.cache:a,k=i?a[h]:h;if(!j[k])return;if(b){d=c?j[k]:j[k].data;if(d){f.isArray(b)||(b in d?b=[b]:(b=f.camelCase(b),b in d?b=[b]:b=b.split(" ")));for(e=0,g=b.length;e-1)return!0;return!1},val:function(a){var c,d,e,g=this[0];{if(!!arguments.length){e=f.isFunction(a);return this.each(function(d){var g=f(this),h;if(this.nodeType===1){e?h=a.call(this,d,g.val()):h=a,h==null?h="":typeof h=="number"?h+="":f.isArray(h)&&(h=f.map(h,function(a){return a==null?"":a+""})),c=f.valHooks[this.nodeName.toLowerCase()]||f.valHooks[this.type];if(!c||!("set"in c)||c.set(this,h,"value")===b)this.value=h}})}if(g){c=f.valHooks[g.nodeName.toLowerCase()]||f.valHooks[g.type];if(c&&"get"in c&&(d=c.get(g,"value"))!==b)return d;d=g.value;return typeof d=="string"?d.replace(q,""):d==null?"":d}}}}),f.extend({valHooks:{option:{get:function(a){var b=a.attributes.value;return!b||b.specified?a.value:a.text}},select:{get:function(a){var b,c,d,e,g=a.selectedIndex,h=[],i=a.options,j=a.type==="select-one";if(g<0)return null;c=j?g:0,d=j?g+1:i.length;for(;c=0}),c.length||(a.selectedIndex=-1);return c}}},attrFn:{val:!0,css:!0,html:!0,text:!0,data:!0,width:!0,height:!0,offset:!0},attr:function(a,c,d,e){var g,h,i,j=a.nodeType;if(!!a&&j!==3&&j!==8&&j!==2){if(e&&c in f.attrFn)return f(a)[c](d);if(typeof a.getAttribute=="undefined")return f.prop(a,c,d);i=j!==1||!f.isXMLDoc(a),i&&(c=c.toLowerCase(),h=f.attrHooks[c]||(u.test(c)?x:w));if(d!==b){if(d===null){f.removeAttr(a,c);return}if(h&&"set"in h&&i&&(g=h.set(a,d,c))!==b)return g;a.setAttribute(c,""+d);return d}if(h&&"get"in h&&i&&(g=h.get(a,c))!==null)return g;g=a.getAttribute(c);return g===null?b:g}},removeAttr:function(a,b){var c,d,e,g,h=0;if(b&&a.nodeType===1){d=b.toLowerCase().split(p),g=d.length;for(;h=0}})});var z=/^(?:textarea|input|select)$/i,A=/^([^\.]*)?(?:\.(.+))?$/,B=/\bhover(\.\S+)?\b/,C=/^key/,D=/^(?:mouse|contextmenu)|click/,E=/^(?:focusinfocus|focusoutblur)$/,F=/^(\w*)(?:#([\w\-]+))?(?:\.([\w\-]+))?$/,G=function(a){var b=F.exec(a);b&&(b[1]=(b[1]||"").toLowerCase(),b[3]=b[3]&&new RegExp("(?:^|\\s)"+b[3]+"(?:\\s|$)"));return b},H=function(a,b){var c=a.attributes||{};return(!b[1]||a.nodeName.toLowerCase()===b[1])&&(!b[2]||(c.id||{}).value===b[2])&&(!b[3]||b[3].test((c["class"]||{}).value))},I=function(a){return f.event.special.hover?a:a.replace(B,"mouseenter$1 mouseleave$1")}; f.event={add:function(a,c,d,e,g){var h,i,j,k,l,m,n,o,p,q,r,s;if(!(a.nodeType===3||a.nodeType===8||!c||!d||!(h=f._data(a)))){d.handler&&(p=d,d=p.handler),d.guid||(d.guid=f.guid++),j=h.events,j||(h.events=j={}),i=h.handle,i||(h.handle=i=function(a){return typeof f!="undefined"&&(!a||f.event.triggered!==a.type)?f.event.dispatch.apply(i.elem,arguments):b},i.elem=a),c=f.trim(I(c)).split(" ");for(k=0;k=0&&(h=h.slice(0,-1),k=!0),h.indexOf(".")>=0&&(i=h.split("."),h=i.shift(),i.sort());if((!e||f.event.customEvent[h])&&!f.event.global[h])return;c=typeof c=="object"?c[f.expando]?c:new f.Event(h,c):new f.Event(h),c.type=h,c.isTrigger=!0,c.exclusive=k,c.namespace=i.join("."),c.namespace_re=c.namespace?new RegExp("(^|\\.)"+i.join("\\.(?:.*\\.)?")+"(\\.|$)"):null,o=h.indexOf(":")<0?"on"+h:"";if(!e){j=f.cache;for(l in j)j[l].events&&j[l].events[h]&&f.event.trigger(c,d,j[l].handle.elem,!0);return}c.result=b,c.target||(c.target=e),d=d!=null?f.makeArray(d):[],d.unshift(c),p=f.event.special[h]||{};if(p.trigger&&p.trigger.apply(e,d)===!1)return;r=[[e,p.bindType||h]];if(!g&&!p.noBubble&&!f.isWindow(e)){s=p.delegateType||h,m=E.test(s+h)?e:e.parentNode,n=null;for(;m;m=m.parentNode)r.push([m,s]),n=m;n&&n===e.ownerDocument&&r.push([n.defaultView||n.parentWindow||a,s])}for(l=0;le&&i.push({elem:this,matches:d.slice(e)});for(j=0;j0?this.on(b,null,a,c):this.trigger(b)},f.attrFn&&(f.attrFn[b]=!0),C.test(b)&&(f.event.fixHooks[b]=f.event.keyHooks),D.test(b)&&(f.event.fixHooks[b]=f.event.mouseHooks)}),function(){function x(a,b,c,e,f,g){for(var h=0,i=e.length;h0){k=j;break}}j=j[a]}e[h]=k}}}function w(a,b,c,e,f,g){for(var h=0,i=e.length;h+~,(\[\\]+)+|[>+~])(\s*,\s*)?((?:.|\r|\n)*)/g,d="sizcache"+(Math.random()+"").replace(".",""),e=0,g=Object.prototype.toString,h=!1,i=!0,j=/\\/g,k=/\r\n/g,l=/\W/;[0,0].sort(function(){i=!1;return 0});var m=function(b,d,e,f){e=e||[],d=d||c;var h=d;if(d.nodeType!==1&&d.nodeType!==9)return[];if(!b||typeof b!="string")return e;var i,j,k,l,n,q,r,t,u=!0,v=m.isXML(d),w=[],x=b;do{a.exec(""),i=a.exec(x);if(i){x=i[3],w.push(i[1]);if(i[2]){l=i[3];break}}}while(i);if(w.length>1&&p.exec(b))if(w.length===2&&o.relative[w[0]])j=y(w[0]+w[1],d,f);else{j=o.relative[w[0]]?[d]:m(w.shift(),d);while(w.length)b=w.shift(),o.relative[b]&&(b+=w.shift()),j=y(b,j,f)}else{!f&&w.length>1&&d.nodeType===9&&!v&&o.match.ID.test(w[0])&&!o.match.ID.test(w[w.length-1])&&(n=m.find(w.shift(),d,v),d=n.expr?m.filter(n.expr,n.set)[0]:n.set[0]);if(d){n=f?{expr:w.pop(),set:s(f)}:m.find(w.pop(),w.length===1&&(w[0]==="~"||w[0]==="+")&&d.parentNode?d.parentNode:d,v),j=n.expr?m.filter(n.expr,n.set):n.set,w.length>0?k=s(j):u=!1;while(w.length)q=w.pop(),r=q,o.relative[q]?r=w.pop():q="",r==null&&(r=d),o.relative[q](k,r,v)}else k=w=[]}k||(k=j),k||m.error(q||b);if(g.call(k)==="[object Array]")if(!u)e.push.apply(e,k);else if(d&&d.nodeType===1)for(t=0;k[t]!=null;t++)k[t]&&(k[t]===!0||k[t].nodeType===1&&m.contains(d,k[t]))&&e.push(j[t]);else for(t=0;k[t]!=null;t++)k[t]&&k[t].nodeType===1&&e.push(j[t]);else s(k,e);l&&(m(l,h,e,f),m.uniqueSort(e));return e};m.uniqueSort=function(a){if(u){h=i,a.sort(u);if(h)for(var b=1;b0},m.find=function(a,b,c){var d,e,f,g,h,i;if(!a)return[];for(e=0,f=o.order.length;e":function(a,b){var c,d=typeof b=="string",e=0,f=a.length;if(d&&!l.test(b)){b=b.toLowerCase();for(;e=0)?c||d.push(h):c&&(b[g]=!1));return!1},ID:function(a){return a[1].replace(j,"")},TAG:function(a,b){return a[1].replace(j,"").toLowerCase()},CHILD:function(a){if(a[1]==="nth"){a[2]||m.error(a[0]),a[2]=a[2].replace(/^\+|\s*/g,"");var b=/(-?)(\d*)(?:n([+\-]?\d*))?/.exec(a[2]==="even"&&"2n"||a[2]==="odd"&&"2n+1"||!/\D/.test(a[2])&&"0n+"+a[2]||a[2]);a[2]=b[1]+(b[2]||1)-0,a[3]=b[3]-0}else a[2]&&m.error(a[0]);a[0]=e++;return a},ATTR:function(a,b,c,d,e,f){var g=a[1]=a[1].replace(j,"");!f&&o.attrMap[g]&&(a[1]=o.attrMap[g]),a[4]=(a[4]||a[5]||"").replace(j,""),a[2]==="~="&&(a[4]=" "+a[4]+" ");return a},PSEUDO:function(b,c,d,e,f){if(b[1]==="not")if((a.exec(b[3])||"").length>1||/^\w/.test(b[3]))b[3]=m(b[3],null,null,c);else{var g=m.filter(b[3],c,d,!0^f);d||e.push.apply(e,g);return!1}else if(o.match.POS.test(b[0])||o.match.CHILD.test(b[0]))return!0;return b},POS:function(a){a.unshift(!0);return a}},filters:{enabled:function(a){return a.disabled===!1&&a.type!=="hidden"},disabled:function(a){return a.disabled===!0},checked:function(a){return a.checked===!0},selected:function(a){a.parentNode&&a.parentNode.selectedIndex;return a.selected===!0},parent:function(a){return!!a.firstChild},empty:function(a){return!a.firstChild},has:function(a,b,c){return!!m(c[3],a).length},header:function(a){return/h\d/i.test(a.nodeName)},text:function(a){var b=a.getAttribute("type"),c=a.type;return a.nodeName.toLowerCase()==="input"&&"text"===c&&(b===c||b===null)},radio:function(a){return a.nodeName.toLowerCase()==="input"&&"radio"===a.type},checkbox:function(a){return a.nodeName.toLowerCase()==="input"&&"checkbox"===a.type},file:function(a){return a.nodeName.toLowerCase()==="input"&&"file"===a.type},password:function(a){return a.nodeName.toLowerCase()==="input"&&"password"===a.type},submit:function(a){var b=a.nodeName.toLowerCase();return(b==="input"||b==="button")&&"submit"===a.type},image:function(a){return a.nodeName.toLowerCase()==="input"&&"image"===a.type},reset:function(a){var b=a.nodeName.toLowerCase();return(b==="input"||b==="button")&&"reset"===a.type},button:function(a){var b=a.nodeName.toLowerCase();return b==="input"&&"button"===a.type||b==="button"},input:function(a){return/input|select|textarea|button/i.test(a.nodeName)},focus:function(a){return a===a.ownerDocument.activeElement}},setFilters:{first:function(a,b){return b===0},last:function(a,b,c,d){return b===d.length-1},even:function(a,b){return b%2===0},odd:function(a,b){return b%2===1},lt:function(a,b,c){return bc[3]-0},nth:function(a,b,c){return c[3]-0===b},eq:function(a,b,c){return c[3]-0===b}},filter:{PSEUDO:function(a,b,c,d){var e=b[1],f=o.filters[e];if(f)return f(a,c,b,d);if(e==="contains")return(a.textContent||a.innerText||n([a])||"").indexOf(b[3])>=0;if(e==="not"){var g=b[3];for(var h=0,i=g.length;h=0}},ID:function(a,b){return a.nodeType===1&&a.getAttribute("id")===b},TAG:function(a,b){return b==="*"&&a.nodeType===1||!!a.nodeName&&a.nodeName.toLowerCase()===b},CLASS:function(a,b){return(" "+(a.className||a.getAttribute("class"))+" ").indexOf(b)>-1},ATTR:function(a,b){var c=b[1],d=m.attr?m.attr(a,c):o.attrHandle[c]?o.attrHandle[c](a):a[c]!=null?a[c]:a.getAttribute(c),e=d+"",f=b[2],g=b[4];return d==null?f==="!=":!f&&m.attr?d!=null:f==="="?e===g:f==="*="?e.indexOf(g)>=0:f==="~="?(" "+e+" ").indexOf(g)>=0:g?f==="!="?e!==g:f==="^="?e.indexOf(g)===0:f==="$="?e.substr(e.length-g.length)===g:f==="|="?e===g||e.substr(0,g.length+1)===g+"-":!1:e&&d!==!1},POS:function(a,b,c,d){var e=b[2],f=o.setFilters[e];if(f)return f(a,c,b,d)}}},p=o.match.POS,q=function(a,b){return"\\"+(b-0+1)};for(var r in o.match)o.match[r]=new RegExp(o.match[r].source+/(?![^\[]*\])(?![^\(]*\))/.source),o.leftMatch[r]=new RegExp(/(^(?:.|\r|\n)*?)/.source+o.match[r].source.replace(/\\(\d+)/g,q));var s=function(a,b){a=Array.prototype.slice.call(a,0);if(b){b.push.apply(b,a);return b}return a};try{Array.prototype.slice.call(c.documentElement.childNodes,0)[0].nodeType}catch(t){s=function(a,b){var c=0,d=b||[];if(g.call(a)==="[object Array]")Array.prototype.push.apply(d,a);else if(typeof a.length=="number")for(var e=a.length;c",e.insertBefore(a,e.firstChild),c.getElementById(d)&&(o.find.ID=function(a,c,d){if(typeof c.getElementById!="undefined"&&!d){var e=c.getElementById(a[1]);return e?e.id===a[1]||typeof e.getAttributeNode!="undefined"&&e.getAttributeNode("id").nodeValue===a[1]?[e]:b:[]}},o.filter.ID=function(a,b){var c=typeof a.getAttributeNode!="undefined"&&a.getAttributeNode("id");return a.nodeType===1&&c&&c.nodeValue===b}),e.removeChild(a),e=a=null}(),function(){var a=c.createElement("div");a.appendChild(c.createComment("")),a.getElementsByTagName("*").length>0&&(o.find.TAG=function(a,b){var c=b.getElementsByTagName(a[1]);if(a[1]==="*"){var d=[];for(var e=0;c[e];e++)c[e].nodeType===1&&d.push(c[e]);c=d}return c}),a.innerHTML="",a.firstChild&&typeof a.firstChild.getAttribute!="undefined"&&a.firstChild.getAttribute("href")!=="#"&&(o.attrHandle.href=function(a){return a.getAttribute("href",2)}),a=null}(),c.querySelectorAll&&function(){var a=m,b=c.createElement("div"),d="__sizzle__";b.innerHTML="

";if(!b.querySelectorAll||b.querySelectorAll(".TEST").length!==0){m=function(b,e,f,g){e=e||c;if(!g&&!m.isXML(e)){var h=/^(\w+$)|^\.([\w\-]+$)|^#([\w\-]+$)/.exec(b);if(h&&(e.nodeType===1||e.nodeType===9)){if(h[1])return s(e.getElementsByTagName(b),f);if(h[2]&&o.find.CLASS&&e.getElementsByClassName)return s(e.getElementsByClassName(h[2]),f)}if(e.nodeType===9){if(b==="body"&&e.body)return s([e.body],f);if(h&&h[3]){var i=e.getElementById(h[3]);if(!i||!i.parentNode)return s([],f);if(i.id===h[3])return s([i],f)}try{return s(e.querySelectorAll(b),f)}catch(j){}}else if(e.nodeType===1&&e.nodeName.toLowerCase()!=="object"){var k=e,l=e.getAttribute("id"),n=l||d,p=e.parentNode,q=/^\s*[+~]/.test(b);l?n=n.replace(/'/g,"\\$&"):e.setAttribute("id",n),q&&p&&(e=e.parentNode);try{if(!q||p)return s(e.querySelectorAll("[id='"+n+"'] "+b),f)}catch(r){}finally{l||k.removeAttribute("id")}}}return a(b,e,f,g)};for(var e in a)m[e]=a[e];b=null}}(),function(){var a=c.documentElement,b=a.matchesSelector||a.mozMatchesSelector||a.webkitMatchesSelector||a.msMatchesSelector;if(b){var d=!b.call(c.createElement("div"),"div"),e=!1;try{b.call(c.documentElement,"[test!='']:sizzle")}catch(f){e=!0}m.matchesSelector=function(a,c){c=c.replace(/\=\s*([^'"\]]*)\s*\]/g,"='$1']");if(!m.isXML(a))try{if(e||!o.match.PSEUDO.test(c)&&!/!=/.test(c)){var f=b.call(a,c);if(f||!d||a.document&&a.document.nodeType!==11)return f}}catch(g){}return m(c,null,null,[a]).length>0}}}(),function(){var a=c.createElement("div");a.innerHTML="
";if(!!a.getElementsByClassName&&a.getElementsByClassName("e").length!==0){a.lastChild.className="e";if(a.getElementsByClassName("e").length===1)return;o.order.splice(1,0,"CLASS"),o.find.CLASS=function(a,b,c){if(typeof b.getElementsByClassName!="undefined"&&!c)return b.getElementsByClassName(a[1])},a=null}}(),c.documentElement.contains?m.contains=function(a,b){return a!==b&&(a.contains?a.contains(b):!0)}:c.documentElement.compareDocumentPosition?m.contains=function(a,b){return!!(a.compareDocumentPosition(b)&16)}:m.contains=function(){return!1},m.isXML=function(a){var b=(a?a.ownerDocument||a:0).documentElement;return b?b.nodeName!=="HTML":!1};var y=function(a,b,c){var d,e=[],f="",g=b.nodeType?[b]:b;while(d=o.match.PSEUDO.exec(a))f+=d[0],a=a.replace(o.match.PSEUDO,"");a=o.relative[a]?a+"*":a;for(var h=0,i=g.length;h0)for(h=g;h=0:f.filter(a,this).length>0:this.filter(a).length>0)},closest:function(a,b){var c=[],d,e,g=this[0];if(f.isArray(a)){var h=1;while(g&&g.ownerDocument&&g!==b){for(d=0;d-1:f.find.matchesSelector(g,a)){c.push(g);break}g=g.parentNode;if(!g||!g.ownerDocument||g===b||g.nodeType===11)break}}c=c.length>1?f.unique(c):c;return this.pushStack(c,"closest",a)},index:function(a){if(!a)return this[0]&&this[0].parentNode?this.prevAll().length:-1;if(typeof a=="string")return f.inArray(this[0],f(a));return f.inArray(a.jquery?a[0]:a,this)},add:function(a,b){var c=typeof a=="string"?f(a,b):f.makeArray(a&&a.nodeType?[a]:a),d=f.merge(this.get(),c);return this.pushStack(S(c[0])||S(d[0])?d:f.unique(d))},andSelf:function(){return this.add(this.prevObject)}}),f.each({parent:function(a){var b=a.parentNode;return b&&b.nodeType!==11?b:null},parents:function(a){return f.dir(a,"parentNode")},parentsUntil:function(a,b,c){return f.dir(a,"parentNode",c)},next:function(a){return f.nth(a,2,"nextSibling")},prev:function(a){return f.nth(a,2,"previousSibling")},nextAll:function(a){return f.dir(a,"nextSibling")},prevAll:function(a){return f.dir(a,"previousSibling")},nextUntil:function(a,b,c){return f.dir(a,"nextSibling",c)},prevUntil:function(a,b,c){return f.dir(a,"previousSibling",c)},siblings:function(a){return f.sibling(a.parentNode.firstChild,a)},children:function(a){return f.sibling(a.firstChild)},contents:function(a){return f.nodeName(a,"iframe")?a.contentDocument||a.contentWindow.document:f.makeArray(a.childNodes)}},function(a,b){f.fn[a]=function(c,d){var e=f.map(this,b,c);L.test(a)||(d=c),d&&typeof d=="string"&&(e=f.filter(d,e)),e=this.length>1&&!R[a]?f.unique(e):e,(this.length>1||N.test(d))&&M.test(a)&&(e=e.reverse());return this.pushStack(e,a,P.call(arguments).join(","))}}),f.extend({filter:function(a,b,c){c&&(a=":not("+a+")");return b.length===1?f.find.matchesSelector(b[0],a)?[b[0]]:[]:f.find.matches(a,b)},dir:function(a,c,d){var e=[],g=a[c];while(g&&g.nodeType!==9&&(d===b||g.nodeType!==1||!f(g).is(d)))g.nodeType===1&&e.push(g),g=g[c];return e},nth:function(a,b,c,d){b=b||1;var e=0;for(;a;a=a[c])if(a.nodeType===1&&++e===b)break;return a},sibling:function(a,b){var c=[];for(;a;a=a.nextSibling)a.nodeType===1&&a!==b&&c.push(a);return c}});var V="abbr|article|aside|audio|canvas|datalist|details|figcaption|figure|footer|header|hgroup|mark|meter|nav|output|progress|section|summary|time|video",W=/ jQuery\d+="(?:\d+|null)"/g,X=/^\s+/,Y=/<(?!area|br|col|embed|hr|img|input|link|meta|param)(([\w:]+)[^>]*)\/>/ig,Z=/<([\w:]+)/,$=/",""],legend:[1,"
","
"],thead:[1,"","
"],tr:[2,"","
"],td:[3,"","
"],col:[2,"","
"],area:[1,"",""],_default:[0,"",""]},bh=U(c);bg.optgroup=bg.option,bg.tbody=bg.tfoot=bg.colgroup=bg.caption=bg.thead,bg.th=bg.td,f.support.htmlSerialize||(bg._default=[1,"div
","
"]),f.fn.extend({text:function(a){if(f.isFunction(a))return this.each(function(b){var c=f(this);c.text(a.call(this,b,c.text()))});if(typeof a!="object"&&a!==b)return this.empty().append((this[0]&&this[0].ownerDocument||c).createTextNode(a));return f.text(this)},wrapAll:function(a){if(f.isFunction(a))return this.each(function(b){f(this).wrapAll(a.call(this,b))});if(this[0]){var b=f(a,this[0].ownerDocument).eq(0).clone(!0);this[0].parentNode&&b.insertBefore(this[0]),b.map(function(){var a=this;while(a.firstChild&&a.firstChild.nodeType===1)a=a.firstChild;return a}).append(this)}return this},wrapInner:function(a){if(f.isFunction(a))return this.each(function(b){f(this).wrapInner(a.call(this,b))});return this.each(function(){var b=f(this),c=b.contents();c.length?c.wrapAll(a):b.append(a)})},wrap:function(a){var b=f.isFunction(a);return this.each(function(c){f(this).wrapAll(b?a.call(this,c):a)})},unwrap:function(){return this.parent().each(function(){f.nodeName(this,"body")||f(this).replaceWith(this.childNodes)}).end()},append:function(){return this.domManip(arguments,!0,function(a){this.nodeType===1&&this.appendChild(a)})},prepend:function(){return this.domManip(arguments,!0,function(a){this.nodeType===1&&this.insertBefore(a,this.firstChild)})},before:function(){if(this[0]&&this[0].parentNode)return this.domManip(arguments,!1,function(a){this.parentNode.insertBefore(a,this)});if(arguments.length){var a=f.clean(arguments);a.push.apply(a,this.toArray());return this.pushStack(a,"before",arguments)}},after:function(){if(this[0]&&this[0].parentNode)return this.domManip(arguments,!1,function(a){this.parentNode.insertBefore(a,this.nextSibling)});if(arguments.length){var a=this.pushStack(this,"after",arguments);a.push.apply(a,f.clean(arguments));return a}},remove:function(a,b){for(var c=0,d;(d=this[c])!=null;c++)if(!a||f.filter(a,[d]).length)!b&&d.nodeType===1&&(f.cleanData(d.getElementsByTagName("*")),f.cleanData([d])),d.parentNode&&d.parentNode.removeChild(d);return this},empty:function() {for(var a=0,b;(b=this[a])!=null;a++){b.nodeType===1&&f.cleanData(b.getElementsByTagName("*"));while(b.firstChild)b.removeChild(b.firstChild)}return this},clone:function(a,b){a=a==null?!1:a,b=b==null?a:b;return this.map(function(){return f.clone(this,a,b)})},html:function(a){if(a===b)return this[0]&&this[0].nodeType===1?this[0].innerHTML.replace(W,""):null;if(typeof a=="string"&&!ba.test(a)&&(f.support.leadingWhitespace||!X.test(a))&&!bg[(Z.exec(a)||["",""])[1].toLowerCase()]){a=a.replace(Y,"<$1>");try{for(var c=0,d=this.length;c1&&l0?this.clone(!0):this).get();f(e[h])[b](j),d=d.concat(j)}return this.pushStack(d,a,e.selector)}}),f.extend({clone:function(a,b,c){var d,e,g,h=f.support.html5Clone||!bc.test("<"+a.nodeName)?a.cloneNode(!0):bo(a);if((!f.support.noCloneEvent||!f.support.noCloneChecked)&&(a.nodeType===1||a.nodeType===11)&&!f.isXMLDoc(a)){bk(a,h),d=bl(a),e=bl(h);for(g=0;d[g];++g)e[g]&&bk(d[g],e[g])}if(b){bj(a,h);if(c){d=bl(a),e=bl(h);for(g=0;d[g];++g)bj(d[g],e[g])}}d=e=null;return h},clean:function(a,b,d,e){var g;b=b||c,typeof b.createElement=="undefined"&&(b=b.ownerDocument||b[0]&&b[0].ownerDocument||c);var h=[],i;for(var j=0,k;(k=a[j])!=null;j++){typeof k=="number"&&(k+="");if(!k)continue;if(typeof k=="string")if(!_.test(k))k=b.createTextNode(k);else{k=k.replace(Y,"<$1>");var l=(Z.exec(k)||["",""])[1].toLowerCase(),m=bg[l]||bg._default,n=m[0],o=b.createElement("div");b===c?bh.appendChild(o):U(b).appendChild(o),o.innerHTML=m[1]+k+m[2];while(n--)o=o.lastChild;if(!f.support.tbody){var p=$.test(k),q=l==="table"&&!p?o.firstChild&&o.firstChild.childNodes:m[1]===""&&!p?o.childNodes:[];for(i=q.length-1;i>=0;--i)f.nodeName(q[i],"tbody")&&!q[i].childNodes.length&&q[i].parentNode.removeChild(q[i])}!f.support.leadingWhitespace&&X.test(k)&&o.insertBefore(b.createTextNode(X.exec(k)[0]),o.firstChild),k=o.childNodes}var r;if(!f.support.appendChecked)if(k[0]&&typeof (r=k.length)=="number")for(i=0;i=0)return b+"px"}}}),f.support.opacity||(f.cssHooks.opacity={get:function(a,b){return br.test((b&&a.currentStyle?a.currentStyle.filter:a.style.filter)||"")?parseFloat(RegExp.$1)/100+"":b?"1":""},set:function(a,b){var c=a.style,d=a.currentStyle,e=f.isNumeric(b)?"alpha(opacity="+b*100+")":"",g=d&&d.filter||c.filter||"";c.zoom=1;if(b>=1&&f.trim(g.replace(bq,""))===""){c.removeAttribute("filter");if(d&&!d.filter)return}c.filter=bq.test(g)?g.replace(bq,e):g+" "+e}}),f(function(){f.support.reliableMarginRight||(f.cssHooks.marginRight={get:function(a,b){var c;f.swap(a,{display:"inline-block"},function(){b?c=bz(a,"margin-right","marginRight"):c=a.style.marginRight});return c}})}),c.defaultView&&c.defaultView.getComputedStyle&&(bA=function(a,b){var c,d,e;b=b.replace(bs,"-$1").toLowerCase(),(d=a.ownerDocument.defaultView)&&(e=d.getComputedStyle(a,null))&&(c=e.getPropertyValue(b),c===""&&!f.contains(a.ownerDocument.documentElement,a)&&(c=f.style(a,b)));return c}),c.documentElement.currentStyle&&(bB=function(a,b){var c,d,e,f=a.currentStyle&&a.currentStyle[b],g=a.style;f===null&&g&&(e=g[b])&&(f=e),!bt.test(f)&&bu.test(f)&&(c=g.left,d=a.runtimeStyle&&a.runtimeStyle.left,d&&(a.runtimeStyle.left=a.currentStyle.left),g.left=b==="fontSize"?"1em":f||0,f=g.pixelLeft+"px",g.left=c,d&&(a.runtimeStyle.left=d));return f===""?"auto":f}),bz=bA||bB,f.expr&&f.expr.filters&&(f.expr.filters.hidden=function(a){var b=a.offsetWidth,c=a.offsetHeight;return b===0&&c===0||!f.support.reliableHiddenOffsets&&(a.style&&a.style.display||f.css(a,"display"))==="none"},f.expr.filters.visible=function(a){return!f.expr.filters.hidden(a)});var bD=/%20/g,bE=/\[\]$/,bF=/\r?\n/g,bG=/#.*$/,bH=/^(.*?):[ \t]*([^\r\n]*)\r?$/mg,bI=/^(?:color|date|datetime|datetime-local|email|hidden|month|number|password|range|search|tel|text|time|url|week)$/i,bJ=/^(?:about|app|app\-storage|.+\-extension|file|res|widget):$/,bK=/^(?:GET|HEAD)$/,bL=/^\/\//,bM=/\?/,bN=/)<[^<]*)*<\/script>/gi,bO=/^(?:select|textarea)/i,bP=/\s+/,bQ=/([?&])_=[^&]*/,bR=/^([\w\+\.\-]+:)(?:\/\/([^\/?#:]*)(?::(\d+))?)?/,bS=f.fn.load,bT={},bU={},bV,bW,bX=["*/"]+["*"];try{bV=e.href}catch(bY){bV=c.createElement("a"),bV.href="",bV=bV.href}bW=bR.exec(bV.toLowerCase())||[],f.fn.extend({load:function(a,c,d){if(typeof a!="string"&&bS)return bS.apply(this,arguments);if(!this.length)return this;var e=a.indexOf(" ");if(e>=0){var g=a.slice(e,a.length);a=a.slice(0,e)}var h="GET";c&&(f.isFunction(c)?(d=c,c=b):typeof c=="object"&&(c=f.param(c,f.ajaxSettings.traditional),h="POST"));var i=this;f.ajax({url:a,type:h,dataType:"html",data:c,complete:function(a,b,c){c=a.responseText,a.isResolved()&&(a.done(function(a){c=a}),i.html(g?f("
").append(c.replace(bN,"")).find(g):c)),d&&i.each(d,[c,b,a])}});return this},serialize:function(){return f.param(this.serializeArray())},serializeArray:function(){return this.map(function(){return this.elements?f.makeArray(this.elements):this}).filter(function(){return this.name&&!this.disabled&&(this.checked||bO.test(this.nodeName)||bI.test(this.type))}).map(function(a,b){var c=f(this).val();return c==null?null:f.isArray(c)?f.map(c,function(a,c){return{name:b.name,value:a.replace(bF,"\r\n")}}):{name:b.name,value:c.replace(bF,"\r\n")}}).get()}}),f.each("ajaxStart ajaxStop ajaxComplete ajaxError ajaxSuccess ajaxSend".split(" "),function(a,b){f.fn[b]=function(a){return this.on(b,a)}}),f.each(["get","post"],function(a,c){f[c]=function(a,d,e,g){f.isFunction(d)&&(g=g||e,e=d,d=b);return f.ajax({type:c,url:a,data:d,success:e,dataType:g})}}),f.extend({getScript:function(a,c){return f.get(a,b,c,"script")},getJSON:function(a,b,c){return f.get(a,b,c,"json")},ajaxSetup:function(a,b){b?b_(a,f.ajaxSettings):(b=a,a=f.ajaxSettings),b_(a,b);return a},ajaxSettings:{url:bV,isLocal:bJ.test(bW[1]),global:!0,type:"GET",contentType:"application/x-www-form-urlencoded",processData:!0,async:!0,accepts:{xml:"application/xml, text/xml",html:"text/html",text:"text/plain",json:"application/json, text/javascript","*":bX},contents:{xml:/xml/,html:/html/,json:/json/},responseFields:{xml:"responseXML",text:"responseText"},converters:{"* text":a.String,"text html":!0,"text json":f.parseJSON,"text xml":f.parseXML},flatOptions:{context:!0,url:!0}},ajaxPrefilter:bZ(bT),ajaxTransport:bZ(bU),ajax:function(a,c){function w(a,c,l,m){if(s!==2){s=2,q&&clearTimeout(q),p=b,n=m||"",v.readyState=a>0?4:0;var o,r,u,w=c,x=l?cb(d,v,l):b,y,z;if(a>=200&&a<300||a===304){if(d.ifModified){if(y=v.getResponseHeader("Last-Modified"))f.lastModified[k]=y;if(z=v.getResponseHeader("Etag"))f.etag[k]=z}if(a===304)w="notmodified",o=!0;else try{r=cc(d,x),w="success",o=!0}catch(A){w="parsererror",u=A}}else{u=w;if(!w||a)w="error",a<0&&(a=0)}v.status=a,v.statusText=""+(c||w),o?h.resolveWith(e,[r,w,v]):h.rejectWith(e,[v,w,u]),v.statusCode(j),j=b,t&&g.trigger("ajax"+(o?"Success":"Error"),[v,d,o?r:u]),i.fireWith(e,[v,w]),t&&(g.trigger("ajaxComplete",[v,d]),--f.active||f.event.trigger("ajaxStop"))}}typeof a=="object"&&(c=a,a=b),c=c||{};var d=f.ajaxSetup({},c),e=d.context||d,g=e!==d&&(e.nodeType||e instanceof f)?f(e):f.event,h=f.Deferred(),i=f.Callbacks("once memory"),j=d.statusCode||{},k,l={},m={},n,o,p,q,r,s=0,t,u,v={readyState:0,setRequestHeader:function(a,b){if(!s){var c=a.toLowerCase();a=m[c]=m[c]||a,l[a]=b}return this},getAllResponseHeaders:function(){return s===2?n:null},getResponseHeader:function(a){var c;if(s===2){if(!o){o={};while(c=bH.exec(n))o[c[1].toLowerCase()]=c[2]}c=o[a.toLowerCase()]}return c===b?null:c},overrideMimeType:function(a){s||(d.mimeType=a);return this},abort:function(a){a=a||"abort",p&&p.abort(a),w(0,a);return this}};h.promise(v),v.success=v.done,v.error=v.fail,v.complete=i.add,v.statusCode=function(a){if(a){var b;if(s<2)for(b in a)j[b]=[j[b],a[b]];else b=a[v.status],v.then(b,b)}return this},d.url=((a||d.url)+"").replace(bG,"").replace(bL,bW[1]+"//"),d.dataTypes=f.trim(d.dataType||"*").toLowerCase().split(bP),d.crossDomain==null&&(r=bR.exec(d.url.toLowerCase()),d.crossDomain=!(!r||r[1]==bW[1]&&r[2]==bW[2]&&(r[3]||(r[1]==="http:"?80:443))==(bW[3]||(bW[1]==="http:"?80:443)))),d.data&&d.processData&&typeof d.data!="string"&&(d.data=f.param(d.data,d.traditional)),b$(bT,d,c,v);if(s===2)return!1;t=d.global,d.type=d.type.toUpperCase(),d.hasContent=!bK.test(d.type),t&&f.active++===0&&f.event.trigger("ajaxStart");if(!d.hasContent){d.data&&(d.url+=(bM.test(d.url)?"&":"?")+d.data,delete d.data),k=d.url;if(d.cache===!1){var x=f.now(),y=d.url.replace(bQ,"$1_="+x);d.url=y+(y===d.url?(bM.test(d.url)?"&":"?")+"_="+x:"")}}(d.data&&d.hasContent&&d.contentType!==!1||c.contentType)&&v.setRequestHeader("Content-Type",d.contentType),d.ifModified&&(k=k||d.url,f.lastModified[k]&&v.setRequestHeader("If-Modified-Since",f.lastModified[k]),f.etag[k]&&v.setRequestHeader("If-None-Match",f.etag[k])),v.setRequestHeader("Accept",d.dataTypes[0]&&d.accepts[d.dataTypes[0]]?d.accepts[d.dataTypes[0]]+(d.dataTypes[0]!=="*"?", "+bX+"; q=0.01":""):d.accepts["*"]);for(u in d.headers)v.setRequestHeader(u,d.headers[u]);if(d.beforeSend&&(d.beforeSend.call(e,v,d)===!1||s===2)){v.abort();return!1}for(u in{success:1,error:1,complete:1})v[u](d[u]);p=b$(bU,d,c,v);if(!p)w(-1,"No Transport");else{v.readyState=1,t&&g.trigger("ajaxSend",[v,d]),d.async&&d.timeout>0&&(q=setTimeout(function(){v.abort("timeout")},d.timeout));try{s=1,p.send(l,w)}catch(z){if(s<2)w(-1,z);else throw z}}return v},param:function(a,c){var d=[],e=function(a,b){b=f.isFunction(b)?b():b,d[d.length]=encodeURIComponent(a)+"="+encodeURIComponent(b)};c===b&&(c=f.ajaxSettings.traditional);if(f.isArray(a)||a.jquery&&!f.isPlainObject(a))f.each(a,function(){e(this.name,this.value)});else for(var g in a)ca(g,a[g],c,e);return d.join("&").replace(bD,"+")}}),f.extend({active:0,lastModified:{},etag:{}});var cd=f.now(),ce=/(\=)\?(&|$)|\?\?/i;f.ajaxSetup({jsonp:"callback",jsonpCallback:function(){return f.expando+"_"+cd++}}),f.ajaxPrefilter("json jsonp",function(b,c,d){var e=b.contentType==="application/x-www-form-urlencoded"&&typeof b.data=="string";if(b.dataTypes[0]==="jsonp"||b.jsonp!==!1&&(ce.test(b.url)||e&&ce.test(b.data))){var g,h=b.jsonpCallback=f.isFunction(b.jsonpCallback)?b.jsonpCallback():b.jsonpCallback,i=a[h],j=b.url,k=b.data,l="$1"+h+"$2";b.jsonp!==!1&&(j=j.replace(ce,l),b.url===j&&(e&&(k=k.replace(ce,l)),b.data===k&&(j+=(/\?/.test(j)?"&":"?")+b.jsonp+"="+h))),b.url=j,b.data=k,a[h]=function(a){g=[a]},d.always(function(){a[h]=i,g&&f.isFunction(i)&&a[h](g[0])}),b.converters["script json"]=function(){g||f.error(h+" was not called");return g[0]},b.dataTypes[0]="json";return"script"}}),f.ajaxSetup({accepts:{script:"text/javascript, application/javascript, application/ecmascript, application/x-ecmascript"},contents:{script:/javascript|ecmascript/},converters:{"text script":function(a){f.globalEval(a);return a}}}),f.ajaxPrefilter("script",function(a){a.cache===b&&(a.cache=!1),a.crossDomain&&(a.type="GET",a.global=!1)}),f.ajaxTransport("script",function(a){if(a.crossDomain){var d,e=c.head||c.getElementsByTagName("head")[0]||c.documentElement;return{send:function(f,g){d=c.createElement("script"),d.async="async",a.scriptCharset&&(d.charset=a.scriptCharset),d.src=a.url,d.onload=d.onreadystatechange=function(a,c){if(c||!d.readyState||/loaded|complete/.test(d.readyState))d.onload=d.onreadystatechange=null,e&&d.parentNode&&e.removeChild(d),d=b,c||g(200,"success")},e.insertBefore(d,e.firstChild)},abort:function(){d&&d.onload(0,1)}}}});var cf=a.ActiveXObject?function(){for(var a in ch)ch[a](0,1)}:!1,cg=0,ch;f.ajaxSettings.xhr=a.ActiveXObject?function(){return!this.isLocal&&ci()||cj()}:ci,function(a){f.extend(f.support,{ajax:!!a,cors:!!a&&"withCredentials"in a})}(f.ajaxSettings.xhr()),f.support.ajax&&f.ajaxTransport(function(c){if(!c.crossDomain||f.support.cors){var d;return{send:function(e,g){var h=c.xhr(),i,j;c.username?h.open(c.type,c.url,c.async,c.username,c.password):h.open(c.type,c.url,c.async);if(c.xhrFields)for(j in c.xhrFields)h[j]=c.xhrFields[j];c.mimeType&&h.overrideMimeType&&h.overrideMimeType(c.mimeType),!c.crossDomain&&!e["X-Requested-With"]&&(e["X-Requested-With"]="XMLHttpRequest");try{for(j in e)h.setRequestHeader(j,e[j])}catch(k){}h.send(c.hasContent&&c.data||null),d=function(a,e){var j,k,l,m,n;try{if(d&&(e||h.readyState===4)){d=b,i&&(h.onreadystatechange=f.noop,cf&&delete ch[i]);if(e)h.readyState!==4&&h.abort();else{j=h.status,l=h.getAllResponseHeaders(),m={},n=h.responseXML,n&&n.documentElement&&(m.xml=n),m.text=h.responseText;try{k=h.statusText}catch(o){k=""}!j&&c.isLocal&&!c.crossDomain?j=m.text?200:404:j===1223&&(j=204)}}}catch(p){e||g(-1,p)}m&&g(j,k,m,l)},!c.async||h.readyState===4?d():(i=++cg,cf&&(ch||(ch={},f(a).unload(cf)),ch[i]=d),h.onreadystatechange=d)},abort:function(){d&&d(0,1)}}}});var ck={},cl,cm,cn=/^(?:toggle|show|hide)$/,co=/^([+\-]=)?([\d+.\-]+)([a-z%]*)$/i,cp,cq=[["height","marginTop","marginBottom","paddingTop","paddingBottom"],["width","marginLeft","marginRight","paddingLeft","paddingRight"],["opacity"]],cr;f.fn.extend({show:function(a,b,c){var d,e;if(a||a===0)return this.animate(cu("show",3),a,b,c);for(var g=0,h=this.length;g=i.duration+this.startTime){this.now=this.end,this.pos=this.state=1,this.update(),i.animatedProperties[this.prop]=!0;for(b in i.animatedProperties)i.animatedProperties[b]!==!0&&(g=!1);if(g){i.overflow!=null&&!f.support.shrinkWrapBlocks&&f.each(["","X","Y"],function(a,b){h.style["overflow"+b]=i.overflow[a]}),i.hide&&f(h).hide();if(i.hide||i.show)for(b in i.animatedProperties)f.style(h,b,i.orig[b]),f.removeData(h,"fxshow"+b,!0),f.removeData(h,"toggle"+b,!0);d=i.complete,d&&(i.complete=!1,d.call(h))}return!1}i.duration==Infinity?this.now=e:(c=e-this.startTime,this.state=c/i.duration,this.pos=f.easing[i.animatedProperties[this.prop]](this.state,c,0,1,i.duration),this.now=this.start+(this.end-this.start)*this.pos),this.update();return!0}},f.extend(f.fx,{tick:function(){var a,b=f.timers,c=0;for(;c-1,k={},l={},m,n;j?(l=e.position(),m=l.top,n=l.left):(m=parseFloat(h)||0,n=parseFloat(i)||0),f.isFunction(b)&&(b=b.call(a,c,g)),b.top!=null&&(k.top=b.top-g.top+m),b.left!=null&&(k.left=b.left-g.left+n),"using"in b?b.using.call(a,k):e.css(k)}},f.fn.extend({position:function(){if(!this[0])return null;var a=this[0],b=this.offsetParent(),c=this.offset(),d=cx.test(b[0].nodeName)?{top:0,left:0}:b.offset();c.top-=parseFloat(f.css(a,"marginTop"))||0,c.left-=parseFloat(f.css(a,"marginLeft"))||0,d.top+=parseFloat(f.css(b[0],"borderTopWidth"))||0,d.left+=parseFloat(f.css(b[0],"borderLeftWidth"))||0;return{top:c.top-d.top,left:c.left-d.left}},offsetParent:function(){return this.map(function(){var a=this.offsetParent||c.body;while(a&&!cx.test(a.nodeName)&&f.css(a,"position")==="static")a=a.offsetParent;return a})}}),f.each(["Left","Top"],function(a,c){var d="scroll"+c;f.fn[d]=function(c){var e,g;if(c===b){e=this[0];if(!e)return null;g=cy(e);return g?"pageXOffset"in g?g[a?"pageYOffset":"pageXOffset"]:f.support.boxModel&&g.document.documentElement[d]||g.document.body[d]:e[d]}return this.each(function(){g=cy(this),g?g.scrollTo(a?f(g).scrollLeft():c,a?c:f(g).scrollTop()):this[d]=c})}}),f.each(["Height","Width"],function(a,c){var d=c.toLowerCase();f.fn["inner"+c]=function(){var a=this[0];return a?a.style?parseFloat(f.css(a,d,"padding")):this[d]():null},f.fn["outer"+c]=function(a){var b=this[0];return b?b.style?parseFloat(f.css(b,d,a?"margin":"border")):this[d]():null},f.fn[d]=function(a){var e=this[0];if(!e)return a==null?null:this;if(f.isFunction(a))return this.each(function(b){var c=f(this);c[d](a.call(this,b,c[d]()))});if(f.isWindow(e)){var g=e.document.documentElement["client"+c],h=e.document.body;return e.document.compatMode==="CSS1Compat"&&g||h&&h["client"+c]||g}if(e.nodeType===9)return Math.max(e.documentElement["client"+c],e.body["scroll"+c],e.documentElement["scroll"+c],e.body["offset"+c],e.documentElement["offset"+c]);if(a===b){var i=f.css(e,d),j=parseFloat(i);return f.isNumeric(j)?j:i}return this.css(d,typeof a=="string"?a:a+"px")}}),a.jQuery=a.$=f,typeof define=="function"&&define.amd&&define.amd.jQuery&&define("jquery",[],function(){return f})})(window);dogpile.cache-0.5.1/docs/_static/minus.png0000644000076500000240000000030712167630573021177 0ustar classicstaff00000000000000PNG  IHDR &q pHYs  tIME <8tEXtComment̖RIDATcz(BpipPc |IENDB`dogpile.cache-0.5.1/docs/_static/pygments.css0000644000076500000240000000753412225643512021717 0ustar classicstaff00000000000000.highlight .hll { background-color: #ffffcc } .highlight { background: #eeffcc; } .highlight .c { color: #408090; font-style: italic } /* Comment */ .highlight .err { border: 1px solid #FF0000 } /* Error */ .highlight .k { color: #007020; font-weight: bold } /* Keyword */ .highlight .o { color: #666666 } /* Operator */ .highlight .cm { color: #408090; font-style: italic } /* Comment.Multiline */ .highlight .cp { color: #007020 } /* Comment.Preproc */ .highlight .c1 { color: #408090; font-style: italic } /* Comment.Single */ .highlight .cs { color: #408090; background-color: #fff0f0 } /* Comment.Special */ .highlight .gd { color: #A00000 } /* Generic.Deleted */ .highlight .ge { font-style: italic } /* Generic.Emph */ .highlight .gr { color: #FF0000 } /* Generic.Error */ .highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ .highlight .gi { color: #00A000 } /* Generic.Inserted */ .highlight .go { color: #333333 } /* Generic.Output */ .highlight .gp { color: #c65d09; font-weight: bold } /* Generic.Prompt */ .highlight .gs { font-weight: bold } /* Generic.Strong */ .highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ .highlight .gt { color: #0044DD } /* Generic.Traceback */ .highlight .kc { color: #007020; font-weight: bold } /* Keyword.Constant */ .highlight .kd { color: #007020; font-weight: bold } /* Keyword.Declaration */ .highlight .kn { color: #007020; font-weight: bold } /* Keyword.Namespace */ .highlight .kp { color: #007020 } /* Keyword.Pseudo */ .highlight .kr { color: #007020; font-weight: bold } /* Keyword.Reserved */ .highlight .kt { color: #902000 } /* Keyword.Type */ .highlight .m { color: #208050 } /* Literal.Number */ .highlight .s { color: #4070a0 } /* Literal.String */ .highlight .na { color: #4070a0 } /* Name.Attribute */ .highlight .nb { color: #007020 } /* Name.Builtin */ .highlight .nc { color: #0e84b5; font-weight: bold } /* Name.Class */ .highlight .no { color: #60add5 } /* Name.Constant */ .highlight .nd { color: #555555; font-weight: bold } /* Name.Decorator */ .highlight .ni { color: #d55537; font-weight: bold } /* Name.Entity */ .highlight .ne { color: #007020 } /* Name.Exception */ .highlight .nf { color: #06287e } /* Name.Function */ .highlight .nl { color: #002070; font-weight: bold } /* Name.Label */ .highlight .nn { color: #0e84b5; font-weight: bold } /* Name.Namespace */ .highlight .nt { color: #062873; font-weight: bold } /* Name.Tag */ .highlight .nv { color: #bb60d5 } /* Name.Variable */ .highlight .ow { color: #007020; font-weight: bold } /* Operator.Word */ .highlight .w { color: #bbbbbb } /* Text.Whitespace */ .highlight .mf { color: #208050 } /* Literal.Number.Float */ .highlight .mh { color: #208050 } /* Literal.Number.Hex */ .highlight .mi { color: #208050 } /* Literal.Number.Integer */ .highlight .mo { color: #208050 } /* Literal.Number.Oct */ .highlight .sb { color: #4070a0 } /* Literal.String.Backtick */ .highlight .sc { color: #4070a0 } /* Literal.String.Char */ .highlight .sd { color: #4070a0; font-style: italic } /* Literal.String.Doc */ .highlight .s2 { color: #4070a0 } /* Literal.String.Double */ .highlight .se { color: #4070a0; font-weight: bold } /* Literal.String.Escape */ .highlight .sh { color: #4070a0 } /* Literal.String.Heredoc */ .highlight .si { color: #70a0d0; font-style: italic } /* Literal.String.Interpol */ .highlight .sx { color: #c65d09 } /* Literal.String.Other */ .highlight .sr { color: #235388 } /* Literal.String.Regex */ .highlight .s1 { color: #4070a0 } /* Literal.String.Single */ .highlight .ss { color: #517918 } /* Literal.String.Symbol */ .highlight .bp { color: #007020 } /* Name.Builtin.Pseudo */ .highlight .vc { color: #bb60d5 } /* Name.Variable.Class */ .highlight .vg { color: #bb60d5 } /* Name.Variable.Global */ .highlight .vi { color: #bb60d5 } /* Name.Variable.Instance */ .highlight .il { color: #208050 } /* Literal.Number.Integer.Long */dogpile.cache-0.5.1/docs/_static/searchtools.js0000644000076500000240000004264712225643512022227 0ustar classicstaff00000000000000/* * searchtools.js_t * ~~~~~~~~~~~~~~~~ * * Sphinx JavaScript utilties for the full-text search. * * :copyright: Copyright 2007-2013 by the Sphinx team, see AUTHORS. * :license: BSD, see LICENSE for details. * */ /** * Porter Stemmer */ var Stemmer = function() { var step2list = { ational: 'ate', tional: 'tion', enci: 'ence', anci: 'ance', izer: 'ize', bli: 'ble', alli: 'al', entli: 'ent', eli: 'e', ousli: 'ous', ization: 'ize', ation: 'ate', ator: 'ate', alism: 'al', iveness: 'ive', fulness: 'ful', ousness: 'ous', aliti: 'al', iviti: 'ive', biliti: 'ble', logi: 'log' }; var step3list = { icate: 'ic', ative: '', alize: 'al', iciti: 'ic', ical: 'ic', ful: '', ness: '' }; var c = "[^aeiou]"; // consonant var v = "[aeiouy]"; // vowel var C = c + "[^aeiouy]*"; // consonant sequence var V = v + "[aeiou]*"; // vowel sequence var mgr0 = "^(" + C + ")?" + V + C; // [C]VC... is m>0 var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 var s_v = "^(" + C + ")?" + v; // vowel in stem this.stemWord = function (w) { var stem; var suffix; var firstch; var origword = w; if (w.length < 3) return w; var re; var re2; var re3; var re4; firstch = w.substr(0,1); if (firstch == "y") w = firstch.toUpperCase() + w.substr(1); // Step 1a re = /^(.+?)(ss|i)es$/; re2 = /^(.+?)([^s])s$/; if (re.test(w)) w = w.replace(re,"$1$2"); else if (re2.test(w)) w = w.replace(re2,"$1$2"); // Step 1b re = /^(.+?)eed$/; re2 = /^(.+?)(ed|ing)$/; if (re.test(w)) { var fp = re.exec(w); re = new RegExp(mgr0); if (re.test(fp[1])) { re = /.$/; w = w.replace(re,""); } } else if (re2.test(w)) { var fp = re2.exec(w); stem = fp[1]; re2 = new RegExp(s_v); if (re2.test(stem)) { w = stem; re2 = /(at|bl|iz)$/; re3 = new RegExp("([^aeiouylsz])\\1$"); re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); if (re2.test(w)) w = w + "e"; else if (re3.test(w)) { re = /.$/; w = w.replace(re,""); } else if (re4.test(w)) w = w + "e"; } } // Step 1c re = /^(.+?)y$/; if (re.test(w)) { var fp = re.exec(w); stem = fp[1]; re = new RegExp(s_v); if (re.test(stem)) w = stem + "i"; } // Step 2 re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; if (re.test(w)) { var fp = re.exec(w); stem = fp[1]; suffix = fp[2]; re = new RegExp(mgr0); if (re.test(stem)) w = stem + step2list[suffix]; } // Step 3 re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; if (re.test(w)) { var fp = re.exec(w); stem = fp[1]; suffix = fp[2]; re = new RegExp(mgr0); if (re.test(stem)) w = stem + step3list[suffix]; } // Step 4 re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; re2 = /^(.+?)(s|t)(ion)$/; if (re.test(w)) { var fp = re.exec(w); stem = fp[1]; re = new RegExp(mgr1); if (re.test(stem)) w = stem; } else if (re2.test(w)) { var fp = re2.exec(w); stem = fp[1] + fp[2]; re2 = new RegExp(mgr1); if (re2.test(stem)) w = stem; } // Step 5 re = /^(.+?)e$/; if (re.test(w)) { var fp = re.exec(w); stem = fp[1]; re = new RegExp(mgr1); re2 = new RegExp(meq1); re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) w = stem; } re = /ll$/; re2 = new RegExp(mgr1); if (re.test(w) && re2.test(w)) { re = /.$/; w = w.replace(re,""); } // and turn initial Y back to y if (firstch == "y") w = firstch.toLowerCase() + w.substr(1); return w; } } /** * Simple result scoring code. */ var Scorer = { // Implement the following function to further tweak the score for each result // The function takes a result array [filename, title, anchor, descr, score] // and returns the new score. /* score: function(result) { return result[4]; }, */ // query matches the full name of an object objNameMatch: 11, // or matches in the last dotted part of the object name objPartialMatch: 6, // Additive scores depending on the priority of the object objPrio: {0: 15, // used to be importantResults 1: 5, // used to be objectResults 2: -5}, // used to be unimportantResults // Used when the priority is not in the mapping. objPrioDefault: 0, // query found in title title: 15, // query found in terms term: 5 }; /** * Search Module */ var Search = { _index : null, _queued_query : null, _pulse_status : -1, init : function() { var params = $.getQueryParameters(); if (params.q) { var query = params.q[0]; $('input[name="q"]')[0].value = query; this.performSearch(query); } }, loadIndex : function(url) { $.ajax({type: "GET", url: url, data: null, dataType: "script", cache: true, complete: function(jqxhr, textstatus) { if (textstatus != "success") { document.getElementById("searchindexloader").src = url; } }}); }, setIndex : function(index) { var q; this._index = index; if ((q = this._queued_query) !== null) { this._queued_query = null; Search.query(q); } }, hasIndex : function() { return this._index !== null; }, deferQuery : function(query) { this._queued_query = query; }, stopPulse : function() { this._pulse_status = 0; }, startPulse : function() { if (this._pulse_status >= 0) return; function pulse() { var i; Search._pulse_status = (Search._pulse_status + 1) % 4; var dotString = ''; for (i = 0; i < Search._pulse_status; i++) dotString += '.'; Search.dots.text(dotString); if (Search._pulse_status > -1) window.setTimeout(pulse, 500); } pulse(); }, /** * perform a search for something (or wait until index is loaded) */ performSearch : function(query) { // create the required interface elements this.out = $('#search-results'); this.title = $('

' + _('Searching') + '

').appendTo(this.out); this.dots = $('').appendTo(this.title); this.status = $('

').appendTo(this.out); this.output = $('
'); } // Prettify the comment rating. comment.pretty_rating = comment.rating + ' point' + (comment.rating == 1 ? '' : 's'); // Make a class (for displaying not yet moderated comments differently) comment.css_class = comment.displayed ? '' : ' moderate'; // Create a div for this comment. var context = $.extend({}, opts, comment); var div = $(renderTemplate(commentTemplate, context)); // If the user has voted on this comment, highlight the correct arrow. if (comment.vote) { var direction = (comment.vote == 1) ? 'u' : 'd'; div.find('#' + direction + 'v' + comment.id).hide(); div.find('#' + direction + 'u' + comment.id).show(); } if (opts.moderator || comment.text != '[deleted]') { div.find('a.reply').show(); if (comment.proposal_diff) div.find('#sp' + comment.id).show(); if (opts.moderator && !comment.displayed) div.find('#cm' + comment.id).show(); if (opts.moderator || (opts.username == comment.username)) div.find('#dc' + comment.id).show(); } return div; } /** * A simple template renderer. Placeholders such as <%id%> are replaced * by context['id'] with items being escaped. Placeholders such as <#id#> * are not escaped. */ function renderTemplate(template, context) { var esc = $(document.createElement('div')); function handle(ph, escape) { var cur = context; $.each(ph.split('.'), function() { cur = cur[this]; }); return escape ? esc.text(cur || "").html() : cur; } return template.replace(/<([%#])([\w\.]*)\1>/g, function() { return handle(arguments[2], arguments[1] == '%' ? true : false); }); } /** Flash an error message briefly. */ function showError(message) { $(document.createElement('div')).attr({'class': 'popup-error'}) .append($(document.createElement('div')) .attr({'class': 'error-message'}).text(message)) .appendTo('body') .fadeIn("slow") .delay(2000) .fadeOut("slow"); } /** Add a link the user uses to open the comments popup. */ $.fn.comment = function() { return this.each(function() { var id = $(this).attr('id').substring(1); var count = COMMENT_METADATA[id]; var title = count + ' comment' + (count == 1 ? '' : 's'); var image = count > 0 ? opts.commentBrightImage : opts.commentImage; var addcls = count == 0 ? ' nocomment' : ''; $(this) .append( $(document.createElement('a')).attr({ href: '#', 'class': 'sphinx-comment-open' + addcls, id: 'ao' + id }) .append($(document.createElement('img')).attr({ src: image, alt: 'comment', title: title })) .click(function(event) { event.preventDefault(); show($(this).attr('id').substring(2)); }) ) .append( $(document.createElement('a')).attr({ href: '#', 'class': 'sphinx-comment-close hidden', id: 'ah' + id }) .append($(document.createElement('img')).attr({ src: opts.closeCommentImage, alt: 'close', title: 'close' })) .click(function(event) { event.preventDefault(); hide($(this).attr('id').substring(2)); }) ); }); }; var opts = { processVoteURL: '/_process_vote', addCommentURL: '/_add_comment', getCommentsURL: '/_get_comments', acceptCommentURL: '/_accept_comment', deleteCommentURL: '/_delete_comment', commentImage: '/static/_static/comment.png', closeCommentImage: '/static/_static/comment-close.png', loadingImage: '/static/_static/ajax-loader.gif', commentBrightImage: '/static/_static/comment-bright.png', upArrow: '/static/_static/up.png', downArrow: '/static/_static/down.png', upArrowPressed: '/static/_static/up-pressed.png', downArrowPressed: '/static/_static/down-pressed.png', voting: false, moderator: false }; if (typeof COMMENT_OPTIONS != "undefined") { opts = jQuery.extend(opts, COMMENT_OPTIONS); } var popupTemplate = '\
\

\ Sort by:\ best rated\ newest\ oldest\

\
Comments
\
\ loading comments...
\
    \
    \

    Add a comment\ (markup):

    \
    \ reStructured text markup: *emph*, **strong**, \ ``code``, \ code blocks: :: and an indented block after blank line
    \
    \ \

    \ \ Propose a change ▹\ \ \ Propose a change ▿\ \

    \ \ \ \ \ \
    \
    '; var commentTemplate = '\
    \
    \
    \ \ \ \ \ \ \
    \
    \ \ \ \ \ \ \
    \
    \
    \

    \ <%username%>\ <%pretty_rating%>\ <%time.delta%>\

    \
    <#text#>
    \

    \ \ reply ▿\ proposal ▹\ proposal ▿\ \ \

    \
    \
    <#proposal_diff#>\
            
    \
      \
      \
      \
      \ '; var replyTemplate = '\
    • \
      \
      \ \ \ \ \ \ \
      \
    • '; $(document).ready(function() { init(); }); })(jQuery); $(document).ready(function() { // add comment anchors for all paragraphs that are commentable $('.sphinx-has-comment').comment(); // highlight search words in search results $("div.context").each(function() { var params = $.getQueryParameters(); var terms = (params.q) ? params.q[0].split(/\s+/) : []; var result = $(this); $.each(terms, function() { result.highlightText(this.toLowerCase(), 'highlighted'); }); }); // directly open comment window if requested var anchor = document.location.hash; if (anchor.substring(0, 9) == '#comment-') { $('#ao' + anchor.substring(9)).click(); document.location.hash = '#s' + anchor.substring(9); } }); dogpile.cache-0.5.1/docs/api.html0000644000076500000240000035377212225642567017371 0ustar classicstaff00000000000000 API — dogpile.cache 0.5.1 documentation

      API

      Region

      class dogpile.cache.region.CacheRegion(name=None, function_key_generator=<function function_key_generator at 0x101989848>, function_multi_key_generator=<function function_multi_key_generator at 0x101989a28>, key_mangler=None, async_creation_runner=None)

      A front end to a particular cache backend.

      Parameters:
      • name – Optional, a string name for the region. This isn’t used internally but can be accessed via the .name parameter, helpful for configuring a region from a config file.
      • function_key_generator

        Optional. A function that will produce a “cache key” given a data creation function and arguments, when using the CacheRegion.cache_on_arguments() method. The structure of this function should be two levels: given the data creation function, return a new function that generates the key based on the given arguments. Such as:

        def my_key_generator(namespace, fn, **kw):
            fname = fn.__name__
            def generate_key(*arg):
                return namespace + "_" + fname + "_".join(str(s) for s in arg)
            return generate_key
        
        
        region = make_region(
            function_key_generator = my_key_generator
        ).configure(
            "dogpile.cache.dbm",
            expiration_time=300,
            arguments={
                "filename":"file.dbm"
            }
        )
        

        The namespace is that passed to CacheRegion.cache_on_arguments(). It’s not consulted outside this function, so in fact can be of any form. For example, it can be passed as a tuple, used to specify arguments to pluck from **kw:

        def my_key_generator(namespace, fn):
            def generate_key(*arg, **kw):
                return ":".join(
                        [kw[k] for k in namespace] +
                        [str(x) for x in arg]
                    )
            return generate_key
        

        Where the decorator might be used as:

        @my_region.cache_on_arguments(namespace=('x', 'y'))
        def my_function(a, b, **kw):
            return my_data()
        
      • function_multi_key_generator

        Optional. Similar to function_key_generator parameter, but it’s used in CacheRegion.cache_multi_on_arguments(). Generated function should return list of keys. For example:

        def my_multi_key_generator(namespace, fn, **kw):
            namespace = fn.__name__ + (namespace or '')
        
            def generate_keys(*args):
                return [namespace + ':' + str(a) for a in args]
        
            return generate_keys
        
      • key_mangler – Function which will be used on all incoming keys before passing to the backend. Defaults to None, in which case the key mangling function recommended by the cache backend will be used. A typical mangler is the SHA1 mangler found at sha1_mangle_key() which coerces keys into a SHA1 hash, so that the string length is fixed. To disable all key mangling, set to False. Another typical mangler is the built-in Python function str, which can be used to convert non-string or Unicode keys to bytestrings, which is needed when using a backend such as bsddb or dbm under Python 2.x in conjunction with Unicode keys.
      • async_creation_runner

        A callable that, when specified, will be passed to and called by dogpile.lock when there is a stale value present in the cache. It will be passed the mutex and is responsible releasing that mutex when finished. This can be used to defer the computation of expensive creator functions to later points in the future by way of, for example, a background thread, a long-running queue, or a task manager system like Celery.

        For a specific example using async_creation_runner, new values can be created in a background thread like so:

        import threading
        
        def async_creation_runner(cache, somekey, creator, mutex):
            ''' Used by dogpile.core:Lock when appropriate  '''
            def runner():
                try:
                    value = creator()
                    cache.set(somekey, value)
                finally:
                    mutex.release()
        
            thread = threading.Thread(target=runner)
            thread.start()
        
        
        region = make_region(
            async_creation_runner=async_creation_runner,
        ).configure(
            'dogpile.cache.memcached',
            expiration_time=5,
            arguments={
                'url': '127.0.0.1:11211',
                'distributed_lock': True,
            }
        )
        

        Remember that the first request for a key with no associated value will always block; async_creator will not be invoked. However, subsequent requests for cached-but-expired values will still return promptly. They will be refreshed by whatever asynchronous means the provided async_creation_runner callable implements.

        By default the async_creation_runner is disabled and is set to None.

        New in version 0.4.2: added the async_creation_runner feature.

      cache_multi_on_arguments(namespace=None, expiration_time=None, should_cache_fn=None, asdict=False, to_str=<type 'str'>)

      A function decorator that will cache multiple return values from the function using a sequence of keys derived from the function itself and the arguments passed to it.

      This method is the “multiple key” analogue to the CacheRegion.cache_on_arguments() method.

      Example:

      @someregion.cache_multi_on_arguments()
      def generate_something(*keys):
          return [
              somedatabase.query(key)
              for key in keys
          ]
      

      The decorated function can be called normally. The decorator will produce a list of cache keys using a mechanism similar to that of CacheRegion.cache_on_arguments(), combining the name of the function with the optional namespace and with the string form of each key. It will then consult the cache using the same mechanism as that of CacheRegion.get_multi() to retrieve all current values; the originally passed keys corresponding to those values which aren’t generated or need regeneration will be assembled into a new argument list, and the decorated function is then called with that subset of arguments.

      The returned result is a list:

      result = generate_something("key1", "key2", "key3")
      

      The decorator internally makes use of the CacheRegion.get_or_create_multi() method to access the cache and conditionally call the function. See that method for additional behavioral details.

      Unlike the CacheRegion.cache_on_arguments() method, CacheRegion.cache_multi_on_arguments() works only with a single function signature, one which takes a simple list of keys as arguments.

      Like CacheRegion.cache_on_arguments(), the decorated function is also provided with a set() method, which here accepts a mapping of keys and values to set in the cache:

      generate_something.set({"k1": "value1",
                              "k2": "value2", "k3": "value3"})
      

      an invalidate() method, which has the effect of deleting the given sequence of keys using the same mechanism as that of CacheRegion.delete_multi():

      generate_something.invalidate("k1", "k2", "k3")
      

      and finally a refresh() method, which will call the creation function, cache the new values, and return them:

      values = generate_something.refresh("k1", "k2", "k3")
      

      Parameters passed to CacheRegion.cache_multi_on_arguments() have the same meaning as those passed to CacheRegion.cache_on_arguments().

      Parameters:
      • namespace – optional string argument which will be established as part of each cache key.
      • expiration_time – if not None, will override the normal expiration time. May be passed as an integer or a callable.
      • should_cache_fn – passed to CacheRegion.get_or_create_multi(). This function is given a value as returned by the creator, and only if it returns True will that value be placed in the cache.
      • asdict

        if True, the decorated function should return its result as a dictionary of keys->values, and the final result of calling the decorated function will also be a dictionary. If left at its default value of False, the decorated function should return its result as a list of values, and the final result of calling the decorated function will also be a list.

        When asdict==True if the dictionary returned by the decorated function is missing keys, those keys will not be cached.

      • to_str – callable, will be called on each function argument in order to convert to a string. Defaults to str(). If the function accepts non-ascii unicode arguments on Python 2.x, the unicode() builtin can be substituted, but note this will produce unicode cache keys which may require key mangling before reaching the cache.

      New in version 0.5.0.

      cache_on_arguments(namespace=None, expiration_time=None, should_cache_fn=None, to_str=<type 'str'>)

      A function decorator that will cache the return value of the function using a key derived from the function itself and its arguments.

      The decorator internally makes use of the CacheRegion.get_or_create() method to access the cache and conditionally call the function. See that method for additional behavioral details.

      E.g.:

      @someregion.cache_on_arguments()
      def generate_something(x, y):
          return somedatabase.query(x, y)
      

      The decorated function can then be called normally, where data will be pulled from the cache region unless a new value is needed:

      result = generate_something(5, 6)
      

      The function is also given an attribute invalidate(), which provides for invalidation of the value. Pass to invalidate() the same arguments you’d pass to the function itself to represent a particular value:

      generate_something.invalidate(5, 6)
      

      Another attribute set() is added to provide extra caching possibilities relative to the function. This is a convenience method for CacheRegion.set() which will store a given value directly without calling the decorated function. The value to be cached is passed as the first argument, and the arguments which would normally be passed to the function should follow:

      generate_something.set(3, 5, 6)
      

      The above example is equivalent to calling generate_something(5, 6), if the function were to produce the value 3 as the value to be cached.

      New in version 0.4.1: Added set() method to decorated function.

      Similar to set() is refresh(). This attribute will invoke the decorated function and populate a new value into the cache with the new value, as well as returning that value:

      newvalue = generate_something.refresh(5, 6)
      

      New in version 0.5.0: Added refresh() method to decorated function.

      The default key generation will use the name of the function, the module name for the function, the arguments passed, as well as an optional “namespace” parameter in order to generate a cache key.

      Given a function one inside the module myapp.tools:

      @region.cache_on_arguments(namespace="foo")
      def one(a, b):
          return a + b
      

      Above, calling one(3, 4) will produce a cache key as follows:

      myapp.tools:one|foo|3 4

      The key generator will ignore an initial argument of self or cls, making the decorator suitable (with caveats) for use with instance or class methods. Given the example:

      class MyClass(object):
          @region.cache_on_arguments(namespace="foo")
          def one(self, a, b):
              return a + b
      

      The cache key above for MyClass().one(3, 4) will again produce the same cache key of myapp.tools:one|foo|3 4 - the name self is skipped.

      The namespace parameter is optional, and is used normally to disambiguate two functions of the same name within the same module, as can occur when decorating instance or class methods as below:

      class MyClass(object):
          @region.cache_on_arguments(namespace='MC')
          def somemethod(self, x, y):
              ""
      
      class MyOtherClass(object):
          @region.cache_on_arguments(namespace='MOC')
          def somemethod(self, x, y):
              ""
      

      Above, the namespace parameter disambiguates between somemethod on MyClass and MyOtherClass. Python class declaration mechanics otherwise prevent the decorator from having awareness of the MyClass and MyOtherClass names, as the function is received by the decorator before it becomes an instance method.

      The function key generation can be entirely replaced on a per-region basis using the function_key_generator argument present on make_region() and CacheRegion. If defaults to function_key_generator().

      Parameters:
      • namespace – optional string argument which will be established as part of the cache key. This may be needed to disambiguate functions of the same name within the same source file, such as those associated with classes - note that the decorator itself can’t see the parent class on a function as the class is being declared.
      • expiration_time

        if not None, will override the normal expiration time.

        May be specified as a callable, taking no arguments, that returns a value to be used as the expiration_time. This callable will be called whenever the decorated function itself is called, in caching or retrieving. Thus, this can be used to determine a dynamic expiration time for the cached function result. Example use cases include “cache the result until the end of the day, week or time period” and “cache until a certain date or time passes”.

        Changed in version 0.5.0: expiration_time may be passed as a callable to CacheRegion.cache_on_arguments().

      • should_cache_fn

        passed to CacheRegion.get_or_create().

        New in version 0.4.3.

      • to_str

        callable, will be called on each function argument in order to convert to a string. Defaults to str(). If the function accepts non-ascii unicode arguments on Python 2.x, the unicode() builtin can be substituted, but note this will produce unicode cache keys which may require key mangling before reaching the cache.

        New in version 0.5.0.

      configure(backend, expiration_time=None, arguments=None, _config_argument_dict=None, _config_prefix=None, wrap=None)

      Configure a CacheRegion.

      The CacheRegion itself is returned.

      Parameters:
      • backend – Required. This is the name of the CacheBackend to use, and is resolved by loading the class from the dogpile.cache entrypoint.
      • expiration_time

        Optional. The expiration time passed to the dogpile system. May be passed as an integer number of seconds, or as a datetime.timedelta value.

        The CacheRegion.get_or_create() method as well as the CacheRegion.cache_on_arguments() decorator (though note: not the CacheRegion.get() method) will call upon the value creation function after this time period has passed since the last generation.

      • arguments – Optional. The structure here is passed directly to the constructor of the CacheBackend in use, though is typically a dictionary.
      • wrap

        Optional. A list of ProxyBackend classes and/or instances, each of which will be applied in a chain to ultimately wrap the original backend, so that custom functionality augmentation can be applied.

        New in version 0.5.0.

      configure_from_config(config_dict, prefix)

      Configure from a configuration dictionary and a prefix.

      Example:

      local_region = make_region()
      memcached_region = make_region()
      
      # regions are ready to use for function
      # decorators, but not yet for actual caching
      
      # later, when config is available
      myconfig = {
          "cache.local.backend":"dogpile.cache.dbm",
          "cache.local.arguments.filename":"/path/to/dbmfile.dbm",
          "cache.memcached.backend":"dogpile.cache.pylibmc",
          "cache.memcached.arguments.url":"127.0.0.1, 10.0.0.1",
      }
      local_region.configure_from_config(myconfig, "cache.local.")
      memcached_region.configure_from_config(myconfig,
                                          "cache.memcached.")
      
      delete(key)

      Remove a value from the cache.

      This operation is idempotent (can be called multiple times, or on a non-existent key, safely)

      delete_multi(keys)

      Remove multiple values from the cache.

      This operation is idempotent (can be called multiple times, or on a non-existent key, safely)

      New in version 0.5.0.

      get(key, expiration_time=None, ignore_expiration=False)

      Return a value from the cache, based on the given key.

      If the value is not present, the method returns the token NO_VALUE. NO_VALUE evaluates to False, but is separate from None to distinguish between a cached value of None.

      By default, the configured expiration time of the CacheRegion, or alternatively the expiration time supplied by the expiration_time argument, is tested against the creation time of the retrieved value versus the current time (as reported by time.time()). If stale, the cached value is ignored and the NO_VALUE token is returned. Passing the flag ignore_expiration=True bypasses the expiration time check.

      Changed in version 0.3.0: CacheRegion.get() now checks the value’s creation time against the expiration time, rather than returning the value unconditionally.

      The method also interprets the cached value in terms of the current “invalidation” time as set by the invalidate() method. If a value is present, but its creation time is older than the current invalidation time, the NO_VALUE token is returned. Passing the flag ignore_expiration=True bypasses the invalidation time check.

      New in version 0.3.0: Support for the CacheRegion.invalidate() method.

      Parameters:
      • key – Key to be retrieved. While it’s typical for a key to be a string, it is ultimately passed directly down to the cache backend, before being optionally processed by the key_mangler function, so can be of any type recognized by the backend or by the key_mangler function, if present.
      • expiration_time

        Optional expiration time value which will supersede that configured on the CacheRegion itself.

        New in version 0.3.0.

      • ignore_expiration

        if True, the value is returned from the cache if present, regardless of configured expiration times or whether or not invalidate() was called.

        New in version 0.3.0.

      get_multi(keys, expiration_time=None, ignore_expiration=False)

      Return multiple values from the cache, based on the given keys.

      Returns values as a list matching the keys given.

      E.g.:

      values = region.get_multi(["one", "two", "three"])
      

      To convert values to a dictionary, use zip():

      keys = ["one", "two", "three"]
      values = region.get_multi(keys)
      dictionary = dict(zip(keys, values))
      

      Keys which aren’t present in the list are returned as the NO_VALUE token. NO_VALUE evaluates to False, but is separate from None to distinguish between a cached value of None.

      By default, the configured expiration time of the CacheRegion, or alternatively the expiration time supplied by the expiration_time argument, is tested against the creation time of the retrieved value versus the current time (as reported by time.time()). If stale, the cached value is ignored and the NO_VALUE token is returned. Passing the flag ignore_expiration=True bypasses the expiration time check.

      New in version 0.5.0.

      get_or_create(key, creator, expiration_time=None, should_cache_fn=None)

      Return a cached value based on the given key.

      If the value does not exist or is considered to be expired based on its creation time, the given creation function may or may not be used to recreate the value and persist the newly generated value in the cache.

      Whether or not the function is used depends on if the dogpile lock can be acquired or not. If it can’t, it means a different thread or process is already running a creation function for this key against the cache. When the dogpile lock cannot be acquired, the method will block if no previous value is available, until the lock is released and a new value available. If a previous value is available, that value is returned immediately without blocking.

      If the invalidate() method has been called, and the retrieved value’s timestamp is older than the invalidation timestamp, the value is unconditionally prevented from being returned. The method will attempt to acquire the dogpile lock to generate a new value, or will wait until the lock is released to return the new value.

      Changed in version 0.3.0: The value is unconditionally regenerated if the creation time is older than the last call to invalidate().

      Parameters:
      • key – Key to be retrieved. While it’s typical for a key to be a string, it is ultimately passed directly down to the cache backend, before being optionally processed by the key_mangler function, so can be of any type recognized by the backend or by the key_mangler function, if present.
      • creator – function which creates a new value.
      • expiration_time – optional expiration time which will overide the expiration time already configured on this CacheRegion if not None. To set no expiration, use the value -1.
      • should_cache_fn

        optional callable function which will receive the value returned by the “creator”, and will then return True or False, indicating if the value should actually be cached or not. If it returns False, the value is still returned, but isn’t cached. E.g.:

        def dont_cache_none(value):
            return value is not None
        
        value = region.get_or_create("some key",
                            create_value,
                            should_cache_fn=dont_cache_none)
        

        Above, the function returns the value of create_value() if the cache is invalid, however if the return value is None, it won’t be cached.

        New in version 0.4.3.

      See also

      CacheRegion.cache_on_arguments() - applies get_or_create() to any function using a decorator.

      CacheRegion.get_or_create_multi() - multiple key/value version

      get_or_create_multi(keys, creator, expiration_time=None, should_cache_fn=None)

      Return a sequence of cached values based on a sequence of keys.

      The behavior for generation of values based on keys corresponds to that of Region.get_or_create(), with the exception that the creator() function may be asked to generate any subset of the given keys. The list of keys to be generated is passed to creator(), and creator() should return the generated values as a sequence corresponding to the order of the keys.

      The method uses the same approach as Region.get_multi() and Region.set_multi() to get and set values from the backend.

      Parameters:
      • keys – Sequence of keys to be retrieved.
      • creator – function which accepts a sequence of keys and returns a sequence of new values.
      • expiration_time – optional expiration time which will overide the expiration time already configured on this CacheRegion if not None. To set no expiration, use the value -1.
      • should_cache_fn – optional callable function which will receive each value returned by the “creator”, and will then return True or False, indicating if the value should actually be cached or not. If it returns False, the value is still returned, but isn’t cached.

      New in version 0.5.0.

      invalidate(hard=True)

      Invalidate this CacheRegion.

      Invalidation works by setting a current timestamp (using time.time()) representing the “minimum creation time” for a value. Any retrieved value whose creation time is prior to this timestamp is considered to be stale. It does not affect the data in the cache in any way, and is also local to this instance of CacheRegion.

      Once set, the invalidation time is honored by the CacheRegion.get_or_create(), CacheRegion.get_or_create_multi() and CacheRegion.get() methods.

      The method supports both “hard” and “soft” invalidation options. With “hard” invalidation, CacheRegion.get_or_create() will force an immediate regeneration of the value which all getters will wait for. With “soft” invalidation, subsequent getters will return the “old” value until the new one is available.

      Usage of “soft” invalidation requires that the region or the method is given a non-None expiration time.

      New in version 0.3.0.

      Parameters:hard

      if True, cache values will all require immediate regeneration; dogpile logic won’t be used. If False, the creation time of existing values will be pushed back before the expiration time so that a return+regen will be invoked.

      New in version 0.5.1.

      is_configured

      Return True if the backend has been configured via the CacheRegion.configure() method already.

      New in version 0.5.1.

      set(key, value)

      Place a new value in the cache under the given key.

      set_multi(mapping)

      Place new values in the cache under the given keys.

      New in version 0.5.0.

      wrap(proxy)

      Takes a ProxyBackend instance or class and wraps the attached backend.

      dogpile.cache.region.make_region(*arg, **kw)

      Instantiate a new CacheRegion.

      Currently, make_region() is a passthrough to CacheRegion. See that class for constructor arguments.

      dogpile.cache.region.value_version = 1

      An integer placed in the CachedValue so that new versions of dogpile.cache can detect cached values from a previous, backwards-incompatible version.

      dogpile.cache.util.function_key_generator(namespace, fn, to_str=<type 'str'>)

      Return a function that generates a string key, based on a given function as well as arguments to the returned function itself.

      This is used by CacheRegion.cache_on_arguments() to generate a cache key from a decorated function.

      It can be replaced using the function_key_generator argument passed to make_region().

      Backend API

      See the section Creating Backends for details on how to register new backends or Changing Backend Behavior for details on how to alter the behavior of existing backends.

      class dogpile.cache.api.CacheBackend(arguments)

      Base class for backend implementations.

      delete(key)

      Delete a value from the cache.

      The key will be whatever was passed to the registry, processed by the “key mangling” function, if any.

      The behavior here should be idempotent, that is, can be called any number of times regardless of whether or not the key exists.

      delete_multi(keys)

      Delete multiple values from the cache.

      The key will be whatever was passed to the registry, processed by the “key mangling” function, if any.

      The behavior here should be idempotent, that is, can be called any number of times regardless of whether or not the key exists.

      New in version 0.5.0.

      get(key)

      Retrieve a value from the cache.

      The returned value should be an instance of CachedValue, or NO_VALUE if not present.

      get_multi(keys)

      Retrieve multiple values from the cache.

      The returned value should be a list, corresponding to the list of keys given.

      New in version 0.5.0.

      get_mutex(key)

      Return an optional mutexing object for the given key.

      This object need only provide an acquire() and release() method.

      May return None, in which case the dogpile lock will use a regular threading.Lock object to mutex concurrent threads for value creation. The default implementation returns None.

      Different backends may want to provide various kinds of “mutex” objects, such as those which link to lock files, distributed mutexes, memcached semaphores, etc. Whatever kind of system is best suited for the scope and behavior of the caching backend.

      A mutex that takes the key into account will allow multiple regenerate operations across keys to proceed simultaneously, while a mutex that does not will serialize regenerate operations to just one at a time across all keys in the region. The latter approach, or a variant that involves a modulus of the given key’s hash value, can be used as a means of throttling the total number of value recreation operations that may proceed at one time.

      key_mangler = None

      Key mangling function.

      May be None, or otherwise declared as an ordinary instance method.

      set(key, value)

      Set a value in the cache.

      The key will be whatever was passed to the registry, processed by the “key mangling” function, if any. The value will always be an instance of CachedValue.

      set_multi(mapping)

      Set multiple values in the cache.

      The key will be whatever was passed to the registry, processed by the “key mangling” function, if any. The value will always be an instance of CachedValue.

      New in version 0.5.0.

      class dogpile.cache.api.CachedValue

      Represent a value stored in the cache.

      CachedValue is a two-tuple of (payload, metadata), where metadata is dogpile.cache’s tracking information ( currently the creation time). The metadata and tuple structure is pickleable, if the backend requires serialization.

      metadata

      Named accessor for the dogpile.cache metadata dictionary.

      payload

      Named accessor for the payload.

      dogpile.cache.api.NO_VALUE = <dogpile.cache.api.NoValue object at 0x1019f1b50>

      Value returned from get() that describes a key not present.

      class dogpile.cache.api.NoValue

      Describe a missing cache value.

      The NO_VALUE module global should be used.

      Backends

      Memory Backend

      Provides a simple dictionary-based backend.

      class dogpile.cache.backends.memory.MemoryBackend(arguments)

      A backend that uses a plain dictionary.

      There is no size management, and values which are placed into the dictionary will remain until explicitly removed. Note that Dogpile’s expiration of items is based on timestamps and does not remove them from the cache.

      E.g.:

      from dogpile.cache import make_region
      
      region = make_region().configure(
          'dogpile.cache.memory'
      )
      

      To use a Python dictionary of your choosing, it can be passed in with the cache_dict argument:

      my_dictionary = {}
      region = make_region().configure(
          'dogpile.cache.memory',
          arguments={
              "cache_dict":my_dictionary
          }
      )
      

      Memcached Backends

      Provides backends for talking to memcached.

      class dogpile.cache.backends.memcached.GenericMemcachedBackend(arguments)

      Base class for memcached backends.

      This base class accepts a number of paramters common to all backends.

      Parameters:
      • url – the string URL to connect to. Can be a single string or a list of strings. This is the only argument that’s required.
      • distributed_lock – boolean, when True, will use a memcached-lock as the dogpile lock (see MemcachedLock). Use this when multiple processes will be talking to the same memcached instance. When left at False, dogpile will coordinate on a regular threading mutex.
      • memcached_expire_time

        integer, when present will be passed as the time parameter to pylibmc.Client.set. This is used to set the memcached expiry time for a value.

        Note

        This parameter is different from Dogpile’s own expiration_time, which is the number of seconds after which Dogpile will consider the value to be expired. When Dogpile considers a value to be expired, it continues to use the value until generation of a new value is complete, when using CacheRegion.get_or_create(). Therefore, if you are setting memcached_expire_time, you’ll want to make sure it is greater than expiration_time by at least enough seconds for new values to be generated, else the value won’t be available during a regeneration, forcing all threads to wait for a regeneration each time a value expires.

      The GenericMemachedBackend uses a threading.local() object to store individual client objects per thread, as most modern memcached clients do not appear to be inherently threadsafe.

      In particular, threading.local() has the advantage over pylibmc’s built-in thread pool in that it automatically discards objects associated with a particular thread when that thread ends.

      client

      Return the memcached client.

      This uses a threading.local by default as it appears most modern memcached libs aren’t inherently threadsafe.

      set_arguments = {}

      Additional arguments which will be passed to the set() method.

      class dogpile.cache.backends.memcached.MemcachedBackend(arguments)

      A backend using the standard Python-memcached library.

      Example:

      from dogpile.cache import make_region
      
      region = make_region().configure(
          'dogpile.cache.memcached',
          expiration_time = 3600,
          arguments = {
              'url':"127.0.0.1:11211"
          }
      )
      
      class dogpile.cache.backends.memcached.PylibmcBackend(arguments)

      A backend for the pylibmc memcached client.

      A configuration illustrating several of the optional arguments described in the pylibmc documentation:

      from dogpile.cache import make_region
      
      region = make_region().configure(
          'dogpile.cache.pylibmc',
          expiration_time = 3600,
          arguments = {
              'url':["127.0.0.1"],
              'binary':True,
              'behaviors':{"tcp_nodelay": True,"ketama":True}
          }
      )
      

      Arguments accepted here include those of GenericMemcachedBackend, as well as those below.

      Parameters:
      • binary – sets the binary flag understood by pylibmc.Client.
      • behaviors – a dictionary which will be passed to pylibmc.Client as the behaviors parameter.
      • min_compress_len – Integer, will be passed as the min_compress_len parameter to the pylibmc.Client.set method.
      class dogpile.cache.backends.memcached.BMemcachedBackend(arguments)

      A backend for the python-binary-memcached memcached client.

      This is a pure Python memcached client which includes the ability to authenticate with a memcached server using SASL.

      A typical configuration using username/password:

      from dogpile.cache import make_region
      
      region = make_region().configure(
          'dogpile.cache.bmemcached',
          expiration_time = 3600,
          arguments = {
              'url':["127.0.0.1"],
              'username':'scott',
              'password':'tiger'
          }
      )
      

      Arguments which can be passed to the arguments dictionary include:

      Parameters:
      • username – optional username, will be used for SASL authentication.
      • password – optional password, will be used for SASL authentication.
      delete_multi(keys)

      python-binary-memcached api does not implements delete_multi

      class dogpile.cache.backends.memcached.MemcachedLock(client_fn, key)

      Simple distributed lock using memcached.

      This is an adaptation of the lock featured at http://amix.dk/blog/post/19386

      Redis Backends

      Provides backends for talking to Redis.

      class dogpile.cache.backends.redis.RedisBackend(arguments)

      A Redis backend, using the redis-py backend.

      Example configuration:

      from dogpile.cache import make_region
      
      region = make_region().configure(
          'dogpile.cache.redis',
          arguments = {
              'host': 'localhost',
              'port': 6379,
              'db': 0,
              'redis_expiration_time': 60*60*2,   # 2 hours
              'distributed_lock':True
              }
      )
      

      Arguments accepted in the arguments dictionary:

      Parameters:
      • url

        string. If provided, will override separate host/port/db params. The format is that accepted by StrictRedis.from_url().

        New in version 0.4.1.

      • host – string, default is localhost.
      • password

        string, default is no password.

        New in version 0.4.1.

      • port – integer, default is 6379.
      • db – integer, default is 0.
      • redis_expiration_time – integer, number of seconds after setting a value that Redis should expire it. This should be larger than dogpile’s cache expiration. By default no expiration is set.
      • distributed_lock – boolean, when True, will use a redis-lock as the dogpile lock. Use this when multiple processes will be talking to the same redis instance. When left at False, dogpile will coordinate on a regular threading mutex.
      • lock_timeout

        integer, number of seconds after acquiring a lock that Redis should expire it. This argument is only valid when distributed_lock is True.

        New in version 0.5.0.

      • lock_sleep

        integer, number of seconds to sleep when failed to acquire a lock. This argument is only valid when distributed_lock is True.

        New in version 0.5.0.

      File Backends

      Provides backends that deal with local filesystem access.

      class dogpile.cache.backends.file.DBMBackend(arguments)

      A file-backend using a dbm file to store keys.

      Basic usage:

      from dogpile.cache import make_region
      
      region = make_region().configure(
          'dogpile.cache.dbm',
          expiration_time = 3600,
          arguments = {
              "filename":"/path/to/cachefile.dbm"
          }
      )
      

      DBM access is provided using the Python anydbm module, which selects a platform-specific dbm module to use. This may be made to be more configurable in a future release.

      Note that different dbm modules have different behaviors. Some dbm implementations handle their own locking, while others don’t. The DBMBackend uses a read/write lockfile by default, which is compatible even with those DBM implementations for which this is unnecessary, though the behavior can be disabled.

      The DBM backend by default makes use of two lockfiles. One is in order to protect the DBM file itself from concurrent writes, the other is to coordinate value creation (i.e. the dogpile lock). By default, these lockfiles use the flock() system call for locking; this is only available on Unix platforms.

      Currently, the dogpile lock is against the entire DBM file, not per key. This means there can only be one “creator” job running at a time per dbm file.

      A future improvement might be to have the dogpile lock using a filename that’s based on a modulus of the key. Locking on a filename that uniquely corresponds to the key is problematic, since it’s not generally safe to delete lockfiles as the application runs, implying an unlimited number of key-based files would need to be created and never deleted.

      Parameters to the arguments dictionary are below.

      Parameters:
      • filename – path of the filename in which to create the DBM file. Note that some dbm backends will change this name to have additional suffixes.
      • rw_lockfile – the name of the file to use for read/write locking. If omitted, a default name is used by appending the suffix ”.rw.lock” to the DBM filename. If False, then no lock is used.
      • dogpile_lockfile – the name of the file to use for value creation, i.e. the dogpile lock. If omitted, a default name is used by appending the suffix ”.dogpile.lock” to the DBM filename. If False, then dogpile.cache uses the default dogpile lock, a plain thread-based mutex.
      class dogpile.cache.backends.file.FileLock(filename)

      Use lockfiles to coordinate read/write access to a file.

      Only works on Unix systems, using fcntl.flock().

      Proxy Backends

      Provides a utility and a decorator class that allow for modifying the behavior of different backends without altering the class itself or having to extend the base backend.

      New in version 0.5.0: Added support for the ProxyBackend class.

      class dogpile.cache.proxy.ProxyBackend(*args, **kwargs)

      A decorator class for altering the functionality of backends.

      Basic usage:

      from dogpile.cache import make_region
      from dogpile.cache.proxy import ProxyBackend
      
      class MyFirstProxy(ProxyBackend):
          def get(self, key):
              # ... custom code goes here ...
              return self.proxied.get(key)
      
          def set(self, key, value):
              # ... custom code goes here ...
              self.proxied.set(key)
      
      class MySecondProxy(ProxyBackend):
          def get(self, key):
              # ... custom code goes here ...
              return self.proxied.get(key)
      
      
      region = make_region().configure(
          'dogpile.cache.dbm',
          expiration_time = 3600,
          arguments = {
              "filename":"/path/to/cachefile.dbm"
          },
          wrap = [ MyFirstProxy, MySecondProxy ]
      )
      

      Classes that extend ProxyBackend can be stacked together. The .proxied property will always point to either the concrete backend instance or the next proxy in the chain that a method can be delegated towards.

      New in version 0.5.0.

      wrap(backend)

      Take a backend as an argument and setup the self.proxied property. Return an object that be used as a backend by a CacheRegion object.

      Plugins

      Mako Integration

      dogpile.cache includes a Mako plugin that replaces Beaker as the cache backend. Setup a Mako template lookup using the “dogpile.cache” cache implementation and a region dictionary:

      from dogpile.cache import make_region
      from mako.lookup import TemplateLookup
      
      my_regions = {
          "local":make_region().configure(
                      "dogpile.cache.dbm",
                      expiration_time=360,
                      arguments={"filename":"file.dbm"}
                  ),
          "memcached":make_region().configure(
                      "dogpile.cache.pylibmc",
                      expiration_time=3600,
                      arguments={"url":["127.0.0.1"]}
                  )
      }
      
      mako_lookup = TemplateLookup(
          directories=["/myapp/templates"],
          cache_impl="dogpile.cache",
          cache_args={
              'regions':my_regions
          }
      )
      

      To use the above configuration in a template, use the cached=True argument on any Mako tag which accepts it, in conjunction with the name of the desired region as the cache_region argument:

      <%def name="mysection()" cached="True" cache_region="memcached">
          some content that's cached
      </%def>
      class dogpile.cache.plugins.mako_cache.MakoPlugin(cache)

      A Mako CacheImpl which talks to dogpile.cache.

      Utilities

      dogpile.cache.util.function_key_generator(namespace, fn, to_str=<type 'str'>)

      Return a function that generates a string key, based on a given function as well as arguments to the returned function itself.

      This is used by CacheRegion.cache_on_arguments() to generate a cache key from a decorated function.

      It can be replaced using the function_key_generator argument passed to make_region().

      dogpile.cache.util.sha1_mangle_key(key)

      a SHA1 key mangler.

      dogpile.cache.util.length_conditional_mangler(length, mangler)

      a key mangler that mangles if the length of the key is past a certain threshold.

      Table Of Contents

      Previous topic

      Usage Guide

      Next topic

      Changelog

      This Page

      dogpile.cache-0.5.1/docs/build/0000755000076500000240000000000012225644023016775 5ustar classicstaff00000000000000dogpile.cache-0.5.1/docs/build/api.rst0000644000076500000240000000175112225642516020311 0ustar classicstaff00000000000000=== API === Region ====== .. automodule:: dogpile.cache.region :members: .. autofunction:: dogpile.cache.util.function_key_generator Backend API ============= See the section :ref:`creating_backends` for details on how to register new backends or :ref:`changing_backend_behavior` for details on how to alter the behavior of existing backends. .. automodule:: dogpile.cache.api :members: Backends ========== .. automodule:: dogpile.cache.backends.memory :members: .. automodule:: dogpile.cache.backends.memcached :members: .. automodule:: dogpile.cache.backends.redis :members: .. automodule:: dogpile.cache.backends.file :members: .. automodule:: dogpile.cache.proxy :members: Plugins ======== .. automodule:: dogpile.cache.plugins.mako_cache :members: Utilities ========= .. currentmodule:: dogpile.cache.util .. autofunction:: function_key_generator .. autofunction:: sha1_mangle_key .. autofunction:: length_conditional_mangler dogpile.cache-0.5.1/docs/build/builder.py0000644000076500000240000000041212225642516020777 0ustar classicstaff00000000000000 def autodoc_skip_member(app, what, name, obj, skip, options): if what == 'class' and skip and name in ('__init__',) and obj.__doc__: return False else: return skip def setup(app): app.connect('autodoc-skip-member', autodoc_skip_member) dogpile.cache-0.5.1/docs/build/changelog.rst0000644000076500000240000002652112225643265021473 0ustar classicstaff00000000000000============== Changelog ============== .. changelog:: :version: 0.5.1 :released: Thu Oct 10 2013 .. change:: :tags: feature :tickets: 38 The :meth:`.CacheRegion.invalidate` method now supports an option ``hard=True|False``. A "hard" invalidation, equivalent to the existing functionality of :meth:`.CacheRegion.invalidate`, means :meth:`.CacheRegion.get_or_create` will not return the "old" value at all, forcing all getters to regenerate or wait for a regeneration. "soft" invalidation means that getters can continue to return the old value until a new one is generated. .. change:: :tags: feature :tickets: 40 New dogpile-specific exception classes have been added, so that issues like "region already configured", "region unconfigured", raise dogpile-specific exceptions. Other exception classes have been made more specific. Also added new accessor :attr:`.CacheRegion.is_configured`. Pullreq courtesy Morgan Fainberg. .. change:: :tags: bug Erroneously missed when the same change was made for ``set()`` in 0.5.0, the Redis backend now uses ``pickle.HIGHEST_PROTOCOL`` for the ``set_multi()`` method as well when producing pickles. Courtesy Łukasz Fidosz. .. change:: :tags: bug, redis, py3k :tickets: 39 Fixed an errant ``u''`` causing incompatibility in Python3.2 in the Redis backend, courtesy Jimmey Mabey. .. change:: :tags: bug The :func:`.util.coerce_string_conf` method now correctly coerces negative integers and those with a leading + sign. This previously prevented configuring a :class:`.CacheRegion` with an ``expiration_time`` of ``'-1'``. Courtesy David Beitey. .. change:: :tags: bug The ``refresh()`` method on :meth:`.CacheRegion.cache_multi_on_arguments` now supports the ``asdict`` flag. .. changelog:: :version: 0.5.0 :released: Fri Jun 21 2013 .. change:: :tags: misc Source repository has been moved to git. .. change:: :tags: bug The Redis backend now uses ``pickle.HIGHEST_PROTOCOL`` when producing pickles. Courtesy Lx Yu. .. change:: :tags: bug :meth:`.CacheRegion.cache_on_arguments` now has a new argument ``to_str``, defaults to ``str()``. Can be replaced with ``unicode()`` or other functions to support caching of functions that accept non-unicode arguments. Initial patch courtesy Lx Yu. .. change:: :tags: feature Now using the ``Lock`` included with the Python ``redis`` backend, which adds ``lock_timeout`` and ``lock_sleep`` arguments to the :class:`.RedisBackend`. .. change:: :tags: feature :tickets: 33, 35 Added new methods :meth:`.CacheRegion.get_or_create_multi` and :meth:`.CacheRegion.cache_multi_on_arguments`, which make use of the :meth:`.CacheRegion.get_multi` and similar functions to store and retrieve multiple keys at once while maintaining dogpile semantics for each. .. change:: :tags: feature :tickets: 36 Added a method ``refresh()`` to functions decorated by :meth:`.CacheRegion.cache_on_arguments` and :meth:`.CacheRegion.cache_multi_on_arguments`, to complement ``invalidate()`` and ``set()``. .. change:: :tags: feature :tickets: 13 :meth:`.CacheRegion.configure` accepts an optional ``datetime.timedelta`` object for the ``expiration_time`` argument as well as an integer, courtesy Jack Lutz. .. change:: :tags: feature :tickets: 20 The ``expiration_time`` argument passed to :meth:`.CacheRegion.cache_on_arguments` may be a callable, to return a dynamic timeout value. Courtesy David Beitey. .. change:: :tags: feature :tickets: 26 Added support for simple augmentation of existing backends using the :class:`.ProxyBackend` class. Thanks to Tim Hanus for the great effort with development, testing, and documentation. .. change:: :tags: feature :pullreq: 14 Full support for multivalue get/set/delete added, using :meth:`.CacheRegion.get_multi`, :meth:`.CacheRegion.set_multi`, :meth:`.CacheRegion.delete_multi`, courtesy Marcos Araujo Sobrinho. .. change:: :tags: bug :tickets: 27 Fixed bug where the "name" parameter for :class:`.CacheRegion` was ignored entirely. Courtesy Wichert Akkerman. .. changelog:: :version: 0.4.3 :released: Thu Apr 4 2013 .. change:: :tags: bug Added support for the ``cache_timeout`` Mako argument to the Mako plugin, which will pass the value to the ``expiration_time`` argument of :meth:`.CacheRegion.get_or_create`. .. change:: :tags: feature :pullreq: 13 :meth:`.CacheRegion.get_or_create` and :meth:`.CacheRegion.cache_on_arguments` now accept a new argument ``should_cache_fn``, receives the value returned by the "creator" and then returns True or False, where True means "cache plus return", False means "return the value but don't cache it." .. changelog:: :version: 0.4.2 :released: Sat Jan 19 2013 .. change:: :tags: feature :pullreq: 10 An "async creator" function can be specified to :class:`.CacheRegion` which allows the "creation" function to be called asynchronously or be subsituted for another asynchronous creation scheme. Courtesy Ralph Bean. .. changelog:: :version: 0.4.1 :released: Sat Dec 15 2012 .. change:: :tags: feature :pullreq: 9 The function decorated by :meth:`.CacheRegion.cache_on_arguments` now includes a ``set()`` method, in addition to the existing ``invalidate()`` method. Like ``invalidate()``, it accepts a set of function arguments, but additionally accepts as the first positional argument a new value to place in the cache, to take the place of that key. Courtesy Antoine Bertin. .. change:: :tags: bug :tickets: 15 Fixed bug in DBM backend whereby if an error occurred during the "write" operation, the file lock, if enabled, would not be released, thereby deadlocking the app. .. change:: :tags: bug :tickets: 12 The :func:`.util.function_key_generator` used by the function decorator no longer coerces non-unicode arguments into a Python unicode object on Python 2.x; this causes failures on backends such as DBM which on Python 2.x apparently require bytestrings. The key_mangler is still needed if actual unicode arguments are being used by the decorated function, however. .. change:: :tags: feature Redis backend now accepts optional "url" argument, will be passed to the new ``StrictRedis.from_url()`` method to determine connection info. Courtesy Jon Rosebaugh. .. change:: :tags: feature Redis backend now accepts optional "password" argument. Courtesy Jon Rosebaugh. .. change:: :tags: feature DBM backend has "fallback" when calling dbm.get() to instead use dictionary access + KeyError, in the case that the "gdbm" backend is used which does not include .get(). Courtesy Jon Rosebaugh. .. changelog:: :version: 0.4.0 :released: Tue Oct 30 2012 .. change:: :tags: bug :tickets: 1 Using dogpile.core 0.4.0 now, fixes a critical bug whereby dogpile pileup could occur on first value get across multiple processes, due to reliance upon a non-shared creation time. This is a dogpile.core issue. .. change:: :tags: bug :tickets: Fixed missing __future__ with_statement directive in region.py. .. changelog:: :version: 0.3.1 :released: Tue Sep 25 2012 .. change:: :tags: bug :tickets: Fixed the mako_cache plugin which was not yet covered, and wasn't implementing the mako plugin API correctly; fixed docs as well. Courtesy Ben Hayden. .. change:: :tags: bug :tickets: Fixed setup so that the tests/* directory isn't yanked into the install. Courtesy Ben Hayden. .. changelog:: :version: 0.3.0 :released: Thu Jun 14 2012 .. change:: :tags: feature :tickets: get() method now checks expiration time by default. Use ignore_expiration=True to bypass this. .. change:: :tags: feature :tickets: 7 Added new invalidate() method. Sets the current timestamp as a minimum value that all retrieved values must be created after. Is honored by the get_or_create() and get() methods. .. change:: :tags: bug :tickets: 8 Fixed bug whereby region.get() didn't work if the value wasn't present. .. changelog:: :version: 0.2.4 :released: .. change:: :tags: :tickets: Fixed py3k issue with config string coerce, courtesy Alexander Fedorov .. changelog:: :version: 0.2.3 :released: Wed May 16 2012 .. change:: :tags: :tickets: 3 support "min_compress_len" and "memcached_expire_time" with python-memcached backend. Tests courtesy Justin Azoff .. change:: :tags: :tickets: 4 Add support for coercion of string config values to Python objects - ints, "false", "true", "None". .. change:: :tags: :tickets: 5 Added support to DBM file lock to allow reentrant access per key within a single thread, so that even though the DBM backend locks for the whole file, a creation function that calls upon a different key in the cache can still proceed. .. change:: :tags: :tickets: Fixed DBM glitch where multiple readers could be serialized. .. change:: :tags: :tickets: Adjust bmemcached backend to work with newly-repaired bmemcached calling API (see bmemcached ef206ed4473fec3b639e). .. changelog:: :version: 0.2.2 :released: Thu Apr 19 2012 .. change:: :tags: :tickets: add Redis backend, courtesy Ollie Rutherfurd .. changelog:: :version: 0.2.1 :released: Sun Apr 15 2012 .. change:: :tags: :tickets: move tests into tests/cache namespace .. change:: :tags: :tickets: py3k compatibility is in-place now, no 2to3 needed. .. changelog:: :version: 0.2.0 :released: Sat Apr 14 2012 .. change:: :tags: :tickets: Based on dogpile.core now, to get the package namespace thing worked out. .. changelog:: :version: 0.1.1 :released: Tue Apr 10 2012 .. change:: :tags: :tickets: Fixed the configure_from_config() method of region and backend which wasn't working. Courtesy Christian Klinger. .. changelog:: :version: 0.1.0 :released: Sun Apr 08 2012 .. change:: :tags: :tickets: Initial release. .. change:: :tags: :tickets: Includes a pylibmc backend and a plain dictionary backend. dogpile.cache-0.5.1/docs/build/conf.py0000644000076500000240000001554112225642516020307 0ustar classicstaff00000000000000# -*- coding: utf-8 -*- # # Dogpile.cache documentation build configuration file, created by # sphinx-quickstart on Sat May 1 12:47:55 2010. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys, os # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. #sys.path.append(os.path.abspath('.')) # If your extensions are in another directory, add it here. If the directory # is relative to the documentation root, use os.path.abspath to make it # absolute, like shown here. sys.path.insert(0, os.path.abspath('../../')) import dogpile.cache # -- General configuration ----------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'changelog'] changelog_sections = ["feature", "bug"] changelog_render_ticket = "https://bitbucket.org/zzzeek/dogpile.cache/issue/%s" changelog_render_pullreq = "https://bitbucket.org/zzzeek/dogpile.cache/pull-request/%s" changelog_render_changeset = "https://bitbucket.org/zzzeek/dogpile.cache/changeset/%s" # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8' # The master toctree document. master_doc = 'index' # General information about the project. project = u'dogpile.cache' copyright = u'2011-2013 Mike Bayer' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = dogpile.cache.__version__ # The full version, including alpha/beta/rc tags. release = dogpile.cache.__version__ # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of documents that shouldn't be included in the build. #unused_docs = [] # List of directories, relative to source directory, that shouldn't be searched # for source files. exclude_trees = [] # The reST default role (used for this markup: `text`) to use for all documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # -- Options for HTML output --------------------------------------------------- # The theme to use for HTML and HTML Help pages. Major themes that come with # Sphinx are currently 'default' and 'sphinxdoc'. html_theme = 'nature' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_use_modindex = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = '' # Output file base name for HTML help builder. htmlhelp_basename = 'dogpile.cachedoc' # -- Options for LaTeX output -------------------------------------------------- # The paper size ('letter' or 'a4'). #latex_paper_size = 'letter' # The font size ('10pt', '11pt' or '12pt'). #latex_font_size = '10pt' # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ('index', 'dogpile.cache.tex', u'Dogpile.Cache Documentation', u'Mike Bayer', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # Additional stuff for the LaTeX preamble. #latex_preamble = '' # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_use_modindex = True #{'python': ('http://docs.python.org/3.2', None)} dogpile.cache-0.5.1/docs/build/front.rst0000644000076500000240000000345412225642516020672 0ustar classicstaff00000000000000============ Front Matter ============ Information about the dogpile.cache project. Project Homepage ================ dogpile.cache is hosted on `Bitbucket `_ - the lead project page is at https://bitbucket.org/zzzeek/dogpile.cache. Source code is tracked here using Git. .. versionchanged:: 0.5.0 Moved source repository to git. Releases and project status are available on Pypi at http://pypi.python.org/pypi/dogpile.cache. The most recent published version of this documentation should be at http://dogpilecache.readthedocs.org. Installation ============ Install released versions of dogpile.cache from the Python package index with `pip `_ or a similar tool:: pip install dogpile.cache Installation via source distribution is via the ``setup.py`` script:: python setup.py install Community ========= dogpile.cache is developed by `Mike Bayer `_, and is loosely associated with the `Pylons Project `_. As dogpile.cache's usage increases, it is anticipated that the Pylons mailing list and IRC channel will become the primary channels for support. Bugs ==== Bugs and feature enhancements to dogpile.cache should be reported on the `Bitbucket issue tracker `_. If you're not sure that a particular issue is specific to either dogpile.cache or `dogpile.core `_, posting to the dogpile.cache tracker is likely the better place to post first. * `dogpile.cache issue tracker `_ (post here if unsure) * `dogpile.core issue tracker `_ dogpile.cache-0.5.1/docs/build/index.rst0000644000076500000240000000172712225642516020652 0ustar classicstaff00000000000000========================================== Welcome to dogpile.cache's documentation! ========================================== `dogpile.cache `_ provides a simple caching pattern based on the `dogpile.core `_ locking system, including rudimentary backends. It effectively completes the replacement of `Beaker `_ as far as caching (though **not** HTTP sessions) is concerned, providing an open-ended, simple, and higher-performing pattern to configure and use cache backends. New backends are very easy to create and use; users are encouraged to adapt the provided backends for their own needs, as high volume caching requires lots of tweaks and adjustments specific to an application and its environment. .. toctree:: :maxdepth: 2 front usage api changelog Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` dogpile.cache-0.5.1/docs/build/Makefile0000644000076500000240000000635012225642516020446 0ustar classicstaff00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = output # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml pickle json htmlhelp qthelp latex changes linkcheck doctest help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dist-html same as html, but places files in /doc" @echo " dirhtml to make HTML files named index.html in directories" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dist-html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) .. @echo @echo "Build finished. The HTML pages are in ../." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Alembic.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Alembic.qhc" latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \ "run these through (pdf)latex." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." dogpile.cache-0.5.1/docs/build/requirements.txt0000644000076500000240000000003012225642516022257 0ustar classicstaff00000000000000mako dogpile changelog dogpile.cache-0.5.1/docs/build/usage.rst0000644000076500000240000003111012225642516020634 0ustar classicstaff00000000000000============ Usage Guide ============ Overview ======== At the time of this writing, popular key/value servers include `Memcached `_, `Redis `_, and `Riak `_. While these tools all have different usage focuses, they all have in common that the storage model is based on the retrieval of a value based on a key; as such, they are all potentially suitable for caching, particularly Memcached which is first and foremost designed for caching. With a caching system in mind, dogpile.cache provides an interface to a particular Python API targeted at that system. A dogpile.cache configuration consists of the following components: * A *region*, which is an instance of :class:`.CacheRegion`, and defines the configuration details for a particular cache backend. The :class:`.CacheRegion` can be considered the "front end" used by applications. * A *backend*, which is an instance of :class:`.CacheBackend`, describing how values are stored and retrieved from a backend. This interface specifies only :meth:`~.CacheBackend.get`, :meth:`~.CacheBackend.set` and :meth:`~.CacheBackend.delete`. The actual kind of :class:`.CacheBackend` in use for a particular :class:`.CacheRegion` is determined by the underlying Python API being used to talk to the cache, such as Pylibmc. The :class:`.CacheBackend` is instantiated behind the scenes and not directly accessed by applications under normal circumstances. * Value generation functions. These are user-defined functions that generate new values to be placed in the cache. While dogpile.cache offers the usual "set" approach of placing data into the cache, the usual mode of usage is to only instruct it to "get" a value, passing it a *creation function* which will be used to generate a new value if and only if one is needed. This "get-or-create" pattern is the entire key to the "Dogpile" system, which coordinates a single value creation operation among many concurrent get operations for a particular key, eliminating the issue of an expired value being redundantly re-generated by many workers simultaneously. Rudimentary Usage ================= dogpile.cache includes a Pylibmc backend. A basic configuration looks like:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], } ) @region.cache_on_arguments() def load_user_info(user_id): return some_database.lookup_user_by_id(user_id) .. sidebar:: pylibmc In this section, we're illustrating Memcached usage using the `pylibmc `_ backend, which is a high performing Python library for Memcached. It can be compared to the `python-memcached `_ client, which is also an excellent product. Pylibmc is written against Memcached's native API so is markedly faster, though might be considered to have rougher edges. The API is actually a bit more verbose to allow for correct multithreaded usage. Above, we create a :class:`.CacheRegion` using the :func:`.make_region` function, then apply the backend configuration via the :meth:`.CacheRegion.configure` method, which returns the region. The name of the backend is the only argument required by :meth:`.CacheRegion.configure` itself, in this case ``dogpile.cache.pylibmc``. However, in this specific case, the ``pylibmc`` backend also requires that the URL of the memcached server be passed within the ``arguments`` dictionary. The configuration is separated into two sections. Upon construction via :func:`.make_region`, the :class:`.CacheRegion` object is available, typically at module import time, for usage in decorating functions. Additional configuration details passed to :meth:`.CacheRegion.configure` are typically loaded from a configuration file and therefore not necessarily available until runtime, hence the two-step configurational process. Key arguments passed to :meth:`.CacheRegion.configure` include *expiration_time*, which is the expiration time passed to the Dogpile lock, and *arguments*, which are arguments used directly by the backend - in this case we are using arguments that are passed directly to the pylibmc module. Region Configuration ==================== The :func:`.make_region` function currently calls the :class:`.CacheRegion` constructor directly. .. autoclass:: dogpile.cache.region.CacheRegion :noindex: One you have a :class:`.CacheRegion`, the :meth:`.CacheRegion.cache_on_arguments` method can be used to decorate functions, but the cache itself can't be used until :meth:`.CacheRegion.configure` is called. The interface for that method is as follows: .. automethod:: dogpile.cache.region.CacheRegion.configure :noindex: The :class:`.CacheRegion` can also be configured from a dictionary, using the :meth:`.CacheRegion.configure_from_config` method: .. automethod:: dogpile.cache.region.CacheRegion.configure_from_config :noindex: Using a Region ============== The :class:`.CacheRegion` object is our front-end interface to a cache. It includes the following methods: .. automethod:: dogpile.cache.region.CacheRegion.get :noindex: .. automethod:: dogpile.cache.region.CacheRegion.get_or_create :noindex: .. automethod:: dogpile.cache.region.CacheRegion.set :noindex: .. automethod:: dogpile.cache.region.CacheRegion.delete :noindex: .. automethod:: dogpile.cache.region.CacheRegion.cache_on_arguments :noindex: .. _creating_backends: Creating Backends ================= Backends are located using the setuptools entrypoint system. To make life easier for writers of ad-hoc backends, a helper function is included which registers any backend in the same way as if it were part of the existing sys.path. For example, to create a backend called ``DictionaryBackend``, we subclass :class:`.CacheBackend`:: from dogpile.cache.api import CacheBackend, NO_VALUE class DictionaryBackend(CacheBackend): def __init__(self, arguments): self.cache = {} def get(self, key): return self.cache.get(key, NO_VALUE) def set(self, key, value): self.cache[key] = value def delete(self, key): self.cache.pop(key) Then make sure the class is available underneath the entrypoint ``dogpile.cache``. If we did this in a ``setup.py`` file, it would be in ``setup()`` as:: entry_points=""" [dogpile.cache] dictionary = mypackage.mybackend:DictionaryBackend """ Alternatively, if we want to register the plugin in the same process space without bothering to install anything, we can use ``register_backend``:: from dogpile.cache import register_backend register_backend("dictionary", "mypackage.mybackend", "DictionaryBackend") Our new backend would be usable in a region like this:: from dogpile.cache import make_region region = make_region("dictionary") data = region.set("somekey", "somevalue") The values we receive for the backend here are instances of ``CachedValue``. This is a tuple subclass of length two, of the form:: (payload, metadata) Where "payload" is the thing being cached, and "metadata" is information we store in the cache - a dictionary which currently has just the "creation time" and a "version identifier" as key/values. If the cache backend requires serialization, pickle or similar can be used on the tuple - the "metadata" portion will always be a small and easily serializable Python structure. .. _changing_backend_behavior: Changing Backend Behavior ========================= The :class:`.ProxyBackend` is a decorator class provided to easily augment existing backend behavior without having to extend the original class. Using a decorator class is also adventageous as it allows us to share the altered behavior between different backends. Proxies are added to the :class:`.CacheRegion` object using the :meth:`.CacheRegion.configure` method. Only the overridden methods need to be specified and the real backend can be accessed with the ``self.proxied`` object from inside the :class:`.ProxyBackend`. For example, a simple class to log all calls to ``.set()`` would look like this:: from dogpile.cache.proxy import ProxyBackend import logging log = logging.getLogger(__name__) class LoggingProxy(ProxyBackend): def set(self, key, value): log.debug('Setting Cache Key: %s' % key) self.proxied.set(key, value) :class:`.ProxyBackend` can be be configured to optionally take arguments (as long as the :meth:`.ProxyBackend.__init__` method is called properly, either directly or via ``super()``. In the example below, the ``RetryDeleteProxy`` class accepts a ``retry_count`` parameter on initialization. In the event of an exception on delete(), it will retry this many times before returning:: from dogpile.cache.proxy import ProxyBackend class RetryDeleteProxy(ProxyBackend): def __init__(self, retry_count=5): super(RetryDeleteProxy, self).__init__() self.retry_count = retry_count def delete(self, key): retries = self.retry_count while retries > 0: retries -= 1 try: self.proxied.delete(key) return except: pass The ``wrap`` parameter of the :meth:`.CacheRegion.configure` accepts a list which can contain any combination of instantiated proxy objects as well as uninstantiated proxy classes. Putting the two examples above together would look like this:: from dogpile.cache import make_region retry_proxy = RetryDeleteProxy(5) region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], }, wrap = [ LoggingProxy, retry_proxy ] ) In the above example, the ``LoggingProxy`` object would be instantated by the :class:`.CacheRegion` and applied to wrap requests on behalf of the ``retry_proxy`` instance; that proxy in turn wraps requests on behalf of the original dogpile.cache.pylibmc backend. .. versionadded:: 0.4.4 Added support for the :class:`.ProxyBackend` class. Recipes ======= Invalidating a group of related keys ------------------------------------- This recipe presents a way to track the cache keys related to a particular region, for the purposes of invalidating a series of keys that relate to a particular id. Three cached functions, ``user_fn_one()``, ``user_fn_two()``, ``user_fn_three()`` each perform a different function based on a ``user_id`` integer value. The region applied to cache them uses a custom key generator which tracks each cache key generated, pulling out the integer "id" and replacing with a template. When all three functions have been called, the key generator is now aware of these three keys: ``user_fn_one_%d``, ``user_fn_two_%d``, and ``user_fn_three_%d``. The ``invalidate_user_id()`` function then knows that for a particular ``user_id``, it needs to hit all three of those keys in order to invalidate everything having to do with that id. :: from dogpile.cache import make_region from itertools import count user_keys = set() def my_key_generator(namespace, fn): fname = fn.__name__ def generate_key(*arg): # generate a key template: # "fname_%d_arg1_arg2_arg3..." key_template = fname + "_" + \ "%d" + \ "_".join(str(s) for s in arg[1:]) # store key template user_keys.add(key_template) # return cache key user_id = arg[0] return key_template % user_id return generate_key def invalidate_user_id(region, user_id): for key in user_keys: region.delete(key % user_id) region = make_region( function_key_generator=my_key_generator ).configure( "dogpile.cache.memory" ) counter = count() @region.cache_on_arguments() def user_fn_one(user_id): return "user fn one: %d, %d" % (next(counter), user_id) @region.cache_on_arguments() def user_fn_two(user_id): return "user fn two: %d, %d" % (next(counter), user_id) @region.cache_on_arguments() def user_fn_three(user_id): return "user fn three: %d, %d" % (next(counter), user_id) print user_fn_one(5) print user_fn_two(5) print user_fn_three(7) print user_fn_two(7) invalidate_user_id(region, 5) print "invalidated:" print user_fn_one(5) print user_fn_two(5) print user_fn_three(7) print user_fn_two(7) dogpile.cache-0.5.1/docs/changelog.html0000644000076500000240000011474512225643512020531 0ustar classicstaff00000000000000 Changelog — dogpile.cache 0.5.1 documentation

      Changelog

      0.5.1

      Released: Thu Oct 10 2013

      feature

      • [feature] The CacheRegion.invalidate() method now supports an option hard=True|False. A “hard” invalidation, equivalent to the existing functionality of CacheRegion.invalidate(), means CacheRegion.get_or_create() will not return the “old” value at all, forcing all getters to regenerate or wait for a regeneration. “soft” invalidation means that getters can continue to return the old value until a new one is generated.(link)

        #38

      • [feature] New dogpile-specific exception classes have been added, so that issues like “region already configured”, “region unconfigured”, raise dogpile-specific exceptions. Other exception classes have been made more specific. Also added new accessor CacheRegion.is_configured. Pullreq courtesy Morgan Fainberg.(link)

        #40

      bug

      • [bug] Erroneously missed when the same change was made for set() in 0.5.0, the Redis backend now uses pickle.HIGHEST_PROTOCOL for the set_multi() method as well when producing pickles. Courtesy Łukasz Fidosz.(link)

      • [bug] [redis] [py3k] Fixed an errant u'' causing incompatibility in Python3.2 in the Redis backend, courtesy Jimmey Mabey.(link)

        #39

      • [bug] The util.coerce_string_conf() method now correctly coerces negative integers and those with a leading + sign. This previously prevented configuring a CacheRegion with an expiration_time of '-1'. Courtesy David Beitey.(link)

      • [bug] The refresh() method on CacheRegion.cache_multi_on_arguments() now supports the asdict flag.(link)

      0.5.0

      Released: Fri Jun 21 2013

      feature

      bug

      • [bug] The Redis backend now uses pickle.HIGHEST_PROTOCOL when producing pickles. Courtesy Lx Yu.(link)

      • [bug] CacheRegion.cache_on_arguments() now has a new argument to_str, defaults to str(). Can be replaced with unicode() or other functions to support caching of functions that accept non-unicode arguments. Initial patch courtesy Lx Yu.(link)

      • [bug] Fixed bug where the “name” parameter for CacheRegion was ignored entirely. Courtesy Wichert Akkerman.(link)

        #27

      misc

      • [misc] Source repository has been moved to git.(link)

      0.4.3

      Released: Thu Apr 4 2013

      feature

      bug

      • [bug] Added support for the cache_timeout Mako argument to the Mako plugin, which will pass the value to the expiration_time argument of CacheRegion.get_or_create().(link)

      0.4.2

      Released: Sat Jan 19 2013

      feature

      • [feature] An “async creator” function can be specified to CacheRegion which allows the “creation” function to be called asynchronously or be subsituted for another asynchronous creation scheme. Courtesy Ralph Bean.(link)

        pull request 10

      0.4.1

      Released: Sat Dec 15 2012

      feature

      • [feature] The function decorated by CacheRegion.cache_on_arguments() now includes a set() method, in addition to the existing invalidate() method. Like invalidate(), it accepts a set of function arguments, but additionally accepts as the first positional argument a new value to place in the cache, to take the place of that key. Courtesy Antoine Bertin.(link)

        pull request 9

      • [feature] Redis backend now accepts optional “url” argument, will be passed to the new StrictRedis.from_url() method to determine connection info. Courtesy Jon Rosebaugh.(link)

      • [feature] Redis backend now accepts optional “password” argument. Courtesy Jon Rosebaugh.(link)

      • [feature] DBM backend has “fallback” when calling dbm.get() to instead use dictionary access + KeyError, in the case that the “gdbm” backend is used which does not include .get(). Courtesy Jon Rosebaugh.(link)

      bug

      • [bug] Fixed bug in DBM backend whereby if an error occurred during the “write” operation, the file lock, if enabled, would not be released, thereby deadlocking the app.(link)

        #15

      • [bug] The util.function_key_generator() used by the function decorator no longer coerces non-unicode arguments into a Python unicode object on Python 2.x; this causes failures on backends such as DBM which on Python 2.x apparently require bytestrings. The key_mangler is still needed if actual unicode arguments are being used by the decorated function, however.(link)

        #12

      0.4.0

      Released: Tue Oct 30 2012

      bug

      • [bug] Using dogpile.core 0.4.0 now, fixes a critical bug whereby dogpile pileup could occur on first value get across multiple processes, due to reliance upon a non-shared creation time. This is a dogpile.core issue.(link)

        #1

      • [bug] Fixed missing __future__ with_statement directive in region.py.(link)

      0.3.1

      Released: Tue Sep 25 2012

      bug

      • [bug] Fixed the mako_cache plugin which was not yet covered, and wasn’t implementing the mako plugin API correctly; fixed docs as well. Courtesy Ben Hayden.(link)

      • [bug] Fixed setup so that the tests/* directory isn’t yanked into the install. Courtesy Ben Hayden.(link)

      0.3.0

      Released: Thu Jun 14 2012

      feature

      • [feature] get() method now checks expiration time by default. Use ignore_expiration=True to bypass this.(link)

      • [feature] Added new invalidate() method. Sets the current timestamp as a minimum value that all retrieved values must be created after. Is honored by the get_or_create() and get() methods.(link)

        #7

      bug

      • [bug] Fixed bug whereby region.get() didn’t work if the value wasn’t present.(link)

        #8

      0.2.4

      no release date
      • Fixed py3k issue with config string coerce, courtesy Alexander Fedorov(link)

      0.2.3

      Released: Wed May 16 2012
      • support “min_compress_len” and “memcached_expire_time” with python-memcached backend. Tests courtesy Justin Azoff(link)

        #3

      • Add support for coercion of string config values to Python objects - ints, “false”, “true”, “None”.(link)

        #4

      • Added support to DBM file lock to allow reentrant access per key within a single thread, so that even though the DBM backend locks for the whole file, a creation function that calls upon a different key in the cache can still proceed.(link)

        #5

      • Fixed DBM glitch where multiple readers could be serialized.(link)

      • Adjust bmemcached backend to work with newly-repaired bmemcached calling API (see bmemcached ef206ed4473fec3b639e).(link)

      0.2.2

      Released: Thu Apr 19 2012
      • add Redis backend, courtesy Ollie Rutherfurd(link)

      0.2.1

      Released: Sun Apr 15 2012
      • move tests into tests/cache namespace(link)

      • py3k compatibility is in-place now, no 2to3 needed.(link)

      0.2.0

      Released: Sat Apr 14 2012
      • Based on dogpile.core now, to get the package namespace thing worked out.(link)

      0.1.1

      Released: Tue Apr 10 2012
      • Fixed the configure_from_config() method of region and backend which wasn’t working. Courtesy Christian Klinger.(link)

      0.1.0

      Released: Sun Apr 08 2012
      • Initial release.(link)

      • Includes a pylibmc backend and a plain dictionary backend.(link)

      Table Of Contents

      Previous topic

      API

      This Page

      dogpile.cache-0.5.1/docs/front.html0000644000076500000240000001722112225642570017724 0ustar classicstaff00000000000000 Front Matter — dogpile.cache 0.5.1 documentation

      Front Matter

      Information about the dogpile.cache project.

      Project Homepage

      dogpile.cache is hosted on Bitbucket - the lead project page is at https://bitbucket.org/zzzeek/dogpile.cache. Source code is tracked here using Git.

      Changed in version 0.5.0: Moved source repository to git.

      Releases and project status are available on Pypi at http://pypi.python.org/pypi/dogpile.cache.

      The most recent published version of this documentation should be at http://dogpilecache.readthedocs.org.

      Installation

      Install released versions of dogpile.cache from the Python package index with pip or a similar tool:

      pip install dogpile.cache

      Installation via source distribution is via the setup.py script:

      python setup.py install

      Community

      dogpile.cache is developed by Mike Bayer, and is loosely associated with the Pylons Project. As dogpile.cache’s usage increases, it is anticipated that the Pylons mailing list and IRC channel will become the primary channels for support.

      Bugs

      Bugs and feature enhancements to dogpile.cache should be reported on the Bitbucket issue tracker. If you’re not sure that a particular issue is specific to either dogpile.cache or dogpile.core, posting to the dogpile.cache tracker is likely the better place to post first.

      Table Of Contents

      Previous topic

      Welcome to dogpile.cache’s documentation!

      Next topic

      Usage Guide

      This Page

      dogpile.cache-0.5.1/docs/genindex.html0000644000076500000240000003450212225643512020373 0ustar classicstaff00000000000000 Index — dogpile.cache 0.5.1 documentation

      Index

      B | C | D | F | G | I | K | L | M | N | P | R | S | V | W

      B

      BMemcachedBackend (class in dogpile.cache.backends.memcached)

      C

      cache_multi_on_arguments() (dogpile.cache.region.CacheRegion method)
      cache_on_arguments() (dogpile.cache.region.CacheRegion method)
      CacheBackend (class in dogpile.cache.api)
      CachedValue (class in dogpile.cache.api)
      CacheRegion (class in dogpile.cache.region)
      client (dogpile.cache.backends.memcached.GenericMemcachedBackend attribute)
      configure() (dogpile.cache.region.CacheRegion method)
      configure_from_config() (dogpile.cache.region.CacheRegion method)

      D

      DBMBackend (class in dogpile.cache.backends.file)
      delete() (dogpile.cache.api.CacheBackend method)
      (dogpile.cache.region.CacheRegion method)
      delete_multi() (dogpile.cache.api.CacheBackend method)
      (dogpile.cache.backends.memcached.BMemcachedBackend method)
      (dogpile.cache.region.CacheRegion method)
      dogpile.cache.api (module)
      dogpile.cache.backends.file (module)
      dogpile.cache.backends.memcached (module)
      dogpile.cache.backends.memory (module)
      dogpile.cache.backends.redis (module)
      dogpile.cache.plugins.mako_cache (module)
      dogpile.cache.proxy (module)
      dogpile.cache.region (module)

      F

      FileLock (class in dogpile.cache.backends.file)
      function_key_generator() (in module dogpile.cache.util), [1]

      G

      GenericMemcachedBackend (class in dogpile.cache.backends.memcached)
      get() (dogpile.cache.api.CacheBackend method)
      (dogpile.cache.region.CacheRegion method)
      get_multi() (dogpile.cache.api.CacheBackend method)
      (dogpile.cache.region.CacheRegion method)
      get_mutex() (dogpile.cache.api.CacheBackend method)
      get_or_create() (dogpile.cache.region.CacheRegion method)
      get_or_create_multi() (dogpile.cache.region.CacheRegion method)

      I

      invalidate() (dogpile.cache.region.CacheRegion method)
      is_configured (dogpile.cache.region.CacheRegion attribute)

      K

      key_mangler (dogpile.cache.api.CacheBackend attribute)

      L

      length_conditional_mangler() (in module dogpile.cache.util)

      M

      make_region() (in module dogpile.cache.region)
      MakoPlugin (class in dogpile.cache.plugins.mako_cache)
      MemcachedBackend (class in dogpile.cache.backends.memcached)
      MemcachedLock (class in dogpile.cache.backends.memcached)
      MemoryBackend (class in dogpile.cache.backends.memory)
      metadata (dogpile.cache.api.CachedValue attribute)

      N

      NO_VALUE (in module dogpile.cache.api)
      NoValue (class in dogpile.cache.api)

      P

      payload (dogpile.cache.api.CachedValue attribute)
      ProxyBackend (class in dogpile.cache.proxy)
      PylibmcBackend (class in dogpile.cache.backends.memcached)

      R

      RedisBackend (class in dogpile.cache.backends.redis)

      S

      set() (dogpile.cache.api.CacheBackend method)
      (dogpile.cache.region.CacheRegion method)
      set_arguments (dogpile.cache.backends.memcached.GenericMemcachedBackend attribute)
      set_multi() (dogpile.cache.api.CacheBackend method)
      (dogpile.cache.region.CacheRegion method)
      sha1_mangle_key() (in module dogpile.cache.util)

      V

      value_version (in module dogpile.cache.region)

      W

      wrap() (dogpile.cache.proxy.ProxyBackend method)
      (dogpile.cache.region.CacheRegion method)
      dogpile.cache-0.5.1/docs/index.html0000644000076500000240000002157112225643512017703 0ustar classicstaff00000000000000 Welcome to dogpile.cache’s documentation! — dogpile.cache 0.5.1 documentation

      Welcome to dogpile.cache’s documentation!

      dogpile.cache provides a simple caching pattern based on the dogpile.core locking system, including rudimentary backends. It effectively completes the replacement of Beaker as far as caching (though not HTTP sessions) is concerned, providing an open-ended, simple, and higher-performing pattern to configure and use cache backends. New backends are very easy to create and use; users are encouraged to adapt the provided backends for their own needs, as high volume caching requires lots of tweaks and adjustments specific to an application and its environment.

      Indices and tables

      Table Of Contents

      Next topic

      Front Matter

      This Page

      dogpile.cache-0.5.1/docs/py-modindex.html0000644000076500000240000001222612225643512021026 0ustar classicstaff00000000000000 Python Module Index — dogpile.cache 0.5.1 documentation
      dogpile.cache-0.5.1/docs/search.html0000644000076500000240000000657312225643512020046 0ustar classicstaff00000000000000 Search — dogpile.cache 0.5.1 documentation

      Search

      Please activate JavaScript to enable the search functionality.

      From here you can search these documents. Enter your search words into the box below and click "search". Note that the search function will automatically search for all of the words. Pages containing fewer words won't appear in the result list.

      dogpile.cache-0.5.1/docs/searchindex.js0000644000076500000240000003364512225643512020546 0ustar classicstaff00000000000000Search.setIndex({envversion:42,terms:{all:[4,2,3],code:[0,2],edg:4,chain:[4,2],illustr:[4,2],queri:[4,2],global:2,focus:4,prefix:[4,2],sleep:2,client_fn:2,abil:2,follow:[4,2],gdbm:3,whose:2,depend:[4,2],advantag:2,mysecondproxi:2,concret:2,my_multi_key_gener:[4,2],those:[4,2,3],skip:[4,2],deriv:[4,2],adapt:[1,2],multivalu:3,"case":[4,2,3],uninstanti:4,string:[4,2,3],fals:[4,2,3],account:2,mechan:[4,2],jack:3,failur:3,veri:1,affect:2,entrypoint:[4,2],entry_point:4,condition:[4,2],recip:[],highest_protocol:3,hour:2,level:[4,2],did:4,overid:[4,2],list:[0,4,2],appropri:[4,2],"try":[4,2],retry_count:4,register_backend:4,adjust:[1,3],small:4,somemethod:[4,2],readthedoc:0,user_fn_two_:4,zip:2,prevent:[4,2,3],impli:2,properti:2,pickleabl:2,direct:3,sign:3,past:2,second:[4,2],design:4,pass:[4,2,3],run:[4,2],delete_multi:[2,3],port:2,append:2,compat:[2,3],index:[0,1],deleg:2,insid:[4,2],compar:4,neg:3,section:[4,2],invok:[4,2],current:[4,2,3],delet:[4,2,3],version:[0,4,2],"new":[4,1,2,3],method:[4,2,3],can:[4,2,3],celeri:[4,2],elimin:4,full:3,hash:[4,2],"_config_argument_dict":[4,2],cachedvalu:[4,2],gener:[4,2,3],never:2,here:[0,4,2],coerc:[4,2,3],inher:2,path:[4,2],becom:[0,4,2],modifi:2,accessor:[2,3],valu:[4,2,3],wait:[4,2,3],incom:[4,2],convert:[4,2],purpos:4,larger:2,host:[0,2],subsitut:3,queue:[4,2],datetim:[4,2,3],implement:[4,2,3],honor:[2,3],retrydeleteproxi:4,semant:3,via:[0,4,2],regardless:[4,2],extra:[4,2],apr:3,appli:[4,2],modul:[4,1,2],no_valu:[4,2],filenam:[4,2],unix:2,"boolean":2,sobrinho:3,"0x101989848":[4,2],instal:[],total:2,establish:[4,2],select:2,from:[0,4,2],ketama:2,asdict:[2,3],config_dict:[4,2],regist:[4,2],two:[4,2],next:[4,2],key3:2,call:[4,2,3],value2:2,value1:2,recommend:[4,2],function_multi_key_gener:[4,2],basi:[4,2],type:[4,2],until:[4,2,3],more:[4,2,3],beaker:[1,2],desir:2,idempot:[4,2],share:[4,3],under:[4,2],enhanc:0,flag:[4,2,3],indic:2,particular:[0,4,2],actual:[4,2,3],effort:3,cach:[0,2],behalf:4,must:3,none:[4,2,3],anydbm:2,join:[4,2],left:2,augment:[4,2,3],setup:[0,4,2,3],work:[2,3],uniqu:2,oct:3,remain:2,minimum:[2,3],whatev:[4,2],caveat:[4,2],dec:3,def:[4,2],problemat:2,overrid:[4,2],defer:[4,2],memcached_expire_tim:[2,3],process:[4,2,3],lock:[4,1,2,3],myfirstproxi:2,accept:[4,2,3],high:[4,1],critic:3,sourc:[0,4,2,3],want:[4,2],set_multi:[2,3],serial:[4,2,3],async_cr:[4,2],occur:[4,2,3],alwai:[4,2],end:[4,1,2],hoc:4,thing:[4,3],rather:[4,2],rw_lockfil:2,ordinari:2,write:[4,2,3],how:[4,2],sever:2,pure:2,instead:3,somevalu:4,config:[4,2,3],map:2,product:4,recogn:[4,2],overridden:4,after:[4,2,3],variant:2,usabl:4,befor:[4,2],function_key_gener:[4,2,3],date:[4,2,3],multipl:[4,2,3],bertin:3,makoplugin:2,associ:[0,4,2],circumst:4,attempt:[4,2],"__name__":[4,2],counter:4,correspond:2,produc:[4,2,3],mangl:[4,2],inform:[0,4,2],maintain:3,combin:[4,2],amix:2,allow:[4,2,3],anoth:[4,2,3],callabl:[4,2,3],volum:1,fallback:3,first:[0,4,2,3],order:[4,2],talk:[4,2],configure_from_config:[4,2,3],user_fn_three_:4,help:[4,2],anticip:0,over:2,move:[0,3],my_funct:[4,2],mabei:3,scott:2,lockfil:2,appar:3,still:[4,2,3],dynam:[4,2,3],paramet:[4,2,3],conjunct:[4,2],thank:3,memorybackend:2,fix:[4,2,3],better:0,platform:2,bypass:[4,2,3],persist:[4,2],mail:0,therefor:[4,2],might:[4,2],alter:[4,2],non:[4,2,3],lock_sleep:[2,3],dogpilecach:0,greater:2,thei:[4,2],python:[0,4,2,3],timestamp:[4,2,3],safe:[4,2],dai:[4,2],initi:[4,2,3],number:[4,2],underneath:4,therebi:3,front:2,now:[4,2,3],term:[4,2],itself:[4,2],name:[4,2,3],anyth:4,even:[2,3],simpl:[4,1,2,3],didn:3,unlik:2,refresh:[4,2,3],separ:[4,2],easili:4,token:[4,2],mode:4,timeout:3,each:[4,2,3],debug:4,klinger:3,unicod:[4,2,3],complet:[1,2],mean:[4,2,3],subset:2,regener:[4,2,3],replac:[4,1,2,3],individu:2,hard:[2,3],continu:[2,3],user_fn_thre:4,cacheimpl:2,connect:[2,3],our:4,flock:2,mangler:[4,2],event:4,out:[4,3],timedelta:[4,2,3],item:2,goe:2,antoin:3,"_config_prefix":[4,2],publish:0,payload:[4,2],content:2,hayden:3,mybackend:4,"0x1019f1b50":2,suitabl:[4,2],rel:[4,2],reader:3,print:4,mypackag:4,correct:4,async_creation_runn:[4,2],cachebackend:[4,2],cache_multi_on_argu:[4,2,3],given:[4,2],standard:2,pip:0,base:[4,1,2,3],bayer:0,dictionari:[4,2,3],put:4,org:0,cache_dict:2,coerce_string_conf:3,extend:[4,2],thread:[4,2,3],could:3,omit:2,ask:2,david:3,turn:4,length:[4,2],place:[0,4,2,3],isn:[4,2,3],outsid:[4,2],my_data:[4,2],is_configur:[2,3],oper:[4,2,3],semaphor:2,suffix:2,directli:[4,2],onc:[2,3],user_fn_on:4,scene:4,someregion:[4,2],mai:[4,2,3],instruct:4,alreadi:[4,2,3],least:2,wasn:3,miss:[2,3],primari:0,lookup_user_by_id:4,size:2,differ:[4,2,3],key_templ:4,my_region:[4,2],attach:2,data:[4,2],creator:[4,2,3],system:[4,1,2],reentrant:3,cache_region:2,paramt:2,conveni:[4,2],"final":[4,2],store:[4,2,3],option:[4,2,3],namespac:[4,2,3],get_mutex:2,tool:[0,4,2],setuptool:4,specifi:[4,2,3],getter:[2,3],consult:[4,2],dont_cache_non:[4,2],myclass:[4,2],than:[4,2],mako_lookup:2,kind:[4,2],scheme:3,beitei:3,target:[4,2],whenev:[4,2],provid:[4,1,2],remov:[4,2],structur:[4,2],some_databas:4,unconfigur:3,str:[4,2,3],were:[4,2],posit:3,dogpile_lockfil:2,toward:2,cache_on_argu:[4,2,3],comput:[4,2],abov:[4,2],runner:[4,2],arg:[4,2],mind:4,ani:[4,2],rosebaugh:3,get_or_cr:[4,2,3],properli:4,expir:[4,2,3],have:[4,2,3],need:[4,1,2,3],templatelookup:2,incompat:[2,3],novalu:2,caus:3,built:[4,2],equival:[4,2,3],self:[4,2],offer:4,strictredi:[2,3],note:[4,2],also:[4,2,3],builtin:[4,2],exampl:[4,2],take:[4,2,3],fname:[4,2],tupl:[4,2],environ:1,analogu:2,singl:[4,2,3],expens:[4,2],bmemcach:[2,3],courtesi:3,sure:[0,4,2],unless:[4,2],distribut:[0,2],though:[4,1,2,3],usernam:2,object:[4,2,3],reach:[4,2],create_valu:[4,2],"0x101989a28":[4,2],most:[0,2],regular:2,choos:2,key_mangl:[4,2,3],model:4,appear:2,don:[2,3],url:[4,2,3],doc:3,later:[4,2],request:[4,2,3],doe:[4,2,3],declar:[4,2],runtim:4,determin:[4,2,3],newvalu:[4,2],bsddb:[4,2],redisbackend:[2,3],fact:[4,2],yank:3,araujo:3,dbm:[4,2,3],serializ:4,verbos:4,concurr:[4,2],particularli:4,threshold:2,cache_timeout:3,involv:2,access:[4,2,3],onli:[4,2],explicitli:2,locat:4,acquir:[4,2],writer:4,releas:[0,4,2,3],behind:4,should:[0,4,2],jan:3,configur:2,dict:2,templat:[4,2],my_key_gener:[4,2],local:[4,2],azoff:3,wichert:3,info:3,distributed_lock:[4,2],hit:4,variou:2,get:[4,2,3],ignore_expir:[4,2,3],pypi:0,foremost:4,nativ:4,bean:3,cannot:[4,2],increas:0,ben:3,requir:[4,1,2,3],d_arg1_arg2_arg3:4,enabl:3,patch:3,"default":[4,2,3],akkerman:3,common:[4,2],contain:4,loggingproxi:4,metadata:[4,2],where:[4,2,3],bother:4,altern:[4,2],set:[4,2,3],sep:3,proce:[2,3],invalidate_user_id:4,see:[4,2,3],result:[4,2],respons:[4,2],fail:2,christian:3,retriev:[4,2,3],modulu:2,concern:1,awar:[4,2],statu:0,detect:2,parent:[4,2],correctli:3,pattern:[4,1],ralph:3,written:4,won:[4,2],mutex:[4,2],between:[4,2],"import":[4,2],bytestr:[4,2,3],irc:0,approach:[4,2],across:[2,3],attribut:[4,2],fcntl:2,signatur:2,kei:2,complement:3,set_argu:2,pluck:[4,2],itertool:4,job:2,entir:[4,2,3],cache_arg:2,uncondition:[4,2],distinguish:[4,2],from_url:[2,3],tue:3,jon:3,popul:[4,2],both:2,protect:2,last:[4,2],howev:[4,2,3],against:[4,2],etc:2,instanc:[4,2],should_cache_fn:[4,2,3],logic:2,improv:2,seri:4,memcached_region:[4,2],expiration_tim:[4,2,3],dogpil:[0,2],load:[4,2],among:4,generate_someth:[4,2],point:[4,2],instanti:[4,2],overview:[],mako_cach:[2,3],period:[4,2],pop:4,user_fn_two:4,suppli:[4,2],typic:[4,2],guid:[],rougher:4,expiri:2,ultim:[4,2],rudimentari:[],user_id:4,creat:2,addition:3,fri:3,three:[4,2],been:[4,2,3],sinc:[4,2],compon:4,interpret:[4,2],basic:[4,2],addit:[4,2,3],markedli:4,mike:0,modern:2,bit:4,value_vers:2,worker:4,search:1,argument:[4,2,3],tiger:2,coordin:[4,2],get_multi:[2,3],togeth:[4,2],sha1_mangle_kei:[4,2],instant:4,when:[4,2,3],present:[4,2,3],my_dictionari:2,look:4,packag:[0,3],plain:[2,3],erron:3,easier:4,defin:4,"while":[4,2,3],match:2,behavior:2,error:3,redis_expiration_tim:2,loos:0,pylon:0,fname_:4,would:[4,2,3],proxybackend:[4,2,3],lock_timeout:[2,3],helper:4,readi:[4,2],key2:2,key1:2,henc:4,them:[4,2],origin:[4,2],kwarg:2,myapp:[4,2],"return":[4,2,3],somekei:[4,2],ascii:[4,2],"__init__":4,decor:[4,2,3],cachefil:2,develop:[0,3],open:1,prior:2,perform:[4,1],make:[4,2,3],same:[4,2,3],bmemcachedbackend:2,binari:2,report:[0,4,2],make_region:[4,2],document:[0,2],higher:1,week:[4,2],finish:[4,2],http:[0,1,2],dictionarybackend:4,again:[4,2],lutz:3,upon:[4,2,3],effect:[1,2],rais:3,user:[4,1],mani:4,immedi:[4,2],local_region:[4,2],stack:2,hanu:3,recent:0,task:[4,2],lib:2,older:[4,2],whole:3,thu:[4,2,3],well:[4,2,3],pickl:[4,3],without:[4,2],wherebi:3,thi:[0,4,2,3],filesystem:2,everyth:4,genericmemcachedbackend:2,latter:2,promptli:[4,2],usual:4,olli:3,construct:4,identifi:4,just:[4,2],"2to3":3,detail:[4,2],relianc:3,simultan:[4,2],yet:[4,2,3],py3k:3,versu:[4,2],easi:1,wed:3,except:[4,2,3],param:2,blog:2,script:0,add:[4,3],other:[2,3],myconfig:[4,2],fedorov:3,passthrough:2,els:2,subsequ:[4,2],app:3,morgan:3,real:4,applic:[4,1,2],marco:3,which:[4,2,3],recreat:[4,2],format:2,read:2,sha1:[4,2],cache_impl:2,know:4,background:[4,2],mysect:2,moc:[4,2],cacheregion:[4,2,3],part:[4,2],password:[2,3],tweak:1,python3:3,generate_kei:[4,2],like:[0,4,2,3],specif:[0,1,2,3,4],glitch:3,threadsaf:2,resolv:[4,2],integ:[4,2,3],server:[4,2],either:[0,4,2],popular:4,async:3,soft:[2,3],page:[0,1],memcachedbackend:2,underli:4,old:[2,3],deal:2,creation:[4,2,3],some:[4,2],back:2,certain:[4,2],understood:2,intern:[4,2],with_stat:3,authent:2,user_kei:4,librari:[4,2],lead:[0,3],channel:0,coercion:3,normal:[4,2],subclass:4,dbmbackend:2,unsur:0,substitut:[4,2],retri:4,pylibmc:2,foo:[4,2],localhost:2,core:[0,1,2,3,4],encourag:1,previou:[4,2],throttl:2,usag:[0,2],filelock:2,step:4,repositori:[0,3],found:[4,2],genericmemachedbackend:2,post:[0,2],zzzeek:0,"super":4,describ:[4,2],session:1,about:0,alexand:3,justin:3,ef206ed4473fec3b639:3,load_user_info:4,manag:[4,2],issu:[0,4,3],constructor:[4,2],own:[1,2],discard:2,disabl:[4,2],"\u0142ukasz":3,repair:3,client:[4,2],"__future__":3,within:[4,2,3],tag:2,automat:2,due:3,down:[4,2],supersed:[4,2],dbmfile:[4,2],wrap:[4,2],chang:[0,2],storag:4,your:2,per:[4,2,3],pullreq:3,retry_proxi:4,git:[0,3],log:4,wai:[4,2],rutherfurd:3,aren:2,support:[0,4,2,3],"long":[4,2],custom:[4,2],avail:[0,4,2],start:[4,2],adventag:4,interfac:4,includ:[4,1,2,3],lot:1,suit:2,disambigu:[4,2],user_fn_one_:4,errant:3,"function":[4,2,3],min_compress_len:[2,3],tracker:0,regen:2,form:[4,2],enough:2,forc:[2,3],great:3,jun:3,keyerror:3,link:[2,3],scope:2,sun:3,"true":[4,2,3],longer:3,count:4,pull:[4,2,3],made:[2,3],memcachedlock:2,consist:4,possibl:[4,2],whether:[4,2],best:2,somedatabas:[4,2],asynchron:[4,2,3],below:[4,2],otherwis:[4,2],deadlock:3,similar:[0,4,2,3],fidosz:3,block:[4,2],featur:2,evalu:[4,2],"int":3,cover:3,dure:[2,3],myotherclass:[4,2],life:4,get_or_create_multi:[4,2,3],repres:[4,2],plu:3,exist:[4,2,3],necessarili:4,pileup:3,sat:3,check:[4,2,3],assembl:2,jimmei:3,value3:2,excel:4,unnecessari:2,invalid:2,tim:3,valid:2,lookup:2,futur:[4,2],rememb:[4,2],test:[4,2,3],ignor:[4,2,3],you:[0,4,2],riak:4,unlimit:2,registri:2,sequenc:2,"class":[4,2,3],fainberg:3,push:2,track:[0,4,2],length_conditional_mangl:2,consid:[4,2],previous:3,pool:2,tcp_nodelai:2,multithread:4,bitbucket:0,receiv:[4,2,3],handl:2,faster:4,directori:[2,3],redundantli:4,space:4,stale:[4,2],to_str:[4,2,3],portion:4,pylibmcbackend:2,potenti:4,time:[4,2,3],far:1,sasl:2,backward:2,newli:[4,2,3],getlogg:4},objtypes:{"0":"py:module","1":"py:method","2":"py:data","3":"py:class","4":"py:attribute","5":"py:function"},objnames:{"0":["py","module","Python module"],"1":["py","method","Python method"],"2":["py","data","Python data"],"3":["py","class","Python class"],"4":["py","attribute","Python attribute"],"5":["py","function","Python function"]},filenames:["front","index","api","changelog","usage"],titles:["Front Matter","Welcome to dogpile.cache’s documentation!","API","Changelog","Usage Guide"],objects:{"dogpile.cache.region":{CacheRegion:[2,3,1,""],make_region:[2,5,1,""],value_version:[2,2,1,""]},"dogpile.cache.backends.redis":{RedisBackend:[2,3,1,""]},"dogpile.cache.backends":{memcached:[2,0,1,""],redis:[2,0,1,""],file:[2,0,1,""],memory:[2,0,1,""]},"dogpile.cache.proxy.ProxyBackend":{wrap:[2,1,1,""]},"dogpile.cache.api.CachedValue":{payload:[2,4,1,""],metadata:[2,4,1,""]},"dogpile.cache.backends.memcached.GenericMemcachedBackend":{set_arguments:[2,4,1,""],client:[2,4,1,""]},"dogpile.cache.backends.memcached.BMemcachedBackend":{delete_multi:[2,1,1,""]},"dogpile.cache.region.CacheRegion":{configure_from_config:[2,1,1,""],is_configured:[2,4,1,""],set:[2,1,1,""],configure:[2,1,1,""],get:[2,1,1,""],invalidate:[2,1,1,""],cache_on_arguments:[2,1,1,""],get_or_create_multi:[2,1,1,""],delete_multi:[2,1,1,""],get_multi:[2,1,1,""],cache_multi_on_arguments:[2,1,1,""],set_multi:[2,1,1,""],wrap:[2,1,1,""],get_or_create:[2,1,1,""],"delete":[2,1,1,""]},"dogpile.cache.plugins.mako_cache":{MakoPlugin:[2,3,1,""]},"dogpile.cache.api.CacheBackend":{set:[2,1,1,""],get:[2,1,1,""],get_mutex:[2,1,1,""],set_multi:[2,1,1,""],delete_multi:[2,1,1,""],get_multi:[2,1,1,""],key_mangler:[2,4,1,""],"delete":[2,1,1,""]},"dogpile.cache.backends.memory":{MemoryBackend:[2,3,1,""]},"dogpile.cache.backends.memcached":{BMemcachedBackend:[2,3,1,""],GenericMemcachedBackend:[2,3,1,""],PylibmcBackend:[2,3,1,""],MemcachedLock:[2,3,1,""],MemcachedBackend:[2,3,1,""]},"dogpile.cache.api":{NO_VALUE:[2,2,1,""],CachedValue:[2,3,1,""],NoValue:[2,3,1,""],CacheBackend:[2,3,1,""]},"dogpile.cache.backends.file":{FileLock:[2,3,1,""],DBMBackend:[2,3,1,""]},"dogpile.cache.proxy":{ProxyBackend:[2,3,1,""]},"dogpile.cache.plugins":{mako_cache:[2,0,1,""]},"dogpile.cache.util":{function_key_generator:[2,5,1,""],length_conditional_mangler:[2,5,1,""],sha1_mangle_key:[2,5,1,""]},"dogpile.cache":{region:[2,0,1,""],api:[2,0,1,""],proxy:[2,0,1,""]}},titleterms:{featur:3,creat:4,overview:4,misc:3,indic:1,api:2,file:2,tabl:1,instal:0,redi:2,guid:4,backend:[4,2],mako:2,group:4,welcom:1,commun:0,configur:4,invalid:4,bug:[0,3],pylibmc:4,document:1,proxi:2,kei:4,relat:4,usag:4,util:2,front:0,rudimentari:4,homepag:0,cach:1,recip:4,changelog:3,region:[4,2],memori:2,memcach:2,project:0,matter:0,integr:2,behavior:4,chang:4,dogpil:1,plugin:2}})dogpile.cache-0.5.1/docs/usage.html0000644000076500000240000024326212225642570017706 0ustar classicstaff00000000000000 Usage Guide — dogpile.cache 0.5.1 documentation

      Usage Guide

      Overview

      At the time of this writing, popular key/value servers include Memcached, Redis, and Riak. While these tools all have different usage focuses, they all have in common that the storage model is based on the retrieval of a value based on a key; as such, they are all potentially suitable for caching, particularly Memcached which is first and foremost designed for caching.

      With a caching system in mind, dogpile.cache provides an interface to a particular Python API targeted at that system.

      A dogpile.cache configuration consists of the following components:

      • A region, which is an instance of CacheRegion, and defines the configuration details for a particular cache backend. The CacheRegion can be considered the “front end” used by applications.
      • A backend, which is an instance of CacheBackend, describing how values are stored and retrieved from a backend. This interface specifies only get(), set() and delete(). The actual kind of CacheBackend in use for a particular CacheRegion is determined by the underlying Python API being used to talk to the cache, such as Pylibmc. The CacheBackend is instantiated behind the scenes and not directly accessed by applications under normal circumstances.
      • Value generation functions. These are user-defined functions that generate new values to be placed in the cache. While dogpile.cache offers the usual “set” approach of placing data into the cache, the usual mode of usage is to only instruct it to “get” a value, passing it a creation function which will be used to generate a new value if and only if one is needed. This “get-or-create” pattern is the entire key to the “Dogpile” system, which coordinates a single value creation operation among many concurrent get operations for a particular key, eliminating the issue of an expired value being redundantly re-generated by many workers simultaneously.

      Rudimentary Usage

      dogpile.cache includes a Pylibmc backend. A basic configuration looks like:

      from dogpile.cache import make_region
      
      region = make_region().configure(
          'dogpile.cache.pylibmc',
          expiration_time = 3600,
          arguments = {
              'url':["127.0.0.1"],
          }
      )
      
      @region.cache_on_arguments()
      def load_user_info(user_id):
          return some_database.lookup_user_by_id(user_id)
      

      Above, we create a CacheRegion using the make_region() function, then apply the backend configuration via the CacheRegion.configure() method, which returns the region. The name of the backend is the only argument required by CacheRegion.configure() itself, in this case dogpile.cache.pylibmc. However, in this specific case, the pylibmc backend also requires that the URL of the memcached server be passed within the arguments dictionary.

      The configuration is separated into two sections. Upon construction via make_region(), the CacheRegion object is available, typically at module import time, for usage in decorating functions. Additional configuration details passed to CacheRegion.configure() are typically loaded from a configuration file and therefore not necessarily available until runtime, hence the two-step configurational process.

      Key arguments passed to CacheRegion.configure() include expiration_time, which is the expiration time passed to the Dogpile lock, and arguments, which are arguments used directly by the backend - in this case we are using arguments that are passed directly to the pylibmc module.

      Region Configuration

      The make_region() function currently calls the CacheRegion constructor directly.

      class dogpile.cache.region.CacheRegion(name=None, function_key_generator=<function function_key_generator at 0x101989848>, function_multi_key_generator=<function function_multi_key_generator at 0x101989a28>, key_mangler=None, async_creation_runner=None)

      A front end to a particular cache backend.

      Parameters:
      • name – Optional, a string name for the region. This isn’t used internally but can be accessed via the .name parameter, helpful for configuring a region from a config file.
      • function_key_generator

        Optional. A function that will produce a “cache key” given a data creation function and arguments, when using the CacheRegion.cache_on_arguments() method. The structure of this function should be two levels: given the data creation function, return a new function that generates the key based on the given arguments. Such as:

        def my_key_generator(namespace, fn, **kw):
            fname = fn.__name__
            def generate_key(*arg):
                return namespace + "_" + fname + "_".join(str(s) for s in arg)
            return generate_key
        
        
        region = make_region(
            function_key_generator = my_key_generator
        ).configure(
            "dogpile.cache.dbm",
            expiration_time=300,
            arguments={
                "filename":"file.dbm"
            }
        )
        

        The namespace is that passed to CacheRegion.cache_on_arguments(). It’s not consulted outside this function, so in fact can be of any form. For example, it can be passed as a tuple, used to specify arguments to pluck from **kw:

        def my_key_generator(namespace, fn):
            def generate_key(*arg, **kw):
                return ":".join(
                        [kw[k] for k in namespace] +
                        [str(x) for x in arg]
                    )
            return generate_key
        

        Where the decorator might be used as:

        @my_region.cache_on_arguments(namespace=('x', 'y'))
        def my_function(a, b, **kw):
            return my_data()
        
      • function_multi_key_generator

        Optional. Similar to function_key_generator parameter, but it’s used in CacheRegion.cache_multi_on_arguments(). Generated function should return list of keys. For example:

        def my_multi_key_generator(namespace, fn, **kw):
            namespace = fn.__name__ + (namespace or '')
        
            def generate_keys(*args):
                return [namespace + ':' + str(a) for a in args]
        
            return generate_keys
        
      • key_mangler – Function which will be used on all incoming keys before passing to the backend. Defaults to None, in which case the key mangling function recommended by the cache backend will be used. A typical mangler is the SHA1 mangler found at sha1_mangle_key() which coerces keys into a SHA1 hash, so that the string length is fixed. To disable all key mangling, set to False. Another typical mangler is the built-in Python function str, which can be used to convert non-string or Unicode keys to bytestrings, which is needed when using a backend such as bsddb or dbm under Python 2.x in conjunction with Unicode keys.
      • async_creation_runner

        A callable that, when specified, will be passed to and called by dogpile.lock when there is a stale value present in the cache. It will be passed the mutex and is responsible releasing that mutex when finished. This can be used to defer the computation of expensive creator functions to later points in the future by way of, for example, a background thread, a long-running queue, or a task manager system like Celery.

        For a specific example using async_creation_runner, new values can be created in a background thread like so:

        import threading
        
        def async_creation_runner(cache, somekey, creator, mutex):
            ''' Used by dogpile.core:Lock when appropriate  '''
            def runner():
                try:
                    value = creator()
                    cache.set(somekey, value)
                finally:
                    mutex.release()
        
            thread = threading.Thread(target=runner)
            thread.start()
        
        
        region = make_region(
            async_creation_runner=async_creation_runner,
        ).configure(
            'dogpile.cache.memcached',
            expiration_time=5,
            arguments={
                'url': '127.0.0.1:11211',
                'distributed_lock': True,
            }
        )
        

        Remember that the first request for a key with no associated value will always block; async_creator will not be invoked. However, subsequent requests for cached-but-expired values will still return promptly. They will be refreshed by whatever asynchronous means the provided async_creation_runner callable implements.

        By default the async_creation_runner is disabled and is set to None.

        New in version 0.4.2: added the async_creation_runner feature.

      One you have a CacheRegion, the CacheRegion.cache_on_arguments() method can be used to decorate functions, but the cache itself can’t be used until CacheRegion.configure() is called. The interface for that method is as follows:

      CacheRegion.configure(backend, expiration_time=None, arguments=None, _config_argument_dict=None, _config_prefix=None, wrap=None)

      Configure a CacheRegion.

      The CacheRegion itself is returned.

      Parameters:
      • backend – Required. This is the name of the CacheBackend to use, and is resolved by loading the class from the dogpile.cache entrypoint.
      • expiration_time

        Optional. The expiration time passed to the dogpile system. May be passed as an integer number of seconds, or as a datetime.timedelta value.

        The CacheRegion.get_or_create() method as well as the CacheRegion.cache_on_arguments() decorator (though note: not the CacheRegion.get() method) will call upon the value creation function after this time period has passed since the last generation.

      • arguments – Optional. The structure here is passed directly to the constructor of the CacheBackend in use, though is typically a dictionary.
      • wrap

        Optional. A list of ProxyBackend classes and/or instances, each of which will be applied in a chain to ultimately wrap the original backend, so that custom functionality augmentation can be applied.

        New in version 0.5.0.

      The CacheRegion can also be configured from a dictionary, using the CacheRegion.configure_from_config() method:

      CacheRegion.configure_from_config(config_dict, prefix)

      Configure from a configuration dictionary and a prefix.

      Example:

      local_region = make_region()
      memcached_region = make_region()
      
      # regions are ready to use for function
      # decorators, but not yet for actual caching
      
      # later, when config is available
      myconfig = {
          "cache.local.backend":"dogpile.cache.dbm",
          "cache.local.arguments.filename":"/path/to/dbmfile.dbm",
          "cache.memcached.backend":"dogpile.cache.pylibmc",
          "cache.memcached.arguments.url":"127.0.0.1, 10.0.0.1",
      }
      local_region.configure_from_config(myconfig, "cache.local.")
      memcached_region.configure_from_config(myconfig,
                                          "cache.memcached.")
      

      Using a Region

      The CacheRegion object is our front-end interface to a cache. It includes the following methods:

      CacheRegion.get(key, expiration_time=None, ignore_expiration=False)

      Return a value from the cache, based on the given key.

      If the value is not present, the method returns the token NO_VALUE. NO_VALUE evaluates to False, but is separate from None to distinguish between a cached value of None.

      By default, the configured expiration time of the CacheRegion, or alternatively the expiration time supplied by the expiration_time argument, is tested against the creation time of the retrieved value versus the current time (as reported by time.time()). If stale, the cached value is ignored and the NO_VALUE token is returned. Passing the flag ignore_expiration=True bypasses the expiration time check.

      Changed in version 0.3.0: CacheRegion.get() now checks the value’s creation time against the expiration time, rather than returning the value unconditionally.

      The method also interprets the cached value in terms of the current “invalidation” time as set by the invalidate() method. If a value is present, but its creation time is older than the current invalidation time, the NO_VALUE token is returned. Passing the flag ignore_expiration=True bypasses the invalidation time check.

      New in version 0.3.0: Support for the CacheRegion.invalidate() method.

      Parameters:
      • key – Key to be retrieved. While it’s typical for a key to be a string, it is ultimately passed directly down to the cache backend, before being optionally processed by the key_mangler function, so can be of any type recognized by the backend or by the key_mangler function, if present.
      • expiration_time

        Optional expiration time value which will supersede that configured on the CacheRegion itself.

        New in version 0.3.0.

      • ignore_expiration

        if True, the value is returned from the cache if present, regardless of configured expiration times or whether or not invalidate() was called.

        New in version 0.3.0.

      CacheRegion.get_or_create(key, creator, expiration_time=None, should_cache_fn=None)

      Return a cached value based on the given key.

      If the value does not exist or is considered to be expired based on its creation time, the given creation function may or may not be used to recreate the value and persist the newly generated value in the cache.

      Whether or not the function is used depends on if the dogpile lock can be acquired or not. If it can’t, it means a different thread or process is already running a creation function for this key against the cache. When the dogpile lock cannot be acquired, the method will block if no previous value is available, until the lock is released and a new value available. If a previous value is available, that value is returned immediately without blocking.

      If the invalidate() method has been called, and the retrieved value’s timestamp is older than the invalidation timestamp, the value is unconditionally prevented from being returned. The method will attempt to acquire the dogpile lock to generate a new value, or will wait until the lock is released to return the new value.

      Changed in version 0.3.0: The value is unconditionally regenerated if the creation time is older than the last call to invalidate().

      Parameters:
      • key – Key to be retrieved. While it’s typical for a key to be a string, it is ultimately passed directly down to the cache backend, before being optionally processed by the key_mangler function, so can be of any type recognized by the backend or by the key_mangler function, if present.
      • creator – function which creates a new value.
      • expiration_time – optional expiration time which will overide the expiration time already configured on this CacheRegion if not None. To set no expiration, use the value -1.
      • should_cache_fn

        optional callable function which will receive the value returned by the “creator”, and will then return True or False, indicating if the value should actually be cached or not. If it returns False, the value is still returned, but isn’t cached. E.g.:

        def dont_cache_none(value):
            return value is not None
        
        value = region.get_or_create("some key",
                            create_value,
                            should_cache_fn=dont_cache_none)
        

        Above, the function returns the value of create_value() if the cache is invalid, however if the return value is None, it won’t be cached.

        New in version 0.4.3.

      See also

      CacheRegion.cache_on_arguments() - applies get_or_create() to any function using a decorator.

      CacheRegion.get_or_create_multi() - multiple key/value version

      CacheRegion.set(key, value)

      Place a new value in the cache under the given key.

      CacheRegion.delete(key)

      Remove a value from the cache.

      This operation is idempotent (can be called multiple times, or on a non-existent key, safely)

      CacheRegion.cache_on_arguments(namespace=None, expiration_time=None, should_cache_fn=None, to_str=<type 'str'>)

      A function decorator that will cache the return value of the function using a key derived from the function itself and its arguments.

      The decorator internally makes use of the CacheRegion.get_or_create() method to access the cache and conditionally call the function. See that method for additional behavioral details.

      E.g.:

      @someregion.cache_on_arguments()
      def generate_something(x, y):
          return somedatabase.query(x, y)
      

      The decorated function can then be called normally, where data will be pulled from the cache region unless a new value is needed:

      result = generate_something(5, 6)
      

      The function is also given an attribute invalidate(), which provides for invalidation of the value. Pass to invalidate() the same arguments you’d pass to the function itself to represent a particular value:

      generate_something.invalidate(5, 6)
      

      Another attribute set() is added to provide extra caching possibilities relative to the function. This is a convenience method for CacheRegion.set() which will store a given value directly without calling the decorated function. The value to be cached is passed as the first argument, and the arguments which would normally be passed to the function should follow:

      generate_something.set(3, 5, 6)
      

      The above example is equivalent to calling generate_something(5, 6), if the function were to produce the value 3 as the value to be cached.

      New in version 0.4.1: Added set() method to decorated function.

      Similar to set() is refresh(). This attribute will invoke the decorated function and populate a new value into the cache with the new value, as well as returning that value:

      newvalue = generate_something.refresh(5, 6)
      

      New in version 0.5.0: Added refresh() method to decorated function.

      The default key generation will use the name of the function, the module name for the function, the arguments passed, as well as an optional “namespace” parameter in order to generate a cache key.

      Given a function one inside the module myapp.tools:

      @region.cache_on_arguments(namespace="foo")
      def one(a, b):
          return a + b
      

      Above, calling one(3, 4) will produce a cache key as follows:

      myapp.tools:one|foo|3 4

      The key generator will ignore an initial argument of self or cls, making the decorator suitable (with caveats) for use with instance or class methods. Given the example:

      class MyClass(object):
          @region.cache_on_arguments(namespace="foo")
          def one(self, a, b):
              return a + b
      

      The cache key above for MyClass().one(3, 4) will again produce the same cache key of myapp.tools:one|foo|3 4 - the name self is skipped.

      The namespace parameter is optional, and is used normally to disambiguate two functions of the same name within the same module, as can occur when decorating instance or class methods as below:

      class MyClass(object):
          @region.cache_on_arguments(namespace='MC')
          def somemethod(self, x, y):
              ""
      
      class MyOtherClass(object):
          @region.cache_on_arguments(namespace='MOC')
          def somemethod(self, x, y):
              ""
      

      Above, the namespace parameter disambiguates between somemethod on MyClass and MyOtherClass. Python class declaration mechanics otherwise prevent the decorator from having awareness of the MyClass and MyOtherClass names, as the function is received by the decorator before it becomes an instance method.

      The function key generation can be entirely replaced on a per-region basis using the function_key_generator argument present on make_region() and CacheRegion. If defaults to function_key_generator().

      Parameters:
      • namespace – optional string argument which will be established as part of the cache key. This may be needed to disambiguate functions of the same name within the same source file, such as those associated with classes - note that the decorator itself can’t see the parent class on a function as the class is being declared.
      • expiration_time

        if not None, will override the normal expiration time.

        May be specified as a callable, taking no arguments, that returns a value to be used as the expiration_time. This callable will be called whenever the decorated function itself is called, in caching or retrieving. Thus, this can be used to determine a dynamic expiration time for the cached function result. Example use cases include “cache the result until the end of the day, week or time period” and “cache until a certain date or time passes”.

        Changed in version 0.5.0: expiration_time may be passed as a callable to CacheRegion.cache_on_arguments().

      • should_cache_fn

        passed to CacheRegion.get_or_create().

        New in version 0.4.3.

      • to_str

        callable, will be called on each function argument in order to convert to a string. Defaults to str(). If the function accepts non-ascii unicode arguments on Python 2.x, the unicode() builtin can be substituted, but note this will produce unicode cache keys which may require key mangling before reaching the cache.

        New in version 0.5.0.

      Creating Backends

      Backends are located using the setuptools entrypoint system. To make life easier for writers of ad-hoc backends, a helper function is included which registers any backend in the same way as if it were part of the existing sys.path.

      For example, to create a backend called DictionaryBackend, we subclass CacheBackend:

      from dogpile.cache.api import CacheBackend, NO_VALUE
      
      class DictionaryBackend(CacheBackend):
          def __init__(self, arguments):
              self.cache = {}
      
          def get(self, key):
              return self.cache.get(key, NO_VALUE)
      
          def set(self, key, value):
              self.cache[key] = value
      
          def delete(self, key):
              self.cache.pop(key)
      

      Then make sure the class is available underneath the entrypoint dogpile.cache. If we did this in a setup.py file, it would be in setup() as:

      entry_points="""
        [dogpile.cache]
        dictionary = mypackage.mybackend:DictionaryBackend
        """
      

      Alternatively, if we want to register the plugin in the same process space without bothering to install anything, we can use register_backend:

      from dogpile.cache import register_backend
      
      register_backend("dictionary", "mypackage.mybackend", "DictionaryBackend")
      

      Our new backend would be usable in a region like this:

      from dogpile.cache import make_region
      
      region = make_region("dictionary")
      
      data = region.set("somekey", "somevalue")
      

      The values we receive for the backend here are instances of CachedValue. This is a tuple subclass of length two, of the form:

      (payload, metadata)
      

      Where “payload” is the thing being cached, and “metadata” is information we store in the cache - a dictionary which currently has just the “creation time” and a “version identifier” as key/values. If the cache backend requires serialization, pickle or similar can be used on the tuple - the “metadata” portion will always be a small and easily serializable Python structure.

      Changing Backend Behavior

      The ProxyBackend is a decorator class provided to easily augment existing backend behavior without having to extend the original class. Using a decorator class is also adventageous as it allows us to share the altered behavior between different backends.

      Proxies are added to the CacheRegion object using the CacheRegion.configure() method. Only the overridden methods need to be specified and the real backend can be accessed with the self.proxied object from inside the ProxyBackend.

      For example, a simple class to log all calls to .set() would look like this:

      from dogpile.cache.proxy import ProxyBackend
      
      import logging
      log = logging.getLogger(__name__)
      
      class LoggingProxy(ProxyBackend):
          def set(self, key, value):
              log.debug('Setting Cache Key: %s' % key)
              self.proxied.set(key, value)
      

      ProxyBackend can be be configured to optionally take arguments (as long as the ProxyBackend.__init__() method is called properly, either directly or via super(). In the example below, the RetryDeleteProxy class accepts a retry_count parameter on initialization. In the event of an exception on delete(), it will retry this many times before returning:

      from dogpile.cache.proxy import ProxyBackend
      
      class RetryDeleteProxy(ProxyBackend):
          def __init__(self, retry_count=5):
              super(RetryDeleteProxy, self).__init__()
              self.retry_count = retry_count
      
          def delete(self, key):
              retries = self.retry_count
              while retries > 0:
                  retries -= 1
                  try:
                      self.proxied.delete(key)
                      return
      
                  except:
                      pass
      

      The wrap parameter of the CacheRegion.configure() accepts a list which can contain any combination of instantiated proxy objects as well as uninstantiated proxy classes. Putting the two examples above together would look like this:

      from dogpile.cache import make_region
      
      retry_proxy = RetryDeleteProxy(5)
      
      region = make_region().configure(
          'dogpile.cache.pylibmc',
          expiration_time = 3600,
          arguments = {
              'url':["127.0.0.1"],
          },
          wrap = [ LoggingProxy, retry_proxy ]
      )
      

      In the above example, the LoggingProxy object would be instantated by the CacheRegion and applied to wrap requests on behalf of the retry_proxy instance; that proxy in turn wraps requests on behalf of the original dogpile.cache.pylibmc backend.

      New in version 0.4.4: Added support for the ProxyBackend class.

      Recipes

      Table Of Contents

      Previous topic

      Front Matter

      Next topic

      API

      This Page

      dogpile.cache-0.5.1/dogpile/0000755000076500000240000000000012225644023016371 5ustar classicstaff00000000000000dogpile.cache-0.5.1/dogpile/__init__.py0000644000076500000240000000036412225642516020512 0ustar classicstaff00000000000000# See http://peak.telecommunity.com/DevCenter/setuptools#namespace-packages try: __import__('pkg_resources').declare_namespace(__name__) except ImportError: from pkgutil import extend_path __path__ = extend_path(__path__, __name__) dogpile.cache-0.5.1/dogpile/cache/0000755000076500000240000000000012225644023017434 5ustar classicstaff00000000000000dogpile.cache-0.5.1/dogpile/cache/__init__.py0000644000076500000240000000012612225642516021551 0ustar classicstaff00000000000000__version__ = '0.5.1' from .region import CacheRegion, register_backend, make_region dogpile.cache-0.5.1/dogpile/cache/api.py0000644000076500000240000001270112225642516020565 0ustar classicstaff00000000000000import operator from .compat import py3k class NoValue(object): """Describe a missing cache value. The :attr:`.NO_VALUE` module global should be used. """ @property def payload(self): return self if py3k: def __bool__(self): #pragma NO COVERAGE return False else: def __nonzero__(self): #pragma NO COVERAGE return False NO_VALUE = NoValue() """Value returned from ``get()`` that describes a key not present.""" class CachedValue(tuple): """Represent a value stored in the cache. :class:`.CachedValue` is a two-tuple of ``(payload, metadata)``, where ``metadata`` is dogpile.cache's tracking information ( currently the creation time). The metadata and tuple structure is pickleable, if the backend requires serialization. """ payload = property(operator.itemgetter(0)) """Named accessor for the payload.""" metadata = property(operator.itemgetter(1)) """Named accessor for the dogpile.cache metadata dictionary.""" def __new__(cls, payload, metadata): return tuple.__new__(cls, (payload, metadata)) def __reduce__(self): return CachedValue, (self.payload, self.metadata) class CacheBackend(object): """Base class for backend implementations.""" key_mangler = None """Key mangling function. May be None, or otherwise declared as an ordinary instance method. """ def __init__(self, arguments): #pragma NO COVERAGE """Construct a new :class:`.CacheBackend`. Subclasses should override this to handle the given arguments. :param arguments: The ``arguments`` parameter passed to :func:`.make_registry`. """ raise NotImplementedError() @classmethod def from_config_dict(cls, config_dict, prefix): prefix_len = len(prefix) return cls( dict( (key[prefix_len:], config_dict[key]) for key in config_dict if key.startswith(prefix) ) ) def get_mutex(self, key): """Return an optional mutexing object for the given key. This object need only provide an ``acquire()`` and ``release()`` method. May return ``None``, in which case the dogpile lock will use a regular ``threading.Lock`` object to mutex concurrent threads for value creation. The default implementation returns ``None``. Different backends may want to provide various kinds of "mutex" objects, such as those which link to lock files, distributed mutexes, memcached semaphores, etc. Whatever kind of system is best suited for the scope and behavior of the caching backend. A mutex that takes the key into account will allow multiple regenerate operations across keys to proceed simultaneously, while a mutex that does not will serialize regenerate operations to just one at a time across all keys in the region. The latter approach, or a variant that involves a modulus of the given key's hash value, can be used as a means of throttling the total number of value recreation operations that may proceed at one time. """ return None def get(self, key): #pragma NO COVERAGE """Retrieve a value from the cache. The returned value should be an instance of :class:`.CachedValue`, or ``NO_VALUE`` if not present. """ raise NotImplementedError() def get_multi(self, keys): #pragma NO COVERAGE """Retrieve multiple values from the cache. The returned value should be a list, corresponding to the list of keys given. .. versionadded:: 0.5.0 """ raise NotImplementedError() def set(self, key, value): #pragma NO COVERAGE """Set a value in the cache. The key will be whatever was passed to the registry, processed by the "key mangling" function, if any. The value will always be an instance of :class:`.CachedValue`. """ raise NotImplementedError() def set_multi(self, mapping): #pragma NO COVERAGE """Set multiple values in the cache. The key will be whatever was passed to the registry, processed by the "key mangling" function, if any. The value will always be an instance of :class:`.CachedValue`. .. versionadded:: 0.5.0 """ raise NotImplementedError() def delete(self, key): #pragma NO COVERAGE """Delete a value from the cache. The key will be whatever was passed to the registry, processed by the "key mangling" function, if any. The behavior here should be idempotent, that is, can be called any number of times regardless of whether or not the key exists. """ raise NotImplementedError() def delete_multi(self, keys): #pragma NO COVERAGE """Delete multiple values from the cache. The key will be whatever was passed to the registry, processed by the "key mangling" function, if any. The behavior here should be idempotent, that is, can be called any number of times regardless of whether or not the key exists. .. versionadded:: 0.5.0 """ raise NotImplementedError() dogpile.cache-0.5.1/dogpile/cache/backends/0000755000076500000240000000000012225644023021206 5ustar classicstaff00000000000000dogpile.cache-0.5.1/dogpile/cache/backends/__init__.py0000644000076500000240000000114312225642516023323 0ustar classicstaff00000000000000from dogpile.cache.region import register_backend register_backend("dogpile.cache.dbm", "dogpile.cache.backends.file", "DBMBackend") register_backend("dogpile.cache.pylibmc", "dogpile.cache.backends.memcached", "PylibmcBackend") register_backend("dogpile.cache.bmemcached", "dogpile.cache.backends.memcached", "BMemcachedBackend") register_backend("dogpile.cache.memcached", "dogpile.cache.backends.memcached", "MemcachedBackend") register_backend("dogpile.cache.memory", "dogpile.cache.backends.memory", "MemoryBackend") register_backend("dogpile.cache.redis", "dogpile.cache.backends.redis", "RedisBackend") dogpile.cache-0.5.1/dogpile/cache/backends/file.py0000644000076500000240000002146112225642516022510 0ustar classicstaff00000000000000""" File Backends ------------------ Provides backends that deal with local filesystem access. """ from __future__ import with_statement from dogpile.cache.api import CacheBackend, NO_VALUE from contextlib import contextmanager from dogpile.cache import compat from dogpile.cache import util import os import fcntl __all__ = 'DBMBackend', 'FileLock' class DBMBackend(CacheBackend): """A file-backend using a dbm file to store keys. Basic usage:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.dbm', expiration_time = 3600, arguments = { "filename":"/path/to/cachefile.dbm" } ) DBM access is provided using the Python ``anydbm`` module, which selects a platform-specific dbm module to use. This may be made to be more configurable in a future release. Note that different dbm modules have different behaviors. Some dbm implementations handle their own locking, while others don't. The :class:`.DBMBackend` uses a read/write lockfile by default, which is compatible even with those DBM implementations for which this is unnecessary, though the behavior can be disabled. The DBM backend by default makes use of two lockfiles. One is in order to protect the DBM file itself from concurrent writes, the other is to coordinate value creation (i.e. the dogpile lock). By default, these lockfiles use the ``flock()`` system call for locking; this is only available on Unix platforms. Currently, the dogpile lock is against the entire DBM file, not per key. This means there can only be one "creator" job running at a time per dbm file. A future improvement might be to have the dogpile lock using a filename that's based on a modulus of the key. Locking on a filename that uniquely corresponds to the key is problematic, since it's not generally safe to delete lockfiles as the application runs, implying an unlimited number of key-based files would need to be created and never deleted. Parameters to the ``arguments`` dictionary are below. :param filename: path of the filename in which to create the DBM file. Note that some dbm backends will change this name to have additional suffixes. :param rw_lockfile: the name of the file to use for read/write locking. If omitted, a default name is used by appending the suffix ".rw.lock" to the DBM filename. If False, then no lock is used. :param dogpile_lockfile: the name of the file to use for value creation, i.e. the dogpile lock. If omitted, a default name is used by appending the suffix ".dogpile.lock" to the DBM filename. If False, then dogpile.cache uses the default dogpile lock, a plain thread-based mutex. """ def __init__(self, arguments): self.filename = os.path.abspath( os.path.normpath(arguments['filename']) ) dir_, filename = os.path.split(self.filename) self._rw_lock = self._init_lock( arguments.get('rw_lockfile'), ".rw.lock", dir_, filename) self._dogpile_lock = self._init_lock( arguments.get('dogpile_lockfile'), ".dogpile.lock", dir_, filename, util.KeyReentrantMutex.factory) # TODO: make this configurable if compat.py3k: import dbm else: import anydbm as dbm self.dbmmodule = dbm self._init_dbm_file() def _init_lock(self, argument, suffix, basedir, basefile, wrapper=None): if argument is None: lock = FileLock(os.path.join(basedir, basefile + suffix)) elif argument is not False: lock = FileLock( os.path.abspath( os.path.normpath(argument) )) else: return None if wrapper: lock = wrapper(lock) return lock def _init_dbm_file(self): exists = os.access(self.filename, os.F_OK) if not exists: for ext in ('db', 'dat', 'pag', 'dir'): if os.access(self.filename + os.extsep + ext, os.F_OK): exists = True break if not exists: fh = self.dbmmodule.open(self.filename, 'c') fh.close() def get_mutex(self, key): # using one dogpile for the whole file. Other ways # to do this might be using a set of files keyed to a # hash/modulus of the key. the issue is it's never # really safe to delete a lockfile as this can # break other processes trying to get at the file # at the same time - so handling unlimited keys # can't imply unlimited filenames if self._dogpile_lock: return self._dogpile_lock(key) else: return None @contextmanager def _use_rw_lock(self, write): if self._rw_lock is None: yield elif write: with self._rw_lock.write(): yield else: with self._rw_lock.read(): yield @contextmanager def _dbm_file(self, write): with self._use_rw_lock(write): dbm = self.dbmmodule.open(self.filename, "w" if write else "r") yield dbm dbm.close() def get(self, key): with self._dbm_file(False) as dbm: if hasattr(dbm, 'get'): value = dbm.get(key, NO_VALUE) else: # gdbm objects lack a .get method try: value = dbm[key] except KeyError: value = NO_VALUE if value is not NO_VALUE: value = compat.pickle.loads(value) return value def get_multi(self, keys): return [self.get(key) for key in keys] def set(self, key, value): with self._dbm_file(True) as dbm: dbm[key] = compat.pickle.dumps(value) def set_multi(self, mapping): with self._dbm_file(True) as dbm: for key,value in mapping.items(): dbm[key] = compat.pickle.dumps(value) def delete(self, key): with self._dbm_file(True) as dbm: try: del dbm[key] except KeyError: pass def delete_multi(self, keys): with self._dbm_file(True) as dbm: for key in keys: try: del dbm[key] except KeyError: pass class FileLock(object): """Use lockfiles to coordinate read/write access to a file. Only works on Unix systems, using `fcntl.flock() `_. """ def __init__(self, filename): self._filedescriptor = compat.threading.local() self.filename = filename def acquire(self, wait=True): return self.acquire_write_lock(wait) def release(self): self.release_write_lock() @property def is_open(self): return hasattr(self._filedescriptor, 'fileno') @contextmanager def read(self): self.acquire_read_lock(True) try: yield finally: self.release_read_lock() @contextmanager def write(self): self.acquire_write_lock(True) try: yield finally: self.release_write_lock() def acquire_read_lock(self, wait): return self._acquire(wait, os.O_RDONLY, fcntl.LOCK_SH) def acquire_write_lock(self, wait): return self._acquire(wait, os.O_WRONLY, fcntl.LOCK_EX) def release_read_lock(self): self._release() def release_write_lock(self): self._release() def _acquire(self, wait, wrflag, lockflag): wrflag |= os.O_CREAT fileno = os.open(self.filename, wrflag) try: if not wait: lockflag |= fcntl.LOCK_NB fcntl.flock(fileno, lockflag) except IOError: os.close(fileno) if not wait: # this is typically # "[Errno 35] Resource temporarily unavailable", # because of LOCK_NB return False else: raise else: self._filedescriptor.fileno = fileno return True def _release(self): try: fileno = self._filedescriptor.fileno except AttributeError: return else: fcntl.flock(fileno, fcntl.LOCK_UN) os.close(fileno) del self._filedescriptor.fileno dogpile.cache-0.5.1/dogpile/cache/backends/memcached.py0000644000076500000240000002375312225642516023505 0ustar classicstaff00000000000000""" Memcached Backends ------------------ Provides backends for talking to `memcached `_. """ from dogpile.cache.api import CacheBackend, NO_VALUE from dogpile.cache import compat from dogpile.cache import util import random import time __all__ = 'GenericMemcachedBackend', 'MemcachedBackend',\ 'PylibmcBackend', 'BMemcachedBackend', 'MemcachedLock' class MemcachedLock(object): """Simple distributed lock using memcached. This is an adaptation of the lock featured at http://amix.dk/blog/post/19386 """ def __init__(self, client_fn, key): self.client_fn = client_fn self.key = "_lock" + key def acquire(self, wait=True): client = self.client_fn() i = 0 while True: if client.add(self.key, 1): return True elif not wait: return False else: sleep_time = (((i+1)*random.random()) + 2**i) / 2.5 time.sleep(sleep_time) if i < 15: i += 1 def release(self): client = self.client_fn() client.delete(self.key) class GenericMemcachedBackend(CacheBackend): """Base class for memcached backends. This base class accepts a number of paramters common to all backends. :param url: the string URL to connect to. Can be a single string or a list of strings. This is the only argument that's required. :param distributed_lock: boolean, when True, will use a memcached-lock as the dogpile lock (see :class:`.MemcachedLock`). Use this when multiple processes will be talking to the same memcached instance. When left at False, dogpile will coordinate on a regular threading mutex. :param memcached_expire_time: integer, when present will be passed as the ``time`` parameter to ``pylibmc.Client.set``. This is used to set the memcached expiry time for a value. .. note:: This parameter is **different** from Dogpile's own ``expiration_time``, which is the number of seconds after which Dogpile will consider the value to be expired. When Dogpile considers a value to be expired, it **continues to use the value** until generation of a new value is complete, when using :meth:`.CacheRegion.get_or_create`. Therefore, if you are setting ``memcached_expire_time``, you'll want to make sure it is greater than ``expiration_time`` by at least enough seconds for new values to be generated, else the value won't be available during a regeneration, forcing all threads to wait for a regeneration each time a value expires. The :class:`.GenericMemachedBackend` uses a ``threading.local()`` object to store individual client objects per thread, as most modern memcached clients do not appear to be inherently threadsafe. In particular, ``threading.local()`` has the advantage over pylibmc's built-in thread pool in that it automatically discards objects associated with a particular thread when that thread ends. """ set_arguments = {} """Additional arguments which will be passed to the :meth:`set` method.""" def __init__(self, arguments): self._imports() # using a plain threading.local here. threading.local # automatically deletes the __dict__ when a thread ends, # so the idea is that this is superior to pylibmc's # own ThreadMappedPool which doesn't handle this # automatically. self.url = util.to_list(arguments['url']) self.distributed_lock = arguments.get('distributed_lock', False) self.memcached_expire_time = arguments.get( 'memcached_expire_time', 0) def _imports(self): """client library imports go here.""" raise NotImplementedError() def _create_client(self): """Creation of a Client instance goes here.""" raise NotImplementedError() @util.memoized_property def _clients(self): backend = self class ClientPool(compat.threading.local): def __init__(self): self.memcached = backend._create_client() return ClientPool() @property def client(self): """Return the memcached client. This uses a threading.local by default as it appears most modern memcached libs aren't inherently threadsafe. """ return self._clients.memcached def get_mutex(self, key): if self.distributed_lock: return MemcachedLock(lambda: self.client, key) else: return None def get(self, key): value = self.client.get(key) if value is None: return NO_VALUE else: return value def get_multi(self, keys): values = self.client.get_multi(keys) return [ NO_VALUE if key not in values else values[key] for key in keys ] def set(self, key, value): self.client.set(key, value, **self.set_arguments ) def set_multi(self, mapping): self.client.set_multi(mapping, **self.set_arguments ) def delete(self, key): self.client.delete(key) def delete_multi(self, keys): self.client.delete_multi(keys) class MemcacheArgs(object): """Mixin which provides support for the 'time' argument to set(), 'min_compress_len' to other methods. """ def __init__(self, arguments): self.min_compress_len = arguments.get('min_compress_len', 0) self.set_arguments = {} if "memcached_expire_time" in arguments: self.set_arguments["time"] =\ arguments["memcached_expire_time"] if "min_compress_len" in arguments: self.set_arguments["min_compress_len"] =\ arguments["min_compress_len"] super(MemcacheArgs, self).__init__(arguments) class PylibmcBackend(MemcacheArgs, GenericMemcachedBackend): """A backend for the `pylibmc `_ memcached client. A configuration illustrating several of the optional arguments described in the pylibmc documentation:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], 'binary':True, 'behaviors':{"tcp_nodelay": True,"ketama":True} } ) Arguments accepted here include those of :class:`.GenericMemcachedBackend`, as well as those below. :param binary: sets the ``binary`` flag understood by ``pylibmc.Client``. :param behaviors: a dictionary which will be passed to ``pylibmc.Client`` as the ``behaviors`` parameter. :param min_compress_len: Integer, will be passed as the ``min_compress_len`` parameter to the ``pylibmc.Client.set`` method. """ def __init__(self, arguments): self.binary = arguments.get('binary', False) self.behaviors = arguments.get('behaviors', {}) super(PylibmcBackend, self).__init__(arguments) def _imports(self): global pylibmc import pylibmc def _create_client(self): return pylibmc.Client(self.url, binary=self.binary, behaviors=self.behaviors ) class MemcachedBackend(MemcacheArgs, GenericMemcachedBackend): """A backend using the standard `Python-memcached `_ library. Example:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.memcached', expiration_time = 3600, arguments = { 'url':"127.0.0.1:11211" } ) """ def _imports(self): global memcache import memcache def _create_client(self): return memcache.Client(self.url) class BMemcachedBackend(GenericMemcachedBackend): """A backend for the `python-binary-memcached `_ memcached client. This is a pure Python memcached client which includes the ability to authenticate with a memcached server using SASL. A typical configuration using username/password:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.bmemcached', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], 'username':'scott', 'password':'tiger' } ) Arguments which can be passed to the ``arguments`` dictionary include: :param username: optional username, will be used for SASL authentication. :param password: optional password, will be used for SASL authentication. """ def __init__(self, arguments): self.username = arguments.get('username', None) self.password = arguments.get('password', None) super(BMemcachedBackend, self).__init__(arguments) def _imports(self): global bmemcached import bmemcached class RepairBMemcachedAPI(bmemcached.Client): """Repairs BMemcached's non-standard method signatures, which was fixed in BMemcached ef206ed4473fec3b639e. """ def add(self, key, value): try: return super(RepairBMemcachedAPI, self).add(key, value) except ValueError: return False self.Client = RepairBMemcachedAPI def _create_client(self): return self.Client(self.url, username=self.username, password=self.password ) def delete_multi(self, keys): """python-binary-memcached api does not implements delete_multi""" for key in keys: self.delete(key) dogpile.cache-0.5.1/dogpile/cache/backends/memory.py0000644000076500000240000000303612225642516023077 0ustar classicstaff00000000000000""" Memory Backend -------------- Provides a simple dictionary-based backend. """ from dogpile.cache.api import CacheBackend, NO_VALUE class MemoryBackend(CacheBackend): """A backend that uses a plain dictionary. There is no size management, and values which are placed into the dictionary will remain until explicitly removed. Note that Dogpile's expiration of items is based on timestamps and does not remove them from the cache. E.g.:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.memory' ) To use a Python dictionary of your choosing, it can be passed in with the ``cache_dict`` argument:: my_dictionary = {} region = make_region().configure( 'dogpile.cache.memory', arguments={ "cache_dict":my_dictionary } ) """ def __init__(self, arguments): self._cache = arguments.pop("cache_dict", {}) def get(self, key): return self._cache.get(key, NO_VALUE) def get_multi(self, keys): return [ self._cache.get(key, NO_VALUE) for key in keys ] def set(self, key, value): self._cache[key] = value def set_multi(self, mapping): for key,value in mapping.items(): self._cache[key] = value def delete(self, key): self._cache.pop(key, None) def delete_multi(self, keys): for key in keys: self._cache.pop(key, None) dogpile.cache-0.5.1/dogpile/cache/backends/redis.py0000644000076500000240000001105712225642516022677 0ustar classicstaff00000000000000""" Redis Backends ------------------ Provides backends for talking to `Redis `_. """ from __future__ import absolute_import from dogpile.cache.api import CacheBackend, NO_VALUE from dogpile.cache.compat import pickle, u redis = None __all__ = 'RedisBackend', class RedisBackend(CacheBackend): """A `Redis `_ backend, using the `redis-py `_ backend. Example configuration:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.redis', arguments = { 'host': 'localhost', 'port': 6379, 'db': 0, 'redis_expiration_time': 60*60*2, # 2 hours 'distributed_lock':True } ) Arguments accepted in the arguments dictionary: :param url: string. If provided, will override separate host/port/db params. The format is that accepted by ``StrictRedis.from_url()``. .. versionadded:: 0.4.1 :param host: string, default is ``localhost``. :param password: string, default is no password. .. versionadded:: 0.4.1 :param port: integer, default is ``6379``. :param db: integer, default is ``0``. :param redis_expiration_time: integer, number of seconds after setting a value that Redis should expire it. This should be larger than dogpile's cache expiration. By default no expiration is set. :param distributed_lock: boolean, when True, will use a redis-lock as the dogpile lock. Use this when multiple processes will be talking to the same redis instance. When left at False, dogpile will coordinate on a regular threading mutex. :param lock_timeout: integer, number of seconds after acquiring a lock that Redis should expire it. This argument is only valid when ``distributed_lock`` is ``True``. .. versionadded:: 0.5.0 :param lock_sleep: integer, number of seconds to sleep when failed to acquire a lock. This argument is only valid when ``distributed_lock`` is ``True``. .. versionadded:: 0.5.0 """ def __init__(self, arguments): self._imports() self.url = arguments.pop('url', None) self.host = arguments.pop('host', 'localhost') self.password = arguments.pop('password', None) self.port = arguments.pop('port', 6379) self.db = arguments.pop('db', 0) self.distributed_lock = arguments.get('distributed_lock', False) self.lock_timeout = arguments.get('lock_timeout', None) self.lock_sleep = arguments.get('lock_sleep', 0.1) self.redis_expiration_time = arguments.pop('redis_expiration_time', 0) self.client = self._create_client() def _imports(self): # defer imports until backend is used global redis import redis def _create_client(self): if self.url is not None: return redis.StrictRedis.from_url(url=self.url) else: return redis.StrictRedis(host=self.host, password=self.password, port=self.port, db=self.db) def get_mutex(self, key): if self.distributed_lock: return self.client.lock(u('_lock{}').format(key), self.lock_timeout, self.lock_sleep) else: return None def get(self, key): value = self.client.get(key) if value is None: return NO_VALUE return pickle.loads(value) def get_multi(self, keys): values = self.client.mget(keys) return [pickle.loads(v) if v is not None else NO_VALUE for v in values] def set(self, key, value): if self.redis_expiration_time: self.client.setex(key, self.redis_expiration_time, pickle.dumps(value, pickle.HIGHEST_PROTOCOL)) else: self.client.set(key, pickle.dumps(value, pickle.HIGHEST_PROTOCOL)) def set_multi(self, mapping): mapping = dict( (k, pickle.dumps(v, pickle.HIGHEST_PROTOCOL)) for k, v in mapping.items() ) if not self.redis_expiration_time: self.client.mset(mapping) else: pipe = self.client.pipeline() for key, value in mapping.items(): pipe.setex(key, self.redis_expiration_time, value) pipe.execute() def delete(self, key): self.client.delete(key) def delete_multi(self, keys): self.client.delete(*keys) dogpile.cache-0.5.1/dogpile/cache/compat.py0000644000076500000240000000200312225642516021271 0ustar classicstaff00000000000000import sys py2k = sys.version_info < (3, 0) py3k = sys.version_info >= (3, 0) py32 = sys.version_info >= (3, 2) jython = sys.platform.startswith('java') try: import threading except ImportError: import dummy_threading as threading if py3k: # pragma: no cover string_types = str, text_type = str string_type = str if py32: callable = callable else: def callable(fn): return hasattr(fn, '__call__') def u(s): return s def ue(s): return s import configparser import io import _thread as thread else: string_types = basestring, text_type = unicode string_type = str def u(s): return unicode(s, "utf-8") def ue(s): return unicode(s, "unicode_escape") import ConfigParser as configparser import StringIO as io callable = callable import thread if py3k or jython: import pickle else: import cPickle as pickledogpile.cache-0.5.1/dogpile/cache/exception.py0000644000076500000240000000072412225642516022014 0ustar classicstaff00000000000000"""Exception classes for dogpile.cache.""" class DogpileCacheException(Exception): """Base Exception for dogpile.cache exceptions to inherit from.""" class RegionAlreadyConfigured(DogpileCacheException): """CacheRegion instance is already configured.""" class RegionNotConfigured(DogpileCacheException): """CacheRegion instance has not been configured.""" class ValidationError(DogpileCacheException): """Error validating a value or option.""" dogpile.cache-0.5.1/dogpile/cache/plugins/0000755000076500000240000000000012225644023021115 5ustar classicstaff00000000000000dogpile.cache-0.5.1/dogpile/cache/plugins/__init__.py0000644000076500000240000000000012225642516023221 0ustar classicstaff00000000000000dogpile.cache-0.5.1/dogpile/cache/plugins/mako_cache.py0000644000076500000240000000561612225642516023556 0ustar classicstaff00000000000000""" Mako Integration ---------------- dogpile.cache includes a `Mako `_ plugin that replaces `Beaker `_ as the cache backend. Setup a Mako template lookup using the "dogpile.cache" cache implementation and a region dictionary:: from dogpile.cache import make_region from mako.lookup import TemplateLookup my_regions = { "local":make_region().configure( "dogpile.cache.dbm", expiration_time=360, arguments={"filename":"file.dbm"} ), "memcached":make_region().configure( "dogpile.cache.pylibmc", expiration_time=3600, arguments={"url":["127.0.0.1"]} ) } mako_lookup = TemplateLookup( directories=["/myapp/templates"], cache_impl="dogpile.cache", cache_args={ 'regions':my_regions } ) To use the above configuration in a template, use the ``cached=True`` argument on any Mako tag which accepts it, in conjunction with the name of the desired region as the ``cache_region`` argument:: <%def name="mysection()" cached="True" cache_region="memcached"> some content that's cached """ from mako.cache import CacheImpl class MakoPlugin(CacheImpl): """A Mako ``CacheImpl`` which talks to dogpile.cache.""" def __init__(self, cache): super(MakoPlugin, self).__init__(cache) try: self.regions = self.cache.template.cache_args['regions'] except KeyError: raise KeyError( "'cache_regions' argument is required on the " "Mako Lookup or Template object for usage " "with the dogpile.cache plugin.") def _get_region(self, **kw): try: region = kw['region'] except KeyError: raise KeyError( "'cache_region' argument must be specified with 'cache=True'" "within templates for usage with the dogpile.cache plugin.") try: return self.regions[region] except KeyError: raise KeyError("No such region '%s'" % region) def get_and_replace(self, key, creation_function, **kw): expiration_time = kw.pop("timeout", None) return self._get_region(**kw).get_or_create(key, creation_function, expiration_time=expiration_time) def get_or_create(self, key, creation_function, **kw): return self.get_and_replace(key, creation_function, **kw) def put(self, key, value, **kw): self._get_region(**kw).put(key, value) def get(self, key, **kw): expiration_time = kw.pop("timeout", None) return self._get_region(**kw).get(key, expiration_time=expiration_time) def invalidate(self, key, **kw): self._get_region(**kw).delete(key) dogpile.cache-0.5.1/dogpile/cache/proxy.py0000644000076500000240000000501612225642516021176 0ustar classicstaff00000000000000""" Proxy Backends ------------------ Provides a utility and a decorator class that allow for modifying the behavior of different backends without altering the class itself or having to extend the base backend. .. versionadded:: 0.5.0 Added support for the :class:`.ProxyBackend` class. """ from .api import CacheBackend class ProxyBackend(CacheBackend): """A decorator class for altering the functionality of backends. Basic usage:: from dogpile.cache import make_region from dogpile.cache.proxy import ProxyBackend class MyFirstProxy(ProxyBackend): def get(self, key): # ... custom code goes here ... return self.proxied.get(key) def set(self, key, value): # ... custom code goes here ... self.proxied.set(key) class MySecondProxy(ProxyBackend): def get(self, key): # ... custom code goes here ... return self.proxied.get(key) region = make_region().configure( 'dogpile.cache.dbm', expiration_time = 3600, arguments = { "filename":"/path/to/cachefile.dbm" }, wrap = [ MyFirstProxy, MySecondProxy ] ) Classes that extend :class:`.ProxyBackend` can be stacked together. The ``.proxied`` property will always point to either the concrete backend instance or the next proxy in the chain that a method can be delegated towards. .. versionadded:: 0.5.0 """ def __init__(self, *args, **kwargs): self.proxied = None def wrap(self, backend): ''' Take a backend as an argument and setup the self.proxied property. Return an object that be used as a backend by a :class:`.CacheRegion` object. ''' assert(isinstance(backend, CacheBackend) or isinstance(backend, ProxyBackend)) self.proxied = backend return self # # Delegate any functions that are not already overridden to # the proxies backend # def get(self, key): return self.proxied.get(key) def set(self, key, value): self.proxied.set(key, value) def delete(self, key): self.proxied.delete(key) def get_multi(self, keys): return self.proxied.get_multi(keys) def set_multi(self, keys): self.proxied.set_multi(keys) def delete_multi(self, keys): self.proxied.delete_multi(keys) def get_mutex(self, key): return self.proxied.get_mutex(key) dogpile.cache-0.5.1/dogpile/cache/region.py0000644000076500000240000013131312225642516021300 0ustar classicstaff00000000000000from __future__ import with_statement from dogpile.core import Lock, NeedRegenerationException from dogpile.core.nameregistry import NameRegistry from . import exception from .util import function_key_generator, PluginLoader, \ memoized_property, coerce_string_conf, function_multi_key_generator from .api import NO_VALUE, CachedValue from .proxy import ProxyBackend from . import compat import time import datetime from numbers import Number from functools import wraps import threading _backend_loader = PluginLoader("dogpile.cache") register_backend = _backend_loader.register from . import backends value_version = 1 """An integer placed in the :class:`.CachedValue` so that new versions of dogpile.cache can detect cached values from a previous, backwards-incompatible version. """ class CacheRegion(object): """A front end to a particular cache backend. :param name: Optional, a string name for the region. This isn't used internally but can be accessed via the ``.name`` parameter, helpful for configuring a region from a config file. :param function_key_generator: Optional. A function that will produce a "cache key" given a data creation function and arguments, when using the :meth:`.CacheRegion.cache_on_arguments` method. The structure of this function should be two levels: given the data creation function, return a new function that generates the key based on the given arguments. Such as:: def my_key_generator(namespace, fn, **kw): fname = fn.__name__ def generate_key(*arg): return namespace + "_" + fname + "_".join(str(s) for s in arg) return generate_key region = make_region( function_key_generator = my_key_generator ).configure( "dogpile.cache.dbm", expiration_time=300, arguments={ "filename":"file.dbm" } ) The ``namespace`` is that passed to :meth:`.CacheRegion.cache_on_arguments`. It's not consulted outside this function, so in fact can be of any form. For example, it can be passed as a tuple, used to specify arguments to pluck from \**kw:: def my_key_generator(namespace, fn): def generate_key(*arg, **kw): return ":".join( [kw[k] for k in namespace] + [str(x) for x in arg] ) return generate_key Where the decorator might be used as:: @my_region.cache_on_arguments(namespace=('x', 'y')) def my_function(a, b, **kw): return my_data() :param function_multi_key_generator: Optional. Similar to ``function_key_generator`` parameter, but it's used in :meth:`.CacheRegion.cache_multi_on_arguments`. Generated function should return list of keys. For example:: def my_multi_key_generator(namespace, fn, **kw): namespace = fn.__name__ + (namespace or '') def generate_keys(*args): return [namespace + ':' + str(a) for a in args] return generate_keys :param key_mangler: Function which will be used on all incoming keys before passing to the backend. Defaults to ``None``, in which case the key mangling function recommended by the cache backend will be used. A typical mangler is the SHA1 mangler found at :func:`.sha1_mangle_key` which coerces keys into a SHA1 hash, so that the string length is fixed. To disable all key mangling, set to ``False``. Another typical mangler is the built-in Python function ``str``, which can be used to convert non-string or Unicode keys to bytestrings, which is needed when using a backend such as bsddb or dbm under Python 2.x in conjunction with Unicode keys. :param async_creation_runner: A callable that, when specified, will be passed to and called by dogpile.lock when there is a stale value present in the cache. It will be passed the mutex and is responsible releasing that mutex when finished. This can be used to defer the computation of expensive creator functions to later points in the future by way of, for example, a background thread, a long-running queue, or a task manager system like Celery. For a specific example using async_creation_runner, new values can be created in a background thread like so:: import threading def async_creation_runner(cache, somekey, creator, mutex): ''' Used by dogpile.core:Lock when appropriate ''' def runner(): try: value = creator() cache.set(somekey, value) finally: mutex.release() thread = threading.Thread(target=runner) thread.start() region = make_region( async_creation_runner=async_creation_runner, ).configure( 'dogpile.cache.memcached', expiration_time=5, arguments={ 'url': '127.0.0.1:11211', 'distributed_lock': True, } ) Remember that the first request for a key with no associated value will always block; async_creator will not be invoked. However, subsequent requests for cached-but-expired values will still return promptly. They will be refreshed by whatever asynchronous means the provided async_creation_runner callable implements. By default the async_creation_runner is disabled and is set to ``None``. .. versionadded:: 0.4.2 added the async_creation_runner feature. """ def __init__(self, name=None, function_key_generator=function_key_generator, function_multi_key_generator=function_multi_key_generator, key_mangler=None, async_creation_runner=None, ): """Construct a new :class:`.CacheRegion`.""" self.name = name self.function_key_generator = function_key_generator self.function_multi_key_generator = function_multi_key_generator if key_mangler: self.key_mangler = key_mangler else: self.key_mangler = None self._hard_invalidated = None self._soft_invalidated = None self.async_creation_runner = async_creation_runner def configure(self, backend, expiration_time=None, arguments=None, _config_argument_dict=None, _config_prefix=None, wrap=None ): """Configure a :class:`.CacheRegion`. The :class:`.CacheRegion` itself is returned. :param backend: Required. This is the name of the :class:`.CacheBackend` to use, and is resolved by loading the class from the ``dogpile.cache`` entrypoint. :param expiration_time: Optional. The expiration time passed to the dogpile system. May be passed as an integer number of seconds, or as a ``datetime.timedelta`` value. .. versionadded 0.5.0 ``expiration_time`` may be optionally passed as a ``datetime.timedelta`` value. The :meth:`.CacheRegion.get_or_create` method as well as the :meth:`.CacheRegion.cache_on_arguments` decorator (though note: **not** the :meth:`.CacheRegion.get` method) will call upon the value creation function after this time period has passed since the last generation. :param arguments: Optional. The structure here is passed directly to the constructor of the :class:`.CacheBackend` in use, though is typically a dictionary. :param wrap: Optional. A list of :class:`.ProxyBackend` classes and/or instances, each of which will be applied in a chain to ultimately wrap the original backend, so that custom functionality augmentation can be applied. .. versionadded:: 0.5.0 .. seealso:: :ref:`changing_backend_behavior` """ if "backend" in self.__dict__: raise exception.RegionAlreadyConfigured( "This region is already " "configured with backend: %s" % self.backend) backend_cls = _backend_loader.load(backend) if _config_argument_dict: self.backend = backend_cls.from_config_dict( _config_argument_dict, _config_prefix ) else: self.backend = backend_cls(arguments or {}) if not expiration_time or isinstance(expiration_time, Number): self.expiration_time = expiration_time elif isinstance(expiration_time, datetime.timedelta): self.expiration_time = int(expiration_time.total_seconds()) else: raise exception.ValidationError( 'expiration_time is not a number or timedelta.') if self.key_mangler is None: self.key_mangler = self.backend.key_mangler self._lock_registry = NameRegistry(self._create_mutex) if getattr(wrap,'__iter__', False): for wrapper in reversed(wrap): self.wrap(wrapper) return self def wrap(self, proxy): ''' Takes a ProxyBackend instance or class and wraps the attached backend. ''' # if we were passed a type rather than an instance then # initialize it. if type(proxy) == type: proxy = proxy() if not issubclass(type(proxy), ProxyBackend): raise TypeError("Type %s is not a valid ProxyBackend" % type(proxy)) self.backend = proxy.wrap(self.backend) def _mutex(self, key): return self._lock_registry.get(key) class _LockWrapper(object): """weakref-capable wrapper for threading.Lock""" def __init__(self): self.lock = threading.Lock() def acquire(self, wait=True): return self.lock.acquire(wait) def release(self): self.lock.release() def _create_mutex(self, key): mutex = self.backend.get_mutex(key) if mutex is not None: return mutex else: return self._LockWrapper() def invalidate(self, hard=True): """Invalidate this :class:`.CacheRegion`. Invalidation works by setting a current timestamp (using ``time.time()``) representing the "minimum creation time" for a value. Any retrieved value whose creation time is prior to this timestamp is considered to be stale. It does not affect the data in the cache in any way, and is also local to this instance of :class:`.CacheRegion`. Once set, the invalidation time is honored by the :meth:`.CacheRegion.get_or_create`, :meth:`.CacheRegion.get_or_create_multi` and :meth:`.CacheRegion.get` methods. The method supports both "hard" and "soft" invalidation options. With "hard" invalidation, :meth:`.CacheRegion.get_or_create` will force an immediate regeneration of the value which all getters will wait for. With "soft" invalidation, subsequent getters will return the "old" value until the new one is available. Usage of "soft" invalidation requires that the region or the method is given a non-None expiration time. .. versionadded:: 0.3.0 :param hard: if True, cache values will all require immediate regeneration; dogpile logic won't be used. If False, the creation time of existing values will be pushed back before the expiration time so that a return+regen will be invoked. .. versionadded:: 0.5.1 """ if hard: self._hard_invalidated = time.time() self._soft_invalidated = None else: self._hard_invalidated = None self._soft_invalidated = time.time() def configure_from_config(self, config_dict, prefix): """Configure from a configuration dictionary and a prefix. Example:: local_region = make_region() memcached_region = make_region() # regions are ready to use for function # decorators, but not yet for actual caching # later, when config is available myconfig = { "cache.local.backend":"dogpile.cache.dbm", "cache.local.arguments.filename":"/path/to/dbmfile.dbm", "cache.memcached.backend":"dogpile.cache.pylibmc", "cache.memcached.arguments.url":"127.0.0.1, 10.0.0.1", } local_region.configure_from_config(myconfig, "cache.local.") memcached_region.configure_from_config(myconfig, "cache.memcached.") """ config_dict = coerce_string_conf(config_dict) return self.configure( config_dict["%sbackend" % prefix], expiration_time = config_dict.get( "%sexpiration_time" % prefix, None), _config_argument_dict=config_dict, _config_prefix="%sarguments." % prefix ) @memoized_property def backend(self): raise exception.RegionNotConfigured( "No backend is configured on this region.") @property def is_configured(self): """Return True if the backend has been configured via the :meth:`.CacheRegion.configure` method already. .. versionadded:: 0.5.1 """ return 'backend' in self.__dict__ def get(self, key, expiration_time=None, ignore_expiration=False): """Return a value from the cache, based on the given key. If the value is not present, the method returns the token ``NO_VALUE``. ``NO_VALUE`` evaluates to False, but is separate from ``None`` to distinguish between a cached value of ``None``. By default, the configured expiration time of the :class:`.CacheRegion`, or alternatively the expiration time supplied by the ``expiration_time`` argument, is tested against the creation time of the retrieved value versus the current time (as reported by ``time.time()``). If stale, the cached value is ignored and the ``NO_VALUE`` token is returned. Passing the flag ``ignore_expiration=True`` bypasses the expiration time check. .. versionchanged:: 0.3.0 :meth:`.CacheRegion.get` now checks the value's creation time against the expiration time, rather than returning the value unconditionally. The method also interprets the cached value in terms of the current "invalidation" time as set by the :meth:`.invalidate` method. If a value is present, but its creation time is older than the current invalidation time, the ``NO_VALUE`` token is returned. Passing the flag ``ignore_expiration=True`` bypasses the invalidation time check. .. versionadded:: 0.3.0 Support for the :meth:`.CacheRegion.invalidate` method. :param key: Key to be retrieved. While it's typical for a key to be a string, it is ultimately passed directly down to the cache backend, before being optionally processed by the key_mangler function, so can be of any type recognized by the backend or by the key_mangler function, if present. :param expiration_time: Optional expiration time value which will supersede that configured on the :class:`.CacheRegion` itself. .. versionadded:: 0.3.0 :param ignore_expiration: if ``True``, the value is returned from the cache if present, regardless of configured expiration times or whether or not :meth:`.invalidate` was called. .. versionadded:: 0.3.0 """ if self.key_mangler: key = self.key_mangler(key) value = self.backend.get(key) value = self._unexpired_value_fn( expiration_time, ignore_expiration)(value) return value.payload def _unexpired_value_fn(self, expiration_time, ignore_expiration): if ignore_expiration: return lambda value: value else: if expiration_time is None: expiration_time = self.expiration_time current_time = time.time() invalidated = self._hard_invalidated or self._soft_invalidated def value_fn(value): if value is NO_VALUE: return value elif expiration_time is not None and \ current_time - value.metadata["ct"] > expiration_time: return NO_VALUE elif invalidated and \ value.metadata["ct"] < invalidated: return NO_VALUE else: return value return value_fn def get_multi(self, keys, expiration_time=None, ignore_expiration=False): """Return multiple values from the cache, based on the given keys. Returns values as a list matching the keys given. E.g.:: values = region.get_multi(["one", "two", "three"]) To convert values to a dictionary, use ``zip()``:: keys = ["one", "two", "three"] values = region.get_multi(keys) dictionary = dict(zip(keys, values)) Keys which aren't present in the list are returned as the ``NO_VALUE`` token. ``NO_VALUE`` evaluates to False, but is separate from ``None`` to distinguish between a cached value of ``None``. By default, the configured expiration time of the :class:`.CacheRegion`, or alternatively the expiration time supplied by the ``expiration_time`` argument, is tested against the creation time of the retrieved value versus the current time (as reported by ``time.time()``). If stale, the cached value is ignored and the ``NO_VALUE`` token is returned. Passing the flag ``ignore_expiration=True`` bypasses the expiration time check. .. versionadded:: 0.5.0 """ if self.key_mangler: keys = map(lambda key: self.key_mangler(key), keys) backend_values = self.backend.get_multi(keys) _unexpired_value_fn = self._unexpired_value_fn( expiration_time, ignore_expiration) return [ value.payload if value is not NO_VALUE else value for value in ( _unexpired_value_fn(value) for value in backend_values ) ] def get_or_create(self, key, creator, expiration_time=None, should_cache_fn=None): """Return a cached value based on the given key. If the value does not exist or is considered to be expired based on its creation time, the given creation function may or may not be used to recreate the value and persist the newly generated value in the cache. Whether or not the function is used depends on if the *dogpile lock* can be acquired or not. If it can't, it means a different thread or process is already running a creation function for this key against the cache. When the dogpile lock cannot be acquired, the method will block if no previous value is available, until the lock is released and a new value available. If a previous value is available, that value is returned immediately without blocking. If the :meth:`.invalidate` method has been called, and the retrieved value's timestamp is older than the invalidation timestamp, the value is unconditionally prevented from being returned. The method will attempt to acquire the dogpile lock to generate a new value, or will wait until the lock is released to return the new value. .. versionchanged:: 0.3.0 The value is unconditionally regenerated if the creation time is older than the last call to :meth:`.invalidate`. :param key: Key to be retrieved. While it's typical for a key to be a string, it is ultimately passed directly down to the cache backend, before being optionally processed by the key_mangler function, so can be of any type recognized by the backend or by the key_mangler function, if present. :param creator: function which creates a new value. :param expiration_time: optional expiration time which will overide the expiration time already configured on this :class:`.CacheRegion` if not None. To set no expiration, use the value -1. :param should_cache_fn: optional callable function which will receive the value returned by the "creator", and will then return True or False, indicating if the value should actually be cached or not. If it returns False, the value is still returned, but isn't cached. E.g.:: def dont_cache_none(value): return value is not None value = region.get_or_create("some key", create_value, should_cache_fn=dont_cache_none) Above, the function returns the value of create_value() if the cache is invalid, however if the return value is None, it won't be cached. .. versionadded:: 0.4.3 .. seealso:: :meth:`.CacheRegion.cache_on_arguments` - applies :meth:`.get_or_create` to any function using a decorator. :meth:`.CacheRegion.get_or_create_multi` - multiple key/value version """ if self.key_mangler: key = self.key_mangler(key) def get_value(): value = self.backend.get(key) if value is NO_VALUE or \ value.metadata['v'] != value_version or \ (self._hard_invalidated and value.metadata["ct"] < self._hard_invalidated): raise NeedRegenerationException() ct = value.metadata["ct"] if self._soft_invalidated: if ct < self._soft_invalidated: ct = time.time() - expiration_time return value.payload, ct def gen_value(): created_value = creator() value = self._value(created_value) if not should_cache_fn or \ should_cache_fn(created_value): self.backend.set(key, value) return value.payload, value.metadata["ct"] if expiration_time is None: expiration_time = self.expiration_time if expiration_time is None and self._soft_invalidated: raise exception.DogpileCacheException( "Non-None expiration time required " "for soft invalidation") if self.async_creation_runner: def async_creator(mutex): return self.async_creation_runner(self, key, creator, mutex) else: async_creator = None with Lock( self._mutex(key), gen_value, get_value, expiration_time, async_creator) as value: return value def get_or_create_multi(self, keys, creator, expiration_time=None, should_cache_fn=None): """Return a sequence of cached values based on a sequence of keys. The behavior for generation of values based on keys corresponds to that of :meth:`.Region.get_or_create`, with the exception that the ``creator()`` function may be asked to generate any subset of the given keys. The list of keys to be generated is passed to ``creator()``, and ``creator()`` should return the generated values as a sequence corresponding to the order of the keys. The method uses the same approach as :meth:`.Region.get_multi` and :meth:`.Region.set_multi` to get and set values from the backend. :param keys: Sequence of keys to be retrieved. :param creator: function which accepts a sequence of keys and returns a sequence of new values. :param expiration_time: optional expiration time which will overide the expiration time already configured on this :class:`.CacheRegion` if not None. To set no expiration, use the value -1. :param should_cache_fn: optional callable function which will receive each value returned by the "creator", and will then return True or False, indicating if the value should actually be cached or not. If it returns False, the value is still returned, but isn't cached. .. versionadded:: 0.5.0 .. seealso:: :meth:`.CacheRegion.cache_multi_on_arguments` :meth:`.CacheRegion.get_or_create` """ def get_value(key): value = values.get(key, NO_VALUE) if value is NO_VALUE or \ value.metadata['v'] != value_version or \ (self._hard_invalidated and value.metadata["ct"] < self._hard_invalidated): # dogpile.core understands a 0 here as # "the value is not available", e.g. # _has_value() will return False. return value.payload, 0 else: ct = value.metadata["ct"] if self._soft_invalidated: if ct < self._soft_invalidated: ct = time.time() - expiration_time return value.payload, ct def gen_value(): raise NotImplementedError() def async_creator(key, mutex): mutexes[key] = mutex if expiration_time is None: expiration_time = self.expiration_time if expiration_time is None and self._soft_invalidated: raise exception.DogpileCacheException( "Non-None expiration time required " "for soft invalidation") mutexes = {} sorted_unique_keys = sorted(set(keys)) if self.key_mangler: mangled_keys = [self.key_mangler(k) for k in sorted_unique_keys] else: mangled_keys = sorted_unique_keys orig_to_mangled = dict(zip(sorted_unique_keys, mangled_keys)) values = dict(zip(mangled_keys, self.backend.get_multi(mangled_keys))) for orig_key, mangled_key in orig_to_mangled.items(): with Lock( self._mutex(mangled_key), gen_value, lambda: get_value(mangled_key), expiration_time, async_creator=lambda mutex: async_creator(orig_key, mutex)): pass try: if mutexes: # sort the keys, the idea is to prevent deadlocks. # though haven't been able to simulate one anyway. keys_to_get = sorted(mutexes) new_values = creator(*keys_to_get) values_w_created = dict( (orig_to_mangled[k], self._value(v)) for k, v in zip(keys_to_get, new_values) ) if not should_cache_fn: self.backend.set_multi(values_w_created) else: self.backend.set_multi(dict( (k, v) for k, v in values_w_created.items() if should_cache_fn(v[0]) )) values.update(values_w_created) return [values[orig_to_mangled[k]].payload for k in keys] finally: for mutex in mutexes.values(): mutex.release() def _value(self, value): """Return a :class:`.CachedValue` given a value.""" return CachedValue(value, { "ct": time.time(), "v": value_version }) def set(self, key, value): """Place a new value in the cache under the given key.""" if self.key_mangler: key = self.key_mangler(key) self.backend.set(key, self._value(value)) def set_multi(self, mapping): """Place new values in the cache under the given keys. .. versionadded:: 0.5.0 """ if self.key_mangler: mapping = dict((self.key_mangler(k), self._value(v)) for k, v in mapping.items()) else: mapping = dict((k, self._value(v)) for k, v in mapping.items()) self.backend.set_multi(mapping) def delete(self, key): """Remove a value from the cache. This operation is idempotent (can be called multiple times, or on a non-existent key, safely) """ if self.key_mangler: key = self.key_mangler(key) self.backend.delete(key) def delete_multi(self, keys): """Remove multiple values from the cache. This operation is idempotent (can be called multiple times, or on a non-existent key, safely) .. versionadded:: 0.5.0 """ if self.key_mangler: keys = map(lambda key: self.key_mangler(key), keys) self.backend.delete_multi(keys) def cache_on_arguments(self, namespace=None, expiration_time=None, should_cache_fn=None, to_str=compat.string_type): """A function decorator that will cache the return value of the function using a key derived from the function itself and its arguments. The decorator internally makes use of the :meth:`.CacheRegion.get_or_create` method to access the cache and conditionally call the function. See that method for additional behavioral details. E.g.:: @someregion.cache_on_arguments() def generate_something(x, y): return somedatabase.query(x, y) The decorated function can then be called normally, where data will be pulled from the cache region unless a new value is needed:: result = generate_something(5, 6) The function is also given an attribute ``invalidate()``, which provides for invalidation of the value. Pass to ``invalidate()`` the same arguments you'd pass to the function itself to represent a particular value:: generate_something.invalidate(5, 6) Another attribute ``set()`` is added to provide extra caching possibilities relative to the function. This is a convenience method for :meth:`.CacheRegion.set` which will store a given value directly without calling the decorated function. The value to be cached is passed as the first argument, and the arguments which would normally be passed to the function should follow:: generate_something.set(3, 5, 6) The above example is equivalent to calling ``generate_something(5, 6)``, if the function were to produce the value ``3`` as the value to be cached. .. versionadded:: 0.4.1 Added ``set()`` method to decorated function. Similar to ``set()`` is ``refresh()``. This attribute will invoke the decorated function and populate a new value into the cache with the new value, as well as returning that value:: newvalue = generate_something.refresh(5, 6) .. versionadded:: 0.5.0 Added ``refresh()`` method to decorated function. The default key generation will use the name of the function, the module name for the function, the arguments passed, as well as an optional "namespace" parameter in order to generate a cache key. Given a function ``one`` inside the module ``myapp.tools``:: @region.cache_on_arguments(namespace="foo") def one(a, b): return a + b Above, calling ``one(3, 4)`` will produce a cache key as follows:: myapp.tools:one|foo|3 4 The key generator will ignore an initial argument of ``self`` or ``cls``, making the decorator suitable (with caveats) for use with instance or class methods. Given the example:: class MyClass(object): @region.cache_on_arguments(namespace="foo") def one(self, a, b): return a + b The cache key above for ``MyClass().one(3, 4)`` will again produce the same cache key of ``myapp.tools:one|foo|3 4`` - the name ``self`` is skipped. The ``namespace`` parameter is optional, and is used normally to disambiguate two functions of the same name within the same module, as can occur when decorating instance or class methods as below:: class MyClass(object): @region.cache_on_arguments(namespace='MC') def somemethod(self, x, y): "" class MyOtherClass(object): @region.cache_on_arguments(namespace='MOC') def somemethod(self, x, y): "" Above, the ``namespace`` parameter disambiguates between ``somemethod`` on ``MyClass`` and ``MyOtherClass``. Python class declaration mechanics otherwise prevent the decorator from having awareness of the ``MyClass`` and ``MyOtherClass`` names, as the function is received by the decorator before it becomes an instance method. The function key generation can be entirely replaced on a per-region basis using the ``function_key_generator`` argument present on :func:`.make_region` and :class:`.CacheRegion`. If defaults to :func:`.function_key_generator`. :param namespace: optional string argument which will be established as part of the cache key. This may be needed to disambiguate functions of the same name within the same source file, such as those associated with classes - note that the decorator itself can't see the parent class on a function as the class is being declared. :param expiration_time: if not None, will override the normal expiration time. May be specified as a callable, taking no arguments, that returns a value to be used as the ``expiration_time``. This callable will be called whenever the decorated function itself is called, in caching or retrieving. Thus, this can be used to determine a *dynamic* expiration time for the cached function result. Example use cases include "cache the result until the end of the day, week or time period" and "cache until a certain date or time passes". .. versionchanged:: 0.5.0 ``expiration_time`` may be passed as a callable to :meth:`.CacheRegion.cache_on_arguments`. :param should_cache_fn: passed to :meth:`.CacheRegion.get_or_create`. .. versionadded:: 0.4.3 :param to_str: callable, will be called on each function argument in order to convert to a string. Defaults to ``str()``. If the function accepts non-ascii unicode arguments on Python 2.x, the ``unicode()`` builtin can be substituted, but note this will produce unicode cache keys which may require key mangling before reaching the cache. .. versionadded:: 0.5.0 .. seealso:: :meth:`.CacheRegion.cache_multi_on_arguments` :meth:`.CacheRegion.get_or_create` """ expiration_time_is_callable = compat.callable(expiration_time) def decorator(fn): if to_str is compat.string_type: # backwards compatible key_generator = self.function_key_generator(namespace, fn) else: key_generator = self.function_key_generator(namespace, fn, to_str=to_str) @wraps(fn) def decorate(*arg, **kw): key = key_generator(*arg, **kw) @wraps(fn) def creator(): return fn(*arg, **kw) timeout = expiration_time() if expiration_time_is_callable \ else expiration_time return self.get_or_create(key, creator, timeout, should_cache_fn) def invalidate(*arg, **kw): key = key_generator(*arg, **kw) self.delete(key) def set_(value, *arg, **kw): key = key_generator(*arg, **kw) self.set(key, value) def refresh(*arg, **kw): key = key_generator(*arg, **kw) value = fn(*arg, **kw) self.set(key, value) return value decorate.set = set_ decorate.invalidate = invalidate decorate.refresh = refresh return decorate return decorator def cache_multi_on_arguments(self, namespace=None, expiration_time=None, should_cache_fn=None, asdict=False, to_str=compat.string_type): """A function decorator that will cache multiple return values from the function using a sequence of keys derived from the function itself and the arguments passed to it. This method is the "multiple key" analogue to the :meth:`.CacheRegion.cache_on_arguments` method. Example:: @someregion.cache_multi_on_arguments() def generate_something(*keys): return [ somedatabase.query(key) for key in keys ] The decorated function can be called normally. The decorator will produce a list of cache keys using a mechanism similar to that of :meth:`.CacheRegion.cache_on_arguments`, combining the name of the function with the optional namespace and with the string form of each key. It will then consult the cache using the same mechanism as that of :meth:`.CacheRegion.get_multi` to retrieve all current values; the originally passed keys corresponding to those values which aren't generated or need regeneration will be assembled into a new argument list, and the decorated function is then called with that subset of arguments. The returned result is a list:: result = generate_something("key1", "key2", "key3") The decorator internally makes use of the :meth:`.CacheRegion.get_or_create_multi` method to access the cache and conditionally call the function. See that method for additional behavioral details. Unlike the :meth:`.CacheRegion.cache_on_arguments` method, :meth:`.CacheRegion.cache_multi_on_arguments` works only with a single function signature, one which takes a simple list of keys as arguments. Like :meth:`.CacheRegion.cache_on_arguments`, the decorated function is also provided with a ``set()`` method, which here accepts a mapping of keys and values to set in the cache:: generate_something.set({"k1": "value1", "k2": "value2", "k3": "value3"}) an ``invalidate()`` method, which has the effect of deleting the given sequence of keys using the same mechanism as that of :meth:`.CacheRegion.delete_multi`:: generate_something.invalidate("k1", "k2", "k3") and finally a ``refresh()`` method, which will call the creation function, cache the new values, and return them:: values = generate_something.refresh("k1", "k2", "k3") Parameters passed to :meth:`.CacheRegion.cache_multi_on_arguments` have the same meaning as those passed to :meth:`.CacheRegion.cache_on_arguments`. :param namespace: optional string argument which will be established as part of each cache key. :param expiration_time: if not None, will override the normal expiration time. May be passed as an integer or a callable. :param should_cache_fn: passed to :meth:`.CacheRegion.get_or_create_multi`. This function is given a value as returned by the creator, and only if it returns True will that value be placed in the cache. :param asdict: if ``True``, the decorated function should return its result as a dictionary of keys->values, and the final result of calling the decorated function will also be a dictionary. If left at its default value of ``False``, the decorated function should return its result as a list of values, and the final result of calling the decorated function will also be a list. When ``asdict==True`` if the dictionary returned by the decorated function is missing keys, those keys will not be cached. :param to_str: callable, will be called on each function argument in order to convert to a string. Defaults to ``str()``. If the function accepts non-ascii unicode arguments on Python 2.x, the ``unicode()`` builtin can be substituted, but note this will produce unicode cache keys which may require key mangling before reaching the cache. .. versionadded:: 0.5.0 .. seealso:: :meth:`.CacheRegion.cache_on_arguments` :meth:`.CacheRegion.get_or_create_multi` """ expiration_time_is_callable = compat.callable(expiration_time) def decorator(fn): key_generator = self.function_multi_key_generator(namespace, fn, to_str=to_str) @wraps(fn) def decorate(*arg, **kw): cache_keys = arg keys = key_generator(*arg, **kw) key_lookup = dict(zip(keys, cache_keys)) @wraps(fn) def creator(*keys_to_create): return fn(*[key_lookup[k] for k in keys_to_create]) timeout = expiration_time() if expiration_time_is_callable \ else expiration_time if asdict: def dict_create(*keys): d_values = creator(*keys) return [d_values.get(key_lookup[k], NO_VALUE) for k in keys] def wrap_cache_fn(value): if value is NO_VALUE: return False elif not should_cache_fn: return True else: return should_cache_fn(value) result = self.get_or_create_multi(keys, dict_create, timeout, wrap_cache_fn) result = dict((k, v) for k, v in zip(cache_keys, result) if v is not NO_VALUE) else: result = self.get_or_create_multi(keys, creator, timeout, should_cache_fn) return result def invalidate(*arg): keys = key_generator(*arg) self.delete_multi(keys) def set_(mapping): keys = list(mapping) gen_keys = key_generator(*keys) self.set_multi(dict( (gen_key, mapping[key]) for gen_key, key in zip(gen_keys, keys)) ) def refresh(*arg): keys = key_generator(*arg) values = fn(*arg) if asdict: self.set_multi( dict(zip(keys, [values[a] for a in arg])) ) return values else: self.set_multi( dict(zip(keys, values)) ) return values decorate.set = set_ decorate.invalidate = invalidate decorate.refresh = refresh return decorate return decorator def make_region(*arg, **kw): """Instantiate a new :class:`.CacheRegion`. Currently, :func:`.make_region` is a passthrough to :class:`.CacheRegion`. See that class for constructor arguments. """ return CacheRegion(*arg, **kw) dogpile.cache-0.5.1/dogpile/cache/util.py0000644000076500000240000001326512225642516020777 0ustar classicstaff00000000000000from hashlib import sha1 import inspect import sys import re import collections from . import compat def coerce_string_conf(d): result = {} for k, v in d.items(): if not isinstance(v, compat.string_types): result[k] = v continue v = v.strip() if re.match(r'^[-+]?\d+$', v): result[k] = int(v) elif v.lower() in ('false', 'true'): result[k] = v.lower() == 'true' elif v == 'None': result[k] = None else: result[k] = v return result class PluginLoader(object): def __init__(self, group): self.group = group self.impls = {} def load(self, name): if name in self.impls: return self.impls[name]() else: #pragma NO COVERAGE # TODO: if someone has ideas on how to # unit test entrypoint stuff, let me know. import pkg_resources for impl in pkg_resources.iter_entry_points( self.group, name): self.impls[name] = impl.load return impl.load() else: raise Exception( "Can't load plugin %s %s" % (self.group, name)) def register(self, name, modulepath, objname): def load(): mod = __import__(modulepath) for token in modulepath.split(".")[1:]: mod = getattr(mod, token) return getattr(mod, objname) self.impls[name] = load def function_key_generator(namespace, fn, to_str=compat.string_type): """Return a function that generates a string key, based on a given function as well as arguments to the returned function itself. This is used by :meth:`.CacheRegion.cache_on_arguments` to generate a cache key from a decorated function. It can be replaced using the ``function_key_generator`` argument passed to :func:`.make_region`. """ if namespace is None: namespace = '%s:%s' % (fn.__module__, fn.__name__) else: namespace = '%s:%s|%s' % (fn.__module__, fn.__name__, namespace) args = inspect.getargspec(fn) has_self = args[0] and args[0][0] in ('self', 'cls') def generate_key(*args, **kw): if kw: raise ValueError( "dogpile.cache's default key creation " "function does not accept keyword arguments.") if has_self: args = args[1:] return namespace + "|" + " ".join(map(to_str, args)) return generate_key def function_multi_key_generator(namespace, fn, to_str=compat.string_type): if namespace is None: namespace = '%s:%s' % (fn.__module__, fn.__name__) else: namespace = '%s:%s|%s' % (fn.__module__, fn.__name__, namespace) args = inspect.getargspec(fn) has_self = args[0] and args[0][0] in ('self', 'cls') def generate_keys(*args, **kw): if kw: raise ValueError( "dogpile.cache's default key creation " "function does not accept keyword arguments.") if has_self: args = args[1:] return [namespace + "|" + key for key in map(to_str, args)] return generate_keys def sha1_mangle_key(key): """a SHA1 key mangler.""" return sha1(key).hexdigest() def length_conditional_mangler(length, mangler): """a key mangler that mangles if the length of the key is past a certain threshold. """ def mangle(key): if len(key) >= length: return mangler(key) else: return key return mangle class memoized_property(object): """A read-only @property that is only evaluated once.""" def __init__(self, fget, doc=None): self.fget = fget self.__doc__ = doc or fget.__doc__ self.__name__ = fget.__name__ def __get__(self, obj, cls): if obj is None: return self obj.__dict__[self.__name__] = result = self.fget(obj) return result def to_list(x, default=None): """Coerce to a list.""" if x is None: return default if not isinstance(x, (list, tuple)): return [x] else: return x class KeyReentrantMutex(object): def __init__(self, key, mutex, keys): self.key = key self.mutex = mutex self.keys = keys @classmethod def factory(cls, mutex): # this collection holds zero or one # thread idents as the key; a set of # keynames held as the value. keystore = collections.defaultdict(set) def fac(key): return KeyReentrantMutex(key, mutex, keystore) return fac def acquire(self, wait=True): current_thread = compat.threading.current_thread().ident keys = self.keys.get(current_thread) if keys is not None and \ self.key not in keys: # current lockholder, new key. add it in keys.add(self.key) return True elif self.mutex.acquire(wait=wait): # after acquire, create new set and add our key self.keys[current_thread].add(self.key) return True else: return False def release(self): current_thread = compat.threading.current_thread().ident keys = self.keys.get(current_thread) assert keys is not None, "this thread didn't do the acquire" assert self.key in keys, "No acquire held for key '%s'" % self.key keys.remove(self.key) if not keys: # when list of keys empty, remove # the thread ident and unlock. del self.keys[current_thread] self.mutex.release() dogpile.cache-0.5.1/dogpile.cache.egg-info/0000755000076500000240000000000012225644023021125 5ustar classicstaff00000000000000dogpile.cache-0.5.1/dogpile.cache.egg-info/dependency_links.txt0000644000076500000240000000000112225644022025172 0ustar classicstaff00000000000000 dogpile.cache-0.5.1/dogpile.cache.egg-info/entry_points.txt0000644000076500000240000000012012225644022024413 0ustar classicstaff00000000000000 [mako.cache] dogpile = dogpile.cache.plugins.mako:MakoPlugin dogpile.cache-0.5.1/dogpile.cache.egg-info/namespace_packages.txt0000644000076500000240000000001012225644022025446 0ustar classicstaff00000000000000dogpile dogpile.cache-0.5.1/dogpile.cache.egg-info/not-zip-safe0000644000076500000240000000000112225642712023356 0ustar classicstaff00000000000000 dogpile.cache-0.5.1/dogpile.cache.egg-info/PKG-INFO0000644000076500000240000001223112225644022022220 0ustar classicstaff00000000000000Metadata-Version: 1.1 Name: dogpile.cache Version: 0.5.1 Summary: A caching front-end based on the Dogpile lock. Home-page: http://bitbucket.org/zzzeek/dogpile.cache Author: Mike Bayer Author-email: mike_mp@zzzcomputing.com License: BSD Description: dogpile.cache ============= A caching API built around the concept of a "dogpile lock", which allows continued access to an expiring data value while a single thread generates a new value. dogpile.cache builds on the `dogpile.core `_ locking system, which implements the idea of "allow one creator to write while others read" in the abstract. Overall, dogpile.cache is intended as a replacement to the `Beaker `_ caching system, the internals of which are written by the same author. All the ideas of Beaker which "work" are re-implemented in dogpile.cache in a more efficient and succinct manner, and all the cruft (Beaker's internals were first written in 2005) relegated to the trash heap. Features -------- * A succinct API which encourages up-front configuration of pre-defined "regions", each one defining a set of caching characteristics including storage backend, configuration options, and default expiration time. * A standard get/set/delete API as well as a function decorator API is provided. * The mechanics of key generation are fully customizable. The function decorator API features a pluggable "key generator" to customize how cache keys are made to correspond to function calls, and an optional "key mangler" feature provides for pluggable mangling of keys (such as encoding, SHA-1 hashing) as desired for each region. * The dogpile lock, first developed as the core engine behind the Beaker caching system, here vastly simplified, improved, and better tested. Some key performance issues that were intrinsic to Beaker's architecture, particularly that values would frequently be "double-fetched" from the cache, have been fixed. * Backends implement their own version of a "distributed" lock, where the "distribution" matches the backend's storage system. For example, the memcached backends allow all clients to coordinate creation of values using memcached itself. The dbm file backend uses a lockfile alongside the dbm file. New backends, such as a Redis-based backend, can provide their own locking mechanism appropriate to the storage engine. * Writing new backends or hacking on the existing backends is intended to be routine - all that's needed are basic get/set/delete methods. A distributed lock tailored towards the backend is an optional addition, else dogpile uses a regular thread mutex. New backends can be registered with dogpile.cache directly or made available via setuptools entry points. * Included backends feature three memcached backends (python-memcached, pylibmc, bmemcached), a Redis backend, a backend based on Python's anydbm, and a plain dictionary backend. * Space for third party plugins, including the first which provides the dogpile.cache engine to Mako templates. * Python 3 compatible in place - no 2to3 required. Synopsis -------- dogpile.cache features a single public usage object known as the ``CacheRegion``. This object then refers to a particular ``CacheBackend``. Typical usage generates a region using ``make_region()``, which can then be used at the module level to decorate functions, or used directly in code with a traditional get/set interface. Configuration of the backend is applied to the region using ``configure()`` or ``configure_from_config()``, allowing deferred config-file based configuration to occur after modules have been imported:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], 'binary':True, 'behaviors':{"tcp_nodelay": True,"ketama":True} } ) @region.cache_on_arguments() def load_user_info(user_id): return some_database.lookup_user_by_id(user_id) Documentation ------------- See dogpile.cache's full documentation at `dogpile.cache documentation `_. Keywords: caching Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 dogpile.cache-0.5.1/dogpile.cache.egg-info/requires.txt0000644000076500000240000000002312225644022023517 0ustar classicstaff00000000000000dogpile.core>=0.4.1dogpile.cache-0.5.1/dogpile.cache.egg-info/SOURCES.txt0000644000076500000240000000403312225644022023010 0ustar classicstaff00000000000000LICENSE MANIFEST.in README.rst setup.cfg setup.py docs/api.html docs/changelog.html docs/front.html docs/genindex.html docs/index.html docs/py-modindex.html docs/search.html docs/searchindex.js docs/usage.html docs/_sources/api.txt docs/_sources/changelog.txt docs/_sources/front.txt docs/_sources/index.txt docs/_sources/usage.txt docs/_static/basic.css docs/_static/comment-bright.png docs/_static/comment-close.png docs/_static/comment.png docs/_static/doctools.js docs/_static/down-pressed.png docs/_static/down.png docs/_static/file.png docs/_static/jquery.js docs/_static/minus.png docs/_static/nature.css docs/_static/plus.png docs/_static/pygments.css docs/_static/searchtools.js docs/_static/underscore.js docs/_static/up-pressed.png docs/_static/up.png docs/_static/websupport.js docs/build/Makefile docs/build/api.rst docs/build/builder.py docs/build/changelog.rst docs/build/conf.py docs/build/front.rst docs/build/index.rst docs/build/requirements.txt docs/build/usage.rst dogpile/__init__.py dogpile.cache.egg-info/PKG-INFO dogpile.cache.egg-info/SOURCES.txt dogpile.cache.egg-info/dependency_links.txt dogpile.cache.egg-info/entry_points.txt dogpile.cache.egg-info/namespace_packages.txt dogpile.cache.egg-info/not-zip-safe dogpile.cache.egg-info/requires.txt dogpile.cache.egg-info/top_level.txt dogpile/cache/__init__.py dogpile/cache/api.py dogpile/cache/compat.py dogpile/cache/exception.py dogpile/cache/proxy.py dogpile/cache/region.py dogpile/cache/util.py dogpile/cache/backends/__init__.py dogpile/cache/backends/file.py dogpile/cache/backends/memcached.py dogpile/cache/backends/memory.py dogpile/cache/backends/redis.py dogpile/cache/plugins/__init__.py dogpile/cache/plugins/mako_cache.py tests/__init__.py tests/cache/__init__.py tests/cache/_fixtures.py tests/cache/test_dbm_backend.py tests/cache/test_decorator.py tests/cache/test_memcached_backend.py tests/cache/test_memory_backend.py tests/cache/test_redis_backend.py tests/cache/test_region.py tests/cache/test_utils.py tests/cache/plugins/__init__.py tests/cache/plugins/test_mako_cache.pydogpile.cache-0.5.1/dogpile.cache.egg-info/top_level.txt0000644000076500000240000000001012225644022023645 0ustar classicstaff00000000000000dogpile dogpile.cache-0.5.1/LICENSE0000644000076500000240000000265012225642516015763 0ustar classicstaff00000000000000Copyright (c) 2011-2013 Mike Bayer All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. The name of the author or contributors may not be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. dogpile.cache-0.5.1/MANIFEST.in0000644000076500000240000000032212225642516016506 0ustar classicstaff00000000000000recursive-include docs *.html *.css *.txt *.js *.jpg *.png *.py Makefile *.rst *.sty recursive-include tests *.py *.dat include README* LICENSE distribute_setup.py CHANGES* test.cfg prune docs/build/output dogpile.cache-0.5.1/PKG-INFO0000644000076500000240000001223112225644023016042 0ustar classicstaff00000000000000Metadata-Version: 1.1 Name: dogpile.cache Version: 0.5.1 Summary: A caching front-end based on the Dogpile lock. Home-page: http://bitbucket.org/zzzeek/dogpile.cache Author: Mike Bayer Author-email: mike_mp@zzzcomputing.com License: BSD Description: dogpile.cache ============= A caching API built around the concept of a "dogpile lock", which allows continued access to an expiring data value while a single thread generates a new value. dogpile.cache builds on the `dogpile.core `_ locking system, which implements the idea of "allow one creator to write while others read" in the abstract. Overall, dogpile.cache is intended as a replacement to the `Beaker `_ caching system, the internals of which are written by the same author. All the ideas of Beaker which "work" are re-implemented in dogpile.cache in a more efficient and succinct manner, and all the cruft (Beaker's internals were first written in 2005) relegated to the trash heap. Features -------- * A succinct API which encourages up-front configuration of pre-defined "regions", each one defining a set of caching characteristics including storage backend, configuration options, and default expiration time. * A standard get/set/delete API as well as a function decorator API is provided. * The mechanics of key generation are fully customizable. The function decorator API features a pluggable "key generator" to customize how cache keys are made to correspond to function calls, and an optional "key mangler" feature provides for pluggable mangling of keys (such as encoding, SHA-1 hashing) as desired for each region. * The dogpile lock, first developed as the core engine behind the Beaker caching system, here vastly simplified, improved, and better tested. Some key performance issues that were intrinsic to Beaker's architecture, particularly that values would frequently be "double-fetched" from the cache, have been fixed. * Backends implement their own version of a "distributed" lock, where the "distribution" matches the backend's storage system. For example, the memcached backends allow all clients to coordinate creation of values using memcached itself. The dbm file backend uses a lockfile alongside the dbm file. New backends, such as a Redis-based backend, can provide their own locking mechanism appropriate to the storage engine. * Writing new backends or hacking on the existing backends is intended to be routine - all that's needed are basic get/set/delete methods. A distributed lock tailored towards the backend is an optional addition, else dogpile uses a regular thread mutex. New backends can be registered with dogpile.cache directly or made available via setuptools entry points. * Included backends feature three memcached backends (python-memcached, pylibmc, bmemcached), a Redis backend, a backend based on Python's anydbm, and a plain dictionary backend. * Space for third party plugins, including the first which provides the dogpile.cache engine to Mako templates. * Python 3 compatible in place - no 2to3 required. Synopsis -------- dogpile.cache features a single public usage object known as the ``CacheRegion``. This object then refers to a particular ``CacheBackend``. Typical usage generates a region using ``make_region()``, which can then be used at the module level to decorate functions, or used directly in code with a traditional get/set interface. Configuration of the backend is applied to the region using ``configure()`` or ``configure_from_config()``, allowing deferred config-file based configuration to occur after modules have been imported:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], 'binary':True, 'behaviors':{"tcp_nodelay": True,"ketama":True} } ) @region.cache_on_arguments() def load_user_info(user_id): return some_database.lookup_user_by_id(user_id) Documentation ------------- See dogpile.cache's full documentation at `dogpile.cache documentation `_. Keywords: caching Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 dogpile.cache-0.5.1/README.rst0000644000076500000240000000767512225642516016461 0ustar classicstaff00000000000000dogpile.cache ============= A caching API built around the concept of a "dogpile lock", which allows continued access to an expiring data value while a single thread generates a new value. dogpile.cache builds on the `dogpile.core `_ locking system, which implements the idea of "allow one creator to write while others read" in the abstract. Overall, dogpile.cache is intended as a replacement to the `Beaker `_ caching system, the internals of which are written by the same author. All the ideas of Beaker which "work" are re-implemented in dogpile.cache in a more efficient and succinct manner, and all the cruft (Beaker's internals were first written in 2005) relegated to the trash heap. Features -------- * A succinct API which encourages up-front configuration of pre-defined "regions", each one defining a set of caching characteristics including storage backend, configuration options, and default expiration time. * A standard get/set/delete API as well as a function decorator API is provided. * The mechanics of key generation are fully customizable. The function decorator API features a pluggable "key generator" to customize how cache keys are made to correspond to function calls, and an optional "key mangler" feature provides for pluggable mangling of keys (such as encoding, SHA-1 hashing) as desired for each region. * The dogpile lock, first developed as the core engine behind the Beaker caching system, here vastly simplified, improved, and better tested. Some key performance issues that were intrinsic to Beaker's architecture, particularly that values would frequently be "double-fetched" from the cache, have been fixed. * Backends implement their own version of a "distributed" lock, where the "distribution" matches the backend's storage system. For example, the memcached backends allow all clients to coordinate creation of values using memcached itself. The dbm file backend uses a lockfile alongside the dbm file. New backends, such as a Redis-based backend, can provide their own locking mechanism appropriate to the storage engine. * Writing new backends or hacking on the existing backends is intended to be routine - all that's needed are basic get/set/delete methods. A distributed lock tailored towards the backend is an optional addition, else dogpile uses a regular thread mutex. New backends can be registered with dogpile.cache directly or made available via setuptools entry points. * Included backends feature three memcached backends (python-memcached, pylibmc, bmemcached), a Redis backend, a backend based on Python's anydbm, and a plain dictionary backend. * Space for third party plugins, including the first which provides the dogpile.cache engine to Mako templates. * Python 3 compatible in place - no 2to3 required. Synopsis -------- dogpile.cache features a single public usage object known as the ``CacheRegion``. This object then refers to a particular ``CacheBackend``. Typical usage generates a region using ``make_region()``, which can then be used at the module level to decorate functions, or used directly in code with a traditional get/set interface. Configuration of the backend is applied to the region using ``configure()`` or ``configure_from_config()``, allowing deferred config-file based configuration to occur after modules have been imported:: from dogpile.cache import make_region region = make_region().configure( 'dogpile.cache.pylibmc', expiration_time = 3600, arguments = { 'url':["127.0.0.1"], 'binary':True, 'behaviors':{"tcp_nodelay": True,"ketama":True} } ) @region.cache_on_arguments() def load_user_info(user_id): return some_database.lookup_user_by_id(user_id) Documentation ------------- See dogpile.cache's full documentation at `dogpile.cache documentation `_. dogpile.cache-0.5.1/setup.cfg0000644000076500000240000000040112225644023016562 0ustar classicstaff00000000000000[egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 [upload_docs] upload-dir = docs/build/output/html [upload] sign = 1 identity = C4DAFEE1 [nosetests] cover-package = dogpile.cache with-coverage = 1 cover-erase = 1 nologcapture = 1 where = tests dogpile.cache-0.5.1/setup.py0000644000076500000240000000235012225642516016465 0ustar classicstaff00000000000000import os import sys import re from setuptools import setup, find_packages v = open(os.path.join(os.path.dirname(__file__), 'dogpile', 'cache', '__init__.py')) VERSION = re.compile(r".*__version__ = '(.*?)'", re.S).match(v.read()).group(1) v.close() readme = os.path.join(os.path.dirname(__file__), 'README.rst') setup(name='dogpile.cache', version=VERSION, description="A caching front-end based on the Dogpile lock.", long_description=open(readme).read(), classifiers=[ 'Development Status :: 4 - Beta', 'Intended Audience :: Developers', 'License :: OSI Approved :: BSD License', 'Programming Language :: Python', 'Programming Language :: Python :: 3', ], keywords='caching', author='Mike Bayer', author_email='mike_mp@zzzcomputing.com', url='http://bitbucket.org/zzzeek/dogpile.cache', license='BSD', packages=find_packages('.', exclude=['ez_setup', 'tests*']), namespace_packages=['dogpile'], entry_points=""" [mako.cache] dogpile = dogpile.cache.plugins.mako:MakoPlugin """, zip_safe=False, install_requires=['dogpile.core>=0.4.1'], test_suite='nose.collector', tests_require=['nose', 'mock'], ) dogpile.cache-0.5.1/tests/0000755000076500000240000000000012225644023016110 5ustar classicstaff00000000000000dogpile.cache-0.5.1/tests/__init__.py0000644000076500000240000000000012225642516020214 0ustar classicstaff00000000000000dogpile.cache-0.5.1/tests/cache/0000755000076500000240000000000012225644023017153 5ustar classicstaff00000000000000dogpile.cache-0.5.1/tests/cache/__init__.py0000644000076500000240000000173412225642516021276 0ustar classicstaff00000000000000import re from nose import SkipTest from functools import wraps from dogpile.cache import compat def eq_(a, b, msg=None): """Assert a == b, with repr messaging on failure.""" assert a == b, msg or "%r != %r" % (a, b) def is_(a, b, msg=None): """Assert a is b, with repr messaging on failure.""" assert a is b, msg or "%r is not %r" % (a, b) def ne_(a, b, msg=None): """Assert a != b, with repr messaging on failure.""" assert a != b, msg or "%r == %r" % (a, b) def assert_raises_message(except_cls, msg, callable_, *args, **kwargs): try: callable_(*args, **kwargs) assert False, "Callable did not raise an exception" except except_cls as e: assert re.search(msg, str(e)), "%r !~ %s" % (msg, e) from dogpile.cache.compat import configparser, io def requires_py3k(fn): @wraps(fn) def wrap(*arg, **kw): if compat.py2k: raise SkipTest("Python 3 required") return fn(*arg, **kw) return wrapdogpile.cache-0.5.1/tests/cache/_fixtures.py0000644000076500000240000002633112225642516021547 0ustar classicstaff00000000000000from dogpile.cache.api import CacheBackend, CachedValue, NO_VALUE from dogpile.cache import register_backend, CacheRegion, util from dogpile.cache.region import _backend_loader from . import eq_, assert_raises_message import itertools import time from nose import SkipTest from threading import Thread, Lock from dogpile.cache.compat import thread from unittest import TestCase import random import collections class _GenericBackendFixture(object): @classmethod def setup_class(cls): try: backend_cls = _backend_loader.load(cls.backend) backend = backend_cls(cls.config_args.get('arguments', {})) except ImportError: raise SkipTest("Backend %s not installed" % cls.backend) cls._check_backend_available(backend) def tearDown(self): if self._region_inst: for key in self._keys: self._region_inst.delete(key) self._keys.clear() elif self._backend_inst: self._backend_inst.delete("some_key") @classmethod def _check_backend_available(cls, backend): pass region_args = {} config_args = {} _region_inst = None _backend_inst = None _keys = set() def _region(self, backend=None, region_args={}, config_args={}): _region_args = self.region_args.copy() _region_args.update(**region_args) _config_args = self.config_args.copy() _config_args.update(config_args) def _store_keys(key): if existing_key_mangler: key = existing_key_mangler(key) self._keys.add(key) return key self._region_inst = reg = CacheRegion(**_region_args) existing_key_mangler = self._region_inst.key_mangler self._region_inst.key_mangler = _store_keys reg.configure(backend or self.backend, **_config_args) return reg def _backend(self): backend_cls = _backend_loader.load(self.backend) _config_args = self.config_args.copy() self._backend_inst = backend_cls(_config_args.get('arguments', {})) return self._backend_inst class _GenericBackendTest(_GenericBackendFixture, TestCase): def test_backend_get_nothing(self): backend = self._backend() eq_(backend.get("some_key"), NO_VALUE) def test_backend_delete_nothing(self): backend = self._backend() backend.delete("some_key") def test_backend_set_get_value(self): backend = self._backend() backend.set("some_key", "some value") eq_(backend.get("some_key"), "some value") def test_backend_delete(self): backend = self._backend() backend.set("some_key", "some value") backend.delete("some_key") eq_(backend.get("some_key"), NO_VALUE) def test_region_set_get_value(self): reg = self._region() reg.set("some key", "some value") eq_(reg.get("some key"), "some value") def test_region_set_multiple_values(self): reg = self._region() values = {'key1': 'value1', 'key2': 'value2', 'key3': 'value3'} reg.set_multi(values) eq_(values['key1'], reg.get('key1')) eq_(values['key2'], reg.get('key2')) eq_(values['key3'], reg.get('key3')) def test_region_get_multiple_values(self): reg = self._region() key1 = 'value1' key2 = 'value2' key3 = 'value3' reg.set('key1', key1) reg.set('key2', key2) reg.set('key3', key3) values = reg.get_multi(['key1', 'key2', 'key3']) eq_( [key1, key2, key3], values ) def test_region_get_nothing_multiple(self): reg = self._region() values = {'key1': 'value1', 'key3': 'value3', 'key5': 'value5'} reg.set_multi(values) reg_values = reg.get_multi(['key1', 'key2', 'key3', 'key4', 'key5', 'key6']) eq_( reg_values, ["value1", NO_VALUE, "value3", NO_VALUE, "value5", NO_VALUE ] ) def test_region_delete_multiple(self): reg = self._region() values = {'key1': 'value1', 'key2': 'value2', 'key3': 'value3'} reg.set_multi(values) reg.delete_multi(['key2', 'key10']) eq_(values['key1'], reg.get('key1')) eq_(NO_VALUE, reg.get('key2')) eq_(values['key3'], reg.get('key3')) eq_(NO_VALUE, reg.get('key10')) def test_region_set_get_nothing(self): reg = self._region() eq_(reg.get("some key"), NO_VALUE) def test_region_creator(self): reg = self._region() def creator(): return "some value" eq_(reg.get_or_create("some key", creator), "some value") def test_threaded_dogpile(self): # run a basic dogpile concurrency test. # note the concurrency of dogpile itself # is intensively tested as part of dogpile. reg = self._region(config_args={"expiration_time": .25}) lock = Lock() canary = [] def creator(): ack = lock.acquire(False) canary.append(ack) time.sleep(.5) if ack: lock.release() return "some value" def f(): for x in range(5): reg.get_or_create("some key", creator) time.sleep(.5) threads = [Thread(target=f) for i in range(5)] for t in threads: t.start() for t in threads: t.join() assert len(canary) > 3 assert False not in canary def test_threaded_get_multi(self): reg = self._region(config_args={"expiration_time": .25}) locks = dict((str(i), Lock()) for i in range(11)) canary = collections.defaultdict(list) def creator(*keys): assert keys ack = [locks[key].acquire(False) for key in keys] #print( # ("%s " % thread.get_ident()) + \ # ", ".join(sorted("%s=%s" % (key, acq) # for acq, key in zip(ack, keys))) # ) for acq, key in zip(ack, keys): canary[key].append(acq) time.sleep(.5) for acq, key in zip(ack, keys): if acq: locks[key].release() return ["some value %s" % k for k in keys] def f(): for x in range(5): reg.get_or_create_multi( [str(random.randint(1, 10)) for i in range(random.randint(1, 5))], creator) time.sleep(.5) f() return threads = [Thread(target=f) for i in range(5)] for t in threads: t.start() for t in threads: t.join() assert sum([len(v) for v in canary.values()]) > 10 for l in canary.values(): assert False not in l def test_region_delete(self): reg = self._region() reg.set("some key", "some value") reg.delete("some key") reg.delete("some key") eq_(reg.get("some key"), NO_VALUE) def test_region_expire(self): reg = self._region(config_args={"expiration_time": .25}) counter = itertools.count(1) def creator(): return "some value %d" % next(counter) eq_(reg.get_or_create("some key", creator), "some value 1") time.sleep(.4) eq_(reg.get("some key", ignore_expiration=True), "some value 1") eq_(reg.get_or_create("some key", creator), "some value 2") eq_(reg.get("some key"), "some value 2") def test_decorated_fn_functionality(self): # test for any quirks in the fn decoration that interact # with the backend. reg = self._region() counter = itertools.count(1) @reg.cache_on_arguments() def my_function(x, y): return next(counter) + x + y eq_(my_function(3, 4), 8) eq_(my_function(5, 6), 13) eq_(my_function(3, 4), 8) eq_(my_function(4, 3), 10) my_function.invalidate(4, 3) eq_(my_function(4, 3), 11) def test_exploding_value_fn(self): reg = self._region() def boom(): raise Exception("boom") assert_raises_message( Exception, "boom", reg.get_or_create, "some_key", boom ) class _GenericMutexTest(_GenericBackendFixture, TestCase): def test_mutex(self): backend = self._backend() mutex = backend.get_mutex("foo") ac = mutex.acquire() assert ac ac2 = mutex.acquire(False) assert not ac2 mutex.release() ac3 = mutex.acquire() assert ac3 mutex.release() def test_mutex_threaded(self): backend = self._backend() mutex = backend.get_mutex("foo") lock = Lock() canary = [] def f(): for x in range(5): mutex = backend.get_mutex("foo") mutex.acquire() for y in range(5): ack = lock.acquire(False) canary.append(ack) time.sleep(.002) if ack: lock.release() mutex.release() time.sleep(.02) threads = [Thread(target=f) for i in range(5)] for t in threads: t.start() for t in threads: t.join() assert False not in canary def test_mutex_reentrant_across_keys(self): backend = self._backend() for x in range(3): m1 = backend.get_mutex("foo") m2 = backend.get_mutex("bar") try: m1.acquire() assert m2.acquire(False) assert not m2.acquire(False) m2.release() assert m2.acquire(False) assert not m2.acquire(False) m2.release() finally: m1.release() def test_reentrant_dogpile(self): reg = self._region() def create_foo(): return "foo" + reg.get_or_create("bar", create_bar) def create_bar(): return "bar" eq_( reg.get_or_create("foo", create_foo), "foobar" ) eq_( reg.get_or_create("foo", create_foo), "foobar" ) class MockMutex(object): def __init__(self, key): self.key = key def acquire(self, blocking=True): return True def release(self): return class MockBackend(CacheBackend): def __init__(self, arguments): self.arguments = arguments self._cache = {} def get_mutex(self, key): return MockMutex(key) def get(self, key): try: return self._cache[key] except KeyError: return NO_VALUE def get_multi(self, keys): return [ self.get(key) for key in keys ] def set(self, key, value): self._cache[key] = value def set_multi(self, mapping): for key,value in mapping.items(): self.set(key, value) def delete(self, key): self._cache.pop(key, None) def delete_multi(self, keys): for key in keys: self.delete(key) register_backend("mock", __name__, "MockBackend") dogpile.cache-0.5.1/tests/cache/plugins/0000755000076500000240000000000012225644023020634 5ustar classicstaff00000000000000dogpile.cache-0.5.1/tests/cache/plugins/__init__.py0000644000076500000240000000000012225642516022740 0ustar classicstaff00000000000000dogpile.cache-0.5.1/tests/cache/plugins/test_mako_cache.py0000644000076500000240000000253512225642516024331 0ustar classicstaff00000000000000from .. import eq_ from unittest import TestCase from nose import SkipTest try: import mako except ImportError: raise SkipTest("this test suite requires mako templates") from mako.template import Template from mako.cache import register_plugin import mock register_plugin("dogpile.cache", "dogpile.cache.plugins.mako_cache", "MakoPlugin") class TestMakoPlugin(TestCase): def _mock_fixture(self): reg = mock.MagicMock() reg.get_or_create.return_value = "hello world" my_regions = { "myregion": reg } return { 'cache_impl': 'dogpile.cache', 'cache_args': {'regions': my_regions} }, reg def test_basic(self): kw, reg = self._mock_fixture() t = Template( '<%page cached="True" cache_region="myregion"/>hi', **kw ) t.render() eq_(reg.get_or_create.call_count, 1) def test_timeout(self): kw, reg = self._mock_fixture() t = Template(""" <%def name="mydef()" cached="True" cache_region="myregion" cache_timeout="20"> some content ${mydef()} """, **kw) t.render() eq_( reg.get_or_create.call_args[1], {"expiration_time": 20} ) dogpile.cache-0.5.1/tests/cache/test_dbm_backend.py0000644000076500000240000000317112225642516023004 0ustar classicstaff00000000000000from ._fixtures import _GenericBackendTest, _GenericMutexTest from . import eq_, assert_raises_message from unittest import TestCase from threading import Thread import time import os from nose import SkipTest try: import fcntl except ImportError: raise SkipTest("fcntl not available") class DBMBackendTest(_GenericBackendTest): backend = "dogpile.cache.dbm" config_args = { "arguments":{ "filename":"test.dbm" } } class DBMBackendNoLockTest(_GenericBackendTest): backend = "dogpile.cache.dbm" config_args = { "arguments":{ "filename":"test.dbm", "rw_lockfile":False, "dogpile_lockfile":False, } } class DBMMutexTest(_GenericMutexTest): backend = "dogpile.cache.dbm" config_args = { "arguments":{ "filename":"test.dbm" } } def test_release_assertion_thread(self): backend = self._backend() m1 = backend.get_mutex("foo") assert_raises_message( AssertionError, "this thread didn't do the acquire", m1.release ) def test_release_assertion_key(self): backend = self._backend() m1 = backend.get_mutex("foo") m2 = backend.get_mutex("bar") m1.acquire() try: assert_raises_message( AssertionError, "No acquire held for key 'bar'", m2.release ) finally: m1.release() def teardown(): for fname in os.listdir(os.curdir): if fname.startswith("test.dbm"): os.unlink(fname) dogpile.cache-0.5.1/tests/cache/test_decorator.py0000644000076500000240000002702012225642516022554 0ustar classicstaff00000000000000#! coding: utf-8 from ._fixtures import _GenericBackendFixture from . import eq_, requires_py3k from unittest import TestCase import time from dogpile.cache import util, compat import itertools from dogpile.cache.api import NO_VALUE class DecoratorTest(_GenericBackendFixture, TestCase): backend = "dogpile.cache.memory" def _fixture(self, namespace=None, expiration_time=None): reg = self._region(config_args={"expiration_time":.25}) counter = itertools.count(1) @reg.cache_on_arguments(namespace=namespace, expiration_time=expiration_time) def go(a, b): val = next(counter) return val, a, b return go def _multi_fixture(self, namespace=None, expiration_time=None): reg = self._region(config_args={"expiration_time":.25}) counter = itertools.count(1) @reg.cache_multi_on_arguments(namespace=namespace, expiration_time=expiration_time) def go(*args): val = next(counter) return ["%d %s" % (val, arg) for arg in args] return go def test_decorator(self): go = self._fixture() eq_(go(1, 2), (1, 1, 2)) eq_(go(3, 4), (2, 3, 4)) eq_(go(1, 2), (1, 1, 2)) time.sleep(.3) eq_(go(1, 2), (3, 1, 2)) def test_decorator_namespace(self): # TODO: test the namespace actually # working somehow... go = self._fixture(namespace="x") eq_(go(1, 2), (1, 1, 2)) eq_(go(3, 4), (2, 3, 4)) eq_(go(1, 2), (1, 1, 2)) time.sleep(.3) eq_(go(1, 2), (3, 1, 2)) def test_decorator_custom_expire(self): go = self._fixture(expiration_time=.5) eq_(go(1, 2), (1, 1, 2)) eq_(go(3, 4), (2, 3, 4)) eq_(go(1, 2), (1, 1, 2)) time.sleep(.3) eq_(go(1, 2), (1, 1, 2)) time.sleep(.3) eq_(go(1, 2), (3, 1, 2)) def test_decorator_expire_callable(self): go = self._fixture(expiration_time=lambda: .5) eq_(go(1, 2), (1, 1, 2)) eq_(go(3, 4), (2, 3, 4)) eq_(go(1, 2), (1, 1, 2)) time.sleep(.3) eq_(go(1, 2), (1, 1, 2)) time.sleep(.3) eq_(go(1, 2), (3, 1, 2)) def test_decorator_expire_callable_zero(self): go = self._fixture(expiration_time=lambda: 0) eq_(go(1, 2), (1, 1, 2)) eq_(go(1, 2), (2, 1, 2)) eq_(go(1, 2), (3, 1, 2)) def test_explicit_expire(self): go = self._fixture(expiration_time=1) eq_(go(1, 2), (1, 1, 2)) eq_(go(3, 4), (2, 3, 4)) eq_(go(1, 2), (1, 1, 2)) go.invalidate(1, 2) eq_(go(1, 2), (3, 1, 2)) def test_explicit_set(self): go = self._fixture(expiration_time=1) eq_(go(1, 2), (1, 1, 2)) go.set(5, 1, 2) eq_(go(3, 4), (2, 3, 4)) eq_(go(1, 2), 5) go.invalidate(1, 2) eq_(go(1, 2), (3, 1, 2)) go.set(0, 1, 3) eq_(go(1, 3), 0) def test_explicit_set_multi(self): go = self._multi_fixture(expiration_time=1) eq_(go(1, 2), ['1 1', '1 2']) eq_(go(1, 2), ['1 1', '1 2']) go.set({1: '1 5', 2: '1 6'}) eq_(go(1, 2), ['1 5', '1 6']) def test_explicit_refresh(self): go = self._fixture(expiration_time=1) eq_(go(1, 2), (1, 1, 2)) eq_(go.refresh(1, 2), (2, 1, 2)) eq_(go(1, 2), (2, 1, 2)) eq_(go(1, 2), (2, 1, 2)) eq_(go.refresh(1, 2), (3, 1, 2)) eq_(go(1, 2), (3, 1, 2)) def test_explicit_refresh_multi(self): go = self._multi_fixture(expiration_time=1) eq_(go(1, 2), ['1 1', '1 2']) eq_(go(1, 2), ['1 1', '1 2']) eq_(go.refresh(1, 2), ['2 1', '2 2']) eq_(go(1, 2), ['2 1', '2 2']) eq_(go(1, 2), ['2 1', '2 2']) class KeyGenerationTest(TestCase): def _keygen_decorator(self, namespace=None, **kw): canary = [] def decorate(fn): canary.append(util.function_key_generator(namespace, fn, **kw)) return fn return decorate, canary def _multi_keygen_decorator(self, namespace=None, **kw): canary = [] def decorate(fn): canary.append(util.function_multi_key_generator(namespace, fn, **kw)) return fn return decorate, canary def test_keygen_fn(self): decorate, canary = self._keygen_decorator() @decorate def one(a, b): pass gen = canary[0] eq_(gen(1, 2), "tests.cache.test_decorator:one|1 2") eq_(gen(None, 5), "tests.cache.test_decorator:one|None 5") def test_multi_keygen_fn(self): decorate, canary = self._multi_keygen_decorator() @decorate def one(a, b): pass gen = canary[0] eq_(gen(1, 2), [ "tests.cache.test_decorator:one|1", "tests.cache.test_decorator:one|2" ]) def test_keygen_fn_namespace(self): decorate, canary = self._keygen_decorator("mynamespace") @decorate def one(a, b): pass gen = canary[0] eq_(gen(1, 2), "tests.cache.test_decorator:one|mynamespace|1 2") eq_(gen(None, 5), "tests.cache.test_decorator:one|mynamespace|None 5") def test_key_isnt_unicode_bydefault(self): decorate, canary = self._keygen_decorator("mynamespace") @decorate def one(a, b): pass gen = canary[0] assert isinstance(gen('foo'), str) def test_unicode_key(self): decorate, canary = self._keygen_decorator("mynamespace", to_str=compat.text_type) @decorate def one(a, b): pass gen = canary[0] eq_(gen(compat.u('méil'), compat.u('drôle')), compat.ue("tests.cache.test_decorator:" "one|mynamespace|m\xe9il dr\xf4le")) def test_unicode_key_multi(self): decorate, canary = self._multi_keygen_decorator("mynamespace", to_str=compat.text_type) @decorate def one(a, b): pass gen = canary[0] eq_(gen(compat.u('méil'), compat.u('drôle')), [ compat.ue('tests.cache.test_decorator:one|mynamespace|m\xe9il'), compat.ue('tests.cache.test_decorator:one|mynamespace|dr\xf4le') ]) @requires_py3k def test_unicode_key_by_default(self): decorate, canary = self._keygen_decorator("mynamespace", to_str=compat.text_type) @decorate def one(a, b): pass gen = canary[0] assert isinstance(gen('méil'), str) eq_(gen('méil', 'drôle'), "tests.cache.test_decorator:" "one|mynamespace|m\xe9il dr\xf4le") class CacheDecoratorTest(_GenericBackendFixture, TestCase): backend = "mock" def test_cache_arg(self): reg = self._region() counter = itertools.count(1) @reg.cache_on_arguments() def generate(x, y): return next(counter) + x + y eq_(generate(1, 2), 4) eq_(generate(2, 1), 5) eq_(generate(1, 2), 4) generate.invalidate(1, 2) eq_(generate(1, 2), 6) def test_reentrant_call(self): reg = self._region(backend="dogpile.cache.memory") counter = itertools.count(1) # if these two classes get the same namespace, # you get a reentrant deadlock. class Foo(object): @classmethod @reg.cache_on_arguments(namespace="foo") def generate(cls, x, y): return next(counter) + x + y class Bar(object): @classmethod @reg.cache_on_arguments(namespace="bar") def generate(cls, x, y): return Foo.generate(x, y) eq_(Bar.generate(1, 2), 4) def test_multi(self): reg = self._region() counter = itertools.count(1) @reg.cache_multi_on_arguments() def generate(*args): return ["%d %d" % (arg, next(counter)) for arg in args] eq_(generate(2, 8, 10), ['2 2', '8 3', '10 1']) eq_(generate(2, 9, 10), ['2 2', '9 4', '10 1']) generate.invalidate(2) eq_(generate(2, 7, 10), ['2 5', '7 6', '10 1']) generate.set({7: 18, 10: 15}) eq_(generate(2, 7, 10), ['2 5', 18, 15]) def test_multi_asdict(self): reg = self._region() counter = itertools.count(1) @reg.cache_multi_on_arguments(asdict=True) def generate(*args): return dict( [(arg, "%d %d" % (arg, next(counter))) for arg in args] ) eq_(generate(2, 8, 10), {2: '2 2', 8: '8 3', 10: '10 1'}) eq_(generate(2, 9, 10), {2: '2 2', 9: '9 4', 10: '10 1'}) generate.invalidate(2) eq_(generate(2, 7, 10), {2: '2 5', 7: '7 6', 10: '10 1'}) generate.set({7: 18, 10: 15}) eq_(generate(2, 7, 10), {2: '2 5', 7: 18, 10: 15}) eq_( generate.refresh(2, 7), {2: '2 7', 7: '7 8'} ) eq_(generate(2, 7, 10), {2: '2 7', 10: 15, 7: '7 8'}) def test_multi_asdict_keys_missing(self): reg = self._region() counter = itertools.count(1) @reg.cache_multi_on_arguments(asdict=True) def generate(*args): return dict( [(arg, "%d %d" % (arg, next(counter))) for arg in args if arg != 10] ) eq_(generate(2, 8, 10), {2: '2 1', 8: '8 2'}) eq_(generate(2, 9, 10), {2: '2 1', 9: '9 3'}) assert reg.get(10) is NO_VALUE generate.invalidate(2) eq_(generate(2, 7, 10), {2: '2 4', 7: '7 5'}) generate.set({7: 18, 10: 15}) eq_(generate(2, 7, 10), {2: '2 4', 7: 18, 10: 15}) def test_multi_asdict_keys_missing_existing_cache_fn(self): reg = self._region() counter = itertools.count(1) @reg.cache_multi_on_arguments(asdict=True, should_cache_fn=lambda v: not v.startswith('8 ')) def generate(*args): return dict( [(arg, "%d %d" % (arg, next(counter))) for arg in args if arg != 10] ) eq_(generate(2, 8, 10), {2: '2 1', 8: '8 2'}) eq_(generate(2, 8, 10), {2: '2 1', 8: '8 3'}) eq_(generate(2, 8, 10), {2: '2 1', 8: '8 4'}) eq_(generate(2, 9, 10), {2: '2 1', 9: '9 5'}) assert reg.get(10) is NO_VALUE generate.invalidate(2) eq_(generate(2, 7, 10), {2: '2 6', 7: '7 7'}) generate.set({7: 18, 10: 15}) eq_(generate(2, 7, 10), {2: '2 6', 7: 18, 10: 15}) def test_multi_namespace(self): reg = self._region() counter = itertools.count(1) @reg.cache_multi_on_arguments(namespace="foo") def generate(*args): return ["%d %d" % (arg, next(counter)) for arg in args] eq_(generate(2, 8, 10), ['2 2', '8 3', '10 1']) eq_(generate(2, 9, 10), ['2 2', '9 4', '10 1']) eq_( sorted(list(reg.backend._cache)), [ 'tests.cache.test_decorator:generate|foo|10', 'tests.cache.test_decorator:generate|foo|2', 'tests.cache.test_decorator:generate|foo|8', 'tests.cache.test_decorator:generate|foo|9'] ) generate.invalidate(2) eq_(generate(2, 7, 10), ['2 5', '7 6', '10 1']) generate.set({7: 18, 10: 15}) eq_(generate(2, 7, 10), ['2 5', 18, 15]) dogpile.cache-0.5.1/tests/cache/test_memcached_backend.py0000644000076500000240000001433012225642516024147 0ustar classicstaff00000000000000from ._fixtures import _GenericBackendTest, _GenericMutexTest from . import eq_ from unittest import TestCase from threading import Thread import time from nose import SkipTest class _TestMemcachedConn(object): @classmethod def _check_backend_available(cls, backend): try: client = backend._create_client() client.set("x", "y") assert client.get("x") == "y" except: raise SkipTest( "memcached is not running or " "otherwise not functioning correctly") class _NonDistributedMemcachedTest(_TestMemcachedConn, _GenericBackendTest): region_args = { "key_mangler":lambda x: x.replace(" ", "_") } config_args = { "arguments":{ "url":"127.0.0.1:11211" } } class _DistributedMemcachedTest(_TestMemcachedConn, _GenericBackendTest): region_args = { "key_mangler":lambda x: x.replace(" ", "_") } config_args = { "arguments":{ "url":"127.0.0.1:11211", "distributed_lock":True } } class _DistributedMemcachedMutexTest(_TestMemcachedConn, _GenericMutexTest): config_args = { "arguments":{ "url":"127.0.0.1:11211", "distributed_lock":True } } class PylibmcTest(_NonDistributedMemcachedTest): backend = "dogpile.cache.pylibmc" class PylibmcDistributedTest(_DistributedMemcachedTest): backend = "dogpile.cache.pylibmc" class PylibmcDistributedMutexTest(_DistributedMemcachedMutexTest): backend = "dogpile.cache.pylibmc" class BMemcachedTest(_NonDistributedMemcachedTest): backend = "dogpile.cache.bmemcached" class BMemcachedDistributedTest(_DistributedMemcachedTest): backend = "dogpile.cache.bmemcached" class BMemcachedDistributedMutexTest(_DistributedMemcachedMutexTest): backend = "dogpile.cache.bmemcached" class MemcachedTest(_NonDistributedMemcachedTest): backend = "dogpile.cache.memcached" class MemcachedDistributedTest(_DistributedMemcachedTest): backend = "dogpile.cache.memcached" class MemcachedDistributedMutexTest(_DistributedMemcachedMutexTest): backend = "dogpile.cache.memcached" from dogpile.cache.backends.memcached import GenericMemcachedBackend from dogpile.cache.backends.memcached import PylibmcBackend from dogpile.cache.backends.memcached import MemcachedBackend class MockGenericMemcachedBackend(GenericMemcachedBackend): def _imports(self): pass def _create_client(self): return MockClient(self.url) class MockMemcacheBackend(MemcachedBackend): def _imports(self): pass def _create_client(self): return MockClient(self.url) class MockPylibmcBackend(PylibmcBackend): def _imports(self): pass def _create_client(self): return MockClient(self.url, binary=self.binary, behaviors=self.behaviors ) class MockClient(object): number_of_clients = 0 def __init__(self, *arg, **kw): self.arg = arg self.kw = kw self.canary = [] self._cache = {} MockClient.number_of_clients += 1 def get(self, key): return self._cache.get(key) def set(self, key, value, **kw): self.canary.append(kw) self._cache[key] = value def delete(self, key): self._cache.pop(key, None) def __del__(self): MockClient.number_of_clients -= 1 class PylibmcArgsTest(TestCase): def test_binary_flag(self): backend = MockPylibmcBackend(arguments={'url':'foo','binary':True}) eq_(backend._create_client().kw["binary"], True) def test_url_list(self): backend = MockPylibmcBackend(arguments={'url':["a", "b", "c"]}) eq_(backend._create_client().arg[0], ["a", "b", "c"]) def test_url_scalar(self): backend = MockPylibmcBackend(arguments={'url':"foo"}) eq_(backend._create_client().arg[0], ["foo"]) def test_behaviors(self): backend = MockPylibmcBackend(arguments={'url':"foo", "behaviors":{"q":"p"}}) eq_(backend._create_client().kw["behaviors"], {"q": "p"}) def test_set_time(self): backend = MockPylibmcBackend(arguments={'url':"foo", "memcached_expire_time":20}) backend.set("foo", "bar") eq_(backend._clients.memcached.canary, [{"time":20}]) def test_set_min_compress_len(self): backend = MockPylibmcBackend(arguments={'url':"foo", "min_compress_len":20}) backend.set("foo", "bar") eq_(backend._clients.memcached.canary, [{"min_compress_len":20}]) def test_no_set_args(self): backend = MockPylibmcBackend(arguments={'url':"foo"}) backend.set("foo", "bar") eq_(backend._clients.memcached.canary, [{}]) class MemcachedArgstest(TestCase): def test_set_time(self): backend = MockMemcacheBackend(arguments={'url':"foo", "memcached_expire_time":20}) backend.set("foo", "bar") eq_(backend._clients.memcached.canary, [{"time":20}]) def test_set_min_compress_len(self): backend = MockMemcacheBackend(arguments={'url':"foo", "min_compress_len":20}) backend.set("foo", "bar") eq_(backend._clients.memcached.canary, [{"min_compress_len":20}]) class LocalThreadTest(TestCase): def setUp(self): import gc gc.collect() eq_(MockClient.number_of_clients, 0) def test_client_cleanup_1(self): self._test_client_cleanup(1) def test_client_cleanup_3(self): self._test_client_cleanup(3) def test_client_cleanup_10(self): self._test_client_cleanup(10) def _test_client_cleanup(self, count): backend = MockGenericMemcachedBackend(arguments={'url': 'foo'}) canary = [] def f(): backend._clients.memcached canary.append(MockClient.number_of_clients) time.sleep(.05) threads = [Thread(target=f) for i in range(count)] for t in threads: t.start() for t in threads: t.join() eq_(canary, [i + 1 for i in range(count)]) eq_(MockClient.number_of_clients, 0) dogpile.cache-0.5.1/tests/cache/test_memory_backend.py0000644000076500000240000000020012225642516023540 0ustar classicstaff00000000000000from ._fixtures import _GenericBackendTest class MemoryBackendTest(_GenericBackendTest): backend = "dogpile.cache.memory" dogpile.cache-0.5.1/tests/cache/test_redis_backend.py0000644000076500000240000000561412225642516023354 0ustar classicstaff00000000000000from dogpile.cache.region import _backend_loader from ._fixtures import _GenericBackendTest, _GenericMutexTest from unittest import TestCase from nose import SkipTest from mock import patch class _TestRedisConn(object): @classmethod def _check_backend_available(cls, backend): try: client = backend._create_client() client.set("x", "y") # on py3k it appears to return b"y" assert client.get("x").decode("ascii") == "y" client.delete("x") except: raise SkipTest( "redis is not running or " "otherwise not functioning correctly") class RedisTest(_TestRedisConn, _GenericBackendTest): backend = 'dogpile.cache.redis' config_args = { "arguments": { 'host': '127.0.0.1', 'port': 6379, 'db': 0, } } class RedisDistributedMutexTest(_TestRedisConn, _GenericMutexTest): backend = 'dogpile.cache.redis' config_args = { "arguments": { 'host': '127.0.0.1', 'port': 6379, 'db': 0, 'distributed_lock': True, } } @patch('redis.StrictRedis', autospec=True) class RedisConnectionTest(TestCase): backend = 'dogpile.cache.redis' @classmethod def setup_class(cls): try: cls.backend_cls = _backend_loader.load(cls.backend) cls.backend_cls({}) except ImportError: raise SkipTest("Backend %s not installed" % cls.backend) def _test_helper(self, mock_obj, expected_args, connection_args=None): if connection_args is None: # The redis backend pops items from the dict, so we copy connection_args = expected_args.copy() self.backend_cls(connection_args) mock_obj.assert_called_once_with(**expected_args) def test_connect_with_defaults(self, MockStrictRedis): # The defaults, used if keys are missing from the arguments dict. arguments = { 'host': 'localhost', 'password': None, 'port': 6379, 'db': 0, } self._test_helper(MockStrictRedis, arguments, {}) def test_connect_with_basics(self, MockStrictRedis): arguments = { 'host': '127.0.0.1', 'password': None, 'port': 6379, 'db': 0, } self._test_helper(MockStrictRedis, arguments) def test_connect_with_password(self, MockStrictRedis): arguments = { 'host': '127.0.0.1', 'password': 'some password', 'port': 6379, 'db': 0, } self._test_helper(MockStrictRedis, arguments) def test_connect_with_url(self, MockStrictRedis): arguments = { 'url': 'redis://redis:password@127.0.0.1:6379/0' } self._test_helper(MockStrictRedis.from_url, arguments)dogpile.cache-0.5.1/tests/cache/test_region.py0000644000076500000240000004234212225642516022061 0ustar classicstaff00000000000000import pprint from unittest import TestCase from dogpile.cache.api import CacheBackend, CachedValue, NO_VALUE from dogpile.cache import exception from dogpile.cache import make_region, register_backend, CacheRegion, util from dogpile.cache.proxy import ProxyBackend from . import eq_, is_, assert_raises_message, io, configparser import time, datetime import itertools from collections import defaultdict import operator from ._fixtures import MockBackend def key_mangler(key): return "HI!" + key class RegionTest(TestCase): def _region(self, init_args={}, config_args={}, backend="mock"): reg = CacheRegion(**init_args) reg.configure(backend, **config_args) return reg def test_set_name(self): my_region = make_region(name='my-name') eq_(my_region.name, 'my-name') def test_instance_from_dict(self): my_conf = { 'cache.example.backend': 'mock', 'cache.example.expiration_time': 600, 'cache.example.arguments.url': '127.0.0.1' } my_region = make_region() my_region.configure_from_config(my_conf, 'cache.example.') eq_(my_region.expiration_time, 600) assert isinstance(my_region.backend, MockBackend) is True eq_(my_region.backend.arguments, {'url': '127.0.0.1'}) def test_instance_from_config_string(self): my_conf = \ '[xyz]\n'\ 'cache.example.backend=mock\n'\ 'cache.example.expiration_time=600\n'\ 'cache.example.arguments.url=127.0.0.1\n'\ 'cache.example.arguments.dogpile_lockfile=false\n'\ 'cache.example.arguments.xyz=None\n' my_region = make_region() config = configparser.ConfigParser() config.readfp(io.StringIO(my_conf)) my_region.configure_from_config(dict(config.items('xyz')), 'cache.example.') eq_(my_region.expiration_time, 600) assert isinstance(my_region.backend, MockBackend) is True eq_(my_region.backend.arguments, {'url': '127.0.0.1', 'dogpile_lockfile':False, 'xyz':None}) def test_datetime_expiration_time(self): my_region = make_region() my_region.configure( backend='mock', expiration_time=datetime.timedelta(days=1, hours=8) ) eq_(my_region.expiration_time, 32*60*60) def test_reject_invalid_expiration_time(self): my_region = make_region() assert_raises_message( exception.ValidationError, "expiration_time is not a number or timedelta.", my_region.configure, 'mock', 'one hour' ) def test_key_mangler_argument(self): reg = self._region(init_args={"key_mangler":key_mangler}) assert reg.key_mangler is key_mangler reg = self._region() assert reg.key_mangler is None MockBackend.key_mangler = km = lambda self, k: "foo" reg = self._region() eq_(reg.key_mangler("bar"), "foo") MockBackend.key_mangler = None def test_key_mangler_impl(self): reg = self._region(init_args={"key_mangler":key_mangler}) reg.set("some key", "some value") eq_(list(reg.backend._cache), ["HI!some key"]) eq_(reg.get("some key"), "some value") eq_(reg.get_or_create("some key", lambda: "some new value"), "some value") reg.delete("some key") eq_(reg.get("some key"), NO_VALUE) def test_dupe_config(self): reg = CacheRegion() reg.configure("mock") assert_raises_message( exception.RegionAlreadyConfigured, "This region is already configured", reg.configure, "mock" ) eq_(reg.is_configured, True) def test_no_config(self): reg = CacheRegion() assert_raises_message( exception.RegionNotConfigured, "No backend is configured on this region.", getattr, reg, "backend" ) eq_(reg.is_configured, False) def test_set_get_value(self): reg = self._region() reg.set("some key", "some value") eq_(reg.get("some key"), "some value") def test_set_get_nothing(self): reg = self._region() eq_(reg.get("some key"), NO_VALUE) eq_(reg.get("some key", expiration_time=10), NO_VALUE) reg.invalidate() eq_(reg.get("some key"), NO_VALUE) def test_creator(self): reg = self._region() def creator(): return "some value" eq_(reg.get_or_create("some key", creator), "some value") def test_multi_creator(self): reg = self._region() def creator(*keys): return ["some value %s" % key for key in keys] eq_(reg.get_or_create_multi(["k3", "k2", "k5"], creator), ['some value k3', 'some value k2', 'some value k5']) def test_remove(self): reg = self._region() reg.set("some key", "some value") reg.delete("some key") reg.delete("some key") eq_(reg.get("some key"), NO_VALUE) def test_expire(self): reg = self._region(config_args={"expiration_time":1}) counter = itertools.count(1) def creator(): return "some value %d" % next(counter) eq_(reg.get_or_create("some key", creator), "some value 1") time.sleep(2) is_(reg.get("some key"), NO_VALUE) eq_(reg.get("some key", ignore_expiration=True), "some value 1") eq_(reg.get_or_create("some key", creator), "some value 2") eq_(reg.get("some key"), "some value 2") def test_expire_multi(self): reg = self._region(config_args={"expiration_time":1}) counter = itertools.count(1) def creator(*keys): return ["some value %s %d" % (key, next(counter)) for key in keys] eq_(reg.get_or_create_multi(["k3", "k2", "k5"], creator), ['some value k3 2', 'some value k2 1', 'some value k5 3']) time.sleep(2) is_(reg.get("k2"), NO_VALUE) eq_(reg.get("k2", ignore_expiration=True), "some value k2 1") eq_(reg.get_or_create_multi(["k3", "k2"], creator), ['some value k3 5', 'some value k2 4']) eq_(reg.get("k2"), "some value k2 4") def test_expire_on_get(self): reg = self._region(config_args={"expiration_time":.5}) reg.set("some key", "some value") eq_(reg.get("some key"), "some value") time.sleep(1) is_(reg.get("some key"), NO_VALUE) def test_ignore_expire_on_get(self): reg = self._region(config_args={"expiration_time":.5}) reg.set("some key", "some value") eq_(reg.get("some key"), "some value") time.sleep(1) eq_(reg.get("some key", ignore_expiration=True), "some value") def test_override_expire_on_get(self): reg = self._region(config_args={"expiration_time":.5}) reg.set("some key", "some value") eq_(reg.get("some key"), "some value") time.sleep(1) eq_(reg.get("some key", expiration_time=5), "some value") is_(reg.get("some key"), NO_VALUE) def test_expire_override(self): reg = self._region(config_args={"expiration_time":5}) counter = itertools.count(1) def creator(): return "some value %d" % next(counter) eq_(reg.get_or_create("some key", creator, expiration_time=1), "some value 1") time.sleep(2) eq_(reg.get("some key"), "some value 1") eq_(reg.get_or_create("some key", creator, expiration_time=1), "some value 2") eq_(reg.get("some key"), "some value 2") def test_hard_invalidate_get(self): reg = self._region() reg.set("some key", "some value") reg.invalidate() is_(reg.get("some key"), NO_VALUE) def test_hard_invalidate_get_or_create(self): reg = self._region() counter = itertools.count(1) def creator(): return "some value %d" % next(counter) eq_(reg.get_or_create("some key", creator), "some value 1") reg.invalidate() eq_(reg.get_or_create("some key", creator), "some value 2") def test_soft_invalidate_get(self): reg = self._region(config_args={"expiration_time": 1}) reg.set("some key", "some value") reg.invalidate(hard=False) is_(reg.get("some key"), NO_VALUE) def test_soft_invalidate_get_or_create(self): reg = self._region(config_args={"expiration_time": 1}) counter = itertools.count(1) def creator(): return "some value %d" % next(counter) eq_(reg.get_or_create("some key", creator), "some value 1") reg.invalidate(hard=False) eq_(reg.get_or_create("some key", creator), "some value 2") def test_soft_invalidate_get_or_create_multi(self): reg = self._region(config_args={"expiration_time": 5}) values = [1, 2, 3] def creator(*keys): v = values.pop(0) return [v for k in keys] ret = reg.get_or_create_multi( [1, 2], creator) eq_(ret, [1, 1]) reg.invalidate(hard=False) ret = reg.get_or_create_multi( [1, 2], creator) eq_(ret, [2, 2]) def test_soft_invalidate_requires_expire_time_get(self): reg = self._region() reg.invalidate(hard=False) assert_raises_message( exception.DogpileCacheException, "Non-None expiration time required for soft invalidation", reg.get_or_create, "some key", lambda: "x" ) def test_soft_invalidate_requires_expire_time_get_multi(self): reg = self._region() reg.invalidate(hard=False) assert_raises_message( exception.DogpileCacheException, "Non-None expiration time required for soft invalidation", reg.get_or_create_multi, ["k1", "k2"], lambda k: "x" ) def test_should_cache_fn(self): reg = self._region() values = [1, 2, 3] def creator(): return values.pop(0) should_cache_fn = lambda val: val in (1, 3) ret = reg.get_or_create( "some key", creator, should_cache_fn=should_cache_fn) eq_(ret, 1) eq_(reg.backend._cache['some key'][0], 1) reg.invalidate() ret = reg.get_or_create( "some key", creator, should_cache_fn=should_cache_fn) eq_(ret, 2) eq_(reg.backend._cache['some key'][0], 1) reg.invalidate() ret = reg.get_or_create( "some key", creator, should_cache_fn=should_cache_fn) eq_(ret, 3) eq_(reg.backend._cache['some key'][0], 3) def test_should_cache_fn_multi(self): reg = self._region() values = [1, 2, 3] def creator(*keys): v = values.pop(0) return [v for k in keys] should_cache_fn = lambda val: val in (1, 3) ret = reg.get_or_create_multi( [1, 2], creator, should_cache_fn=should_cache_fn) eq_(ret, [1, 1]) eq_(reg.backend._cache[1][0], 1) reg.invalidate() ret = reg.get_or_create_multi( [1, 2], creator, should_cache_fn=should_cache_fn) eq_(ret, [2, 2]) eq_(reg.backend._cache[1][0], 1) reg.invalidate() ret = reg.get_or_create_multi( [1, 2], creator, should_cache_fn=should_cache_fn) eq_(ret, [3, 3]) eq_(reg.backend._cache[1][0], 3) def test_should_set_multiple_values(self): reg = self._region() values = {'key1': 'value1', 'key2': 'value2', 'key3': 'value3'} reg.set_multi(values) eq_(values['key1'], reg.get('key1')) eq_(values['key2'], reg.get('key2')) eq_(values['key3'], reg.get('key3')) def test_should_get_multiple_values(self): reg = self._region() values = {'key1': 'value1', 'key2': 'value2', 'key3': 'value3'} reg.set_multi(values) reg_values = reg.get_multi(['key1', 'key2', 'key3']) eq_( reg_values, ["value1", "value2", "value3"] ) def test_should_delete_multiple_values(self): reg = self._region() values = {'key1': 'value1', 'key2': 'value2', 'key3': 'value3'} reg.set_multi(values) reg.delete_multi(['key2', 'key1000']) eq_(values['key1'], reg.get('key1')) eq_(NO_VALUE, reg.get('key2')) eq_(values['key3'], reg.get('key3')) class ProxyRegionTest(RegionTest): ''' This is exactly the same as the region test above, but it goes through a dummy proxy. The purpose of this is to make sure the tests still run successfully even when there is a proxy ''' class MockProxy(ProxyBackend): @property def _cache(self): return self.proxied._cache def _region(self, init_args={}, config_args={}, backend="mock"): reg = CacheRegion(**init_args) config_args['wrap'] = [ProxyRegionTest.MockProxy] reg.configure(backend, **config_args) return reg class ProxyBackendTest(TestCase): class GetCounterProxy(ProxyBackend): counter = 0 def get(self, key): ProxyBackendTest.GetCounterProxy.counter += 1 return self.proxied.get(key) class SetCounterProxy(ProxyBackend): counter = 0 def set(self, key, value): ProxyBackendTest.SetCounterProxy.counter += 1 return self.proxied.set(key, value) class UsedKeysProxy(ProxyBackend): ''' Keep a counter of hose often we set a particular key''' def __init__(self, *args, **kwargs): super(ProxyBackendTest.UsedKeysProxy, self).__init__( *args, **kwargs) self._key_count = defaultdict(lambda: 0) def setcount(self, key): return self._key_count[key] def set(self, key, value): self._key_count[key] += 1 self.proxied.set(key, value) class NeverSetProxy(ProxyBackend): ''' A totally contrived example of a Proxy that we pass arguments to. Never set a key that matches never_set ''' def __init__(self, never_set, *args, **kwargs): super(ProxyBackendTest.NeverSetProxy, self).__init__(*args, **kwargs) self.never_set = never_set self._key_count = defaultdict(lambda: 0) def set(self, key, value): if key != self.never_set: self.proxied.set(key, value) def _region(self, init_args={}, config_args={}, backend="mock"): reg = CacheRegion(**init_args) reg.configure(backend, **config_args) return reg def test_counter_proxies(self): # count up the gets and sets and make sure they are passed through # to the backend properly. Test that methods not overridden # continue to work reg = self._region(config_args={"wrap": [ ProxyBackendTest.GetCounterProxy, ProxyBackendTest.SetCounterProxy]}) ProxyBackendTest.GetCounterProxy.counter = 0 ProxyBackendTest.SetCounterProxy.counter = 0 # set a range of values in the cache for i in range(10): reg.set(i, i) eq_(ProxyBackendTest.GetCounterProxy.counter, 0) eq_(ProxyBackendTest.SetCounterProxy.counter, 10) # check that the range of values is still there for i in range(10): v = reg.get(i) eq_(v, i) eq_(ProxyBackendTest.GetCounterProxy.counter, 10) eq_(ProxyBackendTest.SetCounterProxy.counter, 10) # make sure the delete function(not overridden) still # executes properly for i in range(10): reg.delete(i) v = reg.get(i) is_(v, NO_VALUE) def test_instance_proxies(self): # Test that we can create an instance of a new proxy and # pass that to make_region instead of the class. The two instances # should not interfere with each other proxy_num = ProxyBackendTest.UsedKeysProxy(5) proxy_abc = ProxyBackendTest.UsedKeysProxy(5) reg_num = self._region(config_args={"wrap": [proxy_num]}) reg_abc = self._region(config_args={"wrap": [proxy_abc]}) for i in range(10): reg_num.set(i, True) reg_abc.set(chr(ord('a') + i), True) for i in range(5): reg_num.set(i, True) reg_abc.set(chr(ord('a') + i), True) # make sure proxy_num has the right counts per key eq_(proxy_num.setcount(1), 2) eq_(proxy_num.setcount(9), 1) eq_(proxy_num.setcount('a'), 0) # make sure proxy_abc has the right counts per key eq_(proxy_abc.setcount('a'), 2) eq_(proxy_abc.setcount('g'), 1) eq_(proxy_abc.setcount('9'), 0) def test_argument_proxies(self): # Test that we can pass an argument to Proxy on creation proxy = ProxyBackendTest.NeverSetProxy(5) reg = self._region(config_args={"wrap": [proxy]}) for i in range(10): reg.set(i, True) # make sure 1 was set, but 5 was not eq_(reg.get(5), NO_VALUE) eq_(reg.get(1), True) dogpile.cache-0.5.1/tests/cache/test_utils.py0000644000076500000240000000075612225642516021741 0ustar classicstaff00000000000000from unittest import TestCase from dogpile.cache import util class UtilsTest(TestCase): """ Test the relevant utils functionality. """ def test_coerce_string_conf(self): settings = {'expiration_time': '-1'} coerced = util.coerce_string_conf(settings) self.assertEqual(coerced['expiration_time'], -1) settings = {'expiration_time': '+1'} coerced = util.coerce_string_conf(settings) self.assertEqual(coerced['expiration_time'], 1)