multipletau-0.1.5/000077500000000000000000000000001263310464700140705ustar00rootroot00000000000000multipletau-0.1.5/.gitignore000066400000000000000000000006141263310464700160610ustar00rootroot00000000000000*.py[cod] # C extensions *.so # Packages *.egg *.egg-info dist build eggs parts bin var sdist develop-eggs .installed.cfg lib lib64 # Installer logs pip-log.txt # Unit test / coverage reports .coverage .tox nosetests.xml # Translations *.mo # Mr Developer .mr.developer.cfg .project .pydevproject .settings* # doc build doc/_build/** # other stuff *.yml~ *.py~ *.md~ *.in~ *.txt~ *.txt~ multipletau-0.1.5/.travis.yml000066400000000000000000000014741263310464700162070ustar00rootroot00000000000000language: python python: - '2.7' - '3.4' - '3.5' env: global: - GH_REF: github.com/FCS-analysis/multipletau.git - secure: IVoAJNYKGjWbHUGPpe8oOTLhltGrhu0F+xCaRVGs1tTut34BixSSeDgranlRiXZ0wlVOzBGrDHLkoLxFSRRy43BN4TSiv05WLBZba7ypGYBbDrLrG5nFPnT6n9d4ZgFHHHwyvI2ymdSs6/EJwZRXmr2Ehm0HzetA27FB1/Q3kc0= notifications: email: false install: - travis_retry pip install coverage - travis_retry pip install coveralls - pip freeze script: - coverage run --source=multipletau tests/runtests.py - coverage report -m after_success: - coveralls --verbose - git config credential.helper "store --file=.git/credentials" - echo "https://${GH_TOKEN}:@github.com" > .git/credentials - if [[ $TRAVIS_PYTHON_VERSION == 3.4 ]]; then pip install numpydoc sphinx; fi - if [[ $TRAVIS_PYTHON_VERSION == 3.4 ]]; then python doc/deploy_ghpages.py; fi multipletau-0.1.5/CHANGELOG.md000066400000000000000000000005331263310464700157020ustar00rootroot000000000000000.1.5 - update documentation - officially support Python 3 0.1.4 - integer and boolean input types are now automatically converted to floats - `multipletau.correlate` now works with complex data types - `multipletau.correlate` now checks if input data are same objects - documentation now contains examples 0.1.3 - first non-cython implementation multipletau-0.1.5/LICENSE000066400000000000000000000027071263310464700151030ustar00rootroot00000000000000Copyright (c) 2014 Paul Müller Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of multipletau nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL INFRAE OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. multipletau-0.1.5/MANIFEST.in000066400000000000000000000002641263310464700156300ustar00rootroot00000000000000include CHANGELOG.md include README.txt include tests/*.py include tests/*.md include examples/*.py include doc/*.py include doc/*.rst include doc/*.md include doc/extensions/*.py multipletau-0.1.5/README.md000066400000000000000000000031031263310464700153440ustar00rootroot00000000000000multipletau =========== [![PyPI](http://img.shields.io/pypi/v/multipletau.svg)](https://pypi.python.org/pypi/multipletau) [![Travis](http://img.shields.io/travis/FCS-analysis/multipletau.svg)](https://travis-ci.org/FCS-analysis/multipletau) [![Coveralls](https://img.shields.io/coveralls/FCS-analysis/multipletau.svg)](https://coveralls.io/r/FCS-analysis/multipletau) This repo contains a multiple-tau algorithm for Python - **multipletau** multiple-tau package, implemented using [numpy](http://www.numpy.org/) - **test** testing the algorithm - **doc** the source of the [documentation](http://FCS-analysis.github.io/multipletau/) Installation ------------ The package can be installed from the Python package index. pip install multipletau Usage ----- >>> import numpy as np >>> import multipletau >>> a = np.linspace(2,5,42) >>> v = np.linspace(1,6,42) >>> multipletau.correlate(a, v, m=2) array([[ 1. , 549.87804878], [ 2. , 530.37477692], [ 4. , 491.85812017], [ 8. , 386.39500297]]) Citing ------ The multipletau package should be cited like this (replace "x.x.x" with the actual version of multipletau that you used and "DD Month YYYY" with a matching date). Paul Müller (2012) _Python multiple-tau algorithm_ (Version x.x.x) [Computer program]. Available at https://pypi.python.org/pypi/multipletau/ (Accessed DD Month YYYY) You can find out what version you are using by typing (in a Python console): >>> import multipletau >>> multipletau.__version__ '0.1.4' multipletau-0.1.5/README.txt000066400000000000000000000006671263310464700155770ustar00rootroot00000000000000Multipe-tau correlation is computed on a logarithmic scale (less data points are computed) and is thus much faster than conventional correlation on a linear scale such as `numpy.correlate`. Reference ========= The code is fully documented. An online reference is available at http://FCS-analysis.github.io/multipletau/. Installation ============ The package can be installed from the Python package index. pip install multipletau multipletau-0.1.5/doc/000077500000000000000000000000001263310464700146355ustar00rootroot00000000000000multipletau-0.1.5/doc/README.md000066400000000000000000000005111263310464700161110ustar00rootroot00000000000000multipletau documentation ========================= Install [numpydoc](https://pypi.python.org/pypi/numpydoc): pip install numpydoc To compile the documentation, run python setup.py build_sphinx To upload the documentation to gh-pages, run python setup.py commit_doc or python doc/commit_gh-pages.py multipletau-0.1.5/doc/conf.py000066400000000000000000000231241263310464700161360ustar00rootroot00000000000000# -*- coding: utf-8 -*- # # project documentation build configuration file, created by # sphinx-quickstart on Sat Feb 22 09:35:49 2014. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys import os # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. #sys.path.insert(0, os.path.abspath('.')) sys.path.insert(0, os.path.abspath(os.path.join(os.path.abspath( os.path.dirname(__file__)), '../'))) sys.path.append(os.path.abspath('extensions')) # include examples sys.path.append(os.path.abspath(os.path.dirname(__file__)+"/../examples")) # There should be a file "setup.py" that has the property "version" from setup import author, authors, description, name, version, year projectname = name projectdescription = description # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. #extensions = [ # 'sphinx.ext.autodoc', # 'sphinx.ext.doctest', # 'sphinx.ext.coverage', # 'sphinx.ext.pngmath', # 'sphinx.ext.viewcode', #] extensions = [ # 'matplotlib.sphinxext.mathmpl', # 'matplotlib.sphinxext.only_directives', # 'matplotlib.sphinxext.plot_directive', # 'sphinx.ext.viewcode', # 'ipython_directive', 'sphinx.ext.intersphinx', 'sphinx.ext.autosummary', 'sphinx.ext.autodoc', # 'sphinx.ext.doctest', # 'ipython_console_highlighting', # 'sphinx.ext.pngmath', 'sphinx.ext.mathjax', # 'sphinx.ext.viewcode', # 'sphinx.ext.todo', # 'inheritance_diagram', 'numpydoc', 'myviewcode', # 'hidden_code_block', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = projectname copyright = year+", "+author # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. # The full version, including alpha/beta/rc tags. release = version # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. # The reST default role (used for this markup: `text`) to use for all # documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. #pygments_style = 'default' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # If true, keep warnings as "system message" paragraphs in the built documents. #keep_warnings = False # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'classic' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. html_theme_options = {"stickysidebar": True} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". #html_static_path = ['_static'] # Add any extra paths that contain custom files (such as robots.txt or # .htaccess) here, relative to this directory. These files are copied # directly to the root of the documentation. #html_extra_path = [] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = projectname+'doc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ ('index', projectname+'.tex', projectname+' Documentation', author, 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', projectname, projectname+' Documentation', authors, 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', projectname, projectname+u' Documentation', author, projectname, projectdescription, 'Numeric'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' # If true, do not generate a @detailmenu in the "Top" node's menu. #texinfo_no_detailmenu = False # ----------------------------------------------------------------------------- # intersphinx # ----------------------------------------------------------------------------- _python_doc_base = 'http://docs.python.org/2.7' intersphinx_mapping = { _python_doc_base: None, 'http://docs.scipy.org/doc/numpy': None, 'http://docs.scipy.org/doc/scipy/reference': None, } multipletau-0.1.5/doc/deploy_ghpages.py000066400000000000000000000063071263310464700202070ustar00rootroot00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """ Publish the documentation on GitHub Pages. Prerequisites ------------- 1. Create empty gh-pages branch: git branch gh-pages git checkout gh-pages git symbolic-ref HEAD refs/heads/gh-pages rm .git/index git clean -fdx 2. Setup sphinx. python setup.py build_sphinx should create a build/sphinx/html folder in the repository root. 3. Create GitHub repo token and encrypt it gem install travis travis encrypt GH_TOKEN="" --add 4. Add the encrypted token to .travis.yml env: global: - GH_REF: github.com//.git - secure: "jdcn3kM/dI0zvVTn0UKgal8Br+745Qc1plaKXHcoKhwcwN+0Q1y5H1BnaF0KV2dWWeExVXMpqQMLOCylUSUmd30+hFqUgd3gFQ+oh9pF/+N72uzjnxHAyVjai5Lh7QnjN0SLCd2/xLYwaUIHjWbWsr5t2vK9UuyphZ6/F+7OHf+u8BErviE9HUunD7u4Q2XRaUF0oHuF8stoWbJgnQZtUZFr+qS1Gc3vF6/KBkMqjnq/DgBV61cWsnVUS1HVak/sGClPRXZMSGyz8d63zDxfA5NDO6AbPVgK02k+QV8KQCyIX7of8rBvBmWkBYGw5RnaeETLIAf6JrCKMiQzlJQZiMyLUvd/WflSIBKJyr5YmUKCjFkwvbKKvCU3WBUxFT2p7trKZip5JWg37OMvOAO8eiatf2FC1klNly1KHADU88QqNoi/0y2R/a+1Csrl8Gr/lXZkW4mMkI2due9epLwccDJtMF8Xc39EqRR46xA7Lx9vy7szYW5lLux3zwx1tH40wV6/dX4ZVFoWp/zfJw7TKdOHuOwjZuOuKp/shfJs94G9YCu7bBtvrGv9qCH2KiSgm1NJviwcsZWsVHaq1nP0LliDE7EM3Q0mnkYzlvfOOhA2G5Ka3rHl1RFj7+WYzO5GaAFWU7piP/kdBwc0Mu+hb6PMoy0oeLt39BDr29bNKMs=" 5. Add the deploy command to .travis.yml after_success: - git config credential.helper "store --file=.git/credentials" - echo "https://${GH_TOKEN}:@github.com" > .git/credentials - if [[ $TRAVIS_PYTHON_VERSION == 3.4 ]]; then pip install numpydoc sphinx; fi - if [[ $TRAVIS_PYTHON_VERSION == 3.4 ]]; then python doc/deploy_ghpages.py; fi """ from __future__ import print_function import os from os.path import dirname, abspath import subprocess as sp # go to root of repository os.chdir(dirname(dirname(abspath(__file__)))) # build sphinx sp.check_output("python setup.py build_sphinx", shell=True) # clone into new folder the gh-pages branch sp.check_output("git config --global user.email 'travis@example.com'", shell=True) sp.check_output("git config --global user.name 'Travis CI'", shell=True) sp.check_output("git config --global credential.helper 'store --file=.git/credentials'", shell=True) sp.check_output("echo 'https://${GH_TOKEN}:@github.com' > .git/credentials", shell=True) sp.check_output("git clone --depth 1 -b gh-pages https://${GH_TOKEN}@${GH_REF} gh_pages", shell=True) # copy everything from ./build/sphinx/html to ./gh_pages #sp.check_output("cp -r ./build/sphinx/html/* ./gh_pages/", shell=True) sp.check_output("rsync -rt --del --exclude='.git' --exclude='.nojekyll' ./build/sphinx/html/* ./gh_pages/", shell=True) # commit changes os.chdir("gh_pages") sp.check_output("echo 'https://${GH_TOKEN}:@github.com' > .git/credentials", shell=True) sp.check_output("git add --all ./*", shell=True) try: # If there is nothing to commit, then 'git commit' returns non-zero exit status errorcode = sp.check_output("git commit -a -m 'travis bot build {} [ci skip]'".format(os.getenv("TRAVIS_COMMIT")), shell=True) print("git commit returned:", errorcode) except: pass else: sp.check_output("git push --force --quiet origin gh-pages", shell=True) multipletau-0.1.5/doc/extensions/000077500000000000000000000000001263310464700170345ustar00rootroot00000000000000multipletau-0.1.5/doc/extensions/myviewcode.py000066400000000000000000000213211263310464700215600ustar00rootroot00000000000000""" sphinx.ext.viewcode ~~~~~~~~~~~~~~~~~~~ Add links to module code in Python object descriptions. :copyright: Copyright 2007-2015 by the Sphinx team, see AUTHORS. :license: BSD, see LICENSE for details. Edited by Paul Mueller to support imports from submodules. Uses the importlib library. Changes marked with "## EDIT". 2015-02-22 """ ## EDIT import importlib ## import traceback from six import iteritems, text_type from docutils import nodes import sphinx from sphinx import addnodes from sphinx.locale import _ from sphinx.pycode import ModuleAnalyzer from sphinx.util import get_full_modname from sphinx.util.nodes import make_refnode from sphinx.util.console import blue def _get_full_modname(app, modname, attribute): try: return get_full_modname(modname, attribute) except AttributeError: # sphinx.ext.viewcode can't follow class instance attribute # then AttributeError logging output only verbose mode. app.verbose('Didn\'t find %s in %s' % (attribute, modname)) return None except Exception as e: # sphinx.ext.viewcode follow python domain directives. # because of that, if there are no real modules exists that specified # by py:function or other directives, viewcode emits a lot of warnings. # It should be displayed only verbose mode. app.verbose(traceback.format_exc().rstrip()) app.verbose('viewcode can\'t import %s, failed with error "%s"' % (modname, e)) return None def doctree_read(app, doctree): env = app.builder.env if not hasattr(env, '_viewcode_modules'): env._viewcode_modules = {} def has_tag(modname, fullname, docname, refname): entry = env._viewcode_modules.get(modname, None) try: analyzer = ModuleAnalyzer.for_module(modname) except Exception: env._viewcode_modules[modname] = False return if not isinstance(analyzer.code, text_type): code = analyzer.code.decode(analyzer.encoding) else: code = analyzer.code if entry is None or entry[0] != code: analyzer.find_tags() entry = code, analyzer.tags, {}, refname env._viewcode_modules[modname] = entry elif entry is False: return _, tags, used, _ = entry if fullname in tags: used[fullname] = docname return True for objnode in doctree.traverse(addnodes.desc): if objnode.get('domain') != 'py': continue names = set() for signode in objnode: if not isinstance(signode, addnodes.desc_signature): continue modname = signode.get('module') fullname = signode.get('fullname') refname = modname if env.config.viewcode_import: modname = _get_full_modname(app, modname, fullname) if not modname: continue fullname = signode.get('fullname') ## EDIT fullname, modname = find_modname(fullname, modname) ## if not has_tag(modname, fullname, env.docname, refname): continue if fullname in names: # only one link per name, please continue names.add(fullname) pagename = '_modules/' + modname.replace('.', '/') onlynode = addnodes.only(expr='html') onlynode += addnodes.pending_xref( '', reftype='viewcode', refdomain='std', refexplicit=False, reftarget=pagename, refid=fullname, refdoc=env.docname) onlynode[0] += nodes.inline('', _('[source]'), classes=['viewcode-link']) signode += onlynode def env_merge_info(app, env, docnames, other): if not hasattr(other, '_viewcode_modules'): return # create a _viewcode_modules dict on the main environment if not hasattr(env, '_viewcode_modules'): env._viewcode_modules = {} # now merge in the information from the subprocess env._viewcode_modules.update(other._viewcode_modules) ## EDIT def find_modname(fullname, modname): mod = importlib.import_module(modname) if hasattr(mod, fullname): func = getattr(mod, fullname) modname = func.__module__ fullname = func.__name__ return fullname, modname ## def missing_reference(app, env, node, contnode): # resolve our "viewcode" reference nodes -- they need special treatment if node['reftype'] == 'viewcode': return make_refnode(app.builder, node['refdoc'], node['reftarget'], node['refid'], contnode) def collect_pages(app): env = app.builder.env if not hasattr(env, '_viewcode_modules'): return highlighter = app.builder.highlighter urito = app.builder.get_relative_uri modnames = set(env._viewcode_modules) # app.builder.info(' (%d module code pages)' % # len(env._viewcode_modules), nonl=1) for modname, entry in app.status_iterator( iteritems(env._viewcode_modules), 'highlighting module code... ', blue, len(env._viewcode_modules), lambda x: x[0]): if not entry: continue code, tags, used, refname = entry # construct a page name for the highlighted source pagename = '_modules/' + modname.replace('.', '/') # highlight the source using the builder's highlighter highlighted = highlighter.highlight_block(code, 'python', linenos=False) # split the code into lines lines = highlighted.splitlines() # split off wrap markup from the first line of the actual code before, after = lines[0].split('
')
        lines[0:1] = [before + '
', after]
        # nothing to do for the last line; it always starts with 
anyway # now that we have code lines (starting at index 1), insert anchors for # the collected tags (HACK: this only works if the tag boundaries are # properly nested!) maxindex = len(lines) - 1 for name, docname in iteritems(used): type, start, end = tags[name] backlink = urito(pagename, docname) + '#' + refname + '.' + name lines[start] = ( '
%s' % (name, backlink, _('[docs]')) + lines[start]) lines[min(end - 1, maxindex)] += '
' # try to find parents (for submodules) parents = [] parent = modname while '.' in parent: parent = parent.rsplit('.', 1)[0] if parent in modnames: parents.append({ 'link': urito(pagename, '_modules/' + parent.replace('.', '/')), 'title': parent}) parents.append({'link': urito(pagename, '_modules/index'), 'title': _('Module code')}) parents.reverse() # putting it all together context = { 'parents': parents, 'title': modname, 'body': (_('

Source code for %s

') % modname + '\n'.join(lines)), } yield (pagename, context, 'page.html') if not modnames: return html = ['\n'] # the stack logic is needed for using nested lists for submodules stack = [''] for modname in sorted(modnames): if modname.startswith(stack[-1]): stack.append(modname + '.') html.append('
    ') else: stack.pop() while not modname.startswith(stack[-1]): stack.pop() html.append('
') stack.append(modname + '.') html.append('
  • %s
  • \n' % ( urito('_modules/index', '_modules/' + modname.replace('.', '/')), modname)) html.append('' * (len(stack) - 1)) context = { 'title': _('Overview: module code'), 'body': (_('

    All modules for which code is available

    ') + ''.join(html)), } yield ('_modules/index', context, 'page.html') def setup(app): app.add_config_value('viewcode_import', True, False) app.connect('doctree-read', doctree_read) app.connect('env-merge-info', env_merge_info) app.connect('html-collect-pages', collect_pages) app.connect('missing-reference', missing_reference) # app.add_config_value('viewcode_include_modules', [], 'env') # app.add_config_value('viewcode_exclude_modules', [], 'env') return {'version': sphinx.__display_version__, 'parallel_read_safe': True}multipletau-0.1.5/doc/index.rst000066400000000000000000000010331263310464700164730ustar00rootroot00000000000000multipletau reference ===================== General ::::::: .. automodule:: multipletau :members: Methods ::::::: Summary: .. autosummary:: autocorrelate correlate correlate_numpy For a quick overview, see :ref:`genindex`. Autocorrelation --------------- .. autofunction:: autocorrelate Cross-correlation ----------------- .. autofunction:: correlate Cross-correlation (NumPy) ------------------------- .. autofunction:: correlate_numpy Examples ======== .. automodule:: compare_correlation_methods :members: multipletau-0.1.5/examples/000077500000000000000000000000001263310464700157065ustar00rootroot00000000000000multipletau-0.1.5/examples/compare_correlation_methods.png000066400000000000000000002352101263310464700241710ustar00rootroot00000000000000PNG  IHDR XvpsBIT|d pHYsaa?i IDATxwX.4 H16Ơ{hƚ$ƒ1kIL],;vE; {Ν3;;JQ!B!j} B!xsH"B!7!B!$ B!B|# B!"H"B!7!B!$ B!B|# B!"H"B!7!B!$ B!B|# B!"H"B!7!B!$ B!B|c D6n܈;G!Bz.\A8::;I@^ƍԩB!|tQaH݁WpNZ(ml/}zϪu5oڐ57m?|͛6 |=y$:u^$ /mW9aggcy͋6_y>Og7m?|͛6|rMz3|("""BA$׮]cܹWWW}g*TPhml/}zOrJž;@׼iC!k޴!k5o0Ю_U( 9x z«YfoC"(H|]4B!B|#c@( pU4Y/00CDBK.5]ҲeK&L@bŰgdff2x`(U/ѣbii#{&55U>++coo#Çϑ|(ĉ)S T\~`g$((5kеkW{=޽{QT9R !B}лwo\@jj* 4?e sK,aѢE޽;wvZb„ |̙3'NNNt-$74oޜ۷KםB! ,JE&Mӧeʔa8::}vze˖x{{waggǂ kXYYadd3Θ?xyyѭ[7`Ĉxzz2rHLMM`ŊtRS3gdٲeܺu iӦ1j(ZhgNt&N… Wt҅;2gΜ}[E׹(Y$bmmkp ,,Q!z(ĸ_o @ff&111|G899q}ҰD=deeXdQx{{cgg+{q {A033ŋԪU)^oKKKpvvĉ̝;kkk&MBa,M,"/(ނϚ5 ///|||:u*t5j`iiɨQ8p 7K,ÃsaJ(-ysǎ;v,]t!""7o2p@:w#A4i^^^x{{3e[lllOGPfMΎΝ;?W"у[lIbb"SNiӆB!<|/&MbҤIT\={oQhQ-X"W&""B֭[ӰaCԩ3Vz7nΝ;TVmR^=fΜ3d>tBpp0vvvlRq1zh&N5?L2O}oRd"BBow (S 9i筡otB!"I"Bێ>уЅB!Tm*!Fz@B!F!B!DD!Bo$B!BI@B!F!B!DixȎnn6-{wށҥEعm|~t ƼG5"vyhRS{lYcga-6f6?M>.2ÅBaPj׮M*U:uCO H9~۴yvҹׯYd߾g "׍6Ž$IX07N$OmT)5$sHPڌ [+lmȲA+ƪؙ=1yo$3B!-^p޽P^Jzk(BCCIJJ5F&$X˗?y;x덟ij6͹zx!) gIJ$鎆w$CR Ii%(Y>J`|>&R1|<ns,y ةV%ai]LM06F#1w8N9ܵ$so&֚,[4E0ٙajdoB! (h4t32205}}񊢼.I@ɫafje//<{qqZ Aٿ*JvOݻoc%{}5I)M+'`|Uv5=si@f`JVq2ɘ$adrcT-6yrp)3..XX麟3.V.XZ&B 6_rq b)S&׿υ 8wݻw@"1c{. '=="##)[v|߿333WΪU(R :իWs=VԩSZ*駟r16mرcPFFF,_+uV;СCٽ{7VVVԯ_Sl2ON||7$ T`a`5ֆk( ݻKbyLar^&0Ir7{iLLdTM3MR`TnܶFm.b{{]2eNbbfrB'|BŊIIIaѴlْgn[fMMƘ1cȾ ?{,֭ƆÇӸqcN81ԭ[={2c LLL"++ aÆ?tRiРgΜ^9o2ePH,YB~سgIII2}t>|8ڵc֭[ff&Ǐۛ7n0x`v?[&>>[[[,,,8q"+V`Μ9xyyc:uꄓK~J$RUR{PRRt1rVXr;gUr n=Iq{6$q].Y!N}.yh@S{U(mWWW*`N!Djժ ɓ' hsvv֖>}uֱgX|9J_u|T^3gj 55ٳgd4hyؼy3 ,O>n_Pn]ʕ+ǤIK/u͍3g<֭[7L>ի6 rvv3qDnJ5ڵ9sH$Z ًΪ!u*r%'SZRWPeJ/g'1jrDƲm$)vq&n oLL)e[ "q]E)EBaP ..Kgӧ3f #11?w $$$`nnRmg/|O.]:,YΝ;L4IH۷/q:燙/^VZ/x:I@xA&&J^9s=ʩI_nE C\W.y ?OLiWt9x_~!9==IJ/E̋; K˗xqpp`Μ9#뽼(U?SN1yd6IIIa۶mTX+++h޼9zbΜ9X[[3bJ,IٳΌ9 *пz)۷o]v888зo_JѢE)U_5>GO=&EQrߟyưað̙3^ RtsssԔHzͱc7nNKFRn:5j%666|'hY&DGGcggGg=xM<$ B5 T2"'w$.&pw'pף R v5-8}4qqeձU\Lm%M !D!VYj}*TLJӧSvm {ʕ+۷/+Vz?^gFpp0}}ܾ}[; E4h~zm6mbԨQԨQ y4i>SZ56nܨre2 é_>.]Fi{;'''/^̨Q$ ɓ'k'%J3buF.]Xp!ƍɉ'r9)B@@FʃOͦR+ZǏgє/_GZ#DPu N^IvZSOXJex\m#GߎIL8u3ۘPX%*Tr쥼sy̍_n@Bv]/ vNK?._̄ g=Rs-~s>ص9=lscwj,IN@}*W@eh A# !!988BXʬ(TF:RXmRRɥ?I!"HO>`233ILLw8B`Q:P___s&f&^ e?z ?|:*uy̝3:I3{^٢e ,H`@jA%J2B!^' s`Ν;ݻwsQT9{^YYY 8^zQ-A5sc/S3x,pV":I6V ﷼iV{ { I{yore/d'%,%mK>!0!B] >ɓ'SJ:uꄽ}:c1{lضm۫)13#/|OQ[y[S-ŋ`W*(r B3r!7{/?2)Kٖ{mBJ$ !"W,Y֭[?y۷of̘13|+ ?GoG88⟂m3oݛsdsqr\2:\JT*H޵{y/v]FP¦!!{mzIB"B$ <^z> GGGZ[oe/={f%%e'#;vvҘb.yogRR+-}[ҷ%ٝ Q츸VdQ̺!CgJ)ǃB!>|Rn]χ~>}y1m4._-!\x[[\oy@XXaaayHh6䛉0},:u)Ds>d0U}ӯW&ԭ ؙѤ\k{D'D.Dh O`mjB! 6S`DxU4h@ӧO*EhhS|L2E"D~> #w~鐕e@P>mm&66Ƴ| oMH*$OnB|׋ =x0'* EQPTܻwڽ}6wֹ/]Q>3RRR>}:9f2Pw`Fش Μɾ+(sړX}025LQN9ͦxv#o'Q*֮4-הMQ  %… )SX*V"""|2ڰaxSNu}AJ@ ֭[? nuppy9͚5{6x#Bf sىƍ,!2S-G{\ JE9rs(ǀ :!ߘ{p.GrMy{.cBtڕd֮]mݾ}iӦq<0 2/// Bɒ{{O@/^ST2[L!"a;l> 9vr,mR|xʜ*9S{Ba E믿lٲSti&L]QBCCё޽{]ߵkWZl)^8\xZ͚5k ‚+V0||}}חi4zA2eLJH.]ʯZFVsN.]Dv-ZpN IDATŧVq7Kڵ4hÆ WWW>s .VutbBVi&T%#11___ر#<0`Eɉ1chTP!qT\cj_7k֌+W> HJJRƎTZUqrrRjժ)JrrrJLLLWaC[EqsSPkkEiJQ-R7r.%=EJ;(V"Pޚ=B!( waÔE*K.UΝ;_… EQUiӦrqe۶mJ2e]jҥbcctE9qr JR<<5s+W(JR|}}-Z(-ZP|}}J)SFzjchF(G((*TVʄە?+,M2Ҕ_Ntb;V!M7z8!DaPݻ+ ,uܹsE*iiiڲ+FFF͛7EN@\]]Gi<(iSYjNٸq`}1_Wڴi}ݥKE:u-[蔥+ʦMrmȑ#JR.^S;:eիWWFĘl۶M[gҤIJI4lPg˗#???ƍ+Ӿ8p۷JDGG8 |ùq;7YҦMҥK╩TPB2jܸst2j5ƗhwVկFFXXܧ9}0!Og|SnJmڼ_}Z"g!˺v-{yss{z'N?}ꚽ'ONݺurXXof`4 N *:jժSSS9wݻwؒ=6H"Ooe…\tA*Uz\̙3蔧sܹ\y<33T*Uۜ\]]yScͿgrvvwww};00P@&OѵW^t];!ʕ+6m6LqO@6lAr$5bРA̛7O !^֍AݠwCf`TbFxq\ܤo[R*`d9|[ʷIX{r-+鿾?-|ZЭr7ꕩ1!3̙B?6ڶNBƎ[Ix򂥥eVVVSRR1 5jЩgdVbСL2 llluw"N1}=~PtJJ EYgbbcߗǏyѣ\vT*Uj4gfffϘ#ڴiSq߇a4RؓtqqCB>EiYU~325[|g'ClYER- j-FF1/B*V_c,]D(nS;ӵrW}xBw@sg= / lB=rcɒ%ihj5/ ŋٳ=6ӧOmٙ3gtSSS233u `͚5899yHHH@W^RJ>O(Jdk޽+WNӥK-Z)aaalx𽧧gĥ/? /+V ###Ǻ VZ߳7XM`Ϸ@m.d`Юй3'ԭ %KBx8ƕ!C8({w f[8v1i|dB<剟Jbnn6l˖-ٳݻ бcGҥ Ǐg 8Ν;k/_ĉ1c=zE=!yʕlڴxF́tz<<<8r$&&Iǎqtty޽ŠArJJ*:@Av^YXXȤIcǎ|g/"E6O>͖s[w ]~-_?}A߇$B(l۶ ^B 6oy``5 [w. T-wT*)Դ+ jzcW [3[G5>bqbo07HBJYBP'OwBUӿ'?Iѓ~quNiOSNABI<(zd/^ d⒣ݻw9r7fݺu  &&|٧°\wsp^9<6m㏡F mEQXQcK]|Q K1r!HKK#..NaR|||)3WIMM֭[q Jʊ}2f̘Q+^$T7/H N@ 6mP̻{_~alXj.Iò T/Q]߇!xKKKH2Aرc;v 777f̘}x9z({e̙8;;~?N۶m Y|yP>S<<Zo+96tkteS.d ȿ]p͛HIIk׮DFF |Ǐ-BBFFд)lGpNk.kJ2 տ.ViWGeyߎ'`na?qY}Bo r ȓܿ\U'h4p.^cC'0LW!_bedW|ZdIDV-TwFdȅB6vj= f͢lٲQtiqwwC3jJ,}Bծ`3y /ײ-]׬C߃OE/Xrx 1ԌT}/B6̞=QpF *Ub49{,SNeƍ 6,@&sd/Ov`^}Lv ,-}t=rחQ9搩wB!D3dƌԯ_?S@&M0~xNʗ/ÇhԨsR\t<=HmB+WrJd=E;wѢE̞=;wbffFtt4uކejjʏ?{wg>}:RbEu6pyaр"q7˖a6b\[GRƷZjY!3WO@rsY֭[ 4x[n;G $!!ׯcdFCSJC&a򎪸onF-TFj鿾?Gn/&ԝCBa @& ye˖ܿwyŋsu/_N||<-s91PQuO.˕Բe\S*v#SwcTw={V>!Ю_o:VׯӦML2lڴ)C!g⬽\v5F@^߾95M5?$hA~vB!\V122BV?ۤWٴi׮]###D֭G%!8{ "#q1+m `a4u!f3f oR^C$BaX͍a} sBd$orA&;`5j2^B} fφ?>uukL'Ofd+ߎ~dEezڸ;j!0[B<'ص /b6O;w6tV+,?Z<BR /һwoʕ+=;w 11r!=G(zRC\+K V9oUd4nFߺS{^ !x|r Tš5k 99Lٽ{73gsBg66²gؚ\z%X'y뻷wY,}G, e Ȱa(RNb97i҄]v!2!0,*t(Ke:T=ˀ tЇO6}B Bo >R G233YjB jFH˓ׇ^L`jé躃743F;d!' FbÆ Ӈcǎpu6oLz8q#FsBa<<`3V__XZғ6|Dr}G+3QF,^իW @NhРbٲe9J!(T< 5kBи13 6v[y뻷XytBQHܓsвeK6oӧh4-[ `cc`)Q~֭gBrQ?omo)jQT !(D :IMMTR9CҲeK}$GӦP97kkC sYJkF]7_L=zV!D!aз`YYYallYBkkom̊#'**ø-voѾG)Tg=JwB! N@ڴiÏ?(E! }qf1;CtKki #h>s9xCBQ|ҡCn޼Iڵ9!Δ ;& X?^>|Vއ4$p~ S" !xi=vwڕkJEVvwYb1;\!A' NJByaN[5 ֮snuɊ& 1IN\ !( z m۶寿w(B22>'zuhv=p #}`65ĭT!D` Zˋ0`˗ښҥKӾ}{N>Z'Kr}?%GY:* .Y BVY#Ba`TIݜjߟc  &&bBY|~@=> 9xo}ÀPTS!Xvj`i4Z(ܸqGG[W)]4iqnG>՚V}pWa !0˗sUڷoP@q̚@~uƫtj⮷QTS;D!ǀiԨc׮]+VۄcggSFXXK!7l?fqz꟮Ĭn?QD5}G(ʕ+YrNYrrɝςヷ7k[hA\\K?'Çԯ_CejԨ6B0kk?ǏRse ;YFi" IDAT!^#C~5A}tܛD+MسVR[*E-UvUQZ-UEmbH);Wɽzusqyw<7oޤUVoѢ7nxcbb޽;Geƍ&B!2F rb~¢܍qms !&f?ݝӧOٳP#Gdk֬Yd޽_]!>Б&`T/lΦ =!<]B!^ft֍9sɰað $$dBm9sNپ}{}:N!"ɬ|]bS,َc~tZ:uHS+ˆ~Hhh(۷g޽XZZၦiCƍٶm[|bm B ¢xo~_ז!X{m!f͇5fW v¤̾4k,M6BZ{(n]-Ϋ;S<`!]?,ʜ}Uo?;GCŊP0arS<͟H!ɓ'899ޞGecDB!S"u9IjzԤ|3bbc~Z5k8س<;wo\d^&ůBC! HѢE9tP:DY#^!^j>o9Oހhӧ3H]gU] gH]˺ B17`ݺu̝;̙3o0B!A3{<r(N{NbH|S B_;Ɔ;wl2jժ~~~Ԯ]˗{lK>BN^]#鲱 /`bخZ b]5k‰ Z!z@̍eBmo }΍M]7QP7z  QjI_[̷+&`nfG4.\ooƅ LB3б\GN}gkg-Z`d3TBr9"ٲe JRJmۖmRR%J*֭[MB+$~U/ogKB"C2ר OCD{7alܹl"2dǎtNԩSټy37ofԩtܙ~Q !05K{;־7SsqM=8˕jA V|:AAoZ!r#R^=9tIҰaClll8|pcncB$w6uk5fngŌq0y !r3s5_gϞo߾ɒPUz-Μ9cȄBrdުL{SN1JO=6!x#h#- 2Q9fX[[ѣT?yDB- _[Ⱥ~y;5әoX\Ϻ)ߗ;B6nwwԑ9f4mڔyɫ9ryѬY3D&"'Q'ʎK̯0d~ȼ<8}3ߦiX a&|9c 66{">6m6664lؐzѷo_Kݺu_>666L6a !0cJsdWϐ_Eay 9ڏ?^ǦQ]U7D,4ŋ25)uB۶a Hɒ%9s |?f0|pΜ9C%LB3gci ˏv7-?w|U` Ht%Phx$1Oɓo80'  `\tp5kTۡL8VZ^gժUF\!V>E>|4Xހ}E1+ Be@nh`H];^^{M%I__eolBd' 7?Lu޽{iҤ ɓtUV0RB!W)RwQF1q4_Lpt ԣS# _mKԏQΟW]e yAIϵkptt|ɓ'~dBl!g_>׷g莡Ex;С0b?ϸNDb"Св4C&D:3UVr?O2K&;ɓ'={6mdctB!^6V[]#ﳟuQ)kp nǩHVc܀Ev ,XZjFA`eEǎzD0Gf~~~$_~~~X[[3d-[f∅Bt:j4k2nmM-߰V?K0sUE+W`飆ΜUd$¬eȻ˻ 'sΥC&J!DnP)%>'~¨]vi:K5T}oƆn2톴}͙<"2cXƌQk6nf̀c)Xԑ *I͛|!V66l9=oF *WWfxk8M~v?ʢEC*U+ad'6eB,{@uA+V,#IjĈ8;;'ֳgOz왭q!:<q|ڏE7U&&lMߏ) /MgtioQíi<<Gcغz6U„֭[Ǻul 4DfxzzOӡi: ={6իWs !~6ά츒e;0T2^[Dr kƆ?FYL^R0K/GjUspǏƌ77РJrA}; o5<~Ux Rz ~IjԨa3dɶêUȟ??C5AdB!rN;Qh}mD;ѣR浚{-Agbkhg "/ZfA6P<ʩ ӧ×_>{or??5Y% 7oBPP|ԯV/\PUq}ҷoT3:us̟?|}Um۶6lN_.(P=ù0*,7{thiɧ_; /,톺u/h' ™30sfN`$QD|tM`Ȑ:ŕ˪$ngM,9iooO~3gNڙ9s&&L`…t:6ō 8qb!/NGWzq 4lLϟzǎ>PMBeT&M(ʁ- K&"j\uBBLwvд$% aM* @ll,T7n 66Xbbb>' ! 8`C g9Jo+Wݻa VX>>PPvȑ\Oofg@|dWCJ@FfMUM@ؾ};_5ժU3u8B!rN;q t,ב&\;ǫǷBG^B WRdϫΜQuI EU5c ^^Ņ:`ggǂ LB\֕W.ݡ*sO&=P/?L0;ǎ)E1ߣQ|*YzqA>WIHH‚YZ0=}5ԙ$ &$ۦț7/^^^hKKB!r楚sn9Ɨdݿz>m˴M8U+W>7O6F./aDjI Egΰ NvQܽpza=z-\|l4WH( 2u4 =Rw4Mfg:!`66Ll47*{;uѩ\'fgB/HӦAY`gݻCts;¸`W m8 *\%`:U!U$ti5q JIh>}:ԬihX[Cd0of?K!ȉJfg>#wPn~9>)!!Ф 'kf쀇eP < ]%ж-jۣ5UQBMuJ>\]ɓ #˄ѣɣѣ6O?q)M4(zJ BSttԝe2Tf=W0T{Q^s:TaЭs;n܀m5Wtjpaz4JʀZR6==ygboo(˓""` [x46fШQ#|||pqq! 7778 L_t+&έ[Md#^}`,Y~ZXxCY.zذA9pv"ET0 ;k1k:XZJGz>y)%J NG```zq(WBabz^bf<ňGp?D 4{qxz?fZƸ'YSp_$oݿƎUõ<fa5PyBpf'3hѢygeeElU!&0a e[7 /;-ŋaT-bl: ۶p uSᙳpɓ\Vψ2jڴ)[ls~={6̟?vڙ0B!"s>5z7O28?=%d莡24e*bѿJݻa8}z#W/56hJ#5h>S Ʀi*511j|1c駟 ;v,}eӦMl۶^z1k,G)B8++C_TY ڴ1nii|N\m>)^8;wfekkҥK ߟ+Wl(ḆȯU/Y'NdĸW3܇9p)/OuDH-<\~ԫ׮a<%J9xj|>#6BCaBU*%#>JnLdJ" 5k(8r<gkքcT H޼bF 5`hߞ ynpUP] +RLI|`:7:) 'ls:5(ztE̓YY!2 ?>JF9K/W' ?#_}ӦMcٳŋG:V#L@(/9""c H20q"/\:u2eoZʐ>}Z!&7~'9T_z3 q*p yNbT|c ߜ%r IDATT{ѕos ĢW[W+OiҔp)Bp*L] 0pwWǧo'F%Z55#q+Wj.{J4)Q"ss~Q:9{ְ' kqC>3 $w|ODF{@*TPԒ%;on{QPdfwHB!D2G 6{T[yXtAjby;ر|44z8s3$ޞb*/rB믗iB%;+ Qر0v,L~ٹf*OW5+,!Cͷi66ٴEv@*T(Wsmh4Khv \])XPȍyjΝQ%|aO@-Nz[I=dzg!eoʕĨ..ŭnݴϭruu 3l B!Lae><U4 ֯G>7ϧq. ^He1DY|wVCwYYXQ޽.+eP=<T_JS/tۈOԟGaxQkHd !42!㷅FNxt8D$\g/d*_k_:TDCPOeuIvz̵jEh58o;pP)cIlPd^dw)>s䱝ږ&/$_;P!tɰPUpo )٢89W*̋:)TPì="{Gsm={gϞ R!xN#՝Oׯ2R|!zͥy*+KŘj767GظbB)RĨM i1DDþDD[Ob&*Fjh1$^C#VEkiI^o-ILhjK* lY=L-Q^d5I_pm{~L]+w^?)^6 [\ ,>3w ' E*41$P}lQM1nݺu[4 @ҢbG,--$ۧLt:Ν;s N8]a ![Dt:M?SkhM״ٳ5-68x8VgDE"#h:6l4''M7/[N5SiȞ=$Mqaqm_ ~7hA\Ik:=i45ǎQRݻ+,/0Ф3\}u5&g]fMصK59 Txjx%fE'cxbJ P_ݾn[Rz= _Q,6iruRvmvرc3f /I&ܺuӧ: *-_PQ#5c>5'ɠXv ʔA jLM60Sl(8b$(s7ؽ訬YUp| =Y|A >5Uȑm۶A~81uɜ@!BN=6ߵ 6n[͉hU`8{VMjNi|o֋FO {Itx}fBްg̙Cهt/O=%/7\6"eу&"cB?]_ǵPiL|GQjNXeTGEN~4B!Ș^SCWC IGz_dUth|?~<8wtiHwFY n2*n^vT0'j$ p aWD 5* .V;BO3P&= gϪ߇jTnF)I"B2h!CRs?s&LWSh}muW [H:! @uC/ KfxtafnT^TÊ{ GMSSMKo YRQp08|M?IQ\(7 MS۷=w^^j54a :} Nx]ABfUYvIWWxذ&U?Nɓ%IXB)ᆱ&ܮ_ƾwjҠAj1]ӱSѰn̙%7z밴40EĒt^1dɄoP%$ /$!"<<{w5l .djr2[65^9f&w#~U4z@Uʀjɓj\UFUKOWW5¤TZmժ% 'ɏE!KӦNJ74[v}ߞ=}` w* wk`ɆUOW* Hdz͓KT=8w睊'W&}j2.# B!r++[O&):yL={`JE67=-.FFBp0jMncojQ*ðZêkG{C̭XU⢆.X ֬XC51oV8m=z5NYudN )%JRO-,zMCݝ6TnC8z@=x $nNs!u?OOWlZ7a55>tfHGgڵa!B!k֬W^CgΝxzzbVB!f ,,7oҲeKgp$B!Bd.B!6dӧSX1^:!!!I5j[[[qttm۶It>|^ϔ)SLsS"A}]xWYdYD$Dz\Ǐ2jdwȕ+Waƍ7C:4!R4}tn޼ɭ[hѢ[9Kl}'1ggg%ނ  eÆ ߟ۷o:$!F޽7nFm[#Yvmݸ&Lښʕ+ӣGvaHH~ԪU ;;;lll5jNNNY2Lzz=uǏg:(CHkNGv_>[n5QB$Hzk׮)S6 eĉj WWWz=VJ؈ƌvvvԭ[ݻw'9W^w,X@ҥS!OċUc\t… SP!6li0`@v|<1ר(Y/;X|M,,,Rm;Ӵ\ƍN<==ƍk:N[jUC>#mɒ%Z5+++СCiW^oEDDh.\ ({HIV_ڮ]p-""B5kVP!-(((>xeSڃkݻwƎfG/^kj!!!ZTTaE}vV}$ˎׁjZxxxV|MTu=x@4M;~xGt:6smᚗV~4uYsttJ(-YĸBY}i5k5WWWiӦکSAD_۷6eʔ.r^ƍk...Z޼y j4BV_7ot:fgg988e;fE_[bEVV_ٳ\)ŋkkԩS)S&zݵjӦK*r^EN"׫Iz$ݣPBɶmH\"'U$rܯWI@uqM²;$!R%׫Iz9\"'1Utl{xxx~!̅\"'U$rܯWI@QPݻGv$Dz9\"'U= IDAT$~Jjժqel?z(UV5EXBHW*r^ENb׫$ ҥ 111,^8~[DD+Vnݺ-)DvU$rDWk̟?.m۶q- kצk׮;RT)VZŭ[dyz9\"'U$/jԲ94NzM\xx6zhPBVNm׮]&\FrDW*rziZeB!ˆdB!"H"B!6!B!$ B!Bl# B!"H"B!6!B!$ B!Bl# B!"H"B!6!B!$ BZr%z[n:yzzҮ];Sa4a!D Blth^#YI!N4S D$B!BdI@"غu+m۶p_|Alllc[J,u4jԈƍt5kPvmquuۛ?#qvRT)V^d'O5j+Wgggڴiٳgo>z=7ndʔ))R[[[5kƵkגۨQ#*W̅ hܸ1)R:Y|L8///lll(Vcƌ!22ҠB$B\bժU8991rH͛G50aq;}+_ͫJN{A>s郵5'OfҤI-Z{r ]ve˖̚5yҷo_.\ܵkغu+۷gٌ=sͽ{b֭|G;#GЫW$t:>>xyy%oaagvz *{lbŒmsqqɓ'49s`n޼ILLL>wwtیK PHϝxnɕ+WR>>l۶aÆM(I/ۮRظq#3g$(( 6`eeEoOB!@6B!^#%%'''ZhA``B!zt) ÇԨQ8A !`H B={ej {{{N<ɒ%K(Vz !oKB@˗ٳg:t`ɒ|!xo"B!42D!Bi$B!BdI@B!F!B!DD!Bi$B!BdI@B!F!B!DD!Bi$B!BdI@B!F!B!DD!Bi$B!BdI@B!F!B!DV ȉ'puuښE2abcc]p 3gNh߾=OŋShQ,--bܹVB!(EQě8wҳgO9|0˖-iӦlܸpJ.DEE1m4DZc033H^hѢgLrHƏ@DD...iwqq!22D̈GGGrnݺngΝxxxZeB!Ȫbcc ~i}!$ wܡaÆO888e&NHܹӧn6,ss4[XX_n[/277O33;wm۶xKB!Bd+VЦMc} &pM.]+͚5#%%aÆѺuk]D|||cI,--IHHH\qqq/mԃᳳ0s⼆]x޴월{]c~3\C!׫a5sj: Ϗ }szڶm5lȷ~9~~~L>]7^zaffټy3c֬Yg"|l]D&׫NzJ*EQciB!n݂ E' B!R>y8qdu!Ba8qdu!Ba'N;A!B! .XN'N!B!DixKIxHH>R=O' 1ǦLBly\ )*t^ZygU$$IHVdJBdRRTQ૴ǃC'p<8s%bo;njBELڍ`nnOU! L|J0tW<*W~C`ڴWױa瑗pcsST'&7 )33S4$`FbF3&V*m^Zr$2VxXlF{=9ј$1I4+x4)1SPjFeX>}on\%2Fõ67Kݬp( Zi^ !aiiI@ $9^u__Q^uu8 &FL,IĨxh::L$LPL01K"_%֘Yh[2':9sZ6lrbkkKժro/JK῏QUPAd$h*  l,lrzm ˝;\|㷎s7.k6=;͏Up{tG;$p;qvZR[I2ځ @ `fB!2;)% 'Odر:t8<==޽;=… 0Chhܸ13f1M}/fڴi?}̷$k5O<<_[6>)^׊r;6_ڣk=½ l;1wt*5nܴI6).S(c0ݵ.`essb# {{gS2ŧ[[]U!"V Ȯ]hҤ >>>=9r?pM]pU&M"**iӦq9;Yoի-Z`߿>}-!>z۸n2\|Ga\{|Ga= +wQ(hPµ+5>FYay>sq \Y* EQPT\z} ^^^TRuֽ\޽ ŋg֭K`` ݺu 66www*U͛uǷk׎7r lmjӘ->)Onh[NKN%*_ʑ"E(0^)^ĕvV$Y`1<|ʃH_QTWEmg}MJh!B˗+U0/dWuL:aÆ'Oʕ+G%ҔyOYjweĉ`ii֯_.]6^^^YFI޽ӧ+Wd֭i0sSs }t%qe.޿}<ȩ[kEc"NE(REh@IQ78})/ZLf*8%J,YȑB!^_F8U۷odw&W\ܸqMrei׮3gܜ7or=ʖ-x___o߮{})4e˔)ZӒ0Dȭ̧()|ryb_rnEҕsF x+oƛP<&'i8q/DmH:\ǯjc.ߺB}= f(& >e˖O>._LRR͚5k׮|0g=zĪU%...DFFhppp֭[i"PԺ'u $ [{3AL:8 k3kJ.MJxݢ"%zwRp ]KorI4QkWYHQ!Hυ E4)W!ԉӧՋYfЬY3 d`nnx @;̌X4M277%Dv<y}뫷^=9ۧ9}4'04%JM"ڄ$7 Ʊv@yűAjBᥱpiw)Bd=aaƎ {0h?ҰaC||| }֭[moݺ5=z"E8z,--IxIG8]9!>NNC:myO]Rri6]ܤ[cQ*t@p+@!bVarz¯UQoN,BӧAd/$99vѻwo01y2YΞ=uryrR3 늕ZDDa...$''s}nX DFFz`cui$!2K34%)J \?8~7t&d%Jem,I{᫱`eMBUiB!D &88XoǏM 888HB> Vٲeٽ{7|'j899ꊓǏOscֽ.]4Ǐak)8q9sfLC{c:xw+ćFjR/$^?U_';/ڢ m2RƛOׯC<ڄF!o-/ğMÛU4ٻw!ӪU+&OŋYnE033F4oޜ˗˗4hZjaoocmmMƍ3XkWO9~#G_w"&%42ۅW1H.2Q}7;i/os*wdS4&R!06/mVs,Y$U޽{YnÇ'O< >kRfMOTTSNdɒtIW&LO>jՊzqV\I@@@ !T*n[ٷ]?ް|cr೭ z_, Q KM׮cpfi IDATO-MCѺnʪB!I@ތWBOJJbŊlݺׯ?~7nL۶mƄK,][nA>}+wy177qL>''4u.Zӧ/_>훦ԲJBdU qK!\}II`r**i[ۿػQYw;p)obR!Xwko><<-(B!lMo gW& ?^z?~\rQ@_} sN cǎ4jԈ]vѷo_CR2I&,ltZ_:ygMFu*N3~(-[vӵ "i^ ;ɤw(CQQƎ"3: {ݻL4;wh43޽{Bڄ*P%_fԟXw~ίcɑ/%*:!ŝ-[N@u|i-oW+vy ,nA4u!v RެGܻwݻws5@HݺuqVgjxڕُ:N`~nGߦ}!k.[`WR9H luʿu!wN_NV͛_3$e_B_rJ2{J_Xϓ'ѻ#_{˗/|ZD?͡9:R~.!5ߍ3Y`=[<_|z_Yy!04 =kS۳6c륭|g]hZ);Q`W.zhfesEyFm9~|f;Bݼ0HđսWJ"66FkQT$'٬5B>,L-h^9͋5n]V]Ki1yr}tF!B#OuO){yL~B!8cG=Wdm%z?VO>ŋC4)0*gkgT7S,='2jMw4+ F<%=$5ZTѿPV\!i8đսWұcW\xc!DYgVR(R2.eRw ίcɅ|Kݑ>=(h|"*n/qBB Z|BAܹ3=z|?v ,[JVXAѢE\pm;,̒vѮT;;O?"FOѷ\_Z&e78QX:ݫ]7D!oŠ Ȳe˨SKWlٲ>)Z|,D6S̩3db/96+RȾ}|ѻ#ģV-m{7xy;t!Y@߾Ǝ"{0JefR!ފKwd|_W_o(3xm#+۶;d!Y$o[@6mĦMx… ٽ{wr>d)"éT**W{%כw|sͥi e*G6 Yy]!ǡuk6v{' k֬AzeT*T^3f)"Sta||W;V]̣3~,skw3p>y.kkc+իgx5|p`ѢEDEE=iBd04Y~=~~~ܽ{k+,,,(Y$G6)h+WӧOӻwozť2dC TX&MFꫯ4h~)o۴jwn߾́1cFQF8;;s1zI=yS{Cqppĉ]ݻwӷo_|r.]dÆ zE@@+V 003`ڶml `aFC.9# [GNg7">q!{X .\T]V?~իBӲ#> :dҤI+W#,X@ΝUaaaTPOrE[ıVfVKRѸqcz aØ9s&!!!zO7`{رŋ3dȐמkkkLLLpvv~myfϞ '|”)SHHHoc:tVZj* ҒbŊ1w\4i”)Sprrb֬Y >f͚`vܩ;g||<&MbϞ=5=<<8pTvU zAn?߃=QxRT66\ٚ,>Rh*h4DF!Dr#> ?I&ajjjhozO yl,1xC)bK,:O<ܻwOo[ŊuMLL([,/>(^ܹsSD kZw… x{{-|YR%RRRh4ܾ}[X~?<}:u;!!AVoNJiU6n1$~9H;|XqݙS EBc/MmcǦ > XYYhܜ~ggg^jSf;E=ߡ^CzqV,JYRPiSUSR M6$%%L| ylʠ-tT@RRDTTO>J5 5 [bX|:ZpѣGӼg Ŋܜk׮Qj S||mܙ7aU1n8zTHBl\+ǐҬ7 ӧk!Ȓd~gA_|6m">>#Gw^prr>"P嵭7o7nŋӇǏӹsgʗ/Çʕ+Z˗_@_Μ9IHH0Xmڴ‚:_B~h߾=NNNߟɓ'i&.^H޽y9s2x` @PPW\ɓ̙33v |͛7coo=+V`۶m,YիW3vXz7oN Y&/obdΝDFFK˖-[.sՕ4hڵCTT 4O0QF1i$+FÆ پ};lxS%r`[(B5i35 GBL )U!D:6Yu}:`u|,BQ֞_=ù*]tyhX uכjr!$$+igY ]!JE8<3dZVJ8~%:}CB!{ B/P*fRTLXv$Dјh_?mKe1|pm=X{5r78:`&cFB!{% իWc5Ȱ٫9X90\z]}?>}9]I^0y2tQB!D^ Ȳe ۛ8q"Fx;wNo߅ 0`Bиqcf̘cz/^̴i YoC!^sqvk *Ota"M+D޹3̟g# !Dotf^I,<<ӴSZ5^ʤIxm4޹0zhFKF`nnf^X4R177ו(Bqpp@VsYcj57o6vBUY'O=h?{ŧ$ȴBQ% gC~`x& Ee߾}/ݿi&J.Nu9GGG2ħ9..N% /Bt>;v`l۶۷oK,,YR7i} {F8QCBҳd68v̸dktЁRJѢE @;nˌ7Ç~O̚5p8vruz+0s ܿ_VBBڼql:̕+Wpqq|t(`bbb%$$JJ%cC'něFmoјȿ#!0g!-X׮4 .s &88XoǏ37Q EV+*JT (8;;+ZJSRN[**Jٶm^C)*JYbE*V:} P~)ΊRJc>o߮)SFh4ʦMZ8qBQEINNV *wwwC*^^^驌5JILL3fObbb(\tIZbaa+VLٵkRM6eǕdkRHt(ʿ\ɑ#Eeb:Ttާ;:!hAQAQJ4vdZY~뀌1m믿reRRR(T_|Tg%ذaޔ0rH(X ͛7g儇ݳg/_fРAkժ=ϧaÆښƍSٳ)T .ĉj ¯JPPcʔ)ԯ_;;;ݱ}ӦM[[[J.޽{ܹsĞ>}Fsquuٳt֍9s2d] 6qFLLLHII/Ņcǎ#ك5]tb-nes*. _sOeXa4)܄N:QyIeU00vB;gvjР;wGBBB^X1vFutڕ]2W+RD,dDHIgykkkժU#**'Or&OL|]!^_ѢEqoֵ=zߨSS10~(T vaηUiᦴО+2hWC MX zyB=tŋsyVZEjpwwQdS W^ 2;wryuF\\]ty5j`ժUT^&Eaպm^^^\~իWsfϞƍ_[ݺuC={0bĈwB|$z}  *3s*Ƒ.GQu\zpݻ0t(>mH͍AcЯF29FDL<ڵkGTTܹSo|~իW~kY&Ν֤I @߾}ϏQF1nܸlۆ ҥ ʕ@z3 !^ ??mKȺuPf&f9F4݆vx/fZi*+Kݾw/^ !Da IDATJQ 3aXbb"ϟwwwCT%n͏d{휽sK5Ǝ?^I*6ز^9q,%Mِ:v.]΄nO>ě1XbjjJ7TB!2)9̦tYzlmJF&̜@ ? 6\s.P/}I_8qymˌ~RYuSC&P+sox\vʕƠ ҥځCf~BT-u*U27 … y!BATjy26[ӷ/[f+oxzsƑ U;tL/97|}n݂ӧ ƚL}m[(P>[б#L?'NdNB#SbAɘSP!7oix_7ḄV_f͝˧?g?c[Ӧ\*뿦Ҫ>Uc61|@0p ̛c1YBf\\Ņ| 6 T.ip萶ɣJص ,] kfe1Blő42d%K^ZN!Ȣ\]! #b|n0?gA*x NocWNJ/V&ί(У,['^V/֮iox01AugAŊP>TmTiS8u*K/$fR' ={j{.Zdx & W^5duB!a]ap'76 5tjƵG: &?ѢX>o2%-_ܹt98;k;4:u۰Sa !hh 2n<مACV'LL~kC|TP3=}KwZmI]`֚w_`_8y5um?v[[عZ@`B|9|ƍ`eex B&::[2|ϟ϶mۈɈS ! */JY [ ;ִXâ&X*|pw|v/K(N4@񌕕ϡeKm.!0ɓ??zTkj`<={64i҄>}ЧOpuueΜ9>Bbn6pļ:ZhT*]tdXZP~QyX[qj)վ=0 NF;=o׮Щ̚e!Dݯ"#3? HPP| %J 88SNq))Q'j)Bqpp@VsYcNj57o~e;wn]IQe{۷X8:RqQOw9v,ѮGR պV<{hWL_b"^ͩv3y311 `00FN;}B3??)oΠc@f̘AժUٳg*)U͛7N:̘1=ر˗~ (C0SLaÆϟء[f„ G|$TE 3p:~_<T*,L-*P.P:4[_˱až7PqC eo\yjrS ]f]*vzJ e k?O;s2/Ġ- 7ZK>155E\xѐp\\\(_<ΘKHHxc/^LN oߞ g%2UQEaϫʹ3dҼXsN8ߌ/wCPy/_774&1>TșkH.Bwΐ>~lr}k׮+r:v숿?ׯ_GV @||<ΝKKKVʉTݻZ͎;‚[bbbBhh()))SbEq+V _k 6… cmmM=z4IIIcǎt,Z `/_ZjXZ?{gw!!B n(^(R(P$X"@[H[H?ݵhI EB%|?d12s̝9wuΜ3X}׿0<==c܇*UƆ5jpUרe˖} 4:uϽ0` "GɓEH"޽;;wL2X[[SZ5.\۷ocƍcoٲ[[[QbEt?S( io2aåR# -ı߰9~~uCy֬ATa&:9rkrMZ{v*PTt4mڔ3gzjr!k֬a̙4k̘C*RȌ3?~<Ç3-C%yfl™3gjj +++N89s>|xz!*Vkȑ#6mŇnݺ4 XvF/[t)rɓ|Ջ֭[SvmiР:u"88aÆ1m4N<3͚5#<<ڷoŋ /^6m`cJ*&(nảcG7_>LJ`͖R f;۠GPdBBB[L0"=%JF OOO)\\\F%K?6i_~~q?.\HX "?xܻL6M,XPaaa!V^/ yB[h4m6 "6m*ڵʕwBQpa`8e_EJcƌӧ={sss/۽{h4b֭qݪU+ѵkW2}v)4 B-ZhapSԮ][!lmmÇBшǏn:}ϟlٲN8!#ann.:d πDzbsb? ErM]„ZUB!f3! Q_TfĔSVM_; ͉nP(/##Fʕ ???MFRx!>tl!3%sBŊq?ڴI6m"ϝk\_NXX5jЗQJ.]dжRJkhZˋ_n`Yv-5jŅٳ3j(޽kЧK.?~ɣ/Zjklmmc+SǏ'اFcЇ NNN.]Z_+WXHb{*WK. [UAٳg'000*tr`^t _??Шp. B˵-y"R\}PQ(I[̋Q`X[[3p@h?z>*>֯'qJ!D n@ ׯ_si|}}4iyaҤI-[WWW8v;vd4l{{{V^͔)S#XujLhXbFCO|&=z0|p/^&@ԯBa Fw$;/nPw0\'ssSs~m+jś *:*筜r\cK}AtCP|4mdLjyY9$Ν;sP!Gɒ Qdd{c+ XXXpa}YXX'Od988PLfΜEVZcѣGqss￧B s֭+Qw5}O"! .̝;wl\r`Pv̙X$cQҦxWRD }Y}63fҥKx{{-BPP:D׬X3%&5`IK\h$4(Qʔ + E"(T(%ȼUoٳg{o5搊TƆ>}0tPŋꫯ{ ^ŪUѦrAYvAEr֮]י1c[lIShQ9w1"jժӧlur)/_εk3f .\00 !=K.8;;ӢE }#Zbذa4lWW}:u@AH:ƍ+)nnv_~ =nLEs1U(HLւDɒU7Ufؿ1Tآ}ZER+Çgܾ};EIAAA\~I&ѫW'OfFN*ơp~;}[wnʻwi'@<Nܚ&T({p7lH92#FU@Zl… ٺuk-[hѢGޢqF3?6IPdVͭY4_RuAU>6I[} UX7V($VQI?3ŋӲeK*T@Νܹ3˗UV+VqsH"EDDvvvq;v,۷lٲt E]Λ=Byp1mtpdkH#ZVi2B2$f٩Si/Gf ǎcԨQ~z6l@xx8G8::sHBP|]_QFY߆o&aR;W  ED[TXRw)HD7NY: B",Xz-5Od!um֑._xhK} $"+BȲDjm>rdFjNpp0˖-ѣG9BP(>P4 >.xs˳A 97BFK]i66d6RUy%]t… 9BP(>p密/*TъF=0mh(c-P(>Z3g`4%ӑ BP("gr'c2xl̓O>N܇Щd(Bb/ϝ˭ ) 'Oxxx`kkm۶ڵk1^tF={vܹ3Oc .DX[[ShQf͚ڷP(dbjbh[wu¥=6NӍڷBȘ,[V%횳gc/?~oOL2F>>ڵׯ_'jժ( dɒ\|Y_qF6mW>>-W4o?N߾} ׯo߾?LH]L"*BAcnjo ~cs⼊sXJM}n! Pd4.^(ys֨zrdf1:uυ4hЀz駟RT)_n}:SNe?s9l­[ҥ'7@)й`РA&NСC|8p땿wpA}ٱtR.]ә?>ӦM3͛7eΜ9VUVXYYq ̙ST(GqJ?{/ L„pq;ѡDDkʓ'+@psKo)Fr})?ߴi%Jo߾̞=(c]|~Qzu2F{+++}sssoKKK}_B@|ą,/Bp|"ɤ@L:"EpyMF=ڵ]>}:UTݻwd˖?GGG֬Y) !ؼy3,\6m뼼8pC ԯ_˗/KÆ 9p}# 2dk׮eСP-[e{rG'Ҹqdf "cbef즳,I=4ڬ\.- kJ+Ȱa8SUҥS&B|-lݚRkBjlYJaR0̌@߿Fs}'9<|O?GGG6lؠ_au]kk8ۥ sBŊq?LMs&[FCժU ʪVʵkBGfpssNosgΜVZz#6?_|+ ڵksaZڡSJ իYzA٫7X{ hF5kZV__BѢEd$jժ%lmm?k\r/"FyѢEEzh4bΝ9"4XbE @%.)<<< ʾ;!F-_\h4qY!ƍ} oooѲeKcQ!ft IDATdIѲem˗//:w,\]]B<{LXZZ::oݻ 1cDr ٳG}ݻF[nM5dϤBwh'ܧSO%ꚝ;!6HhK`kxE!7>ELx$ag.@W_S9J!\I*EMV!mIkq<6n҂60 V9}4SNe۷O jժŀwDDm۶_?8v9;v޽{s5u#G{RfϞCΝ; 2+Wzjf͚)P̘17nm6~GkO`` ڵϏk׮|r^o#._L퉈{yyj*<==ȑ#ŋgڵ2Er֮]י1c[lIׯOѢEܹs%Q(>m>"NqW9JJ8zTwSOzk@eBш>L,_}:v2HVXD-Y$wNT=zУG$ˣP(KZ;RgiFhј-GBD\B0ư@ډxqyfҥI+"8;ˣV+sof ?Oo Fra+V #G ʔ)?l! BHهc;%zy+Y}Z5V#[h才no/N[ + kr1,qD {']5lٳC.W.y|:ˈ$5ǎQO/_>mۨQժU#{tԉ3M BP3&šxrsʱߵIg^9yHPݺ):_tiC0*rб|{@/4ޥlI gJ!3FU@x! sHҠA}>OBP(2=WL34*܈vmk7ބ&dT8fLy$rN8w- (ųg+[XȣV m8h}enmB͚ l* VU^:شi$((͛]F޼y9BP(2OH3Νm[\&tr`Y|1.{,QZZF2y_oS(AY7CQ@n!#"4feIrDeˊU4inݚ 7P}Jp֭[gAP(ӧeBpvzK,4 ]u?9⚌?ЈЄ/6&JQ=m5{7mWd-F ^FzNW6ckl>p?jq0j"Epe.^= 1k,2;}\t)EP(YTB0c9"Ǎ֭kSSprJo)SD"veɌ=8]byxJ$xCi7#<\Ft;aaj.UgPAnJؓeFƍklFg)UUȞ=;-Z0p; Bzh40{6æMЬ%D1#f&f=Eis'*Ϋufph@;͚䖝܏^rR@))9dT@#BBbn.'/_Qq7ȓGZ|K\?]qeo1Ί+?snnn|t^ŋ/P(bP\`@0 ٲ%GiPJ*T#`辡l%-PС`lj w(o16(*ѷe}qXCfMxTǹs#8vNI382eCߗ/Ox,@ٳ'…ЧOZJ: 0v,,_qY@c/$,iQ5Gӧ5kO<?~?1T( 1x֕> g82JVBРA^s;og?,õgRo9CXp ERrعs7(<7~üQ]}6k 7dMm~-2̰ݞCuަѣ]{C._V ͛ӧ}<*haaA>}ӧ6m2 BH!!ұ{p(_A22Mjao/~}d",t>}x2s߈ЦRNmjGQ "y5kǬ\ՍFBV15>χ5 5 HBk"Do u+Ee9<{,^_bŊ3cP(P,] KKǏEϞ2Erj+'gކ!y2 u \sԇaQ}Qu.<`jք|ݲwwX*S\t&GA;VwsWj`F~56,XپÌ9r(y"2s.n7;BhǎQwwwngqww7 BHٳ w?hjQ,Z${Fd"l,lp*Guk-(Qoh֭:eYp,Oڻ~ 0l_-yjySM+ ժ%1z{ytW/Ĩ H~طo7fϞ=ܺu[n{n4i޽{߿1T( Er(QBnOp Bn2H&^֯rJ.Ůk7@ ] =u:. r;;gB&$3v,!3a56ynqaǎ]oe{+Txl0ҷo_F7棏>⣏>I&0f~H BrfΔ[J!f3}SȡMV56˗bpڹsLmFA8V}}L2=띜%rġCWc<&ֲ0zcr=V\Ʉ 0a+W޽{3) a8z4HٲɜE>VZm_QiǦ oɭ% x*ϞwלuoŮxZn|?,sa3 ʊr~tFcGUǤI#LlM]xptt򋱺̐ @( B!j(_^^=!OoyBЌՈ{ۓή\ĶŹsƓ1*ӧ qpHb!BCŐ!B|$@Uߜrj,D@}侼Y Md wbn-`g.3>[^N*B|U+xYGCrmܹ-aKv9Ȇo(*_3s4V/d#͔udȲ > kdeṸ v/cn@t+gd>pU@J2ˏt!x s';,,51u.ujd+W}~~ɓcoS6ZLuW$ Z@rFڡ)B>ða)+OD -̝ '&&R)\XZp S&8DDC_^:{?TCb,1ӧ0}\;dteBh {P̬t)ׅ3~wysΘל7\] tI,lvvP|i<$ervN?2 sG>ϙS* J r'oa_9ؘ1Pӻn!U05b(^Pk{R ?,3iլ)jcڵеke0h"cv*8:BN)c`2)ܸ!M';+VH+MB ۷jXhҢvBEDS>#䤽c+ׯrA|}i-'U@2;NlO5+sj0Qt֝ߎOuEhpRh+Qg 0P$L|udF[/]Jq3rBl\鼰~g.߿_z~K@:cqA*,SʵDQ$(%8CZ}UɓK! ʹ 'ßbKK9exDXY~ &LY^&* :tݻwDZ|+1 1qv\2:!rRrpCgA)\XdgU7o~tװWW8sFNtMO~4h W.\FzNg.of$WWZi60Aca $Y*|4>77sCrƐQC"FDFQmaՇ|]QGѪDCnTR@T$*/7(;ϋό20E˖afrN\26xp,_,'bI}d;}ί _ƍ w{V皳gі,0|xu/R%֣ɣY,c}I* ʱĘ!#<<fϖj}Gзo}hK[ 7,k/G˖ݬD7l.RMϺR\^ڧ%e㥯ԩ2bҵ aj*}S@_|($&&|̷n-bdhyj>=ʏ~v>@r]AקoB; ߣK+[պ =.=ۂs4>,Y3d%lmeV~֭vm`I޼qt(#͜)}ʔ!n#=t\iMgJ1tƍܺED^ 2N [Böyļ~Gfѝ&cj3b޼yqϝ;WXe򥍌A*2 1h\B !5F !ʔP!!jJ%򟳳}$DܑK)B̝QyTb؄n"7nR0.Nɕ޽[t_q~f7+Xr{],DBJ+bҺ~`'BWMVL]";lY!,"o?u?k*mQ\x&&B̟wjUym֬M`>B#̙1-oG o'J1;pMB1nGK+mjT H9|r/_.B⃠^=H cʌH_ˇ\v*]ZG% ''%-1t.we|͊{Փ~R\.-H_|SxT(x{S"Q%oֶ^WwǴ)gwU^+߻w}q(0)O7_@%4\ ~*N±5,iH\_GHl4+VFnJ~6l@H?uE7`*uuV7bEܴzu 2F˛ɾb69"Dl7('gς9=a&~$>b\xz֎ŋKvm˧1nݺ [[XSN m!ӜA*X +w 1{thwwKR&&BZ $gϖ+o 1eiv(hr9ʕ$yV99G|R!Z /'/AaAujN!"/qwB̚%?.Ľ{g?ش! !_ <iUiPI [kVb4iMFBTGe2D7{7NYSss}AR.| HȊaoV-^O4h`Xޣ˖%|1ܶ-Yvn}|IhW* LlR5J5JhB qcd7PP$77/qO#w$V,D5AP3>{B1'm_CmDK_!<=w/ΦO SĜΉ#ѹQ5m˖ q#GJELÅXZ !D~-guU7nJH}HWE`k-8'Bd1BѨQd]Rғ|ĹbE'r,w/>]Kۗ&.2ը Bܿ_tY F#4 IDAT~4HF{ E*+ 5oqI!֯ӧdȫW'w;q1rH6M0Qbhرc\Т'nk-[YPZ-Y"}&D4:TZBB]`D) :.]Ywo҂)SWfb| rO7C#|4nsr"Jvsrڿyi2!rYvL JFt۵ KܘYT LLLȝ;7sVʇBP(>,䄼paEO>vLn92MDGGehLVYr X`gP@luNǫ%["~ gw!9ÿg#c>ahmؐ(qj{]ԬV91 cƍ-ӥؠ8meF%GMMafmlNgtΜ!'){wGŹeSAvػ5G6;cĚQz Hf!$$ÇJl٨Z*WzP(p켶waF xd K'6`b|yԬ [s #.X }'_>i W4q#UTeؾaN whty)ƾ }0cԫF)R=[cf9s>~ycN L0hԹnN9xfMHGt$Ei%l`D o 4ٮD $ V E~53 WO/ WT$<6=lxae5+qe.US6FV"K+ 6ľ={,--޽;ǎ(BP(Mre hDNʓrqcq,l׫'<.޽@t)vA;l:]ax4t.I*Yf=}žx& #ul<\&?whٌ{!5Zv0f5`t_P-+Q~A.= }ƣ$OxJNK4˚hЧJ2yCS<;e?ԨiazzF泍HX:{㬍|^:޸!`k9kVi(]MYAG!T?e-.p_y oTEysG%16D!diߟEbeķHP(elyi IoxDt%jܸ{+VڱQY{߇{8 5}v~k4J8wެ|5wѷh5{ƏHh)S:?G8؇E8p_&B4\l(,{˽;׹V8o޽[-jԐ LժX[}У̛'KѐJ`_2wWr K09t!eјsDߒ\ Z |{c F7o&Ig̀yW|y^Yƛeܦ Xr7_Hب9c<,7Iákek"Kr]ك؜O Bq\YP=~,}/&$ڶ-~*fN铿#=Ifۛ7vTǏJ>+&-'PPQ7L,g٤iM3l'MAJoNu(8 V?[>ÝK}R{=P8qC sFQyFZ9 &N֤O5Jӷ[7!" -*\ʔgoxTΖ.Ȼv%~Yu'''Se.c#- nnrOrboT|^'߹`͢{fu"7|hr}]FDH]v5k&5ϓt'Y,X[[@ousBP(2ff [??=OnB ̞-5#"giiQ6 BGh+$|9'7/vvi8ټ5r CVfVYb9 3WL3ME #"'g g|n }ÛGx}2[y EyT$331?L5b1Dcb\#M#Wjtѵӟ15[Ըa_0w"A#u*'ę<@*[\x{w#6JsAP|;_qLѩ Q,@ޡ* zΝ?{9 CŽk嬃Q^5`Mzƻyyy s7C>ܰWٸ m}J ˍ'/JfU@@ NoߞWHBPwJ ȡCrHjS0doժ%UBdI9JffЫL(8T(ҬYdB#GKNbqwq;aX {OSmβ"=7a/ϱw ^z3E9=o! #\N6\WZ"DBB@B08ߗ눚1|ɟ0wdcB* ֊EP|OxUɫX.Xkm7~={).ȇnִDr3HDzg/,CnifeJ%?O.> ni.{2 Ȝ= 3°^Ό'9\:p7Tv%PpsϹwGK: C2gA2*߯Q'Xdm[];y׹o՝QGn]6oތ͹2epuuϟ/E׮]:&::Zj&l^}!ڷE !cV!O[B\7oʄ-h??0ksGw%C$ĪUΝB̛gM{z7n]Z3{ mM ւsUBH fuSm[O$DkĈ 9V^q"?}JBhaOnc?g a0+W/5{Z?^Jk^>x``г^ǚ5kбcG]gtqdo{U<XbShIX&R4S]8ǟ&L㋥Ū9-/,UW)yy,c?r| ;46,յILxP0)G!{x.%"cbڶv,٤(77潟=?lݚu* bȐ!>}:M+WGLJQ^}:Ur4 7UO=%رrqG1k^f:eKYvm8d%o{: ,sX9l̘"NWUn#RzJEӧryʰaN`_?ࣦ0ѯ= _: RA{a5F9  VD 8e>z3߅ rhL/ܟ 0 ?:ua„ X~=z-Ch9ww5" ۣ(rC а<z3ٌۖdK.ϢqQ4# z=p(Ygwxh^ɓVdMGUdݦfa %5~uo.,*U*5[\"GrhAdeqhcr3];vw E̓ {ŔAf>d 9..s] ?č7pٳEDDYJ"Ң/3$9'we9J,^ s4h9c[쉣#q׫J:]g g{c?DW_`Ƽ7ߘ^=j2@ q =sB-X`#l[p7^~~v65k Zadq͹serooӨyԟ粠9[j@nQWWh>!"""Gu:wQBX3t3[cү0yW7`:_bYD(;T&4H?bq1&*R0{|g=WjYq㊦GE޿%'o V^^VlϕADnmW/]|S+R-O=e::دZ5YΑ/ዕHyԨ!Ǐ7ҷn&VbizEe&MG#zcDVǍk:T;v?Pܸ[7._._AFڼG\~M~Wy/ò52bKzu8W7& ^yصm)]ʛ3|`(`fPGFHr15ض-s' Sƍed+9QZ܀d䢅2GW".\ g؟U`(==e C18,j7LK*+Һ9u|\1k:|9B 4˝bZ""bJ9ee*(lPד (:Dpd,NŠ+11 v0M*ÇLYcQeo*,o<&s$ʁµ\ _n90`r? ׮-ص<^x?}5C7 ;B˖yj(y0_^ߪc\^bBDDTz  ,Y"Z^@iBV3_\5h ; |too - ƸwFXHP!Ӱڵ5 j@Y0tIA7-77 o |t x|_ &Ln/p :Lf7~<ջvḟW 2 }Vkc=uL@0  @No>W@MG@t|Y̔C.GJ 2.DhF$FD_T׉W5%eEѮ}qf?/)yh?!ƽp0ͥZQv_?ʄCʹc..&|SRw/(d㜶9XLNI'>1˪A<:28iṱ[={`G;8BXιK=e0xUS֮Et9jvPY.\gX7%5m* Sk6|}t>~woӢ Ui?۷?`<"""*49ܓO{ʩJCK]}i_ijUJ5 Vr33ka N;d'j"\S0ziZCO{fMVMϸpVVG<{pu!LQ> @> DDDT̝+oTw:ܼ |>3gch`l ٳثWMKq]n8qW Z\fL} ӐtmU{0=&L "#9s)fzE624ևWT/͖…yTi42Pٲb3!!S2UHywn"o/,6v uwtmv9Z/Ow*V׋$\k.aҽ993Vɡ^SFY6V_WERc…E8+W#"PFL2krnW8a0W#5bqŰakbs5်ԩMMFE{͹U"?|+m Q.^.>\޾X GۍCd7`sxNS=;B&}dяKO>1γSmo. VDFZgJx_:bb͠L@TKX!|%Ku q':-&-MNmS`ޯ:yzGmU`/má`) @0d^^|Q>ڵ(s7o.&+6mdݣGe1cbZ,̞ o[Tv-%z00|8q7Ukz\(Y&ܾeK!?oiա>s{ ,͚ қ 5} Ν3UOaѣ;˒% Do,a"׵+ϫW[9|T]ܷAeH4j?ҟKW, DZVZ'Հ@h(ȕ?hzS%sc0[ t:BN' /U䴴^ޕI<=MK^s/?|رC8иS'9CoaLo:HBe-㶓'$$Okݱ.[-HJ]Tv2瀆 Q!)yZoo9J_#vYV/?~  \B& @~q,9 {}_EŘhEs"}ӦYgEXf|7}{,Zz ۴O}fv6-X0e~:gPW8[W//`vdc=F [O5HLme9E9۶-ԮǏWGYD s9_h:p t\ 뀏>o- @Pkn9?1r{]DYao cԪz\ RZ?[ Fgcq T:zmF_SšƵ7ofܸ!` VR@cw!s:Y.%%111Eqž wh-.EsW`-sDpǵm)b/GRY4GF11XV:ilh+VțϠ8kJXx+6UHFgt }4)%%7oƨTbblׅ os}U[ؒQiƍ>|xI7@6l؀aÆt3Tbb"~'Vo""""RD!..{FժUK9 @y9 b^zE۶m^M")<<񁏏[M"rFK)Dv+ /6myUP'*AYYY=z4ׯ???t GԴmEH1ZbكÇ#55֭C&&*f`ժUHKKCZZnQ &N:@Qn]'N˗ 6_G)+Qq}4hBJJ &L~޽{Ez $'' ,@TTԩhٲ%*՘ʒ+Wcǎhڴ).j͚5V5w 6OOO{סCRJᅨ: @ɵkא[f͚hڴ)˒nQ&NիW^8sLI7Ȯ;wO>ٳK)DkݺuCTTTHJHll,޽"=/bru 66W\֭[1c >0~x"44C ? 88k׮E||3曨W-Znݺa~z9̞=/"1w\̙3už}(X 2{ƒ%KPre5 Ν3wEܹҥK1e9saaayյ.\;wbԩ>}:9aÆY( ЧOiK,AӦM1m4 ŋcX|9y,]CNDD""*w֬Y#EW\1ntV7Nxyy ^/B׋*U:~k׮<+44hPկ__(";wƍ#&&/\]]v 777ҥKm[nEfФI$&&Gt(\BDDٳg1sL۷磻\ xAAAqEh44oճ$k!>cDFF"..999Ujd~NSk-m^GQ$$$hDD DD@rr2s"88Zј6mͅ蹉"N:ǬY0fP&L3 <#Xd}֭ks;Q~ܽ{;v0NK`!~aaaG\\ZnuBBB`0pY<m۶G؞dsd(ԩSѣG^" @| ++ oUTATTt7"99uyh4̙3HFL\]]F:n݊7nS̜9xG1rHmff̘{aӦMѮ];޽ӦM:G~jغ튢`ǎXt)֭[۷0a5jTklCDD TV _|Q!"2k@Ȃ^Rn:$%%!<`. """ from __future__ import print_function import numpy as np import os from os.path import abspath, dirname, join import sys import time sys.path.insert(0, dirname(dirname(abspath(__file__)))) from noise_generator import noise_exponential, noise_cross_exponential from multipletau import autocorrelate, correlate, correlate_numpy def compare_corr(): ## Starting parameters N = np.int(np.pi*1e3) countrate = 250. * 1e-3 # in Hz taudiff = 55. # in us deltat = 2e-6 # time discretization [s] normalize = True # time factor taudiff *= deltat ## ## Autocorrelation ## print("Creating noise for autocorrelation") data = noise_exponential(N, taudiff, deltat=deltat) data += - np.average(data) if normalize: data += countrate # multipletau print("Performing autocorrelation (multipletau).") G = autocorrelate(data, deltat=deltat, normalize=normalize) # numpy.correlate for comparison if len(data) < 1e5: print("Performing autocorrelation (numpy).") Gd = correlate_numpy(data, data, deltat=deltat, normalize=normalize) # Calculate the expected curve x = G[:,0] amp = np.correlate(data-np.average(data), data-np.average(data), mode="valid") if normalize: amp /= len(data) * countrate**2 y = amp*np.exp(-x/taudiff) ## ## Cross-correlation ## print("Creating noise for cross-correlation") a, v = noise_cross_exponential(N, taudiff, deltat=deltat) a += - np.average(a) v += - np.average(v) if normalize: a += countrate v += countrate # multipletau Gccforw = correlate(a, v, deltat=deltat, normalize=normalize) Gccback = correlate(v, a, deltat=deltat, normalize=normalize) if len(a) < 1e5: print("Performing autocorrelation (numpy).") Gdccforw = correlate_numpy(a, v, deltat=deltat, normalize=normalize) # Calculate the expected curve xcc = Gccforw[:,0] ampcc = np.correlate(a-np.average(a), v-np.average(v), mode="valid") if normalize: ampcc /= len(a) * countrate**2 ycc = ampcc*np.exp(-xcc/taudiff) ## ## Plotting ## # AC fig = plt.figure() fig.canvas.set_window_title('testing multipletau') ax = fig.add_subplot(2,1,1) ax.set_xscale('log') plt.plot(x, y, "g-", label="input model") plt.plot(G[:,0], G[:,1], "r-", label="autocorrelate") if len(data) < 1e5: plt.plot(Gd[:,0], Gd[:,1] , "b--", label="correlate (numpy)") plt.xlabel("lag channel") plt.ylabel("autocorrelation") plt.legend(loc=0, fontsize='small') plt.ylim( -amp*.2, amp*1.2) ## CC ax = fig.add_subplot(2,1,2) ax.set_xscale('log') plt.plot(xcc, ycc, "g-", label="input model") plt.plot(Gccforw[:,0], Gccforw[:,1], "r-", label="forward") if len(data) < 1e5: plt.plot(Gdccforw[:,0], Gdccforw[:,1] , "b--", label="forward (numpy)") plt.plot(Gccback[:,0], Gccback[:,1], "r--", label="backward") plt.xlabel("lag channel") plt.ylabel("cross-correlation") plt.legend(loc=0, fontsize='small') plt.ylim( -ampcc*.2, ampcc*1.2) plt.tight_layout() savename = __file__[:-3]+".png" if os.path.exists(savename): savename = __file__[:-3]+time.strftime("_%Y-%m-%d_%H-%M-%S.png") plt.savefig(savename) print("Saved output to", savename) if __name__ == '__main__': # move mpl import to main so travis automated doc build does not complain from matplotlib import pylab as plt compare_corr() multipletau-0.1.5/examples/noise_generator.py000066400000000000000000000065701263310464700214530ustar00rootroot00000000000000#!/usr/bin/python # -*- coding: utf-8 -*- """ This module contains methods for correlated noise generation. """ from __future__ import division from __future__ import print_function import numpy as np __all__ = ["noise_exponential", "noise_cross_exponential"] def noise_exponential(N, tau=20, variance=1, deltat=1): """ Generate exponentially correlated noise. Parameters ---------- N : integer Total number of samples tau : float Correlation time of the exponential in `deltat` variance : float Variance of the noise deltat : float Bin size of output array, defines the time scale of `tau` Returns ------- a : ndarray Exponentially correlated noise. """ # time constant (inverse of correlationtime tau) g = deltat/tau # variance s0 = variance # normalization factor (memory of the trace) exp_g = np.exp(-g) one_exp_g = 1-exp_g z_norm_factor = np.sqrt(1-np.exp(-2*g))/one_exp_g # create random number array # generates random numbers in interval [0,1) randarray = np.random.random(N) # make numbers random in interval [-1,1) randarray = 2*(randarray-0.5) # simulate exponential random behavior a = np.zeros(N) a[0] = one_exp_g*randarray[0] b = 1* a for i in np.arange(N-1)+1: a[i] = exp_g*a[i-1] + one_exp_g*randarray[i] # Solving the equation iteratively leads to this equation: #j = np.arange(i) #a[i] = a[0]*exp_g**(i) + \ # one_exp_g)*np.sum(exp_g**(i-1-j)*randarray[1:i+1]) a = a * z_norm_factor*s0 return a def noise_cross_exponential(N, tau=20, variance=1, deltat=1): """ Generate exponentially cross-correlated noise. Parameters ---------- N : integer Total number of samples tau : float Correlation time of the exponential in `deltat` variance : float Variance of the noise deltat : float Bin size of output array, defines the time scale of `tau` Returns ------- a, randarray : ndarrays Array `a` has positive exponential correlation to the 'truly' random array `randarray`. """ # length of mean0 trace N_steps = N # time constant (inverse of correlationtime tau) g = deltat/tau # variance s0 = variance # normalization factor (memory of the trace) exp_g = np.exp(-g) one_exp_g = 1-exp_g z_norm_factor = np.sqrt(1-np.exp(-2*g))/one_exp_g # create random number array # generates random numbers in interval [0,1) randarray = np.random.random(N) # make numbers random in interval [-1,1) randarray = 2*(randarray-0.5) # simulate exponential random behavior a = np.zeros(N) a[0] = one_exp_g*randarray[0] b = np.zeros(N) b[0] = one_exp_g*randarray[0] # slow #for i in np.arange(N-1)+1: # for j in np.arange(i-1): # a[i] += exp_g**j*randarray[i-j] # a[i] += one_exp_g*randarray[i] # faster j = np.arange(N+5) for i in np.arange(N-1)+1: a[i] += np.sum(exp_g**j[2:i+1] * randarray[2:i+1][::-1]) a[i] += one_exp_g*randarray[i] a *= z_norm_factor*s0 randarray = randarray * z_norm_factor*s0 return a, randarray multipletau-0.1.5/multipletau/000077500000000000000000000000001263310464700164355ustar00rootroot00000000000000multipletau-0.1.5/multipletau/__init__.py000066400000000000000000000042251263310464700205510ustar00rootroot00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- u""" This package provides a multiple-τ algorithm for Python 2.7 and Python 3.x and requires the package :py:mod:`numpy`. Multipe-τ correlation is computed on a logarithmic scale (less data points are computed) and is thus much faster than conventional correlation on a linear scale such as :py:func:`numpy.correlate`. Recommended literature ---------------------- - Klaus Schaetzel and Rainer Peters; *Noise on multiple-tau photon correlation data*. Proc. SPIE 1430, Photon Correlation Spectroscopy: Multicomponent Systems, 109 (June 1, 1991); http://doi.org/10.1117/12.44160 - Thorsten Wohland, Rudolf Rigler, and Horst Vogel; *The Standard Deviation in Fluorescence Correlation Spectroscopy*. Biophysical Journal, 80 (June 1, 2001); http://dx.doi.org/10.1016/S0006-3495(01)76264-9 Obtaining multipletau --------------------- If you have Python and :py:mod:`numpy` installed, simply run pip install multipletau The source code of multipletau is available at https://github.com/FCS-analysis/multipletau. Citing multipletau ------------------ The multipletau package should be cited like this (replace "x.x.x" with the actual version of multipletau that you used and "DD Month YYYY" with a matching date). .. topic:: cite Paul Müller (2012) *Python multiple-tau algorithm* (Version x.x.x) [Computer program]. Available at https://pypi.python.org/pypi/multipletau/ (Accessed DD Month YYYY) You can find out what version you are using by typing (in a Python console): >>> import multipletau >>> multipletau.__version__ '0.1.4' Usage ----- The package is straightforward to use. Here is a quick example: >>> import numpy as np >>> import multipletau >>> a = np.linspace(2,5,42) >>> v = np.linspace(1,6,42) >>> multipletau.correlate(a, v, m=2) array([[ 1. , 549.87804878], [ 2. , 530.37477692], [ 4. , 491.85812017], [ 8. , 386.39500297]]) """ from ._multipletau import * from ._version import version as __version__ __author__ = u"Paul Müller" __license__ = "BSD (3 clause)" multipletau-0.1.5/multipletau/_multipletau.py000077500000000000000000000361021263310464700215200ustar00rootroot00000000000000#!/usr/bin/python # -*- coding: utf-8 -*- """ A multiple-τ algorithm for Python 2.7 and 3.x. Copyright (c) 2014 Paul Müller Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of multipletau nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL INFRAE OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. """ from __future__ import division import numpy as np import warnings __all__ = ["autocorrelate", "correlate", "correlate_numpy"] def autocorrelate(a, m=16, deltat=1, normalize=False, copy=True, dtype=None): """ Autocorrelation of a 1-dimensional sequence on a log2-scale. This computes the correlation according to :py:func:`numpy.correlate` for positive :math:`k` on a base 2 logarithmic scale. :func:`numpy.correlate(a, a, mode="full")[len(a)-1:]` :math:`z_k = \Sigma_n a_n a_{n+k}` Note that only the correlation in the positive direction is computed. Parameters ---------- a : array_like input sequence of real numbers m : even integer defines the number of points on one level, must be an even integer deltat : float distance between bins normalize : bool normalize the result to the square of the average input signal and the factor `M-k`. copy : bool copy input array, set to False to save memory dtype : dtype, optional The type of the returned array and of the accumulator in which the elements are summed. By default, the dtype of `a` is used. Returns ------- autocorrelation : ndarray Nx2 array containing lag time and autocorrelation Notes ----- The algorithm computes the correlation with the convention of the curve decaying to zero. For experiments like e.g. fluorescence correlation spectroscopy, the signal can be normalized to `M-k` by invoking: normalize = True For emulating the numpy.correlate behavior on a logarithmic scale (default behavior) use: normalize = False Examples -------- >>> from numpy import dtype >>> from multipletau import autocorrelate >>> autocorrelate(range(42), m=2, dtype=dtype(float)) array([[ 1.00000000e+00, 2.29600000e+04], [ 2.00000000e+00, 2.21000000e+04], [ 4.00000000e+00, 2.03775000e+04], [ 8.00000000e+00, 1.50612000e+04]]) """ traceavg = np.average(a) if normalize and traceavg == 0: raise ZeroDivisionError("Normalization not possible. The " + "average of the input *binned_array* " + "is zero.") trace = np.array(a, dtype=dtype, copy=copy) dtype = trace.dtype if dtype.kind in ["b", "i", "u"]: warnings.warn("Converting input data type ({}) to float.". format(dtype)) dtype = np.dtype(float) trace = np.array(a, dtype=dtype, copy=copy) # Complex data if dtype.kind == "c": raise NotImplementedError( "Please use `multipletau.correlate` for complex data.") # Check parameters if np.around(m / 2) != m / 2: mold = 1 * m m = int((np.around(m / 2) + 1) * 2) warnings.warn("Invalid value of m={}. Using m={} instead" .format(mold, m)) else: m = int(m) N = N0 = len(trace) # Find out the length of the correlation function. # The integer k defines how many times we can average over # two neighboring array elements in order to obtain an array of # length just larger than m. k = int(np.floor(np.log2(N / m))) # In the base2 multiple-tau scheme, the length of the correlation # array is (only taking into account values that are computed from # traces that are just larger than m): lenG = np.int(np.floor(m + k * m / 2)) G = np.zeros((lenG, 2), dtype=dtype) normstat = np.zeros(lenG, dtype=dtype) normnump = np.zeros(lenG, dtype=dtype) # We use the fluctuation of the signal around the mean if normalize: trace -= traceavg if N < 2 * m: # Otherwise the following for-loop will fail: raise ValueError("len(binned_array) must be larger than 2m.") # Calculate autocorrelation function for first m bins # Discrete convolution of m elements for n in range(1, m + 1): G[n - 1, 0] = deltat * n # This is the computationally intensive step G[n - 1, 1] = np.sum(trace[:N - n] * trace[n:], dtype=dtype) normstat[n - 1] = N - n normnump[n - 1] = N # Now that we calculated the first m elements of G, let us # go on with the next m/2 elements. # Check if len(trace) is even: if N % 2 == 1: N -= 1 # Add up every second element trace = (trace[:N:2] + trace[1:N + 1:2]) / 2 N /= 2 # Start iteration for each m/2 values for step in range(1, k + 1): # Get the next m/2 values via correlation of the trace for n in range(1, int(m / 2) + 1): idx = int(m + n - 1 + (step - 1) * m / 2) if len(trace[:N - (n + m / 2)]) == 0: # This is a shortcut that stops the iteration once the # length of the trace is too small to compute a corre- # lation. The actual length of the correlation function # does not only depend on k - We also must be able to # perform the sum with repect to k for all elements. # For small N, the sum over zero elements would be # computed here. # # One could make this for loop go up to maxval, where # maxval1 = int(m/2) # maxval2 = int(N-m/2-1) # maxval = min(maxval1, maxval2) # However, we then would also need to find out which # element in G is the last element... G = G[:idx - 1] normstat = normstat[:idx - 1] normnump = normnump[:idx - 1] # Note that this break only breaks out of the current # for loop. However, we are already in the last loop # of the step-for-loop. That is because we calculated # k in advance. break else: G[idx, 0] = deltat * (n + m / 2) * 2**step # This is the computationally intensive step G[idx, 1] = np.sum(trace[:N - (n + m / 2)] * trace[(n + m / 2):], dtype=dtype) normstat[idx] = N - (n + m / 2) normnump[idx] = N # Check if len(trace) is even: if N % 2 == 1: N -= 1 # Add up every second element trace = (trace[:N:2] + trace[1:N + 1:2]) / 2 N /= 2 if normalize: G[:, 1] /= traceavg**2 * normstat else: G[:, 1] *= N0 / normnump return G def correlate(a, v, m=16, deltat=1, normalize=False, copy=True, dtype=None): """ Cross-correlation of two 1-dimensional sequences on a log2-scale. This computes the cross-correlation according to :py:func:`numpy.correlate` for positive :math:`k` on a base 2 logarithmic scale. numpy.correlate(a, v, mode="full")[len(a)-1:] :math:`z_k = \Sigma_n a_n v_{n+k}` Note that only the correlation in the positive direction is computed. Parameters ---------- a, v : array_like input sequences with equal length m : even integer defines the number of points on one level, must be an even integer deltat : float distance between bins normalize : bool normalize the result to the square of the average input signal and the factor `M-k`. copy : bool copy input array, set to False to save memory dtype : dtype, optional The type of the returned array and of the accumulator in which the elements are summed. By default, the dtype of `a` is used. Returns ------- crosscorrelation : ndarray Nx2 array containing lag time and cross-correlation Notes ----- The algorithm computes the correlation with the convention of the curve decaying to zero. For experiments like e.g. fluorescence correlation spectroscopy, the signal can be normalized to `M-k` by invoking: normalize = True For emulating the numpy.correlate behavior on a logarithmic scale (default behavior) use: normalize = False Examples -------- >>> from numpy import dtype >>> from multipletau import correlate >>> correlate(range(42), range(1,43), m=2, dtype=dtype(float)) array([[ 1.00000000e+00, 2.38210000e+04], [ 2.00000000e+00, 2.29600000e+04], [ 4.00000000e+00, 2.12325000e+04], [ 8.00000000e+00, 1.58508000e+04]]) """ # See `autocorrelation` for better documented code. traceavg1 = np.average(v) traceavg2 = np.average(a) if normalize and traceavg1 * traceavg2 == 0: raise ZeroDivisionError("Normalization not possible. The " + "average of the input *binned_array* " + "is zero.") trace1 = np.array(v, dtype=dtype, copy=copy) dtype = trace1.dtype if dtype.kind in ["b", "i", "u"]: warnings.warn( "Converting input data type ({}) to float.".format(dtype)) dtype = np.dtype(float) trace1 = np.array(v, dtype=dtype, copy=copy) # Prevent traces from overwriting each other if a is v: # Force copying trace 2 copy = True trace2 = np.array(a, dtype=dtype, copy=copy) # Complex data if dtype.kind == "c": trace1.imag *= -1 # Check parameters if np.around(m / 2) != m / 2: mold = 1 * m m = int((np.around(m / 2) + 1) * 2) warnings.warn("Invalid value of m={}. Using m={} instead" .format(mold, m)) else: m = int(m) if len(a) != len(v): raise ValueError("Input arrays must be of equal length.") N = N0 = len(trace1) # Find out the length of the correlation function. # The integer k defines how many times we can average over # two neighboring array elements in order to obtain an array of # length just larger than m. k = int(np.floor(np.log2(N / m))) # In the base2 multiple-tau scheme, the length of the correlation # array is (only taking into account values that are computed from # traces that are just larger than m): lenG = np.int(np.floor(m + k * m / 2)) G = np.zeros((lenG, 2), dtype=dtype) normstat = np.zeros(lenG, dtype=dtype) normnump = np.zeros(lenG, dtype=dtype) # We use the fluctuation of the signal around the mean if normalize: trace1 -= traceavg1 trace2 -= traceavg2 if N < 2 * m: # Otherwise the following for-loop will fail: raise ValueError("len(binned_array) must be larger than 2m.") # Calculate autocorrelation function for first m bins for n in range(1, m + 1): G[n - 1, 0] = deltat * n G[n - 1, 1] = np.sum(trace1[:N - n] * trace2[n:]) normstat[n - 1] = N - n normnump[n - 1] = N # Check if len(trace) is even: if N % 2 == 1: N -= 1 # Add up every second element trace1 = (trace1[:N:2] + trace1[1:N + 1:2]) / 2 trace2 = (trace2[:N:2] + trace2[1:N + 1:2]) / 2 N /= 2 for step in range(1, k + 1): # Get the next m/2 values of the trace for n in range(1, int(m / 2) + 1): idx = int(m + n - 1 + (step - 1) * m / 2) if len(trace1[:N - (n + m / 2)]) == 0: # Abort G = G[:idx - 1] normstat = normstat[:idx - 1] normnump = normnump[:idx - 1] break else: G[idx, 0] = deltat * (n + m / 2) * 2**step G[idx, 1] = np.sum( trace1[:N - (n + m / 2)] * trace2[(n + m / 2):]) normstat[idx] = N - (n + m / 2) normnump[idx] = N # Check if len(trace) is even: if N % 2 == 1: N -= 1 # Add up every second element trace1 = (trace1[:N:2] + trace1[1:N + 1:2]) / 2 trace2 = (trace2[:N:2] + trace2[1:N + 1:2]) / 2 N /= 2 if normalize: G[:, 1] /= traceavg1 * traceavg2 * normstat else: G[:, 1] *= N0 / normnump return G def correlate_numpy(a, v, deltat=1, normalize=False, dtype=None, copy=True): """ Convenience function that wraps around numpy.correlate and returns the data as multipletau.correlate does. Parameters ---------- a, v : array_like input sequences deltat : float distance between bins normalize : bool normalize the result to the square of the average input signal and the factor (M-k). The resulting curve follows the convention of decaying to zero for large lag times. copy : bool copy input array, set to False to save memory dtype : dtype, optional The type of the returned array and of the accumulator in which the elements are summed. By default, the dtype of `a` is used. Returns ------- crosscorrelation : ndarray Nx2 array containing lag time and cross-correlation """ avg = np.average(a) vvg = np.average(v) if dtype is None: dtype = a.dtype if len(a) != len(v): raise ValueError("Arrays must be of same length.") ab = np.array(a, dtype=dtype, copy=copy) vb = np.array(v, dtype=dtype, copy=copy) Gd = np.correlate(ab - avg, vb - vvg, mode="full")[len(ab) - 1:] if normalize: N = len(Gd) m = N - np.arange(N) Gd /= m * avg * vvg G = np.zeros((len(Gd), 2)) G[:, 1] = Gd G[:, 0] = np.arange(len(Gd)) * deltat return G multipletau-0.1.5/multipletau/_version.py000066400000000000000000000000501263310464700206260ustar00rootroot00000000000000#!/usr/bin/env python version = "0.1.5" multipletau-0.1.5/setup.cfg000066400000000000000000000000731263310464700157110ustar00rootroot00000000000000[egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 multipletau-0.1.5/setup.py000066400000000000000000000044701263310464700156070ustar00rootroot00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- # To create a distribution package for pip or easy-install: # python setup.py sdist from os.path import join, dirname, realpath from setuptools import setup, find_packages, Command import subprocess as sp import sys from warnings import warn author = u"Paul Müller" authors = [author] description = 'A multiple-tau algorithm for Python/NumPy.' name = 'multipletau' year = "2013" sys.path.insert(0, realpath(dirname(__file__))+"/"+name) try: from _version import version except: version = "unknown" class PyDocGitHub(Command): """ Upload the docs to GitHub gh-pages branch """ user_options = [] def initialize_options(self): pass def finalize_options(self): pass def run(self): errno = sp.call([sys.executable, 'doc/commit_gh-pages.py']) raise SystemExit(errno) class PyTest(Command): """ Perform pytests """ user_options = [] def initialize_options(self): pass def finalize_options(self): pass def run(self): errno = sp.call([sys.executable, 'tests/runtests.py']) raise SystemExit(errno) if __name__ == "__main__": setup( name=name, author=author, author_email='paul.mueller@biotec.tu-dresden.de', url='https://github.com/FCS-analysis/multipletau', version=version, packages=[name], package_dir={name: name}, license="BSD (3 clause)", description=description, long_description=open(join(dirname(__file__), 'README.txt')).read(), install_requires=["NumPy >= 1.5.1"], keywords=["multiple", "tau", "FCS", "correlation", "spectroscopy", "fluorescence"], extras_require={ 'doc': ['sphinx'] }, classifiers= [ 'Operating System :: OS Independent', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3.2', 'Programming Language :: Python :: 3.3', 'Topic :: Scientific/Engineering :: Visualization', 'Intended Audience :: Science/Research' ], platforms=['ALL'], cmdclass = {'test': PyTest, 'commit_doc': PyDocGitHub, }, ) multipletau-0.1.5/tests/000077500000000000000000000000001263310464700152325ustar00rootroot00000000000000multipletau-0.1.5/tests/README.md000066400000000000000000000002401263310464700165050ustar00rootroot00000000000000### Test Scripts This will run all tests: python runtests.py ### Running single tests Directly execute the scripts, e.g. python test_basic.py multipletau-0.1.5/tests/test_basic.py000066400000000000000000000020361263310464700177250ustar00rootroot00000000000000#!/usr/bin/python # -*- coding: utf-8 -*- """ basic tests also available in the function docs """ import numpy as np from os.path import abspath, dirname, join import sys sys.path.insert(0, dirname(dirname(abspath(__file__)))) from multipletau import autocorrelate, correlate def test_ac(): ist = autocorrelate(range(42), m=2, dtype=np.dtype(float)) soll = np.array([[ 1.00000000e+00, 2.29600000e+04], [ 2.00000000e+00, 2.21000000e+04], [ 4.00000000e+00, 2.03775000e+04], [ 8.00000000e+00, 1.50612000e+04]]) assert np.allclose(soll, ist) def test_cc(): soll = correlate(range(42), range(1,43), m=2, dtype=np.dtype(float)) ist = np.array([[ 1.00000000e+00, 2.38210000e+04], [ 2.00000000e+00, 2.29600000e+04], [ 4.00000000e+00, 2.12325000e+04], [ 8.00000000e+00, 1.58508000e+04]]) assert np.allclose(soll, ist) if __name__ == "__main__": test_ac() test_cc()