pydl-0.7.0/0000755000076500000240000000000013434104632013102 5ustar weaverstaff00000000000000pydl-0.7.0/PKG-INFO0000644000076500000240000001126513434104632014204 0ustar weaverstaff00000000000000Metadata-Version: 1.2 Name: pydl Version: 0.7.0 Summary: Astropy affiliated package Home-page: http://github.com/weaverba137/pydl Author: Benjamin Alan Weaver Author-email: baweaver@lbl.gov License: BSD Description: ==== PyDL ==== .. image:: http://img.shields.io/badge/powered%20by-AstroPy-orange.svg?style=flat :target: http://www.astropy.org :alt: Powered by Astropy Badge .. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.1095151.svg :target: https://doi.org/10.5281/zenodo.1095151 :alt: DOI: 10.5281/zenodo.1095151 .. image:: https://img.shields.io/pypi/v/pydl.svg :target: https://pypi.python.org/pypi/pydl :alt: PyPI Badge Description ----------- This package consists of Python_ replacements for functions that are part of the `IDL®`_ built-in library or part of astronomical `IDL®`_ libraries. The emphasis is on reproducing results of the astronomical library functions. Only the bare minimum of `IDL®`_ built-in functions are implemented to support this. There are four astronomical libraries targeted: * idlutils_ : a general suite of tools heavily used by SDSS_. * `Goddard utilities`_ : The `IDL®`_ Astronomy User's Libary, maintained by Wayne Landsman and distributed with idlutils_. * idlspec2d_ : tools for working with SDSS_, BOSS_ and eBOSS_ spectroscopic data. * photoop_ : tools for working with SDSS_ imaging data. This package affiliated with the astropy_ project and is registered with PyPI_. Full Documentation ------------------ Please visit `PyDL on Read the Docs`_ .. image:: https://readthedocs.org/projects/pydl/badge/?version=latest :target: http://pydl.readthedocs.org/en/latest/ :alt: Documentation Status History ------- This package was initially developed on the SDSS-III_ `svn repository`_. It was moved to the new GitHub_ repository on 2013-03-06. The present location of the repository is http://github.com/weaverba137/pydl . Travis Build Status ------------------- .. image:: https://img.shields.io/travis/weaverba137/pydl.svg :target: https://travis-ci.org/weaverba137/pydl :alt: Travis Build Status Test Coverage Status -------------------- .. image:: https://coveralls.io/repos/weaverba137/pydl/badge.svg?branch=master&service=github :target: https://coveralls.io/github/weaverba137/pydl?branch=master :alt: Test Coverage Status License ------- .. image:: https://img.shields.io/pypi/l/pydl.svg :target: https://pypi.python.org/pypi/pydl :alt: License PyDL is free software licensed under a 3-clause BSD-style license. For details see the ``licenses/LICENSE.rst`` file. Legal ----- * IDL is a registered trademark of `Harris Geospatial Solutions`_. .. _Python: http://python.org .. _`IDL®`: http://www.harrisgeospatial.com/SoftwareTechnology/IDL.aspx .. _idlutils: https://www.sdss.org/dr14/software/idlutils/ .. _SDSS: https://www.sdss.org .. _`Goddard utilities`: http://idlastro.gsfc.nasa.gov/ .. _idlspec2d: https://svn.sdss.org/public/repo/eboss/idlspec2d/trunk/ .. _BOSS: https://www.sdss.org/surveys/boss/ .. _eBOSS: https://www.sdss.org/surveys/eboss/ .. _photoop: https://svn.sdss.org/public/repo/sdss/photoop/trunk/ .. _astropy: http://www.astropy.org .. _PyPI: https://pypi.python.org/pypi/pydl/ .. _`PyDL on Read the Docs`: https://pydl.readthedocs.io/en/latest/ .. _SDSS-III: http://www.sdss3.org .. _`svn repository`: https://www.sdss.org/dr14/software/products/ .. _GitHub: https://github.com .. _`Harris Geospatial Solutions`: http://www.harrisgeospatial.com/ Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Console Classifier: Intended Audience :: Science/Research Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Scientific/Engineering :: Physics Classifier: Topic :: Scientific/Engineering :: Astronomy Requires-Python: >=2.7 pydl-0.7.0/pydl/0000755000076500000240000000000013434104632014052 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/conftest.py0000644000076500000240000000505213273057371016263 0ustar weaverstaff00000000000000# This file is used to configure the behavior of pytest when using the Astropy # test infrastructure. from astropy.version import version as astropy_version if astropy_version < '3.0': # With older versions of Astropy, we actually need to import the pytest # plugins themselves in order to make them discoverable by pytest. from astropy.tests.pytest_plugins import * else: # As of Astropy 3.0, the pytest plugins provided by Astropy are # automatically made available when Astropy is installed. This means it's # not necessary to import them here, but we still need to import global # variables that are used for configuration. from astropy.tests.plugins.display import PYTEST_HEADER_MODULES, TESTED_VERSIONS from astropy.tests.helper import enable_deprecations_as_exceptions ## Uncomment the following line to treat all DeprecationWarnings as ## exceptions. For Astropy v2.0 or later, there are 2 additional keywords, ## as follow (although default should work for most cases). ## To ignore some packages that produce deprecation warnings on import ## (in addition to 'compiler', 'scipy', 'pygments', 'ipykernel', and ## 'setuptools'), add: ## modules_to_ignore_on_import=['module_1', 'module_2'] ## To ignore some specific deprecation warning messages for Python version ## MAJOR.MINOR or later, add: ## warnings_to_ignore_by_pyver={(MAJOR, MINOR): ['Message to ignore']} # enable_deprecations_as_exceptions() ## Uncomment and customize the following lines to add/remove entries from ## the list of packages for which version numbers are displayed when running ## the tests. Making it pass for KeyError is essential in some cases when ## the package uses other astropy affiliated packages. try: PYTEST_HEADER_MODULES['Astropy'] = 'astropy' PYTEST_HEADER_MODULES['PyDL'] = 'pydl' try: del PYTEST_HEADER_MODULES['h5py'] except KeyError: pass try: del PYTEST_HEADER_MODULES['Pandas'] except KeyError: pass except (NameError, KeyError): # NameError is needed to support Astropy < 1.0 pass ## Uncomment the following lines to display the version number of the ## package rather than the version number of Astropy in the top line when ## running the tests. import os ## This is to figure out the package version, rather than ## using Astropy's try: from .version import version except ImportError: version = 'dev' try: packagename = os.path.basename(os.path.dirname(__file__)) TESTED_VERSIONS[packagename] = version except NameError: # Needed to support Astropy <= 1.0.0 pass pydl-0.7.0/pydl/version.py0000644000076500000240000001625213434104630016115 0ustar weaverstaff00000000000000# Autogenerated by Astropy-affiliated package pydl's setup.py on 2019-02-22 23:45:28 UTC from __future__ import unicode_literals import datetime import locale import os import subprocess import warnings def _decode_stdio(stream): try: stdio_encoding = locale.getdefaultlocale()[1] or 'utf-8' except ValueError: stdio_encoding = 'utf-8' try: text = stream.decode(stdio_encoding) except UnicodeDecodeError: # Final fallback text = stream.decode('latin1') return text def update_git_devstr(version, path=None): """ Updates the git revision string if and only if the path is being imported directly from a git working copy. This ensures that the revision number in the version string is accurate. """ try: # Quick way to determine if we're in git or not - returns '' if not devstr = get_git_devstr(sha=True, show_warning=False, path=path) except OSError: return version if not devstr: # Probably not in git so just pass silently return version if 'dev' in version: # update to the current git revision version_base = version.split('.dev', 1)[0] devstr = get_git_devstr(sha=False, show_warning=False, path=path) return version_base + '.dev' + devstr else: # otherwise it's already the true/release version return version def get_git_devstr(sha=False, show_warning=True, path=None): """ Determines the number of revisions in this repository. Parameters ---------- sha : bool If True, the full SHA1 hash will be returned. Otherwise, the total count of commits in the repository will be used as a "revision number". show_warning : bool If True, issue a warning if git returns an error code, otherwise errors pass silently. path : str or None If a string, specifies the directory to look in to find the git repository. If `None`, the current working directory is used, and must be the root of the git repository. If given a filename it uses the directory containing that file. Returns ------- devversion : str Either a string with the revision number (if `sha` is False), the SHA1 hash of the current commit (if `sha` is True), or an empty string if git version info could not be identified. """ if path is None: path = os.getcwd() if not os.path.isdir(path): path = os.path.abspath(os.path.dirname(path)) if sha: # Faster for getting just the hash of HEAD cmd = ['rev-parse', 'HEAD'] else: cmd = ['rev-list', '--count', 'HEAD'] def run_git(cmd): try: p = subprocess.Popen(['git'] + cmd, cwd=path, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE) stdout, stderr = p.communicate() except OSError as e: if show_warning: warnings.warn('Error running git: ' + str(e)) return (None, b'', b'') if p.returncode == 128: if show_warning: warnings.warn('No git repository present at {0!r}! Using ' 'default dev version.'.format(path)) return (p.returncode, b'', b'') if p.returncode == 129: if show_warning: warnings.warn('Your git looks old (does it support {0}?); ' 'consider upgrading to v1.7.2 or ' 'later.'.format(cmd[0])) return (p.returncode, stdout, stderr) elif p.returncode != 0: if show_warning: warnings.warn('Git failed while determining revision ' 'count: {0}'.format(_decode_stdio(stderr))) return (p.returncode, stdout, stderr) return p.returncode, stdout, stderr returncode, stdout, stderr = run_git(cmd) if not sha and returncode == 128: # git returns 128 if the command is not run from within a git # repository tree. In this case, a warning is produced above but we # return the default dev version of '0'. return '0' elif not sha and returncode == 129: # git returns 129 if a command option failed to parse; in # particular this could happen in git versions older than 1.7.2 # where the --count option is not supported # Also use --abbrev-commit and --abbrev=0 to display the minimum # number of characters needed per-commit (rather than the full hash) cmd = ['rev-list', '--abbrev-commit', '--abbrev=0', 'HEAD'] returncode, stdout, stderr = run_git(cmd) # Fall back on the old method of getting all revisions and counting # the lines if returncode == 0: return str(stdout.count(b'\n')) else: return '' elif sha: return _decode_stdio(stdout)[:40] else: return _decode_stdio(stdout).strip() # This function is tested but it is only ever executed within a subprocess when # creating a fake package, so it doesn't get picked up by coverage metrics. def _get_repo_path(pathname, levels=None): # pragma: no cover """ Given a file or directory name, determine the root of the git repository this path is under. If given, this won't look any higher than ``levels`` (that is, if ``levels=0`` then the given path must be the root of the git repository and is returned if so. Returns `None` if the given path could not be determined to belong to a git repo. """ if os.path.isfile(pathname): current_dir = os.path.abspath(os.path.dirname(pathname)) elif os.path.isdir(pathname): current_dir = os.path.abspath(pathname) else: return None current_level = 0 while levels is None or current_level <= levels: if os.path.exists(os.path.join(current_dir, '.git')): return current_dir current_level += 1 if current_dir == os.path.dirname(current_dir): break current_dir = os.path.dirname(current_dir) return None _packagename = "pydl" _last_generated_version = "0.7.0" _last_githash = "8e37ea10aee408566a3f4812e3ab4f9f73d51fe3" # Determine where the source code for this module # lives. If __file__ is not a filesystem path then # it is assumed not to live in a git repo at all. if _get_repo_path(__file__, levels=len(_packagename.split('.'))): version = update_git_devstr(_last_generated_version, path=__file__) githash = get_git_devstr(sha=True, show_warning=False, path=__file__) or _last_githash else: # The file does not appear to live in a git repo so don't bother # invoking git version = _last_generated_version githash = _last_githash major = 0 minor = 7 bugfix = 0 release = True timestamp = datetime.datetime(2019, 2, 22, 23, 45, 28) debug = False astropy_helpers_version = "2.0.8" try: from ._compiler import compiler except ImportError: compiler = "unknown" try: from .cython_version import cython_version except ImportError: cython_version = "unknown" pydl-0.7.0/pydl/pcomp.py0000644000076500000240000000611513301650031015535 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.utils import lazyproperty class pcomp(object): """Replicates the IDL ``PCOMP()`` function. The attributes of this class are all read-only properties, implemented with :class:`~astropy.utils.decorators.lazyproperty`. Parameters ---------- x : array-like A 2-D array with :math:`N` rows and :math:`M` columns. standardize : :class:`bool`, optional If set to ``True``, the input data will have its mean subtracted off and will be scaled to unit variance. covariance : :class:`bool`, optional. If set to ``True``, the covariance matrix of the data will be used for the computation. Otherwise the correlation matrix will be used. Notes ----- References ---------- http://www.harrisgeospatial.com/docs/pcomp.html Examples -------- """ def __init__(self, x, standardize=False, covariance=False): from scipy.linalg import eigh if x.ndim != 2: raise ValueError('Input array must be two-dimensional') no, nv = x.shape self._nv = nv if standardize: xstd = x - np.tile(x.mean(0), no).reshape(x.shape) s = np.tile(xstd.std(0), no).reshape(x.shape) self._array = xstd/s self._xstd = xstd else: self._array = x self._xstd = None self._standardize = standardize if covariance: self._c = np.cov(self._array, rowvar=0) else: self._c = np.corrcoef(self._array, rowvar=0) self._covariance = covariance # # eigh is used for symmetric matrices # evals, evecs = eigh(self._c) # # Sort eigenvalues in descending order # ie = evals.argsort()[::-1] self._evals = evals[ie] self._evecs = evecs[:, ie] # # If necessary, add code to fix the signs of the eigenvectors. # http://www3.interscience.wiley.com/journal/117912150/abstract # return @lazyproperty def coefficients(self): """(:class:`~numpy.ndarray`) The principal components. These are the coefficients of `derived`. Basically, they are a re-scaling of the eigenvectors. """ return self._evecs * np.tile(np.sqrt(self._evals), self._nv).reshape( self._nv, self._nv) @lazyproperty def derived(self): """(:class:`~numpy.ndarray`) The derived variables. """ derived_data = np.dot(self._array, self.coefficients) if self._standardize: derived_data += self._xstd return derived_data @lazyproperty def variance(self): """(:class:`~numpy.ndarray`) The variances of each derived variable. """ return self._evals/self._c.trace() @lazyproperty def eigenvalues(self): """(:class:`~numpy.ndarray`) The eigenvalues. There is one eigenvalue for each principal component. """ return self._evals pydl-0.7.0/pydl/pydlspec2d/0000755000076500000240000000000013434104632016123 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlspec2d/spec2d.py0000644000076500000240000004440613434104050017657 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the spec2d directory in idlspec2d. """ from warnings import warn import numpy as np from scipy.special import erf from astropy import log from astropy.io import ascii from astropy.utils.data import get_pkg_data_filename from . import Pydlspec2dException, Pydlspec2dUserWarning from .. import smooth from ..pydlutils.bspline import iterfit from ..pydlutils.image import djs_maskinterp from ..pydlutils.math import djs_median from ..pydlutils.sdss import sdss_flagval from ..pydlutils.trace import traceset2xy, xy2traceset from ..goddard.astro import vactoair def aesthetics(flux, invvar, method='traditional'): """Add nice values to a spectrum where it is masked. Parameters ---------- flux : :class:`numpy.ndarray` The spectrum to clean up. invvar : :class:`numpy.ndarray` Inverse variance of the spectrum. method : { 'traditional', 'noconst', 'mean', 'damp', 'nothing' }, optional Apply this method to clean up the spectrum. Default is 'traditional'. Returns ------- :class:`numpy.ndarray` A cleaned-up spectrum. """ badpts = invvar == 0 if badpts.any(): if method == 'traditional': newflux = djs_maskinterp(flux, invvar == 0, const=True) elif method == 'noconst': newflux = djs_maskinterp(flux, invvar == 0) elif method == 'mean': newflux = flux.copy() goodpts = invvar > 0 newflux[~goodpts] = newflux[goodpts].mean() elif method == 'damp': l = 250 # damping length in pixels goodpts = invvar.nonzero()[0] nflux = flux.size mingood = goodpts.min() maxgood = goodpts.max() newflux = djs_maskinterp(flux, invvar == 0, const=True) pixels = np.arange(nflux, dtype='f') if mingood > 0: damp1 = float(min(mingood, l)) newflux *= 0.5*(1.0+erf((pixels-mingood)/damp1)) if maxgood < (nflux - 1): damp2 = float(min(maxgood, l)) newflux *= 0.5*(1.0+erf((maxgood-pixels)/damp2)) elif method == 'nothing': newflux = flux.copy() else: raise Pydlspec2dException("Unknown method: {0}".format(method)) return newflux else: return flux def combine1fiber(inloglam, objflux, newloglam, objivar=None, verbose=False, **kwargs): """Combine several spectra of the same object, or resample a single spectrum. Parameters ---------- inloglam : :class:`numpy.ndarray` Vector of log wavelength. objflux : :class:`numpy.ndarray` Input flux. newloglam : :class:`numpy.ndarray` Output wavelength pixels, vector of log wavelength. objivar : :class:`numpy.ndarray`, optional Inverse variance of the flux. verbose : :class:`bool`, optional If ``True``, set log level to DEBUG. Returns ------- :func:`tuple` of :class:`numpy.ndarray` The resulting flux and inverse variance. Raises ------ :exc:`ValueError` If input dimensions don't match. """ # # Log # # log.enable_warnings_logging() if verbose: log.setLevel('DEBUG') # # Check that dimensions of inputs are valid. # npix = inloglam.size nfinalpix = len(newloglam) if objflux.shape != inloglam.shape: raise ValueError('Dimensions of inloglam and objflux do not agree.') if objivar is not None: if objivar.shape != inloglam.shape: raise ValueError('Dimensions of inloglam and objivar do not agree.') if 'finalmask' in kwargs: if kwargs['finalmask'].shape != inloglam.shape: raise ValueError('Dimensions of inloglam and finalmask do not agree.') if 'indisp' in kwargs: if kwargs['indisp'].shape != inloglam.shape: raise ValueError('Dimensions of inloglam and indisp do not agree.') # # Set defaults # EPS = np.finfo(np.float32).eps if 'binsz' in kwargs: binsz = kwargs['binsz'] else: if inloglam.ndim == 2: binsz = inloglam[0, 1] - inloglam[0, 0] else: binsz = inloglam[1] - inloglam[0] if 'nord' in kwargs: nord = kwargs['nord'] else: nord = 3 if 'bkptbin' in kwargs: bkptbin = kwargs['bkptbin'] else: bkptbin = 1.2 * binsz if 'maxsep' in kwargs: maxsep = kwargs['maxsep'] else: maxsep = 2.0 * binsz if inloglam.ndim == 1: # # Set specnum = 0 for all elements # nspec = 1 specnum = np.zeros(inloglam.shape, dtype=inloglam.dtype) else: nspec, ncol = inloglam.shape specnum = np.tile(np.arange(nspec), ncol).reshape(ncol, nspec).transpose() # # Use fullcombmask for modifying the pixel masks in the original input files. # fullcombmask = np.zeros(npix) newflux = np.zeros(nfinalpix, dtype=inloglam.dtype) newmask = np.zeros(nfinalpix, dtype='i4') newivar = np.zeros(nfinalpix, dtype=inloglam.dtype) newdisp = np.zeros(nfinalpix, dtype=inloglam.dtype) newsky = np.zeros(nfinalpix, dtype=inloglam.dtype) newdispweight = np.zeros(nfinalpix, dtype=inloglam.dtype) if objivar is None: nonzero = np.arange(npix, dtype='i4') ngood = npix else: nonzero = (objivar.ravel() > 0).nonzero()[0] ngood = nonzero.size # # ormask is needed to create andmask # andmask = np.zeros(nfinalpix, dtype='i4') ormask = np.zeros(nfinalpix, dtype='i4') if ngood == 0: # # In this case of no good points, set the nodata bit everywhere. # Also if noplug is set in the first input bit-mask, assume it # should be set everywhere in the output bit masks. No other bits # are set. # warn('No good points!', Pydlspec2dUserWarning) bitval = sdss_flagval('SPPIXMASK', 'NODATA') if 'finalmask' in kwargs: bitval |= (sdss_flagval('SPPIXMASK', 'NOPLUG') * (finalmask[0] & sdss_flagval('SPPIXMASK', 'NODATA'))) andmask = andmask | bitval ormask = ormask | bitval return (newflux, newivar) else: # # Now let's break sorted wavelengths into groups where pixel # separations are larger than maxsep. # inloglam_r = inloglam.ravel() isort = nonzero[inloglam_r[nonzero].argsort()] wavesort = inloglam_r[isort] padwave = np.insert(wavesort, 0, wavesort.min() - 2.0*maxsep) padwave = np.append(padwave, wavesort.max() + 2.0*maxsep) ig1 = ((padwave[1:ngood+1]-padwave[0:ngood]) > maxsep).nonzero()[0] ig2 = ((padwave[2:ngood+2]-padwave[1:ngood+1]) > maxsep).nonzero()[0] if ig1.size != ig2.size: raise ValueError('Grouping tricks did not work!') # # Avoid flux-dependent bias when combining multiple spectra. # This call to djs_median contains a width that is both floating-point # and even, which is very strange. # if objivar is not None and objivar.ndim > 1: saved_objivar = objivar for spec in range(nspec): igood = (objivar[spec, :] > 0).nonzero()[0] if igood.size > 0: # objivar[spec, igood] = djs_median(saved_objivar[spec, igood], width=100.) objivar[spec, igood] = djs_median(saved_objivar[spec, igood], width=101) else: saved_objivar = None for igrp in range(ig1.size): ss = isort[ig1[igrp]:ig2[igrp]+1] if ss.size > 2: if objivar is None: # # Fit without variance # sset, bmask = iterfit(inloglam_r[ss], objflux.ravel()[ss], nord=nord, groupbadpix=True, requiren=1, bkspace=bkptbin, silent=True) else: # # Fit with variance # sset, bmask = iterfit(inloglam_r[ss], objflux.ravel()[ss], invvar=objivar.ravel()[ss], nord=nord, groupbadpix=True, requiren=1, bkspace=bkptbin, silent=True) if np.sum(np.absolute(sset.coeff)) == 0: sset = None bmask = np.zeros(len(ss)) warn('All B-spline coefficients have been set to zero!', Pydlspec2dUserWarning) else: bmask = np.zeros(len(ss)) sset = None warn('Not enough data for B-spline fit!', Pydlspec2dUserWarning) inside = ((newloglam >= (inloglam_r[ss]).min()-EPS) & (newloglam <= (inloglam_r[ss]).max()+EPS)).nonzero()[0] # # It is possible for numinside to be zero, if the input data points # span an extremely small wavelength range, within which there are # no output wavelengths. # if sset is not None and len(inside) > 0: newflux[inside], bvalumask = sset.value(newloglam[inside]) if bvalumask.any(): newmask[inside[bvalumask]] = 1 log.debug('Masked {0:d} of {1:d} pixels.'.format((1-bmask).sum(), bmask.size)) # # Determine which pixels should be masked based upon the spline # fit. Set the combinerej bit. # ireplace = ~bmask if ireplace.any(): # # The following would replace the original flux values of # masked pixels with b-spline evaluations. # # objflux[ss[ireplace]] = sset.value(inloglam[ss[ireplace]]) # # Set the inverse variance of these pixels to zero. # if objivar is not None: objivar.ravel()[ss[ireplace]] = 0.0 log.debug('Replaced {0:d} pixels in objivar.'.format(len(ss[ireplace]))) if 'finalmask' in kwargs: finalmask[ss[ireplace]] = (finalmask[ss[ireplace]] | sdss_flagval('SPPIXMASK', 'COMBINEREJ')) fullcombmask[ss] = bmask # # Restore objivar # if saved_objivar is not None: objivar = saved_objivar * (objivar > 0) # # Combine inverse variance and pixel masks. # # Start with all bits set in andmask # andmask[:] = -1 for j in range(int(specnum.max())+1): these = (specnum.ravel() == j).nonzero()[0] if these.any(): inbetween = ((newloglam >= inloglam_r[these].min()) & (newloglam <= inloglam_r[these].max())) if inbetween.any(): jnbetween = inbetween.nonzero()[0] # # Conserve inverse variance by doing a linear interpolation # on that quantity. # result = np.interp(newloglam[jnbetween], inloglam_r[these], (objivar.ravel()[these] * fullcombmask[these])) # # Grow the fullcombmask below to reject any new sampling # containing even a partial masked pixel. # smask = np.interp(newloglam[jnbetween], inloglam_r[these], fullcombmask[these].astype(inloglam.dtype)) result *= smask >= (1.0 - EPS) newivar[jnbetween] += result*newmask[jnbetween] lowside = np.floor((inloglam_r[these]-newloglam[0])/binsz).astype('i4') highside = lowside + 1 if 'finalmask' in kwargs: andmask[lowside] &= finalmask[these] andmask[highside] &= finalmask[these] ormask[lowside] |= finalmask[these] ormask[highside] |= finalmask[these] # # Combine the dispersions + skies in the dumbest way possible # [sic]. # if 'indisp' in kwargs: newdispweight[jnbetween] += result newdisp[jnbetween] += (result * np.interp(newloglam[jnbetween], inloglam_r[these], indisp.ravel()[these])) newsky[jnbetween] += (result * np.interp(newloglam[jnbetween], inloglam_r[these], skyflux.ravel()[these])) if 'indisp' in kwargs: newdisp /= newdispweight + (newdispweight == 0) newsky /= newdispweight + (newdispweight == 0) # # Grow regions where 3 or more pixels are rejected together ??? # foo = smooth(newivar, 3) badregion = np.absolute(foo) < EPS # badregion = foo == 0.0 if badregion.any(): warn('Growing bad pixel region, {0:d} pixels found.'.format(badregion.sum()), Pydlspec2dUserWarning) ibad = badregion.nonzero()[0] lowerregion = np.where(ibad-2 < 0, 0, ibad-2) upperregion = np.where(ibad+2 > nfinalpix-1, nfinalpix-1, ibad+2) newivar[lowerregion] = 0.0 newivar[upperregion] = 0.0 # # Replace NaNs in combined spectra; this should really never happen. # inff = ((~np.isfinite(newflux)) | (~np.isfinite(newivar))) if inff.any(): warn('{0:d} NaNs in combined spectra.'.format(inff.sum()), Pydlspec2dUserWarning) newflux[inff] = 0.0 newivar[inff] = 0.0 # # Interpolate over masked pixels, just for aesthetic purposoes. # goodpts = newivar > 0 if 'aesthetics' in kwargs: amethod = kwargs['aesthetics'] else: amethod = 'traditional' newflux = aesthetics(newflux, newivar, method=amethod) # if 'interpolate' in kwargs: # newflux = pydlutils.image.djs_maskinterp(newflux,~goodpts,const=True) # else: # newflux[~goodpts] = newflux[goodpts].mean() if goodpts.any(): minglam = newloglam[goodpts].min() maxglam = newloglam[goodpts].max() ibad = ((newloglam < minglam) | (newloglam > maxglam)) if ibad.any(): ormask[ibad] |= sdss_flagval('SPPIXMASK', 'NODATA') andmask[ibad] |= sdss_flagval('SPPIXMASK', 'NODATA') # # Replace values of -1 in the andmask with 0. # andmask *= (andmask != -1) return (newflux, newivar) def filter_thru(flux, waveimg=None, wset=None, mask=None, filter_prefix='sdss_jun2001', toair=False): """Compute throughput in SDSS filters. Parameters ---------- flux : array-like Spectral flux. waveimg : array-like, optional Full wavelength solution, with the same shape as `flux`. wset : :class:`~pydl.pydlutils.trace.TraceSet`, optional A trace set containing the wavelength solution. Must be specified if `waveimg` is not specified. mask : array-like, optional Interpolate over pixels where `mask` is non-zero. filter_prefix : :class:`str`, optional Specifies a set of filter curves. toair : :class:`bool`, optional If ``True``, convert the wavelengths to air from vacuum before computing. Returns ------- array-like Integrated flux in the filter bands. Raises ------ :exc:`ValueError` If neither `waveimg` nor `wset` are set. """ nTrace, nx = flux.shape if filter_prefix != 'sdss_jun2001': raise ValueError("Filters other than {0} are not available!".format('sdss_jun2001')) ffiles = [_get_pkg_filename_compat('data/filters/{0}_{1}_atm.dat'.format(filter_prefix, f), package='pydl.pydlutils') for f in 'ugriz'] if waveimg is None and wset is None: raise ValueError("Either waveimg or wset must be specified!") if waveimg is None: pixnorm, logwave = traceset2xy(wset) waveimg = 10**logwave if toair: newwaveimg = vactoair(waveimg) else: newwaveimg = waveimg logwave = np.log10(newwaveimg) diffx = np.outer(np.ones((nTrace,), dtype=flux.dtype), np.arange(nx-1, dtype=flux.dtype)) diffy = logwave[:, 1:] - logwave[:, 0:nx-1] diffset = xy2traceset(diffx, diffy, ncoeff=4, xmin=0, xmax=nx-1) pixnorm, logdiff = traceset2xy(diffset) logdiff = np.absolute(logdiff) if mask is not None: flux_interp = djs_maskinterp(flux, mask, iaxis=0) res = np.zeros((nTrace, len(ffiles)), dtype=flux.dtype) for i, f in enumerate(ffiles): filter_data = ascii.read(f, comment='#.*', names=('lam', 'respt', 'resbig', 'resnoa', 'xatm')) filtimg = logdiff * np.interp(newwaveimg.flatten(), filter_data['lam'].data, filter_data['respt'].data).reshape(logdiff.shape) if mask is not None: res[:, i] = (flux_interp * filtimg).sum(1) else: res[:, i] = (flux * filtimg).sum(1) sumfilt = filtimg.sum(1) res[:, i] = res[:, i] / (sumfilt + (sumfilt <= 0).astype(sumfilt.dtype)) return res def _get_pkg_filename_compat(filename, package): """Astropy 1.0.x/LTS does not accept the 'package' argument. """ try: f = get_pkg_data_filename(filename, package=package) except TypeError: from pkg_resources import resource_filename f = resource_filename(package, filename) return f pydl-0.7.0/pydl/pydlspec2d/spec1d.py0000644000076500000240000022020413434104050017646 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the spec1d directory in idlspec2d. """ import glob import os import time from warnings import warn import numpy as np from numpy.linalg import solve import matplotlib matplotlib.use('Agg') matplotlib.rcParams['figure.figsize'] = (16.0, 12.0) import matplotlib.pyplot as plt from matplotlib.font_manager import fontManager, FontProperties from astropy import log from astropy.io import ascii, fits from six import integer_types from . import Pydlspec2dException, Pydlspec2dUserWarning # # Used by findspec # findspec_cache = None class HMF(object): """Class used to manage data for Heteroscedastic Matrix Factorization (HMF). This is a replacement for :func:`~pydl.pydlspec2d.spec1d.pca_solve`. It can be called with:: hmf = HMF(spectra, invvar) output = hmf.solve() The input spectra should be pre-processed through :func:`~pydl.pydlspec2d.spec2d.combine1fiber`. Parameters ---------- spectra : array-like The input spectral flux, assumed to have a common wavelength and redshift system. invvar : array-like The inverse variance of the spectral flux. K : :class:`int`, optional The number of dimensions of the factorization (default 4). n_iter : :class:`int`, optional Number of iterations. seed : :class:`int`, optional. If set, pass this value to :func:`numpy.random.seed`. nonnegative : :class:`bool`, optional Set this to ``True`` to perform nonnegative HMF. epsilon : :class:`float`, optional Regularization parameter. Set to any non-negative float value to turn it on. verbose : :class:`bool`, optional If ``True``, print extra information. Notes ----- See [1]_ and [2]_ for the original derivation of this method. The HMF iteration is initialized using :func:`~scipy.cluster.vq.kmeans`, which itself uses random numbers to initialize its state. If you need to ensure reproducibility, call :func:`numpy.random.seed` before initializing HMF. The current algorithm cannot handle input data that contain *columns* of zeros. Columns of this type need to be *carefully* removed from the input data. This could also result in the output data having a different size compared to the input data. References ---------- .. [1] `Tsalmantza, P., Decarli, R., Dotti, M., Hogg, D. W., 2011 ApJ 738, 20 `_ .. [2] `Tsalmantza, P., Hogg, D. W., 2012 ApJ 753, 122 `_ """ def __init__(self, spectra, invvar, K=4, n_iter=None, seed=None, nonnegative=False, epsilon=None, verbose=False): self.spectra = spectra self.invvar = invvar self.K = K if n_iter is None: if nonnegative: self.n_iter = 2048 else: self.n_iter = 20 else: self.n_iter = int(n_iter) self.seed = seed self.nonnegative = nonnegative self.epsilon = epsilon self.verbose = verbose self.a = None self.g = None if self.verbose: log.setLevel('DEBUG') return def solve(self): """Process the inputs. Returns ------- :class:`dict` The HMF solution. """ if len(self.spectra.shape) == 1: nobj = 1 npix = self.spectra.shape[0] else: nobj, npix = self.spectra.shape log.info("Building HMF from %d object spectra.", nobj) fluxdict = dict() # # If there is only one object spectrum, then all we can do is return it. # if nobj == 1: fluxdict['flux'] = self.spectra.astype('f') return fluxdict a, g = self.iterate() fluxdict['acoeff'] = a fluxdict['flux'] = g return fluxdict def model(self): """Compute the model. """ return np.dot(self.a, self.g) def resid(self): """Compute residuals. """ return self.spectra - self.model() def chi(self): """Compute :math:`\chi`, the scaled residual. """ return self.resid() * np.sqrt(self.invvar) def penalty(self): """Compute penalty for non-smoothness. """ if self.epsilon is None: return 0.0 return self.epsilon * np.sum(np.diff(self.g)**2) def badness(self): """Compute :math:`\chi^2`, including possible non-smoothness penalty. """ return np.sum(self.chi()**2) + self.penalty() def normbase(self): """Apply standard component normalization. """ return np.sqrt((self.g**2).mean(1)) def astep(self): """Update for coefficients at fixed component spectra. """ N, M = self.spectra.shape K, M = self.g.shape a = np.zeros((N, K), dtype=self.g.dtype) for i in range(N): Gi = np.zeros((K, K), dtype=self.g.dtype) for k in range(K): for kp in range(k, K): Gi[k, kp] = np.sum(self.g[k, :] * self.g[kp, :] * self.invvar[i, :]) if kp > k: Gi[kp, k] = Gi[k, kp] Fi = np.dot(self.g, self.spectra[i, :]*self.invvar[i, :]) a[i, :] = solve(Gi, Fi) return a def gstep(self): """Update for component spectra at fixed coefficients. """ N, M = self.spectra.shape N, K = self.a.shape g = np.zeros((K, M), dtype=self.a.dtype) e = np.zeros(self.g.shape, dtype=self.g.dtype) d = np.zeros((K, K, M), dtype=self.a.dtype) if self.epsilon is not None and self.epsilon > 0: foo = self.epsilon * np.eye(K, dtype=self.a.dtype) for l in range(M): d[:, :, l] = foo if l > 0 and l < M-1: d[:, :, l] *= 2 # d[:, :, 0] = foo # d[:, :, 1:M-1] = 2*foo # d[:, :, M-1] = foo e[:, 0] = self.epsilon*self.g[:, 1] e[:, 1:M-1] = self.epsilon*(self.g[:, 0:M-2] + self.g[:, 2:M]) e[:, M-1] = self.epsilon*self.g[:, M-2] for j in range(M): Aj = np.zeros((K, K), dtype=self.a.dtype) for k in range(K): for kp in range(k, K): Aj[k, kp] = np.sum(self.a[:, k] * self.a[:, kp] * self.invvar[:, j]) if kp > k: Aj[kp, k] = Aj[k, kp] Aj += d[:, :, j] Fj = (np.dot(self.a.T, self.spectra[:, j]*self.invvar[:, j]) + e[:, j]) g[:, j] = solve(Aj, Fj) return g def astepnn(self): """Non-negative update for coefficients at fixed component spectra. """ numerator = np.dot(self.spectra*self.invvar, self.g.T) denominator = np.dot(np.dot(self.a, self.g)*self.invvar, self.g.T) return self.a*(numerator/denominator) def gstepnn(self): """Non-negative update for component spectra at fixed coefficients. """ K, M = self.g.shape numerator = np.dot(self.a.T, (self.spectra*self.invvar)) if self.epsilon is not None and self.epsilon > 0: e = np.zeros(self.g.shape, dtype=self.g.dtype) e[:, 0] = self.epsilon*self.g[:, 1] e[:, 1:M-1] = self.epsilon*(self.g[:, 0:M-2] + self.g[:, 2:M]) e[:, M-1] = self.epsilon*self.g[:, M-2] numerator += e denominator = np.dot(self.a.T, np.dot(self.a, self.g)*self.invvar) if self.epsilon is not None and self.epsilon > 0: d = self.epsilon*self.g.copy() d[:, 1:M-1] *= 2 denominator += d return self.g*(numerator/denominator) def reorder(self): """Reorder and rotate basis analogous to PCA. """ from numpy.linalg import eigh l, U = eigh(np.dot(self.a.T, self.a)) return (np.dot(self.a, U), np.dot(U.T, self.g)) def iterate(self): """Handle the HMF iteration. Returns ------- :func:`tuple` of :class:`numpy.ndarray` The fitting coefficients and fitted functions, respectively. """ from scipy.cluster.vq import kmeans, whiten from ..pydlutils.math import find_contiguous N, M = self.spectra.shape # # Make spectra non-negative # if self.nonnegative: self.spectra[self.spectra < 0] = 0 self.invvar[self.spectra < 0] = 0 # # Detect and fix very bad columns. # si = self.spectra * self.invvar if (self.spectra.sum(0) == 0).any(): log.warn("Columns of zeros detected in spectra!") if (self.invvar.sum(0) == 0).any(): log.warn("Columns of zeros detected in invvar!") if (si.sum(0) == 0).any(): log.warn("Columns of zeros detected in spectra*invvar!") zerocol = ((self.spectra.sum(0) == 0) | (self.invvar.sum(0) == 0) | (si.sum(0) == 0)) n_zero = zerocol.sum() if n_zero > 0: log.warn("Found %d bad columns in input data!", n_zero) # # Find the largest set of contiguous pixels # goodcol = find_contiguous(~zerocol) self.spectra = self.spectra[:, goodcol] self.invvar = self.invvar[:, goodcol] # si = si[:, goodcol] # newloglam = fullloglam[goodcol] # # Initialize g matrix with kmeans # if self.seed is not None: np.random.seed(self.seed) whitespectra = whiten(self.spectra) log.debug(whitespectra[0:3, 0:3]) self.g, foo = kmeans(whitespectra, self.K) self.g /= np.repeat(self.normbase(), M).reshape(self.g.shape) log.debug(self.g[0:3, 0:3]) # # Initialize a matrix # self.a = np.outer(np.sqrt((self.spectra**2).mean(1)), np.repeat(1.0/self.K, self.K)) if self.nonnegative: for k in range(128): self.a = self.astepnn() # # Iterate! # t0 = time.time() for m in range(self.n_iter): log.info("Starting iteration #%4d.", m+1) if self.nonnegative: self.a = self.astepnn() self.g = self.gstepnn() else: self.a = self.astep() self.g = self.gstep() self.a, self.g = self.reorder() norm = self.normbase() self.g /= np.repeat(norm, M).reshape(self.g.shape) self.a = (self.a.T*np.repeat(norm, N).reshape(self.K, N)).T log.debug(self.a[0:3, 0:3]) log.debug(self.g[0:3, 0:3]) log.debug("Chi**2 after iteration #%4d = %f.", m+1, self.badness()) log.info("The elapsed time for iteration #%4d is %6.2f s.", m+1, time.time()-t0) return (self.a, self.g) def findspec(*args, **kwargs): """Find SDSS/BOSS spectra that match a given RA, Dec. Parameters ---------- ra, dec : array-like, optional If set, the first two positional arguments will be interpreted as RA, Dec. best : :class:`bool`, optional If set, return only the best match for each input RA, Dec. infile : :class:`str`, optional If set, read RA, Dec data from this file. outfile : :class:`str`, optional If set, print match data to this file. print : :class:`bool`, optional If set, print the match data to the console. run1d : :class:`str`, optional Override the value of :envvar:`RUN1D`. run2d : :class:`str`, optional Override the value of :envvar:`RUN2D`. sdss : :class:`bool`, optional If set, search for SDSS-I/II spectra instead of BOSS spectra. searchrad : :class:`float`, optional Search for spectra in this radius around given RA, Dec. Default is 3 arcsec. topdir : :class:`str`, optional If set, override the value of :envvar:`SPECTRO_REDUX` or :envvar:`BOSS_SPECTRO_REDUX`. Returns ------- :class:`dict` A dictionary containing plate, MJD, fiber, etc. """ from .. import uniq from ..pydlutils.misc import struct_print from ..pydlutils.spheregroup import spherematch global findspec_cache # # Set up default values # if 'sdss' in kwargs: if 'topdir' in kwargs: topdir = kwargs['topdir'] else: topdir = os.environ['SPECTRO_REDUX'] if 'run2d' in kwargs: run2d = str(kwargs['run2d']) else: run2d = '26' run1d = '' else: if 'topdir' in kwargs: topdir = kwargs['topdir'] else: topdir = os.environ['BOSS_SPECTRO_REDUX'] if 'run2d' in kwargs: run2d = str(kwargs['run2d']) else: run2d = os.environ['RUN2D'] if 'run1d' in kwargs: run1d = str(kwargs['run1d']) else: run1d = os.environ['RUN1D'] if findspec_cache is None: findspec_cache = {'lasttopdir': topdir, 'plist': None} if (findspec_cache['plist'] is None or topdir != findspec_cache['lasttopdir']): findspec_cache['lasttopdir'] = topdir platelist_file = os.path.join(topdir, "platelist.fits") plates_files = glob.glob(os.path.join(topdir, "plates-*.fits")) plist = None if os.path.exists(platelist_file): platelist = fits.open(platelist_file) plist = platelist[1].data platelist.close() if len(plates_files) > 0: plates = fits.open(plates_files[0]) plist = plates[1].data plates.close() if plist is None: raise Pydlspec2dException("Plate list (platelist.fits or plates-*.fits) not found in {0}.".format(topdir)) else: findspec_cache['plist'] = plist qdone = plist.field('STATUS1D') == 'Done' qdone2d = plist.field('RUN2D').strip() == run2d if run1d == '': qdone1d = np.ones(plist.size, dtype='bool') else: qdone1d = plist.field('RUN1D').strip() == run1d qfinal = qdone & qdone2d & qdone1d if not qfinal.any(): warn("No reduced plates!", Pydlspec2dUserWarning) return None idone = np.arange(plist.size)[qfinal] # # If there are positional arguments, interpret these as RA, Dec # if len(args) == 2: ra = args[0] dec = args[1] # # Read RA, Dec from infile if set # if 'infile' in kwargs: infile_data = ascii.read(kwargs['infile'], names=['ra', 'dec']) ra = infile_data["ra"].data dec = infile_data["dec"].data if 'searchrad' in kwargs: searchrad = float(kwargs['searchrad']) else: searchrad = 3.0/3600.0 # # Create output structure # slist_type = np.dtype([('PLATE', 'i4'), ('MJD', 'i4'), ('FIBERID', 'i4'), ('RA', 'f8'), ('DEC', 'f8'), ('MATCHRAD', 'f8')]) # # Match all plates with objects # imatch1, itmp, dist12 = spherematch(ra, dec, plist[qfinal].field('RACEN'), plist[qfinal].field('DECCEN'), searchrad+1.55, maxmatch=0) if imatch1.size == 0: warn("No matching plates found.", Pydlspec2dUserWarning) return None imatch2 = idone[itmp] # # Read all relevant plates # try: n_total = plist.field('N_TOTAL') except KeyError: n_total = np.zeros(plist.size, dtype='i4') + 640 iplate = imatch2[uniq(imatch2, imatch2.argsort())] i0 = 0 plugmap = np.zeros(n_total[iplate].sum(), dtype=[('PLATE', 'i4'), ('MJD', 'i4'), ('FIBERID', 'i4'), ('RA', 'd'), ('DEC', 'd')]) for i in range(iplate.size): spplate = readspec(plist[iplate[i]].field('PLATE'), mjd=plist[iplate[i]].field('MJD'), topdir=topdir, run2d=run2d, run1d=run1d) index_to = i0 + np.arange(n_total[iplate[i]], dtype='i4') plugmap['PLATE'][index_to] = plist[iplate[i]].field('PLATE') plugmap['MJD'][index_to] = plist[iplate[i]].field('MJD') plugmap['FIBERID'][index_to] = spplate['plugmap']['FIBERID'] plugmap['RA'][index_to] = spplate['plugmap']['RA'] plugmap['DEC'][index_to] = spplate['plugmap']['DEC'] i0 += n_total[iplate[i]] i1, i2, d12 = spherematch(ra, dec, plugmap['RA'], plugmap['DEC'], searchrad, maxmatch=0) if i1.size == 0: warn('No matching objects found.', Pydlspec2dUserWarning) return None if 'best' in kwargs: # # Return only best match per object # slist = np.zeros(ra.size, dtype=slist_type) spplate = readspec(plugmap[i2]['PLATE'], plugmap[i2]['FIBERID'], mjd=plugmap[i2]['MJD'], topdir=topdir, run2d=run2d, run1d=run1d) sn = spplate['zans']['SN_MEDIAN'] isort = (i1 + np.where(sn > 0, sn, 0)/(sn+1.0).max()).argsort() i1 = i1[isort] i2 = i2[isort] d12 = d12[isort] iuniq = uniq(i1) slist[i1[iuniq]]['PLATE'] = plugmap[i2[iuniq]]['PLATE'] slist[i1[iuniq]]['MJD'] = plugmap[i2[iuniq]]['MJD'] slist[i1[iuniq]]['FIBERID'] = plugmap[i2[iuniq]]['FIBERID'] slist[i1[iuniq]]['RA'] = plugmap[i2[iuniq]]['RA'] slist[i1[iuniq]]['DEC'] = plugmap[i2[iuniq]]['DEC'] slist[i1[iuniq]]['MATCHRAD'] = d12[iuniq] else: # # Return all matches # slist = np.zeros(i1.size, dtype=slist_type) slist['PLATE'] = plugmap[i2]['PLATE'] slist['MJD'] = plugmap[i2]['MJD'] slist['FIBERID'] = plugmap[i2]['FIBERID'] slist['RA'] = plugmap[i2]['RA'] slist['DEC'] = plugmap[i2]['DEC'] slist['MATCHRAD'] = d12 # # Print to terminal or output file # if 'print' in kwargs: foo = struct_print(slist) if 'outfile' in kwargs: foo = struct_print(slist, filename=outfile) return slist def latest_mjd(plate, **kwargs): """Find the most recent MJD associated with a plate. Parameters ---------- plate : :class:`int` or :class:`numpy.ndarray` The plate(s) to examine. Returns ------- :class:`numpy.ndarray` An array of MJD values for each plate. """ import re if isinstance(plate, integer_types) or plate.shape == (): platevec = np.array([plate], dtype='i4') else: platevec = plate mjd = np.zeros(len(platevec), dtype='i4') mjdre = re.compile(r'spPlate-[0-9]{4}-([0-9]{5}).fits') unique_plates = np.unique(platevec) paths = spec_path(unique_plates, **kwargs) for p, q in zip(paths, unique_plates): plateglob = "{0}/spPlate-{1:04d}-*.fits".format(p, q) bigmjd = 0 for f in glob.glob(plateglob): thismjd = int(mjdre.search(f).groups()[0]) if thismjd > bigmjd: bigmjd = thismjd mjd[platevec == q] = bigmjd return mjd def number_of_fibers(plate, **kwargs): """Returns the total number of fibers per plate. Parameters ---------- plate : :class:`int` or :class:`numpy.ndarray` The plate(s) to examine. Returns ------- :class:`numpy.ndarray` The number of fibers on each plate. """ # # Get mjd values # if isinstance(plate, integer_types) or plate.shape == (): platevec = np.array([plate], dtype='i4') else: platevec = plate mjd = latest_mjd(plate, **kwargs) nfiber = np.zeros(mjd.size, dtype='i4') # # SDSS-I,II plates # nfiber[mjd < 55025] = 640 # # Short circuit if we're done. # if (nfiber == 640).all(): return nfiber # # Not all BOSS plates have 1000 fibers # if 'path' in kwargs: platelistpath = os.path.join(kwargs['path'], 'platelist.fits') else: platelistpath = os.path.join(os.environ['BOSS_SPECTRO_REDUX'], 'platelist.fits') platelist = fits.open(platelistpath) platentotal = platelist[1].data.field('N_TOTAL') plateplate = platelist[1].data.field('PLATE') platemjd = platelist[1].data.field('MJD') platerun2d = platelist[1].data.field('RUN2D') platerun1d = platelist[1].data.field('RUN1D') platelist.close() if 'run2d' in kwargs: run2d = kwargs['run2d'] else: run2d = os.environ['RUN2D'] if 'run1d' in kwargs: run1d = kwargs['run1d'] else: run1d = os.environ['RUN1D'] for k in range(mjd.size): nfiber[k] = platentotal[(plateplate == platevec[k]) & (platemjd == mjd[k]) & (platerun2d == run2d) & (platerun1d == run1d)] return nfiber def pca_solve(newflux, newivar, maxiter=0, niter=10, nkeep=3, nreturn=None, verbose=False): """Replacement for idlspec2d pca_solve.pro. Parameters ---------- newflux : array-like The input spectral flux, assumed to have a common wavelength and redshift system. newivar : array-like The inverse variance of the spectral flux. maxiter : :class:`int`, optional Stop PCA+reject iterations after this number. niter : :class:`int`, optional Stop PCA iterations after this number. nkeep : :class:`int`, optional Number of PCA components to keep. nreturn : :class:`int`, optional Number of PCA components to return, usually the same as `nkeep`. verbose : :class:`bool`, optional If ``True``, print extra information. Returns ------- :class:`dict` The PCA solution. """ from .. import pcomp from ..pydlutils.math import computechi2, djs_reject if verbose: log.setLevel('DEBUG') if nreturn is None: nreturn = nkeep if len(newflux.shape) == 1: nobj = 1 npix = newflux.shape[0] else: nobj, npix = newflux.shape log.info("Building PCA from %d object spectra.", nobj) nzi = newivar.nonzero() first_nonzero = (np.arange(nobj, dtype=nzi[0].dtype), np.array([nzi[1][nzi[0] == k].min() for k in range(nobj)])) # # Construct the synthetic weight vector, to be used when replacing the # low-S/N object pixels with the reconstructions. # synwvec = np.ones((npix,), dtype='d') for ipix in range(npix): indx = newivar[:, ipix] != 0 if indx.any(): synwvec[ipix] = newivar[indx, ipix].mean() fluxdict = dict() # # If there is only one object spectrum, then all we can do is return it. # if nobj == 1: fluxdict['flux'] = newflux.astype('f') return fluxdict # # Rejection iteration loop. # qdone = 0 iiter = 0 # # Begin with all points good. # outmask = None inmask = newivar != 0 ymodel = None # emevecs, emevals = pydlutils.empca(newflux, inmask) # fluxdict['emevecs'] = emevecs # fluxdict['emevals'] = emeveals while qdone == 0 and iiter <= maxiter: log.debug('starting djs_reject') outmask, qdone = djs_reject(newflux, ymodel, inmask=inmask, outmask=outmask, invvar=newivar) log.debug('finished with djs_reject') # # Iteratively do the PCA solution # filtflux = newflux.copy() acoeff = np.zeros((nobj, nkeep), dtype='d') t0 = time.time() for ipiter in range(niter): # # We want to get these values from the pcomp routine. # # eigenval = 1 # coeff = 1 flux0 = np.tile(filtflux[first_nonzero], npix).reshape(npix, nobj).transpose() # flux0 = np.tile(filtflux, npix).reshape(npix, nobj).transpose() totflux = np.absolute(filtflux - flux0).sum(1) goodobj = totflux > 0 if goodobj.all(): tmp = pcomp(filtflux.T) # , standardize=True) pres = tmp.derived eigenval = tmp.eigenvalues else: tmp = pcomp(filtflux[goodobj, :].T) # , standardize=True) pres = np.zeros((nobj, npix), dtype='d') pres[goodobj, :] = tmp.derived eigenval = np.zeros((nobj,), dtype='d') eigenval[goodobj] = tmp.eigenvalues maskivar = newivar * outmask sqivar = np.sqrt(maskivar) for iobj in range(nobj): out = computechi2(newflux[iobj, :], sqivar[iobj, :], pres[:, 0:nkeep]) filtflux[iobj, :] = (maskivar[iobj, :] * newflux[iobj, :] + synwvec*out.yfit) / (maskivar[iobj, :] + synwvec) acoeff[iobj, :] = out.acoeff log.info("The elapsed time for iteration #%2d is %6.2f s.", ipiter+1, time.time()-t0) # # Now set ymodel for rejecting points. # ymodel = np.dot(acoeff, pres[:, 0:nkeep].T) iiter += 1 if nobj == 1: usemask = outmask else: usemask = outmask.sum(0) fluxdict['usemask'] = usemask fluxdict['outmask'] = outmask fluxdict['flux'] = pres[:, 0:nreturn].transpose().astype('f') fluxdict['eigenval'] = eigenval[0:nreturn] fluxdict['acoeff'] = acoeff return fluxdict def plot_eig(filename, title='Unknown'): """Plot spectra from an eigenspectra/template file. Parameters ---------- filename : :class:`str` Name of a FITS file containing eigenspectra/templates. title : :class:`str`, optional Title to put on the plot. Raises ------ :exc:`ValueError` If an unknown template type was input in `filename`. """ # # Set title based on filename # if title == 'Unknown': if filename.find('Gal') > 0: title = 'Galaxies: Eigenspectra' elif filename.find('QSO') > 0: title = 'QSOs: Eigenspectra' elif filename.find('Star') > 0: title = 'Stars: Eigenspectra' elif filename.find('CVstar') > 0: title = 'CV Stars: Eigenspectra' else: raise ValueError('Unknown template type!') base, ext = filename.split('.') spectrum = fits.open(filename) newloglam0 = spectrum[0].header['COEFF0'] objdloglam = spectrum[0].header['COEFF1'] spectro_data = spectrum[0].data spectrum.close() (neig, ndata) = spectro_data.shape newloglam = np.arange(ndata) * objdloglam + newloglam0 lam = 10.0**newloglam fig = plt.figure(dpi=100) ax = fig.add_subplot(111) colorvec = ['k', 'r', 'g', 'b', 'm', 'c'] for l in range(neig): p = ax.plot(lam, spectro_data[l, :], colorvec[l % len(colorvec)]+'-', linewidth=1) ax.set_xlabel(r'Wavelength [$\AA$]') ax.set_ylabel('Flux [Arbitrary Units]') ax.set_title(title) # ax.set_xlim([3500.0,10000.0]) # ax.set_ylim([-400.0,500.0]) # fig.savefig(base+'.zoom.png') fig.savefig(base+'.png') plt.close(fig) return def readspec(platein, mjd=None, fiber=None, **kwargs): """Read SDSS/BOSS spec2d & spec1d files. Parameters ---------- platein : :class:`int` or :class:`numpy.ndarray` Plate number(s). mjd : :class:`int` or :class:`numpy.ndarray`, optional MJD numbers. If not provided, they will be calculated by :func:`latest_mjd`. fiber : array-like, optional Fibers to read. If not set, all fibers from all plates will be returned. topdir : :class:`str`, optional Override the value of :envvar:`BOSS_SPECTRO_REDUX`. run2d : :class:`str`, optional Override the value of :envvar:`RUN2D`. run1d : :class:`str`, optional Override the value of :envvar:`RUN1D`. path : :class:`str`, optional Override all path information with this directory name. align : :class:`bool`, optional If set, align all the spectra in wavelength. znum : :class:`int`, optional If set, return the znum-th best fit reshift fit, instead of the best. Returns ------- :class:`dict` A dictionary containing the data read. """ try: nplate = len(platein) plate = platein except TypeError: nplate = 1 plate = np.array([platein], dtype='i4') if 'run2d' in kwargs: run2d = kwargs['run2d'] else: run2d = os.environ['RUN2D'] if 'run1d' in kwargs: run1d = kwargs['run1d'] else: run1d = os.environ['RUN1D'] if fiber is None: # # Read all fibers # nfibers = number_of_fibers(plate, **kwargs) total_fibers = nfibers.sum() platevec = np.zeros(total_fibers, dtype='i4') fibervec = np.zeros(total_fibers, dtype='i4') k = 0 for p in np.unique(plate): n = np.unique(nfibers[plate == p])[0] platevec[k:k+n] = p fibervec[k:k+n] = np.arange(n) + 1 k += n else: try: nfiber = len(fiber) except TypeError: nfiber = 1 if nplate > 1 and nfiber > 1 and nplate != nfiber: raise TypeError("Plate & Fiber must have the same length!") if nplate > 1: platevec = np.array(plate, dtype='i4') else: platevec = np.zeros(nfiber, dtype='i4') + plate if nfiber > 1: fibervec = np.array(fiber, dtype='i4') else: fibervec = np.zeros(nplate, dtype='i4') + fiber if 'mjd' is None: mjdvec = latest_mjd(platevec, **kwargs) else: try: nmjd = len(mjd) except TypeError: nmjd = 1 if nmjd != nplate: raise TypeError("Plate & MJD must have the same length!") mjdvec = np.zeros(nplate, dtype='i4') + mjd # # Now select unique plate-mjd combinations & read them # pmjd = ((np.array(platevec, dtype='u8') << 16) + np.array(mjdvec, dtype='u8')) # log.debug(pmjd) upmjd = np.unique(pmjd) zupmjd = list(zip(upmjd >> 16, upmjd & ((1 << 16) - 1))) # log.debug(zupmjd) spplate_data = dict() hdunames = ('flux', 'invvar', 'andmask', 'ormask', 'disp', 'plugmap', 'sky', 'loglam',) for thisplate, thismjd in zupmjd: # thisplate = int(p>>16) # thismjd = int(np.bitwise_and(p, (1<<16)-1)) pmjdindex = ((platevec == thisplate) & (mjdvec == thismjd)).nonzero()[0] thisfiber = fibervec[pmjdindex] # log.debug(type(thisplate), type(thismjd)) # log.debug(repr(thisfiber)) # log.debug(type(thisfiber)) pmjdstr = "{0:04d}-{1:05d}".format(int(thisplate), int(thismjd)) if 'path' in kwargs: sppath = [kwargs['path']] else: sppath = spec_path(thisplate, run2d=run2d) spfile = os.path.join(sppath[0], "spPlate-{0}.fits".format(pmjdstr)) log.info(spfile) spplate = fits.open(spfile) # # Get wavelength coefficients from primary header # npix = spplate[0].header['NAXIS1'] c0 = spplate[0].header['COEFF0'] c1 = spplate[0].header['COEFF1'] coeff0 = np.zeros(thisfiber.size, dtype='d') + c0 coeff1 = np.zeros(thisfiber.size, dtype='d') + c1 loglam0 = c0 + c1*np.arange(npix, dtype='d') loglam = np.resize(loglam0, (thisfiber.size, npix)) # # Read the data images # for k in range(len(hdunames)): if hdunames[k] == 'loglam': tmp = loglam else: try: tmp = spplate[k].data[thisfiber-1, :] except IndexError: tmp = spplate[k].data[thisfiber-1] if hdunames[k] not in spplate_data: if k == 0: allpmjdindex = pmjdindex allcoeff0 = coeff0 allcoeff1 = coeff1 # # Put the data into the return structure # if hdunames[k] == 'plugmap': spplate_data['plugmap'] = dict() for c in spplate[k].columns.names: spplate_data['plugmap'][c] = tmp[c] else: spplate_data[hdunames[k]] = tmp else: # # Append data # if k == 0: allpmjdindex = np.concatenate((allpmjdindex, pmjdindex)) if 'align' in kwargs: mincoeff0 = min(allcoeff0) if mincoeff0 == 0 and coeff0[0] > 0: allcoeff0 = coeff0[0] allcoeff1 = coeff1[1] if mincoeff0 > 0 and coeff0[0] == 0: coeff0 = mincoeff0 coeff1 = allcoeff1[0] ps = np.floor((coeff0[0] - mincoeff0)/coeff1[0] + 0.5) if ps > 0: coeff0 = coeff0 - ps*coeff1 else: allcoeff0 = allcoeff0 + ps*allcoeff1 else: ps = 0 allcoeff0 = np.concatenate((allcoeff0, coeff0)) allcoeff1 = np.concatenate((allcoeff1, coeff1)) if hdunames[k] == 'plugmap': for c in spplate[5].columns.names: spplate_data['plugmap'][c] = np.concatenate( (spplate_data['plugmap'][c], tmp[c])) else: spplate_data[hdunames[k]] = spec_append(spplate_data[hdunames[k]], tmp, pixshift=ps) spplate.close() # # Read photoPlate information, if available # photofile = os.path.join(sppath[0], "photoPlate-{0}.fits".format(pmjdstr)) if not os.path.exists(photofile): # # Hmm, maybe this is an SDSS-I,II plate # photofile = os.path.join(os.environ['SPECTRO_MATCH'], run2d, os.path.basename(os.environ['PHOTO_RESOLVE']), "{0:04d}".format(int(thisplate)), "photoPlate-{0}.fits".format(pmjdstr)) if os.path.exists(photofile): photop = fits.open(photofile) tmp = photop[1].data[thisfiber-1] if 'tsobj' not in spplate_data: spplate_data['tsobj'] = dict() for c in photop[1].columns.names: spplate_data['tsobj'][c] = tmp[c] else: for c in photop[1].columns.names: spplate_data['tsobj'][c] = np.concatenate( (spplate_data['tsobj'][c], tmp[c])) photop.close() # # Read redshift information, if available. # if 'znum' in kwargs: zfile = os.path.join(sppath[0], run1d, "spZall-{0}.fits".format(pmjdstr)) else: zfile = os.path.join(sppath[0], run1d, "spZbest-{0}.fits".format(pmjdstr)) if os.path.exists(zfile): spz = fits.open(zfile) if 'znum' in kwargs: nper = spz[0].header['DIMS0'] zfiber = (thisfiber-1)*nper + kwargs['znum'] - 1 else: zfiber = thisfiber tmp = spz[1].data[zfiber-1] if 'zans' not in spplate_data: spplate_data['zans'] = dict() for c in spz[1].columns.names: spplate_data['zans'][c] = tmp[c] else: for c in spz[1].columns.names: spplate_data['zans'][c] = np.concatenate( (spplate_data['zans'][c], tmp[c])) spz.close() # # Reorder the data. At this point allpmjdindex is an index for which # fiber[allpmjdindex] == spplate['plugmap']['FIBERID'], so we have to # reverse this mapping. # j = allpmjdindex.argsort() for k in spplate_data: if isinstance(spplate_data[k], dict): for c in spplate_data[k]: if spplate_data[k][c].ndim == 2: spplate_data[k][c] = spplate_data[k][c][j, :] else: spplate_data[k][c] = spplate_data[k][c][j] else: spplate_data[k] = spplate_data[k][j, :] allcoeff0 = allcoeff0[j] allcoeff1 = allcoeff1[j] # # If necessary, recompute the wavelengths # nfibers, npixmax = spplate_data['flux'].shape if 'align' in kwargs: loglam0 = allcoeff0[0] + allcoeff1[1]*np.arange(npixmax, dtype='d') spplate_data['loglam'] = np.resize(loglam0, (nfibers, npixmax)) return spplate_data def skymask(invvar, andmask, ormask=None, ngrow=2): """Mask regions where sky-subtraction errors are expected to dominate. Parameters ---------- invvar : :class:`numpy.ndarray` Inverse variance. andmask : :class:`numpy.ndarray` An "and" mask. For historical reasons, this input is ignored. ormask : :class:`numpy.ndarray`, optional An "or" mask. Although technically this is optional, if it is not supplied, this function will have no effect. ngrow : :class:`int`, optional Expand bad areas by this number of pixels. Returns ------- :class:`numpy.ndarray` The `invvar` multiplied by the bad areas. """ from ..pydlutils.sdss import sdss_flagval from .. import smooth nrows, npix = invvar.shape badmask = np.zeros(invvar.shape, dtype='i4') badskychi = sdss_flagval('SPPIXMASK', 'BADSKYCHI') redmonster = sdss_flagval('SPPIXMASK', 'REDMONSTER') # brightsky = sdss_flagval('SPPIXMASK', 'BRIGHTSKY') if ormask is not None: badmask = badmask | ((ormask & badskychi) != 0) badmask = badmask | ((ormask & redmonster) != 0) # badmask = badmask | ((andmask & brightsky) != 0) if ngrow > 0: width = 2*ngrow + 1 for k in range(nrows): badmask[k, :] = smooth(badmask[k, :]*width, width, True) > 0 return invvar * (1 - badmask) def spec_append(spec1, spec2, pixshift=0): """Append the array spec2 to the array spec1 & return a new array. If the dimension of these arrays is the same, then append as [spec1,spec2]. If not, increase the size of the smaller array & fill with zeros. Parameters ---------- spec1, spec2 : :class:`numpy.ndarray` Append `spec2` to `spec1`. pixshift : :class:`int`, optional If `pixshift` is set to a positive integer, `spec2` will be padded with `pixshift` zeros on the left side. If `pixshift` is set to a negative integer, `spec1` will be padded with ``abs(pixshift)`` zeros on the left side. If not set, all zeros will be padded on the right side. Returns ------- :class:`numpy.ndarray` A new array containing both `spec1` and `spec2`. """ nrows1, npix1 = spec1.shape nrows2, npix2 = spec2.shape nrows = nrows1+nrows2 nadd1 = 0 nadd2 = 0 if pixshift != 0: if pixshift < 0: nadd1 = -pixshift else: nadd2 = pixshift maxpix = max(npix1 + nadd1, npix2 + nadd2) spec3 = np.zeros((nrows, maxpix), dtype=spec1.dtype) spec3[0:nrows1, nadd1:nadd1+npix1] = spec1 spec3[nrows1:nrows, nadd2:nadd2+npix2] = spec2 return spec3 def spec_path(plate, path=None, topdir=None, run2d=None): """Return the directory containing spPlate files. Parameters ---------- plate : :class:`int` or :class:`numpy.ndarray` The plate(s) to examine. path : :class:`str`, optional If set, `path` becomes the full path for every plate. In other words, it completely short-circuits this function. topdir : :class:`str`, optional Used to override the value of :envvar:`BOSS_SPECTRO_REDUX`. run2d : :class:`str`, optional Used to override the value of :envvar:`RUN2D`. Returns ------- :class:`list` A list of directories, one for each plate. Raises ------ :exc:`KeyError` If environment variables are not supplied. """ if isinstance(plate, integer_types) or plate.shape == (): platevec = np.array([plate], dtype='i4') else: platevec = plate if path is None: if run2d is None: run2d = os.environ['RUN2D'] if topdir is None: env = "SPECTRO_REDUX" try: ir = int(run2d) except ValueError: env = 'BOSS_SPECTRO_REDUX' topdir = os.environ[env] paths = list() for p in platevec: if path is not None: paths.append(path) else: paths.append(os.path.join(topdir, run2d, '{0:04d}'.format(p))) return paths def preprocess_spectra(flux, ivar, loglam=None, zfit=None, aesthetics='mean', newloglam=None, wavemin=None, wavemax=None, verbose=False): """Handle the processing of input spectra through the :func:`~pydl.pydlspec2d.spec2d.combine1fiber` stage. Parameters ---------- flux : array-like The input spectral flux. ivar : array-like The inverse variance of the spectral flux. loglam : array-like, optional The input wavelength solution. zfit : array-like, optional The redshift of each input spectrum. aesthetics : :class:`str`, optional This parameter will be passed to :func:`~pydl.pydlspec2d.spec2d.combine1fiber`. newloglam : array-like, optional The output wavelength solution. wavemin : :class:`float`, optional Minimum wavelength if `newloglam` is not specified. wavemax : :class:`float`, optional Maximum wavelength if `newloglam` is not specified. verbose : :class:`bool`, optional If ``True``, print extra information. Returns ------- :func:`tuple` of :class:`numpy.ndarray` The resampled flux, inverse variance and wavelength solution, respectively. """ from .spec2d import combine1fiber if verbose: log.setLevel('DEBUG') if len(flux.shape) == 1: nobj = 1 npix = flux.shape[0] else: nobj, npix = flux.shape # # The redshift of each object in pixels would be logshift/objdloglam. # if zfit is None: logshift = np.zeros((nobj,), dtype=flux.dtype) else: logshift = np.log10(1.0 + zfit) # # Determine the new wavelength mapping. # if loglam is None: if newloglam is None: raise ValueError("newloglam must be set if loglam is not!") return (flux, ivar, newloglam) else: if newloglam is None: igood = loglam != 0 dloglam = loglam[1] - loglam[0] logmin = loglam[igood].min() - logshift.max() logmax = loglam[igood].max() - logshift.min() if wavemin is not None: logmin = max(logmin, np.log10(wavemin)) if wavemax is not None: logmax = min(logmax, np.log10(wavemax)) fullloglam = wavevector(logmin, logmax, binsz=dloglam) else: fullloglam = newloglam dloglam = fullloglam[1] - fullloglam[0] nnew = fullloglam.size fullflux = np.zeros((nobj, nnew), dtype='d') fullivar = np.zeros((nobj, nnew), dtype='d') # # Shift each spectrum to z = 0 and sample at the output wavelengths # if loglam.ndim == 1: indx = loglam > 0 rowloglam = loglam[indx] for iobj in range(nobj): log.info("OBJECT %5d", iobj) if loglam.ndim > 1: if loglam.shape[0] != nobj: raise ValueError('Wrong number of dimensions for loglam.') indx = loglam[iobj, :] > 0 rowloglam = loglam[iobj, indx] flux1, ivar1 = combine1fiber(rowloglam-logshift[iobj], flux[iobj, indx], fullloglam, objivar=ivar[iobj, indx], binsz=dloglam, aesthetics=aesthetics, verbose=verbose) fullflux[iobj, :] = flux1 fullivar[iobj, :] = ivar1 return (fullflux, fullivar, fullloglam) def template_metadata(inputfile, verbose=False): """Read template metadata from file. Parameters ---------- inputfile : :class:`str` Name of a Parameter file containing the input data and metadata. verbose : :class:`bool`, optional If ``True``, print lots of extra information. Returns ------- :func:`tuple` A tuple containing the list of input spectra and a dictionary containing other metadata. """ from ..pydlutils.yanny import yanny if verbose: log.setLevel('DEBUG') if not os.path.exists(inputfile): raise Pydlspec2dException("Could not find {0}!".format(inputfile)) log.debug("Reading input data from %s.", inputfile) par = yanny(inputfile) required_metadata = {'object': str, 'method': str, 'aesthetics': str, 'run2d': str, 'run1d': str, 'wavemin': float, 'wavemax': float, 'snmax': float, 'niter': int, 'nkeep': int, 'minuse': int} metadata = dict() for key in required_metadata: try: metadata[key] = required_metadata[key](par[key]) log.debug('%s = %s', key, par[key]) except KeyError: raise KeyError('The {0} keyword was not found in {1}!'.format(key, inputfile)) except ValueError: raise ValueError('The {0} keyword has invalid value, {0}!'.format(key, par[key])) slist = par['EIGENOBJ'] for r in ('run2d', 'run1d'): try: metadata['orig_'+r] = os.environ[r.upper()] except KeyError: metadata['orig_'+r] = None os.environ[r.upper()] = metadata[r] if metadata['method'].lower() == 'hmf': required_hmf_metadata = {'nonnegative': lambda x: bool(int(x)), 'epsilon': float} for key in required_hmf_metadata: try: metadata[key] = required_hmf_metadata[key](par[key]) except KeyError: raise KeyError('The {0} keyword was not found in {1}!'.format(key, inputfile)) except ValueError: raise ValueError('The {0} keyword has invalid value, {0}!'.format(key, par[key])) return (slist, metadata) def template_input(inputfile, dumpfile, flux=False, verbose=False): """Collect spectra and pass them to PCA or HMF solvers to compute spectral templates. This function replaces the various ``PCA_GAL()``, ``PCA_STAR()``, etc., functions from idlspec2d. Parameters ---------- inputfile : :class:`str` Name of a Parameter file containing the input data and metadata. dumpfile : :class:`str` Name of a Pickle file used to store intermediate data. flux : :class:`bool`, optional If ``True``, plot the individual input spectra. verbose : :class:`bool`, optional If ``True``, print lots of extra information. """ import pickle from astropy.constants import c as cspeed from .. import uniq from .. import __version__ as pydl_version from ..goddard.astro import get_juldate from ..pydlutils.image import djs_maskinterp from ..pydlutils.math import djs_median # # Logging # if verbose: log.setLevel('DEBUG') # # Read metadata. # slist, metadata = template_metadata(inputfile) # # Name the output files. # jd = get_juldate() outfile = "spEigen{0}-{1:d}".format(metadata['object'].title(), int(jd - 2400000.5)) # # Read the input spectra # if os.path.exists(dumpfile): log.info("Loading data from %s.", dumpfile) with open(dumpfile) as f: inputflux = pickle.load(f) newflux = inputflux['newflux'] newivar = inputflux['newivar'] newloglam = inputflux['newloglam'] else: if metadata['object'].lower() == 'star': spplate = readspec(slist.plate, mjd=slist.mjd, fiber=slist.fiberid, align=True) else: spplate = readspec(slist.plate, mjd=slist.mjd, fiber=slist.fiberid) # # Insist that all of the requested spectra exist. # missing = spplate['plugmap']['FIBERID'] == 0 if missing.any(): imissing = missing.nonzero()[0] for k in imissing: log.error("Missing plate=%d mjd=%d fiberid=%d", slist.plate[k], slist.mjd[k], slist.fiberid[k]) raise ValueError("{0:d} missing object(s).".format(missing.sum())) # # Do not fit where the spectrum may be dominated by sky-sub residuals. # objinvvar = skymask(spplate['invvar'], spplate['andmask'], spplate['ormask']) ifix = spplate['flux']**2 * objinvvar > metadata['snmax']**2 if ifix.any(): objinvvar[ifix.nonzero()] = (metadata['snmax']/spplate['flux'][ifix.nonzero()])**2 # # Set the new wavelength mapping here. If the binsz keyword is not set, # then bin size is determined from the first spectrum returned by readspec. # This is fine in the case where all spectra have the same bin size # (though their starting wavelengths may differ). However, this may not # be a safe assumption in the future. # try: objdloglam = float(par['binsz']) except: objdloglam = spplate['loglam'][0, 1] - spplate['loglam'][0, 0] if metadata['object'].lower() == 'star': newloglam = spplate['loglam'][0, :] else: newloglam = wavevector(np.log10(metadata['wavemin']), np.log10(metadata['wavemax']), binsz=objdloglam) try: zfit = slist.zfit except AttributeError: zfit = slist.cz/cspeed.to('km / s').value # # Shift to common wavelength grid. # newflux, newivar, newloglam = preprocess_spectra(spplate['flux'], objinvvar, loglam=spplate['loglam'], zfit=zfit, newloglam=newloglam, aesthetics=metadata['aesthetics'], verbose=verbose) # # Dump input fluxes to a file for debugging purposes. # if not os.path.exists(dumpfile): with open(dumpfile, 'w') as f: inputflux = {'newflux': newflux, 'newivar': newivar, 'newloglam': newloglam} pickle.dump(inputflux, f) # # Solve. # if metadata['object'].lower() == 'qso': pcaflux = template_qso(metadata, newflux, newivar, verbose) elif metadata['object'].lower() == 'star': pcaflux = template_star(metadata, newloglam, newflux, newivar, slist, outfile, verbose) else: if metadata['method'].lower() == 'pca': pcaflux = pca_solve(newflux, newivar, niter=metadata['niter'], nkeep=metadata['nkeep'], verbose=verbose) elif metadata['method'].lower() == 'hmf': hmf = HMF(newflux, newivar, K=metadata['nkeep'], n_iter=metadata['niter'], nonnegative=metadata['nonnegative'], epsilon=metadata['epsilon'], verbose=verbose) pcaflux = hmf.solve() else: raise ValueError("Unknown method: {0}!".format(metadata['method'])) pcaflux['newflux'] = newflux pcaflux['newivar'] = newivar pcaflux['newloglam'] = newloglam # # Fill in bad data with a running median of the good data. # The presence of boundary='nearest' means that this code snippet # was never meant to be called! In other words it should always # be the case that qgood.all() is True. # if 'usemask' in pcaflux: qgood = pcaflux['usemask'] >= metadata['minuse'] if not qgood.all(): warn("Would have triggered djs_median replacement!", Pydlspec2dUserWarning) if False: medflux = np.zeros(pcaflux['flux'].shape, dtype=pcaflux['flux'].dtype) for i in range(metadata['nkeep']): medflux[i, qgood] = djs_median(pcaflux['flux'][i, qgood], width=51, boundary='nearest') medflux[i, :] = djs_maskinterp(medflux[i, :], ~qgood, const=True) pcaflux['flux'][:, ~qgood] = medflux[:, ~qgood] # # Make plots # colorvec = ['k', 'r', 'g', 'b', 'm', 'c'] smallfont = FontProperties(size='xx-small') nspectra = pcaflux['newflux'].shape[0] # # Plot input spectra # if flux: nfluxes = 30 separation = 5.0 nplots = nspectra/nfluxes if nspectra % nfluxes > 0: nplots += 1 for k in range(nplots): istart = k*nfluxes iend = min(istart+nfluxes, nspectra) - 1 fig = plt.figure(dpi=100) ax = fig.add_subplot(111) for l in range(istart, iend+1): p = ax.plot(10.0**pcaflux['newloglam'], pcaflux['newflux'][l, :] + separation*(l % nfluxes), colorvec[l % len(colorvec)]+'-', linewidth=1) ax.set_xlabel(r'Wavelength [$\AA$]') ax.set_ylabel(r'Flux [$\mathsf{10^{-17} erg\, cm^{-2} s^{-1} \AA^{-1}}$] + Constant') ax.set_title('Input Spectra {0:04d}-{1:04d}'.format(istart+1, iend+1)) ax.set_ylim(pcaflux['newflux'][istart, :].min(), pcaflux['newflux'][iend-1, :].max()+separation*(nfluxes-1)) fig.savefig('{0}.flux.{1:04d}-{2:04d}.png'.format(outfile, istart+1, iend+1)) plt.close(fig) # # Missing data diagnostic. # fig = plt.figure(dpi=100) ax = fig.add_subplot(111) p = ax.plot(10.0**pcaflux['newloglam'], (pcaflux['newivar'] == 0).sum(0)/float(nspectra), 'k-') ax.set_xlabel(r'Wavelength [$\AA$]') ax.set_ylabel('Fraction of spectra with missing data') ax.set_title('Missing Data') ax.grid(True) fig.savefig(outfile+'.missing.png') plt.close(fig) # # usemask diagnostic # if 'usemask' in pcaflux: fig = plt.figure(dpi=100) ax = fig.add_subplot(111) p = ax.semilogy(10.0**pcaflux['newloglam'][pcaflux['usemask'] > 0], pcaflux['usemask'][pcaflux['usemask'] > 0], 'k-', 10.0**pcaflux['newloglam'], np.zeros(pcaflux['newloglam'].shape, dtype=pcaflux['newloglam'].dtype) + metadata['minuse'], 'k--') ax.set_xlabel(r'Wavelength [$\AA$]') ax.set_ylabel('Usemask') ax.set_title('UseMask') ax.grid(True) fig.savefig(outfile+'.usemask.png') plt.close(fig) # # This type of figure isn't really meaningful for stars. # if metadata['object'].lower() != 'star': aratio10 = pcaflux['acoeff'][:, 1]/pcaflux['acoeff'][:, 0] aratio20 = pcaflux['acoeff'][:, 2]/pcaflux['acoeff'][:, 0] aratio30 = pcaflux['acoeff'][:, 3]/pcaflux['acoeff'][:, 0] fig = plt.figure(dpi=100) ax = fig.add_subplot(111) p = ax.plot(aratio10, aratio20, marker='None', linestyle='None') for k in range(len(aratio10)): t = ax.text(aratio10[k], aratio20[k], '{0:04d}-{1:04d}'.format(slist.plate[k], slist.fiberid[k]), horizontalalignment='center', verticalalignment='center', color=colorvec[k % len(colorvec)], fontproperties=smallfont) # ax.set_xlim([aratio10.min(), aratio10.max]) # ax.set_xlim([aratio20.min(), aratio20.max]) ax.set_xlabel('Eigenvalue Ratio, $a_1/a_0$') ax.set_ylabel('Eigenvalue Ratio, $a_2/a_0$') ax.set_title('Eigenvalue Ratios') fig.savefig(outfile+'.a2_v_a1.png') plt.close(fig) fig = plt.figure(dpi=100) ax = fig.add_subplot(111) p = ax.plot(aratio20, aratio30, marker='None', linestyle='None') for k in range(len(aratio10)): t = ax.text(aratio20[k], aratio30[k], '{0:04d}-{1:04d}'.format(slist.plate[k], slist.fiberid[k]), horizontalalignment='center', verticalalignment='center', color=colorvec[k % len(colorvec)], fontproperties=smallfont) # ax.set_xlim([aratio10.min(), aratio10.max]) # ax.set_xlim([aratio20.min(), aratio20.max]) ax.set_xlabel('Eigenvalue Ratio, $a_2/a_0$') ax.set_ylabel('Eigenvalue Ratio, $a_3/a_0$') ax.set_title('Eigenvalue Ratios') fig.savefig(outfile+'.a3_v_a2.png') plt.close(fig) # # Save output to FITS file. # if os.path.exists(outfile+'.fits'): os.remove(outfile+'.fits') hdu0 = fits.PrimaryHDU(pcaflux['flux']) objtypes = {'gal': 'GALAXY', 'qso': 'QSO', 'star': 'STAR'} if not pydl_version: pydl_version = 'git' hdu0.header['OBJECT'] = (objtypes[metadata['object']], 'Type of template') hdu0.header['COEFF0'] = (pcaflux['newloglam'][0], 'Wavelength zeropoint') hdu0.header['COEFF1'] = (pcaflux['newloglam'][1]-pcaflux['newloglam'][0], 'Delta wavelength') hdu0.header['IDLUTILS'] = ('pydl-{0}'.format(pydl_version), 'Version of idlutils') hdu0.header['SPEC2D'] = ('pydl-{0}'.format(pydl_version), 'Version of idlspec2d') hdu0.header['RUN2D'] = (os.environ['RUN2D'], 'Version of 2d reduction') hdu0.header['RUN1D'] = (os.environ['RUN1D'], 'Version of 1d reduction') hdu0.header['FILENAME'] = (inputfile, 'Input file') hdu0.header['METHOD'] = (metadata['method'].upper(), 'Method used') if metadata['method'].lower() == 'hmf': hdu0.header['NONNEG'] = (metadata['nonnegative'], 'Was nonnegative HMF used?') hdu0.header['EPSILON'] = (metadata['epsilon'], 'Regularization parameter used.') # for i in range(len(namearr)): # hdu0.header["NAME{0:d}".format(i)] = namearr[i]+' ' c = [fits.Column(name='plate', format='J', array=slist.plate), fits.Column(name='mjd', format='J', array=slist.mjd), fits.Column(name='fiberid', format='J', array=slist.fiberid)] if metadata['object'].lower() == 'star': c.append(fits.Column(name='cz', format='D', unit='km/s', array=slist.cz)) for i, name in enumerate(pcaflux['namearr']): hdu0.header['NAME{0:d}'.format(i)] = (name, 'Name of class {0:d}.'.format(i)) else: c.append(fits.Column(name='zfit', format='D', array=slist.zfit)) hdu1 = fits.BinTableHDU.from_columns(fits.ColDefs(c)) hdulist = fits.HDUList([hdu0, hdu1]) hdulist.writeto(outfile+'.fits') if metadata['object'].lower() != 'star': plot_eig(outfile+'.fits') # # Clean up # for r in ('run2d', 'run1d'): if metadata['orig_'+r] is None: del os.environ[r.upper()] else: os.environ[r.upper()] = metadata['orig_'+r] return def template_qso(metadata, newflux, newivar, verbose=False): """Run PCA or HMF on QSO spectra. Historically, QSO templates were comptuted one at a time instead of all at once. Parameters ---------- metadata : :class:`dict` Dictionary containing metadata about the spectra. newflux : :class:`~numpy.ndarray` Flux shifted onto common wavelength. newivar : :class:`~numpy.ndarray` Inverse variances of the fluxes. verbose : :class:`bool`, optional If ``True``, print lots of extra information. Returns ------- :class:`dict` A dictonary containing flux, eigenvalues, etc. """ from ..pydlutils.math import computechi2 if metadata['object'].lower() != 'qso': raise Pydlspec2dException("You appear to be passing the wrong kind of object to template_qso()!") if len(newflux.shape) == 1: nobj = 1 npix = newflux.shape[0] else: nobj, npix = newflux.shape objflux = newflux.copy() for ikeep in range(metadata['nkeep']): log.info("Solving for eigencomponent #%d of %d", ikeep+1, nkeep) if metadata['method'].lower() == 'pca': pcaflux1 = pca_solve(objflux, newivar, niter=metadata['niter'], nkeep=1, verbose=verbose) elif metadata['method'].lower() == 'hmf': hmf = HMF(objflux, newivar, K=metadata['nkeep'], n_iter=metadata['niter'], nonnegative=metadata['nonnegative'], epsilon=metadata['epsilon'], verbose=verbose) pcaflux1 = hmf.solve() else: raise ValueError("Unknown method: {0}!".format(metadata['method'])) if ikeep == 0: # # Create new pcaflux dict # pcaflux = dict() for k in pcaflux1: pcaflux[k] = pcaflux1[k].copy() else: # # Add to existing dict # # for k in pcaflux1: # pcaflux[k] = np.vstack((pcaflux[k],pcaflux1[k])) pcaflux['flux'] = np.vstack((pcaflux['flux'], pcaflux1['flux'])) pcaflux['eigenval'] = np.concatenate((pcaflux['eigenval'], pcaflux1['eigenval'])) # # Re-solve for the coefficients using all PCA components so far # acoeff = np.zeros((nobj, ikeep+1), dtype=pcaflux1['acoeff'].dtype) for iobj in range(nobj): out = computechi2(newflux[iobj, :], np.sqrt(pcaflux1['newivar'][iobj, :]), pcaflux['flux'].T) acoeff[iobj, :] = out['acoeff'] # # Prevent re-binning of spectra on subsequent calls to pca_solve() # # objloglam = None if ikeep == 0: objflux = newflux - np.outer(acoeff, pcaflux['flux']) else: objflux = newflux - np.dot(acoeff, pcaflux['flux']) # objflux = newflux - np.outer(acoeff,pcaflux['flux']) # objinvvar = pcaflux1['newivar'] pcaflux['acoeff'] = acoeff return pcaflux def template_star(metadata, newloglam, newflux, newivar, slist, outfile, verbose=False): """Run PCA or HMF on stellar spectra of various classes. Parameters ---------- metadata : :class:`dict` Dictionary containing metadata about the spectra. newloglam : :class:`~numpy.ndarray` The wavelength array, used only for plots. newflux : :class:`~numpy.ndarray` Flux shifted onto common wavelength. newivar : :class:`~numpy.ndarray` Inverse variances of the fluxes. slist : :class:`~numpy.recarray` The list of objects, containing stellar class information. outfile : :class:`str` The base name of output file, used for plots. verbose : :class:`bool`, optional If ``True``, print lots of extra information. Returns ------- :class:`dict` A dictonary containing flux, eigenvalues, etc. """ from .. import uniq from ..pydlutils.image import djs_maskinterp if metadata['object'].lower() != 'star': raise Pydlspec2dException("You appear to be passing the wrong kind of object to template_star()!") # # Find the list of unique star types # isort = np.argsort(slist['class']) classlist = slist['class'][isort[uniq(slist['class'][isort])]] # # Loop over each star type # npix, nstars = newflux.shape pcaflux = dict() pcaflux['namearr'] = list() for c in classlist: # # Find the subclasses for this stellar type # log.info("Finding eigenspectra for Stellar class %s.", c) indx = (slist['class'] == c).nonzero()[0] nindx = indx.size thesesubclass = slist['subclass'][indx] isort = np.argsort(thesesubclass) subclasslist = thesesubclass[isort[uniq(thesesubclass[isort])]] nsubclass = subclasslist.size # # Solve for 2 eigencomponents if we have specified subclasses for # this stellar type # if nsubclass == 1: nkeep = 1 else: nkeep = 2 if metadata['method'].lower() == 'pca': pcaflux1 = pca_solve(newflux[indx, :], newivar[indx, :], niter=metadata['niter'], nkeep=nkeep, verbose=verbose) elif metadata['method'].lower() == 'hmf': hmf = HMF(newflux[indx, :], newivar[indx, :], K=metadata['nkeep'], n_iter=metadata['niter'], nonnegative=metadata['nonnegative'], epsilon=metadata['epsilon'], verbose=verbose) pcaflux1 = hmf.solve() else: raise ValueError("Unknown method: {0}!".format(metadata['method'])) # # Some star templates are generated from only one spectrum, # and these will not have a usemask set. # if 'usemask' not in pcaflux1: pcaflux1['usemask'] = np.zeros((npix,), dtype='i4') + nindx # # Interpolate over bad flux values in the middle of a spectrum, # and set fluxes to zero at the blue+red ends of the spectrum # # minuse = 1 # ? minuse = np.floor((nindx+1) / 3.0) qbad = pcaflux1['usemask'] < minuse # # Interpolate over all bad pixels # for j in range(nkeep): pcaflux1['flux'][j, :] = djs_maskinterp(pcaflux1['flux'][j, :], qbad, const=True) # # Set bad pixels at the very start or end of the spectrum to zero # instead. # npix = qbad.size igood = (~qbad).nonzero()[0] if qbad[0]: pcaflux1['flux'][:, 0:igood[0]-1] = 0 if qbad[npix-1]: pcaflux1['flux'][:, igood[::-1][0]+1:npix] = 0 # # Re-normalize the first eigenspectrum to a mean of 1 # norm = pcaflux1['flux'][0, :].mean() pcaflux1['flux'] /= norm if 'acoeff' in pcaflux1: pcaflux1['acoeff'] *= norm # # Now loop through each stellar subclass and reconstruct # an eigenspectrum for that subclass # thesesubclassnum = np.zeros(thesesubclass.size, dtype='i4') colorvec = ['k', 'r', 'g', 'b', 'm', 'c'] smallfont = FontProperties(size='xx-small') fig = plt.figure(dpi=100) ax = fig.add_subplot(111) for isub in range(nsubclass): ii = (thesesubclass == subclasslist[isub]).nonzero()[0] thesesubclassnum[ii] = isub if nkeep == 1: thisflux = pcaflux1['flux'][0, :] else: aratio = pcaflux1['acoeff'][ii, 1]/pcaflux1['acoeff'][ii, 0] # # np.median(foo) is equivalent to MEDIAN(foo,/EVEN) # thisratio = np.median(aratio) thisflux = (pcaflux1['flux'][0, :] + thisratio.astype('f') * pcaflux1['flux'][1, :]) # # Plot spectra # plotflux = thisflux/thisflux.max() ax.plot(10.0**newloglam, plotflux, "{0}-".format(colorvec[isub % len(colorvec)]), linewidth=1) if isub == 0: ax.set_xlabel(r'Wavelength [$\AA$]') ax.set_ylabel('Flux [arbitrary units]') ax.set_title('STAR {0}: Eigenspectra Reconstructions'.format(c)) t = ax.text(10.0**newloglam[-1], plotflux[-1], subclasslist[isub], horizontalalignment='right', verticalalignment='center', color=colorvec[isub % len(colorvec)], fontproperties=smallfont) fig.savefig(outfile+'.{0}.png'.format(c)) plt.close(fig) if 'flux' in pcaflux: pcaflux['flux'] = np.vstack((pcaflux['flux'], thisflux)) # pcaflux['acoeff'] = np.vstack((pcaflux['acoeff'], pcaflux1['acoeff'])) # pcaflux['usemask'] = np.vstack((pcaflux['usemask'], pcaflux1['usemask'])) else: pcaflux['flux'] = thisflux # pcaflux['acoeff'] = pcaflux1['acoeff'] # pcaflux['usemask'] = pcaflux1['usemask'] pcaflux['namearr'].append(subclasslist[isub]) return pcaflux def template_input_main(): # pragma: no cover """Entry point for the compute_templates script. Returns ------- :class:`int` An integer suitable for passing to :func:`sys.exit`. """ # # Imports for main() # import sys from argparse import ArgumentParser # Get home directory in platform-independent way home_dir = os.path.expanduser('~') # # Get Options # parser = ArgumentParser(description="Compute spectral templates.", prog=os.path.basename(sys.argv[0])) parser.add_argument('-d', '--dump', action='store', dest='dump', metavar='FILE', default=os.path.join(home_dir, 'scratch', 'templates', 'compute_templates.dump'), help='Dump data to a pickle file (default: %(default)s).') parser.add_argument('-F', '--flux', action='store_true', dest='flux', help='Plot input spectra.') parser.add_argument('-f', '--file', action='store', dest='inputfile', metavar='FILE', default=os.path.join(home_dir, 'scratch', 'templates', 'compute_templates.par'), help='Read input spectra and redshifts from FILE (default: %(default)s).') parser.add_argument('-v', '--verbose', action='store_true', dest='verbose', help='Print lots of extra information.') options = parser.parse_args() template_input(options.inputfile, options.dump, options.flux, options.verbose) return 0 def wavevector(minfullwave, maxfullwave, zeropoint=3.5, binsz=1.0e-4, wavemin=None): """Return an array of wavelengths. Parameters ---------- minfullwave : :class:`float` Minimum wavelength. maxfullwave : :class:`float` Maximum wavelength. zeropoint : :class:`float`, optional Offset of the input wavelength values. binsz : :class:`float`, optional Separation between wavelength values. wavemin : :class:`float`, optional If this is set the values of `minfullwave` and `zeropoint` are ignored. Returns ------- :class:`numpy.ndarray` Depending on the values of `minfullwave`, `binsz`, etc., the resulting array could be interpreted as an array of wavelengths or an array of log(wavelength). """ if wavemin is not None: spotmin = 0 spotmax = int((maxfullwave - wavemin)/binsz) wavemax = spotmax * binsz + wavemin else: spotmin = int((minfullwave - zeropoint)/binsz) + 1 spotmax = int((maxfullwave - zeropoint)/binsz) wavemin = spotmin * binsz + zeropoint wavemax = spotmax * binsz + zeropoint nfinalpix = spotmax - spotmin + 1 finalwave = np.arange(nfinalpix, dtype='d') * binsz + wavemin return finalwave pydl-0.7.0/pydl/pydlspec2d/tests/0000755000076500000240000000000013434104632017265 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlspec2d/tests/__init__.py0000644000076500000240000000021512671560637021411 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This is the pydl/pydlspec2d/tests directory. """ pydl-0.7.0/pydl/pydlspec2d/tests/t/0000755000076500000240000000000013434104632017530 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlspec2d/tests/t/spPlate-4055-55359-0020.fits0000644000076500000240000136730012544033417023506 0ustar weaverstaff00000000000000SIMPLE = T / conforms to FITS standard BITPIX = -32 / array data type NAXIS = 2 / number of array dimensions NAXIS1 = 4637 NAXIS2 = 20 TELESCOP= 'SDSS 2.5-M' / Sloan Digital Sky Survey NGUIDE = 33.0000 / Number of guider frames during exposure SEEING20= 1.41527 / 20% seeing during exposure (arcsec) SEEING50= 1.52343 / 50% seeing during exposure (arcsec) SEEING80= 1.63672 / 80% seeing during exposure (arcsec) RMSOFF20= 0.0111816500000 / 20% RMS offset of guide fibers (arcsec) RMSOFF50= 0.0164329166667 / 50% RMS offset of guide fibers (arcsec) RMSOFF80= 0.0225299500000 / 80% RMS offset of guide fibers (arcsec) DAQVER = '1.2.14 ' CAMDAQ = '1.4.7:38' SUBFRAME= '' / the subframe readout command ERRCNT = 'NONE ' SYNCERR = 'NONE ' SLINES = 'NONE ' PIXERR = 'NONE ' PLINES = 'NONE ' PFERR = 'NONE ' DIDFLUSH= T / CCD was flushed before integration FLAVOR = 'science ' / exposure type, SDSS spectro style BOSSVER = '2 ' / ICC version MJD = 55359 / APO fMJD day at start of exposure MJDLIST = '55359 ' / TAI-BEG = 4783034232.00 / MJD(TAI) seconds at start of integration DATE-OBS= '2010-06-12T04:37:12' / TAI date at start of integration OBJSYS = 'ICRS ' / The TCC objSys RA = 236.433204 / RA of telescope boresight (deg) DEC = 1.836429 / Dec of telescope boresight (deg) EQUINOX = 2000.00 / RADECSYS= 'FK5 ' / RADEG = 236.4365 / RA of telescope pointing(deg) DECDEG = 1.8332 / Dec of telescope pointing (deg) ROTTYPE = 'Obj ' / Rotator request type ROTPOS = 0.0 / Rotator request position (deg) BOREOFFX= 0.0 / TCC Boresight offset, deg BOREOFFY= 0.0 / TCC Boresight offset, deg ARCOFFX = -0.003294 / TCC ObjArcOff, deg ARCOFFY = 0.003229 / TCC ObjArcOff, deg OBJOFFX = 0.0 / TCC ObjOff, deg OBJOFFY = 0.0 / TCC ObjOff, deg CALOFFX = 0.0 / TCC CalibOff, deg CALOFFY = 0.0 / TCC CalibOff, deg CALOFFR = 0.0 / TCC CalibOff, deg GUIDOFFX= 0.0 / TCC GuideOff, deg GUIDOFFY= 0.0 / TCC GuideOff, deg GUIDOFFR= 0.016592 / TCC GuideOff, deg AZ = 4.68733 / Azimuth axis pos. (approx, deg) ALT = 58.2660 / Altitude axis pos. (approx, deg) AIRMASS = 1.18604 / FOCUS = 198.16 / User-specified focus offset (um) M2PISTON= -1338.43 / TCC SecOrient M2XTILT = -4.0 / TCC SecOrient M2YTILT = -9.02 / TCC SecOrient M2XTRAN = 10.35 / TCC SecOrient M2YTRAN = 81.04000000000001 / TCC SecOrient M1PISTON= -3257.73 / TCC PrimOrient M1XTILT = -3.29 / TCC PrimOrient M1YTILT = 4.82 / TCC PrimOrient M1XTRAN = 278.26 / TCC PrimOrient M1YTRAN = 177.78 / TCC PrimOrient SCALE = 1.00033 / User-specified scale factor NAME = '4055-55359-01' / The name of the currently loaded plate PLATEID = 4055 / The currently loaded plate TILEID = 10375 /Cartridge used in this plugging CARTID = 14 /Cartridge used in this plugging MAPID = 1 / The mapping version of the loaded plate POINTING= 'A ' / The currently specified pointing GUIDER1 = 'proc-gimg-0222.fits' / The first guider image SLITID1 = 14 / Normalized slithead ID. sp1&2 should match. SLITID2 = 14 / Normalized slithead ID. sp1&2 should match. GUIDERN = 'proc-gimg-0278.fits' / The last guider image COLLA = 41802 / The position of the A collimator motor COLLB = 42513 / The position of the B collimator motor COLLC = 43176 / The position of the C collimator motor HARTMANN= 'Out ' / Hartmanns: Left,Right,Out MC1HUMHT= 12.2 / sp1 mech Hartmann humidity, % MC1HUMCO= 10.1 / sp1 mech Central optics humidity, % MC1TEMDN= 14.4 / sp1 mech Median temp, C MC1THT = 14.5 / sp1 mech Hartmann Top Temp, C MC1TRCB = 14.0 / sp1 mech Red Cam Bottom Temp, C MC1TRCT = 14.5 / sp1 mech Red Cam Top Temp, C MC1TBCB = 14.3 / sp1 mech Blue Cam Bottom Temp, C MC1TBCT = 14.4 / sp1 mech Blue Cam Top Temp, C NEXP = 24 / Number of exposures in this file BESTEXP = 116480 / EXPID01 = 'b1-00116475-00116481-00116482' / ID string for exposure 1 EXPID02 = 'b1-00116476-00116481-00116482' / ID string for exposure 2 EXPID03 = 'b1-00116477-00116481-00116482' / ID string for exposure 3 EXPID04 = 'b1-00116478-00116481-00116482' / ID string for exposure 4 EXPID05 = 'b1-00116479-00116481-00116482' / ID string for exposure 5 EXPID06 = 'b1-00116480-00116481-00116482' / ID string for exposure 6 EXPID07 = 'b2-00116475-00116481-00116482' / ID string for exposure 7 EXPID08 = 'b2-00116476-00116481-00116482' / ID string for exposure 8 EXPID09 = 'b2-00116477-00116481-00116482' / ID string for exposure 9 EXPID10 = 'b2-00116478-00116481-00116482' / ID string for exposure 10 EXPID11 = 'b2-00116479-00116481-00116482' / ID string for exposure 11 EXPID12 = 'b2-00116480-00116481-00116482' / ID string for exposure 12 EXPID13 = 'r1-00116475-00116481-00116482' / ID string for exposure 13 EXPID14 = 'r1-00116476-00116481-00116482' / ID string for exposure 14 EXPID15 = 'r1-00116477-00116481-00116482' / ID string for exposure 15 EXPID16 = 'r1-00116478-00116481-00116482' / ID string for exposure 16 EXPID17 = 'r1-00116479-00116481-00116482' / ID string for exposure 17 EXPID18 = 'r1-00116480-00116481-00116482' / ID string for exposure 18 EXPID19 = 'r2-00116475-00116481-00116482' / ID string for exposure 19 EXPID20 = 'r2-00116476-00116481-00116482' / ID string for exposure 20 EXPID21 = 'r2-00116477-00116481-00116482' / ID string for exposure 21 EXPID22 = 'r2-00116478-00116481-00116482' / ID string for exposure 22 EXPID23 = 'r2-00116479-00116481-00116482' / ID string for exposure 23 EXPID24 = 'r2-00116480-00116481-00116482' / ID string for exposure 24 NEXP_B1 = 6 / b1 camera number of exposures NEXP_B2 = 6 / b2 camera number of exposures NEXP_R1 = 6 / r1 camera number of exposures NEXP_R2 = 6 / r2 camera number of exposures EXPT_B1 = 5405.01 / b1 camera exposure time (seconds) EXPT_B2 = 5405.04 / b2 camera exposure time (seconds) EXPT_R1 = 5405.01 / r1 camera exposure time (seconds) EXPT_R2 = 5405.04 / r2 camera exposure time (seconds) EXPTIME = 5405.01 / Minimum of exposure times for all cameras SPCOADD = 'Sat Apr 12 18:49:38 2014' / SPCOADD finished SHOPETIM= 0.72 / open shutter transit time, s SHCLOTIM= 0.5600000000000001 / close shutter transit time, s AUTHOR = 'Scott Burles & David Schlegel' / VERSIDL = '7.0 ' / Version of IDL VERSUTIL= 'v5_5_14 ' / Version of idlutils VERSREAD= 'v5_7_0 ' / Version of idlspec2d for pre-processing raw datVERS2D = 'v5_7_0 ' / Version of idlspec2d for 2D reduction VERSCOMB= 'v5_7_0 ' / Version of idlspec2d for combining multiple speVERSLOG = 'trunk exported' / Version of SPECLOG product VERSFLAT= 'v1_21 ' / Version of SPECFLAT product TWOPHASE= F / RDNOISE0= 1.99069 /CCD read noise amp 0 [electrons] BADPIXEL= 'badpixels-55300-b1.fits.gz' / RUN2D = 'v5_7_0 ' / Spectro-2D reduction name TAI-END = 4783040074.84 / REDDEN01= 0.495500 / Median extinction in u-band REDDEN02= 0.364600 / Median extinction in g-band REDDEN03= 0.264500 / Median extinction in r-band REDDEN04= 0.200500 / Median extinction in i-band REDDEN05= 0.142200 / Median extinction in z-band XSIGMA = 1.07552 / XSIGMIN = 1.02262 / XSIGMAX = 1.18520 / WSIGMA = 1.08131 / WSIGMIN = 1.04033 / WSIGMAX = 1.15414 / PLUGFILE= 'plPlugMapM-4055-55359-01.par' / LAMPLIST= 'lamphgcdne.dat' / SKYLIST = 'skylines.dat' / HELIO_RV= 12.4068856707 / Heliocentric correction (added to velocities) VACUUM = T / Wavelengths are in vacuum SFLATTEN= T / Superflat has been applied PSFSKY = 3 / Order of PSF skysubtraction SKYCHI2 = 1.28961868553 / Mean chi^2 of sky-subtraction SCHI2MIN= 1.20423986980 / SCHI2MAX= 1.55158902875 / PREJECT = 0.200000 / Profile area rejection threshold SPEC1_G = 25.5228 / (S/N)^2 for spec 1 at mag 21.20 SN2EXT1G= 25.4134 / Extinction corrected (S/N)^2 SPEC1_R = 69.5485 / (S/N)^2 for spec 1 at mag 20.20 SN2EXT1R= 69.4692 / Extinction corrected (S/N)^2 SPEC1_I = 52.3166 / (S/N)^2 for spec 1 at mag 20.20 SN2EXT1I= 52.2564 / Extinction corrected (S/N)^2 SPEC2_G = 24.8694 / (S/N)^2 for spec 2 at mag 21.20 SN2EXT2G= 24.7601 / Extinction corrected (S/N)^2 SPEC2_R = 68.1589 / (S/N)^2 for spec 2 at mag 20.20 SN2EXT2R= 68.0796 / Extinction corrected (S/N)^2 SPEC2_I = 46.5307 / (S/N)^2 for spec 2 at mag 20.20 SN2EXT2I= 46.4706 / Extinction corrected (S/N)^2 NSTD = 20 / Number of (good) std stars GOFFSTD = -0.00487995 / Spectrophoto offset for std stars in G-band GRMSSTD = 0.0647212 / Spectrophoto RMS for std stars in G-band ROFFSTD = 0.0269613 / Spectrophoto offset for std stars in R-band RRMSSTD = 0.100117 / Spectrophoto RMS for std stars in R-band IOFFSTD = 0.0276089 / Spectrophoto offset for std stars in I-band IRMSSTD = 0.140843 / Spectrophoto RMS for std stars in I-band GROFFSTD= 0.0247478 / Spectrophoto offset for std stars in (GR) GRRMSSTD= 0.0879660 / Spectrophoto RMS for std stars in (GR) RIOFFSTD= -0.00171995 / Spectrophoto offset for std stars in (RI) RIRMSSTD= 0.0523789 / Spectrophoto RMS for std stars in (RI) LOWREJ = 5 / Extraction: low rejection HIGHREJ = 8 / Extraction: high rejection SCATPOLY= 0 / Extraction: Order of scattered light polynomialPROFTYPE= 1 / Extraction profile: 1=Gaussian NFITPOLY= 1 / Extraction: Number of parameters in each profilXCHI2 = 0.889458493246 / Extraction: Mean chi^2 XCHI2MIN= 0.698653000000 / XCHI2MAX= 1.11949002743 / NWORDER = 2 / Linear-log10 coefficients WFITTYPE= 'LOG-LINEAR' / Linear-log10 dispersion COEFF0 = 3.55290000000 / Central wavelength (log10) of first pixel COEFF1 = 0.000100000000000 / Log10 dispersion per pixel UNAME = 'n0011 ' / FBADPIX = 0.0469793 / Fraction of bad pixels FBADPIX1= 0.0294160 / Fraction of bad pixels on spectro-1 FBADPIX2= 0.0645426 / Fraction of bad pixels on spectro-2 WAT0_001= 'system=linear' / WAT1_001= 'wtype=linear label=Wavelength units=Angstroms' / CRVAL1 = 3.55290000000 / Central wavelength (log10) of first pixel CD1_1 = 0.000100000000000 / Log10 dispersion per pixel CRPIX1 = 1 / Starting pixel (1-indexed) CTYPE1 = 'LINEAR ' / DC-FLAG = 1 / Log-linear flag BUNIT = '1E-17 erg/cm^2/s/Ang' / END Au,A_GA=),A<ъAMAA.yAAjADADҐAKA? A_A@*%AB!Ax3A.FAUAoA^~A AoSA Ay"bAgAmAXA,BAf AAjAf IAAApAL%AlnA"ȲAAr[A=APS3A;Q\A-pAS A%AweSA4{gA1/A,AAk/ AeAuA=AN+eA/ZA:A@EA@AvIA,A6>A:&A(>6ALYA1AFAPLA=-A4(AFSA A#LEA=SACAHFATUA\$>AGsAAH2A;A`AqBAGFA&XaA}A'A5jWA VA8ipA_'AAAlAA:AdA^A~ZA-AAA}sAJAAHAhAqgAl''Ap>AbA%pA2A^ #AVAovA AVvAAA{WAk4A8AA!AV@A*AA~A ACAlADA AvAA^~A^AJAm0Ad2@AZUAxYACm/A7fAa~AdFAAq7AAeAbA?A#GA A=A;A ?AYAA`A]AkAuAoD Av2AvAjA|3AjAALMA.A},AP AAaAA/@@a@Ҙ@A h9A z3AT%-AAA0AAzAxAHAAVAAAdAA:=A?2AAAA=APoA_AA8AAARAqAwrAJUA4rAA@Ѕ@h@IAGxA A6YAHAu)AqJAcAMAxAi;A|iA1&A AңAu9AyAXA~AZAA=A5AKA AAA-AA*ArA1AAA|AA6AaAa_AԳA=AAAAUA/AA,AwA(;A?IARAyAADA5A_AAA,AAA˲A$XADAAALAT_ABA&A{^A^AAUAa1AueAXAAsAsA]AHA`+A9AANSAAAAZ A4AAAAA)XAoA̳AcA AlA|sAAtAAAAAAAEA_AHA{@AA*A AAAAvpAALAcxAA2AOAq/ArA\A0A{|A`A2ADAnAAAuAZTAQkAI|AK9AHߔAxA-AAADAAAAݰA!2A Ar@ArUAXaAA[A/AAOAA'.A`A,A[ABAy9A% AAEA,AYA AMAAfAgA!JA{AAABA(AAEA&AdAbA_AjA!AAASAA AJAA6AA'AASAyAFAءAAA\HA.A!oAsAAAjAALAGA2YAVAA]AAAAMAXA[AJAwA@A7Av2AYA AzAmAAAA A(XA7"AAA(Aq@A%WARA:MA\A A_ A`tA+AAFApA)A2AtA.AjAqAAAIAL|AvA'A0AɉAWPA#A.AAAAA CA&AAaZALAuCAqAAAM ARAAJAvmAKAA@AAҖAAWA AvAAA&A!RAvvAA AAbAvTAAܸAATAAQ8AyA$AAAnA&'AAAEAA.*A=A'tAFACA܋AAAɼAAsAqAA}AϜAA5A*fAA$AIAA :AAA?AAUA9A!AfA ?AfUA]AAAQAAA4AAAA2AgAzAAAd;AAAAA/A;1AoAAMABANAdUAn?A,ZA AACaAOAĖAA/AS1A5AlA2AjA| AdAAjAA?AAAA.A|AAjAANA!)A3A%AASAɧAWA۩AaA,AAAuAAoAAAEAhAAA A/[ATPAA;A ATAAAAAA&AAOA3A>AA7lAJAAvAA>A$ANxAeA3A?zA2XAIAJAA5A[AAAAA;AdAl\AA*ALAYA:AA {AA%QA\ARAĢA`AA`AAAOA^AAALtA VAvAA>AA"AAyvA%AtAAkAVA2AA[A8sAZAHArA kAĤAAFAkNAACOAA$A^AAuAPA@cHAE1AIعAsAx)A AtXATAVAxAZAA-AO>A$yA/AAA8ACAAAAAA8A,A]AAAA,AA{>AkAAAnmAA~A[IASAA gA[AAA AAA4zAA,AYA@ASAAHAAAAAIA AtAgAAiAAA1AqAgAA!ABA“AA[BAkA1AoA3A@AAAr}AAA9AjA.AZ]A"+ADA߽AA$[A=AAhAyAZAPAAAA.A͙AAۮAAAwA0nA>>A5yA.AjAfAASAAAA7AAAAUANAA$AYAIAAJAAAkA7A+A.AaA) AG9AA:A9AJRA}A AAAAAAA^A4AAqAA/AXANaAAjAcA~A|AA:1AiAwEApA$AdA'qAAq:A;A\ AxAA_AչA ;A/FAA;*AALrA+A'AAA`AeA AUAAA6AXrAAA MA7AyAAAeA(AtAAA} hA`/AxaAvPA?A3A&A4>AAIAAAx A}uAzA jANAvASAAArA AjA eA4AFAuAMAEAA{AA!AԭA4A"AFAjAAA1AtA)AAmAhAv7AA;AEAz#AAL4A4AA|ӅAlADAA/AXAAcA|AoAWALANALAkA#A|~iAqAqAlxA\:AˇAA'AAtAAwBAA|AhVAlAq A~AAwݓA1A AAcA*AxiAAA})AA1 A}{A AhAAAv_AwAs AAAA|XAA`uA{AmA @AAA65A"ArAoAe(A'AA5AxxA=AAAY5A\A{bAAAwo?AVAHA}JVAyAAjA}SAw4A},AAXAA7AH[AAV[AjZAs'AYyA}4At*iAvAyYAZAo8AxsAץA AiA~ݸA#A%A}-AzAu.[A[ZAAyA AAoAjrAhApeAwAZAnAnSAjAtAp]AgAz+AuN8ArwAzAiAdAs7Ae9A_zAkA_AnAtbAvAt2bAm[Ah)AcAlApApֺAr;/AqApAw0fAvAjѦAeAoAkf{AiAf\AfLAlAqAj!A^A[Ad]AjAcAgAb]AeAoKAp;Ad~AfrAeA_lA\A_\AbIpAfA_=AaA`A[1AfAp%AqAk}AeAaAa AmzApqfAaKAP AQoAd.Agj9AfAnAlA^!jAZNAWDAeE>A]A\pAgA^nAj$`A\/AR*AZAa!KARIAg#ARA_iAjNAZӸAdlAlmAFCAjAxAJSA\[ AvArAcH>AaAd#AU3NA_SAmAn2AlAplAuVAwAb%AhMAojA_ATAX_ArTAmA^AaA]AVF?Ad,AbA["Ac{Ao AeA[DAYAauAoAshAoAbAgeAiA`;Ac_AcsAe2A[pA^AhAW&AhkMAU!AI%dAUAd`A]AR} AW1AbAIANA[sAZ?AYAi9ANAW=^AX=kAR7AX~ A^&AY+APEAQ|A\AdA^NAR!A^AStAQvAcANAS`AR2YAQ6~AOARyAdyAZ.A_0AXAU GA]@APAT^AQHAZAZAPzAeA^{ANAHALAQIAN=AN)AWA`AX.AJIARAJANAVydAXAVAWAEAWbAQzALAdAc)AQ|A]A]#AUAYGAV(ARbA`A]|AU:BAT"YAG`AR_AR1AZ#AVAGCAY}1A;/ARAY%AUAaA[CA^oARkAZ A]9ALASrAJAWAAUASA^ALvACAKAPASeASfALAUnAS*dARAKzABAQ%eAO"AU'ARAGEWAEAUAOZAIADA=A6A)A)eAZALA EA,ܬA2 A6 LAFfAJQ=AJATIARAE/AV AR91AGAPAOkACAU_AXAKNAE/AHAOAMׄAN(AF]A<^ARAH]ALf7AVASp+ACA= fA;=AYAYAL.ANHAFAA^JANҝAF^.AR>AT~1AD'[AP^AYҎAUAPAOKAHY3AG-AD AG%AM AN+AMAVgAL=2ASLAQ_JA_ AD AIYXAD`ABAIAJbATMAJPA@AAAGALjA?AFUARAK,NAA:FAAaAOrABAE0rA@A>AC'AI:AJmAFwAC$A9xOA3A; ABA;qA< A?˜AB)ACAA=pA>A:wOA@LAG=ABAA<6A=F(ACi+A8A@:SAEޜAW1CAQqA2AQiAAѻA89KA@|A>A-A7AA AD AA:*A>A=UA;6A>jAJcAGAAkA;?ABA?AAA6VA>AD'A?}AZAE A=A6A:OAD `A1A3A@UHA8$A>rA<A5ήA1A;\UALABƸAAA.~A8SAMDAC3A:b(A6@A29A5sA=A<7A3A: A?^:A>ʠA=YA=A5AA=A<&DA6A8\A8BA8R:A6oA2A7ǃA2oRAA~A;A0AADSA8#8AA2A1uJA7ZA=hbA2JwA7UAEvA@'fA:)A4A;HA?A9A4A7zA=҄AA A=VA3=A,WA7fLA3vA>aA?A4܊A6[AAgA=A6SA6fA1q2A)7 A:A:vA0&A0 |A6ӧA8A81A2mA2 A8sA7vA./A1ljA:k]A2_A+fA2A9 A/A-eA'[3A.A8 zA.iA-g A7A2CA.E3A1zA3A/BA4_HA6A0|A/A4A.T^A9A;A5A0A/~A2A(A3uA4 A/RA9?A%MA.*WA)A,]A4 A-A2A.B$A2uA1A) A&A*A&TA0ơA5rA2tQA050A\A09A:wA5A/A.A1A.h!A4TA6'A0U&A,hA1sA;$ACˆAKSAJA9A+!A5A2Y|A+âA0,A5_jA%_A4&7A/{A+zA2A2A'IA4 A-A<AMGA7A<A,+A6 A?A6vA-x`A5xA7KA,[A)aA,A&@A.!AA'A6RA3A)XA6AOA+A)MxA:,ALAGWA4{A.-AAA*A,>A3=AA4A/A-bA-O A&gA2CA\ :AUFA9aA%ZAA"AA(`zA/T`A/fzA1HA*A%ȔA*]tA3'.A.A kA:GACi A,8AEA'A)A!|A*jA(A%dA* >A+tA"A+3A-A(A('1A.'A)A;A2A*A&AA0`A?A!zA$bAAA A,A$IA$UPA.A"ypA'97A,"LA&+A"BA">A"A(YA)A&aAA vA0>A0A!OA(nA,FGA#-AA+AA#=A 1AA&6A+xA,GA$݌A$H A&lAgA!A62A,{A$qA)A"A ʼA#4A'zcA#vA*A A)A|A A#uA#!AA%=yA6 A7A AAbA1A6A,2kA%A%4A A'0A#-AgAA$AA"A'AA#A2AA'A _A$qIA+A#A.A oA!A97A#RA.JATALA!AAA AbAFAAɣA&}A(iA$&A IA/dvA"1A"IA$n@.A6AxA$/A,A#e*AA"A$A'&A,luAA A0A$KA"3A#SAVAkAzOA,AA#nApkA#AOA%.A#ApAA0+A Y7AvA+A A ATAACAXA&A>AA" AYA'AAA`ALAA AXOA05AA͚A! AAXA:A#qADOA3A;A>"iA'@%AAmA&A_A3AgA4AAA)A-dAAُAzABA1!A9A'8AAJA5A+A4TA^A%eAAAA#{KA24A _A-LA*fjA)lAAA-AA ADA:AH&A)>AgAAAAA'NAZAA-AAAA<$A9#A2A]AMAϙA \ALAGAAy#A>AϮAAhA „AuA&AHoA:$A"IA"A#A'NKA$AA7AA A*A AAA5A-PAA$A AA.uA A >A0AAz@*A ݒ@Ҭ@AAAAA=@vA (nAQ@9<@YEA A@uOAuAA{j@ۡ@"A6A 9@xjA@|~AA A@|aAsA,4A A A A AAqkAAA@;@AgAA"UA#@4A\A$ A%bAz@@AIA4A>A VA AN@UA;AET@@M6@M@AW A@@t@8@@pi@QAjbA@hAA<A-A3n|A @3AAAA DI@@AHA?A@AfBAIA2@*@"@ŵ@o@!5@@je@@׵AA%\A,@@ӭ@ AaA4,AOOA @^@@Q@#S@@|]@ @@@1E@n@R[@@6@xA ,7A iAIAAA[@I@,@T@j@'@ @̞R@@Q8@@í@+@l,@o@¹@ꊀ@ʝ@Ra@Y@0@?@*A95@G@@5@Ϣ@ @K@@a@Ԩ@’T@ԏg@@D@pn@y@ƹ@!@{q@@@#@z@ϩ@ᜏ@@@{@&@]@3@@՛l@M@@֢@ԏ@P @~@ϴ@@q@&Q@@ݝ@@@"@ݠf@*R@B@ګo@ۂ6@&o@տ@9@ʨ@@֔@՟@:@w@e&@@ @@ȥ9@&@΁'@@@ƣ@0@B@ق@1@|@ӡ@@1-@@H@:@~ @Y#@ܕ@K@þ-@@*7@_i@@p@(@q@g@@71@I@@+@ a@@@9@ϝ@ĸ@~@@k@"@Džg@Ĉg@ N@>@À@p@ f@ԟ@@hJA^bAcAjA AdA%A3AA~A@@V@ߛ@q@ֽAH@" @@@%@x@:~@@@@S@/@3 @J@@'AsسBB-AK3@0@@R@<@ @ @N@@'A@@h@@Ť@F@C0@C@CA3PiA܏@6 @k@ˌ @LV@+@8A"qAAq@F@l@@@@5@AbiA@w@{@E@y@fxA#AtIAb@@H@h@m @@@Njs@B@Y@gA7AqpA/@ʒ@@2@J@@A m@<@@l@ӵt@Ο@˜#@q@Ƭ@@T@@@w@ʂA,?ALq@@b@Պ@ACˬA=;@^@>@@@5@|@@C@s@ @T@)@@_@׸@@"@N@@@@i3@|*@ `@qH@@MR@@@@@{@$@@@}@^@a@,@p|@@@;@ڀ^@@o_@x@m@~q@ё@m@ Aȼ@}@q@ @#@@@Q@;@.X@@ @@Z@@}@3R@>ARi@@s@O@|@ӖB@6A @i@@3@ƅP@ @W@Y@@Q@@gd@z@@@@@@@@ʓ@@ͅH@a+AuA@@_@;@_@Dz@ݨ@Y@S@i@4@g@۽@q@}h@@d@@ND@@a@c@R@@@i@!@Ʋ@@@P@@?w@é@)|@ MALA @@Wt@@+@T~@G@[@ A@@@@@@v@AAiٵ@@k@@?@Qh@<@"@M@@|A2AhA A5IAk@~F)?hA_AURA H'@@Gz@G@@@v&@|@@@f=@oa@x@@Ok@\@@ @V6@yA* ?s6AA` >@@t@=@"@չ@rf@1@Z@U@@A#D@з@J@B6@`0@H@D@}ASI@0@P@K@@@{@(@1Ah}A5 @|@@?@C@@m@(A ?@á@ru@,@v@O@@@EAyA\@@<@X@\L@ˌ@U@a@I@{6@K`A HA(@X@=@@H@qd@A@ω@@@B@eA5A/A -r@@|@ۛn@Q@ƤAWAAE"A:H @I@p@M&@4AA@0>?#@t@K@hD@c@D@$@Ј@j @l-h@% ]@9N@Ѐ?@A/4A;AA@MdA-aA AhXw?nAW6y67黡9G:<=c>@A}BۻD9EFHPIK LgMĻO!P}QڻS5TUWJXZ[[\^_m`ǻb#@BL tr@hi@S@D?y¿Tt@k?8K<@Ǵv@3?<=;@?@{@ "8v@^*79q O@?i&ߎ-sM@_#[$@eDy1?`J%>̩(?'x`VG?nSdZ+q@e?;?Ն{@#N?}cM@~@wL(6MW->4?ƸZ҂#ϿH?%??¶>=^G[y`>?zc9_?`p0$= Wl@Z^? n=X@r?d$@`|Mꞿ5S`B?w4s - x +:^@3>i?/X?ɕe@'2>w?ȥ?tҢ-ό?R`u>јǿt3A4[?Ox@^/0JDe>cnM ^ W^?Up>cS? 7??ؐ?߇'?9+@\]J?e?z?L?'?T >?>\< ~#?I@)? ?Qʾ?[?a>jE?}dn?-e[ >)|.?N? Z??M>?'=8?Վq>~0Fs>AQ?$?? =0#>@{">w?Y?-`r?,>B>놝н瘴>[P"Ȧ*=M|3T????FϽ.=Og?x"?_EMl9>NNӿ?} &>*$>>Nq@> Mر?rg`???)q?[>dhu ?/K>1H>7>?)>š߾;uʾf>ܢ?؉J?Ư "O?8#no*>+??^&\C>jؠfl*>v?]{>T,f=Y">Z>E?q|Caˣ?3MU+);3*>tH?g׈?2V?2?:㋽< ?*g?B5??i>ɠ8xM?V"?33!?GܾH->p>I-6V>U, ? =OY>𙅾?-4?+?>\?V~W ?g?VE>¾ f/\]>??">#2u?!>½6g>/s=@7>?5y0ο{y5"&=k/?z?7F?އI:>ԡ>1GCl)*c?e?"5>Y?n>2?Ev2н>:?_VJ= J6x^p5Bt>龪?b?8u>cĿY?W<-?;>ƭ`=?g?r>6+76:GZNx>g>]?^?*k>! R?]<&5>y??J{? ̩=BK>Ṉ>X#S>Z>>~?/>b/ؼH=X K>>#ݾ=+ >)x?8=6>>]>>7>`d?$Vᄃ >='Ȕ>y?.? 8F<=F^b>SY?&K??3F>lK=lFR0vG> Qm̽>=4⩾Q+#(G>CwG<.7??I"?..V?]T[/?#F?*?1%>>\ >l?/>/`3TF> S>⬾Fx>e?=͖6X?k'N0q=>P=l~W>L>EDft>%/}C>MCe>r?m>_KľK$Zr?>5N=O=e">](?7L? y<>?>L{U<# ;.?8Ѡy>6?6>6^?xL:⩽j߾3=>T\uȽ\c?E?Mn>W=wɿ<>`>I3>L&>N/N((>]=>Y>4t>>όϿ >Q>w8!Vij>"<aۿ# >K:@~>IX?_>;>_AV>ӢG?$zA<i;C>*?#>0>0;/.> +?.u>B]?U) x?-[>]? '>a>: C(S>B~>>ǥ5>VEN>0T=qLܿt>_ >f Cᦾ'?({ֿd=o=q<\dn$?> =9J+徫eő>v?!s?M_#>5U6>9C?m/qj4>˥\59?)g?(֥ww?Ac?0OCk> ?fK>׾UzŎ>>;d>>F=޾ ji>>EF><:őz>3?p>qMڶ>W%f>+>{?f@0=}h?m?zS?E?Ep ?-_>KQ?JxUek01Y>!þJ[!? !?Hژ=RC=>>ddz 0=>ED># s~R">e>>z>>>m?D>rNPu's^P>"q ƽ>J?>!&C,o?R>Ae)CS>)fp>L< ta>t> :>Yb4K=a}x8sBu]>?1.>s澊P>ƣ=k= >h>=ұ>>쾸 >>2j>.(Gvl>ak=0>3:C ?_p򾺪e?#\MC}>!?=%s\@ͥ>>90=*^>n#o.fU?=j=/&/>OcD>xU:=h4NQ n>/~#f9ӽ>~\C9 P>R⾷ƾ>E$4-6P>Z?j}>D>_È>d>8=n[=5=SW2==,!l<#>?l94>d;>D>0>k=蚽n=W>>^P>ݾ_?=>/7j޾{?@>9S>ƿcm=t>3?i#^><;?fF >6>??>˗>*1>4iF=Q>]? J2Ga ~db^>ԢC<4A>e"=+oK=.T4,2A2[>ԕ>Љ>m>>F̔A|n=0>C>o\=|?a>;>嵽-6:a>n@: >q:>R9u=Rr)$=Y>O>l>?>tRBQ=%+<]->> =gs>ߺ>=;d v?(?2>s?fp>H?bo<>&9?aϾkNc>.`>+M>>n>0,> >:>o>[|Ⱦ)>B?%˸y#)=[>$:6< O?Y ?>im˙> 4>x;$>r>t4==3? >_<0[KN>>>)>*S>[?v>> >2>u(>>ս5_\ NC>(R=彼[>W?Cw>mɼAp>gtX>7Bm>0>^>!^/? >H`!2=†>\ >j XR+@+&>>j?(>S)=d>_;s1!kys뢾[qm\ ɲ?@o]>aV((<\(>Ƚj7<2>.Fʵ=v> > y>o>>? k̾<>Zt)[,(=CX> q .m1;ڵѽ=3T>˽&%F>H>ϐ`Ѝ<=r7=>a!Laz>>? =lrͿ`Wƽ/O>H<<]>E>A޾\p> >ϑ$=|?)ھID=,?=Y>S>?;fZ=-?fn>>2YOJ>1ƾ,> >2b=9>Χe=xؾ1"?@=6mBQ7=>wD>sO>ebT>|?>Ȍ=֝="? cu>Jם>^;:=!G@== <=sw=w?MPN[J$>>ǷlR> >,ӂ톾C.=Z>a>>D}>"u? v}=wt߼A;{<% >`>>KȾ#>> ed<,>#>ퟀ>y><>.)>1=?u>侇$Ѩg>p>F>hXc=ݝ!?PF=sY)>wT>Zav 6K,' Q=5􈼈_> @=1"S<]EP>E>lK>PD ;Ju>x/v==þa[(|jL=$ &۾hnýP>><!{He=3>e>d)a@\(?> =[>? W>`Vt=\+b>>=˾?$P>"^>_9`޾f (>Ы{==>9B.Žz>62>?#Y^>9h>:z=^>>p>^0>CX>3> >qLG"?/>N=I =ҽ<;3դ>VIM=ڸ8?>5=<;\/>W%> <1>l>B=#>(>o_k>ߊ> w!{Mv<]>FV{>'=v=I>.>񓽵-='Ἵa*=>l==n$<ӳ>=MR;p>tk>>=Fs=k>α? >–>e\>6r=6!9=ྔ1==5ZPҧ<)>j)bF>>2=}>0b >8?4hq#|>ىB=#%=B"M{>S>p=Lٻ2>#>)B'? H>>>>kRKC>>8"V>`=APC>GCr>;[?|=C>w!>_^>`^]fo=3=}_ Q<\vJ#>>=辂>>>1 =">S>#z;v<(G>u>L4> ᴾ ˽M>?S>.NH?E >+fP>q>? a>H>q>? ޼wCKr޻FQ=%[S=uyn-ݾUq=J >~>r> ߼(=WZ> <> >vʈmC>> L>X̽}n>:> IT:5>+>KU=(>:_~E _=*<\%A]<;U ޒ>.:= ^=9:>ݞ@*p#>4>'ݼL= h==6>)N>?'0>н+ =C#>>Hw==Jk4yV; >>z>nE˗?H~Yt7=,S> l>@>2=~`hx=CN>]>*ͼ>.>֒=*_[w g>3(=*=U~:k>!>%=VI=>= dE}1>^?> YӾD;3ǘP=ՈO>>޼E>F9'>e=Ϳ@<:{7>}&WouE>O? , >qH>r>@O >m=GR>r>j=Xڽ^ , =jr:ZC<>7 >XG>ݺ? #>^Y>: M>V>c=h D>?V|>n+=>)>> >=4־8}>ɾV?}>>?>%>,i>Rs> > (l)yT>? 1 >w/'eN<~>RN,ҕ"=Zu%=>YZzu>>?Խ>n+fE=ΌQ>$_>=ӕ7]tBL>h>W>J=ԾR>.2⨾龾`a׳ɾ>p%#)>:69<_?߿NM>f=vݽP>>=>>Di<> *Ha~>| $b>m[>11>(>>> Vʙ|ﺠ0 =qӰtxYɪ>e K"Hི.+>S>>K,hPo>b?8~=- c_ AS]y!K>w> =-B=1ÝѾ~/>cA > ;=|>b>aEE:>>j +׾ >n >=!>dh>֓>Z>߲-zOܮM>F=e=>R>"%kK>WN>'K>E>'=?8>^E=b[3׾ |s?5h>! AUQ?uB=}>! >>Qu>ꖾj8>6>W qྋWX=(ؾ?׾?!G>Y)߾m=;= >wn^V>w!=&EH>4F%>K*L #==ϴ:?_/=&=TͿ[2=⁾W{\\>\dI>-> l=k{=QQ=b{?*K>2Tp>RS<((I޸=D6i>ȧ>On>/w> S=W̯>̾Ž~ݾR y;)>#WZۿ uӺ}4A >-M>5>?g<Bϼ;Ӿe=?]:>O >)=5D,]/=k?Ic?d?=o|ľ&.ҽt?Wd>==R? ?SPn>&=_>@?0#>Cݾ=ᚣlqs<>s>.ZHJ?8޷>y>>^ھ<>&>og==þN ƌC=.{2&?LoV2=wu==n>`s>yG_ p3O=yXZ =e>>&=@>>h?@Y>{NyAqt);S>9nt%k3}]B72ҹG־6\=ʒ@=߉=odSw'w>=vG0?*x?A[s1kXT^@?qǏ= ">W?v,>󐾀W;K8>0S 2>y`>YJ!{#˾>>:>bP>C&S>cнT=Z>$LgT51n> U>8?Q>"j|2A>~,?xXD8>4a>Җ@s2? >}龘+>>,?d[% #os?4E_> NSF*>o@=8_Fzd^9>aXbMg>i?3ƾ/ @==>v=R+3Ph>E=c=p?>H _>:?8>?=q^>lQ2>~=>=J=wS?Lz>N~Wۦ>)þ.=b??'kVT^*=a>d>=-NȾd8Ot/j`i> Ъ:?7>0۽\|>GK >m>t`0!¾O>Κ>DP>a )>6rYQh_>)n? -@p;g~?>ʹ?=5/1k=<=7=ۭ?,?R!>9#L^E?%1G>>Yp={=݃>,>4?>J@%=gl7 <˽Е(Ҋ\>s?!iT'$?,?1Aɾݕ')>j$(U==%%IF>n?:N>I=<>`?iͽO?I>;R;[t>T>=)꽿(򼽹C>ws>>?#?ҌL?>3Zz-wq*ZaF?I?_ bS>2?G=!ƿ>Vw6y=i|=2K=j>w1(z>2❾t!f=?Z^>(k?H>lf3?#=57z>OGp>Y>:i翗!>A>W/|=A޽=%Q>,+=g;6`/{>R=;5)澬\j>ߢN.ǾuH>> T ~>c>KnQ:>Ɖwp]GW>+ھ|վ+(譕#u=sx=ͽ<>!]+7m狾$>GA< >EsY>j=tq2>{ܽe~ >N}[DU>PV=Թ_3=>Ц:==S< >\5>o==٦vS<>YXe>aela>>z}zx>_&kټ(Q>N'>w 6A3u¼Ӭ>y=>>,D[ `>Q9>rN>,=~d<93*vsHb=ŽA%=G;9Br(==z,R>B's4ipx>]K>"&L,S7=ӽDa>W]v>+>־ )oc>,>"n?*ml>8>1ߔ,)!<`d:=B=?E>=_O>>'v> >ô>%?C=)>@_>>c=<Oڤ>J 0Ǥ= t>؊>rPBH[> x>YJ&6@5yx=>]|J&S>(fX=6>?hپ ׾)FXɾ2z>oF>?<is"J? ?>ͽ6Wb=@>!R&6>2=)֘=f >PLɐǾHֽ)5=7>g? =߿>p>>BF>/o?>k9> <=">',X޾!>L-><6 \/C=(>\<ܪ><Ⱦ+y>K{=7aX?j>SI!:k<>W~>]>];>#*=[ [S ?JuW; +3=˫=4=<[]n,#=B>JZ;D<>>>h?c=z;Ы 6'>[LzRW>~C= ?s>>>G?K2?-'x.u?5!>Ǿ}{=Oh)l-=RgF<3=R#>=f``p>2wD!3,='130>L0 F>=FQ>?m=OD> c>N=A>?M0 t=r9[^=.>ĩ=*Q?Hr>-A=e>1ymN>L'6[޾ J(>Y>Cab>~=t>Ž$!y>%[>AH/%#rRr=y½WvOA,1f꾾b0=_ֽo=tg=ex.][= ~8>>H}>n=2B>mi>kyk=&>12y;Z >6?!uR/=xS==vuF<#g%͈76F<`Dؙ=>#U}yR>xzҽH]!>f;>j>x>Wiܽc>>U)?>(>\?>*t9)> t> /=t'=Jk>=bS޾jm5}=!8l>lFپ> b>G[tN<MI=F>!j߅zHѾvI>Pq'*!<>ܶ>w/<>QD>jSb_K?>1$l>Kg> t;$>)>e=ݽq>M>X=.q>x>;n>l\=82ʾD>f$환m<( m >>D>@-,GN>;>mj9Ѿ^Sx_Q]#ko>+ּttv=V |sG;ͱ= >7Qfs>[ e>>ejfӾսˡ>b>ʽ>Y>nؾ1?Vὅ>#qo?>ccо|5>,濂,4\=fzG>7B>8v#Bܿ̾Z? =gɽV .>z?>?}5|,I5h> |󽺗JE񾅸D i=gY0>}=:I9u>9z>6bFh?R=8f;!4=苽A3 R'd=O=`r>ɧ>C\ H@N>@#<5V9=!@4u> $+=*ޝ;F>Ҹm>Yd9&?>΂ ="6=_7>>=,>Ђ=Ɉ΀>J>l=@*,ƀ.>wh׼;o>j>dB>w;#>V>+)ʎ?﮾L4{;n>=nxv޾goUDtv=]׾!yq>!i>,uVm>+]zb>%M\>% ?BcQ?M>2o>_<%A=| DB!&>',>Nb>+h>V>:k =B}= ҿS`=LBpb|<>>LBS>q=Hs}-l<=@>F>ݴ>eZ>9ƌa>d>w˾fXv#>zgվo*j> >͞J\[>A >$+=>c ?7jgn:O1+Ҿ=|?n}=R0e?a5>Eڿ<|>zֽ%":*ރsƾ,#=AW:>Qr=~9n<֞s >9k@?%2,?¾<(2?bx>xXL_}>@'>:ty4`>> Z̾??6Y}ۿ915> E>Gn>,?>>iҽ>*זq?b?l0F9$>|gs=M8=<`ϿY>E>>>J6D$ο\>`'ο=6>T,rz9쾎M=e> >p1-=4=%UDҿ$g>D}7L3>U,>"@@?~/p@?~N:r*>%|;7|?`?'᱿^~u)ξ:.'>B%=>H|?F1c⾂Ҿs@!>1q/ L=ZE?+>?>=II>vg1:>¶ BS>@ D~?2z Ⱦ[⾒j [~R>>i<+6 =^𽴖TF>=?iJj?A7>o@=> MK(v==kD>7=F@>Gh;&辷sӽ >] ]]u>S> >>;e>8 qagS+> [=4>]fM J1<^UB/̿DP~>=EjJ>`>g=[Q>F =޽3;*N }>N Ⱦ3+<& >k\>V2\>>TƨDCxw>f=˭?zF̾U> z#=a޼>ɋ91 < &ᆿ9ɡ(3!F=l=7<ÓmH<6>4n-s׼I㜾Iގ[Y>7 >'`>=>fQ4iQC>b|=rJLZ7^-=艩;n#@P=@=t>fq>i=B?L>]`?d@B*?;?@8&Zw(}b,>(?M +&꾼\PM@>&4¾mmR>>`A)>&LS>i*4 Z?P@B?˿%W?S+?ϊɟO? *^>m>L=ChoT?4?SS#nN=H1ó :?;ݥ=XP_:>ؿ>W?Տ?jX>Bz羯>%=S.>?/޼s=˾]W?9>g?3? {پwC =C>JEn5l>j1= &P?$>>(݄\xQ>DT^?żD|;rS`׽?<">M9;\1k>]nDik>\>r[2?2>:=>Y= =QѾΨk>jQ>k>ھFpֽ'ŽK=?D>5? ᱾:>4=ʹz5=uw_>nld$~<񁾛{iq% > pT徖X\(2'=9=ʅT=><>il>J>9nj> >R> XY1|>?Pǿ_I>4p>  =R? ?w =_J>J;^Dsg?(/=h$+=q`7E按8):&QO=@пhl 11c= 0<ƍ=e>ɾKM>=>V=V^>x>v?=Nd!佦y=ޘ>-a=)H'> 3>>Y3r> ">>r='P=NG=}y>,^>3F;?رHΚw??s:T]@:a?۽v㏿ 6?>8$<"=]彂U=/m=Zl>vr>> L>i>>+ز??B(9O@f#(?N{>=C?s?C%>1f(^>T=徶T~>>'7?UuI[(>ӣ*>U>*@h?߭>7dj>?ھ960ؾ;>HȢ?GA>w+ѽ+>L_@IX@lG^>B/ʽhA=(JѾN={hq?vYD5?*?Q.Z?O >>@G?%< >l![e5?>{>ࡾw??b->4&t?X@&撿S,>b:=ɡ<0<c 0= >(^'>}" P}KQℽ>^>>u=2:*[>7>D;&6>" =cI>>xIG.# >Y>YA>\q9hA>>ʹwx>?^[=T>tvL= >'>^0='n>".@>Sx=g>M>4龍ξ(>>=;=H=i @́۲d>:=8>gS>P=Z¹s r 6@ViA.>=>=$>]~-=([=fT>>j=>0(>OyR>@1>^0?U?0>=->7>Y Yyx>Y"= H6}\/> 2L}< =f~=nž\4==&=L>%>2|>ү= 5?6i?6]d?.I??p@TrbA#.2@ Z@FxUEt@k?۾? v?>¾@2?~ !:ʾ+?[<"qi'>W?>2? @>xvh@y=??SiF>trX~?bA=JDlt{:mj?,Т> w@ J?N.&>i=`O?\x"@Ko jfrs@hJ>LCl=݅=S?t@GjymW8d>ؽH燽 ?}@@???=?$?>>󒚼>?iD>?@ ??Ⱦ[z=/?Lɿ𸿲b?LH?p@Tk a?;'4uSQ^]? 0y>]^?s??Vr{Qz.G&v1?Fx>{?ԩ>x >hԼR>K>_#&>X>> >D>Z">-ܻ >ڛ=HW?<>b>{?ڕ?gA@ м"ͽ"<0/?M=3 ?,}Ŷ.>|;>\=K?#'?hWiah?]?F?sS>s=? ?#UEY?)<ޙo7 d*?@? ?"J?b^u?W>Z%r@2=>>q?[ >I۾u+4>=nAD?q?D>F;4u=>r+>5˾X>օ>]$> iw%]?yw?T:>x"??Ƞw>Q_= O??z ?>7> `A?𡾓Q=; ?4ܿ ܾUҾٍ h?*?+ۿܿ͠U'򗾣5ȿQ?Ͼ@*@r@7-?>*b?Dͮ>>v> ?3q %j>Rl>>>#@F@qY>>6>=C?T=տT ?k~?C!??>_? ?o?8@WC?C@[nf,F?a9;?"{?W,?Ad(E8o>z>cvQ?8> cר2~^@ @|B x|b݋?W?[@ ^x#p>\\?<_-=BJi=0>I3T>@hS>>D;>--$?75?X8?O"W@2 @:@ILݽ=$!?Z?o/@-p%?wd>(=??0?s@&fPE?!@?2/1?>H?#>!O=_U?2L@4D>?O?&/>V5>v:%d@j?h[9@@&"}I? ?I@5Q?<|sM@%ˎ.@խ@G4:>ǔ[?ȉ>w,H[>Q?h@4I?">)c?s2>m@Zj@>AUyhe}A?P@j>-bR@U@aa@ac@ae@ah&@ajP@al{@an@ap@ar@au @awI@ayq@a{@a}@a@a@a9@aa@a@a@a@a@a$@aI@ao@a@a@a@a@a*@aO@ar@a?P/?WK ~q@p@%UD6p@?|6p?J@<qZz>8}ә?t#=?&R>?ȝ?k>`B?Q@.iF??a@?c@'>>숳?mf>;t f4[e9l7z>YG\+y]@Q@E?q sf?>a:@-@w 6Ͽܒ@~P@S7p> P>{VN^@%v.=䢿,ҿ/h@ l?Di"@%@_c@@F3@gfe@b@k/?u?Ι@-H*@y@$@g?Q A@;@?ӣ@q 8?Zy@xMe?&+L-P?d?7?sr0jm^R Ͼj{濦׊@?K@=L>J?^?0X< ?(="? =,?-"?i'0Mܿ'T8Q;!h@ A>?d>#Q:??Ħ?q?D {?K?ɹx?L?x ??'bx:>S?,LL?j?# j??Ҥ>??j>f?v_6>??B?b> ?0j?w?:?̿KId?bud>>?A>бm?e_?Q={">7?HO,?">]>]Ơտ ? s?,?9u>@?B@(O??3$N/?@j ?2>U?<?l)q>kx?}X?r̿f?٬%?E@Lj>zR>+*8a?U@^)3aw ?St>YH?i@;%y#>@ ?E>َj?"E/ Fݿ???9r?:6?>Eb>Ʀ???s$?Od>-a??mLQ>P??*?\#S -?;>:eE)>r4?pe-?W >P]?'p?Qzў>j"?zw?^>˿`E?k*B̿ٮ.?*d>l >?0b? Rt>c? }>󪊽GhĞ>D=o?#E?u?ꏾ<ɿ@@_?>?NW?]1?ˎ>ӭ!e?N?V>@>`?4?97(=nm.)@ _?g>?TRs?>ſARh?z?f???wy>d+h~̡8r@ q>D?\@?D>OV???(ԼY_>]9B>˾j?D4?Z>Y X?ük?\Z>6 =N5>~->p? ? >9W?>[7=R?sL??>ֿ  ??5K?Ys?5-뾮f??B-.?(|xwkYa=,?M?rUL6a7>Ș>?? +c<?Z?!z> > ǽ71T>>H"=h?:L>(>T)>b\??x D >?k?R7>f>?yQsC>9?S>_?Pf2QyS=?i!S?h?2g???Av?yG>*;=Ә?&?Kdz>B>@, v?Z>pEP?4F>(>+>u%7"4=&s?F>p=>`f?Y<}?JS>ό?s ->vO>X?^F!o??#?4"?K6? ?S?p<??=B> ?žƽ\?3?VC?b޵?[?9U?&ݜ?6>ʋ>fٺDf >`>)3b)Bp䦽>3.? '?$?ds?tu?: ?0DA6A?8]?a=>"5>25q@>?((?H ?I7?!>?_/??E>>͒?]G?Q?n5>>M?,J>ˀ??Bwֵ:+f`?&? |>pq?e_<֋ ?>@?B?&a]? I>?[.>U@42?Ș?/ӿ??Y>ȏ=yzv?-\(/?R#?Hm=ά6?6{?E?;ú?{?)9?/ C=}ͩn>!?6̾s=9͍n޼l^W >>E?"?G1=e=ʸ?D7=5׳/}>\K? Z>i?;>?'?5>>&>?݉?5@>^??)Ur>}A?+?/C?pAJ-?'%?Z??B?(2#((?e?@8??FU??7?Y&???c->4>o?-?{*7>\N?d?X>t? JI?`H>}i?=3%?7~?[>c> 1W>?uO>pWH>V>z>ߘ?)qS?3>h>gp>o??ZS=>h>>?;? c>۟?2Њ$>1c ?/> s?JZ ?c?/R>>G>ӽ>> ھͿ?0=-;Q&בu >ct? di"ֿ.>?0>k>>չvN>?>(?"݋>@/4??>ڔ??`? y?n>*? > ==>6=CA>?*>yѽK?ft?6 D>B??{XB>&jcB>{r>Q>Q?m>?>>/?iY%>f?M??Xk>b?@V>4[a>Ϻ?T? :?>==`=>i*#>H:?e?"=<>0>)f< >s>=>;?>? j?C=* >?d= >f>>>*D?>>Ŋ?!:>T ?r?V? >>>?Nlh??A^?O?<>s?-G>?1Gj?ِ>?+=6?=?*?2=[> O>?"?R3?l?NJ`>V>G>B> =N ?L?%>Y^?2?:??ͷ>ztW?Ab=ӌ;\m=?}>}>?:>>8>&xm>e?_h}:?H?TW`KnQ?ls>N>l?i>?x?(T=5`>ŏ?Z>}2a$?MceL=>mɈ?>'IL(>H>l> I?lh?ڂ>!I?,(?X?u`d>o?aQ&??"=?qR?"??> @>\>ڞP=O ƾ@?%2?k>rL?T>m=o>m??$>vsdP>V?`J>SR?UK?[z>=A[\? @?NN>9'.qp>m?RO?;?a>H>?}C ? G(??z>.??R_ ?3!j>S?M?f(?m?B?.?=1>;E?n>ワ>>>娒?o???7@P?*Z}=>˩>??q?5!7p?&?a]?t?н>;d?>Q{F> ?R?x>AoTjَ>>K3?G?Gy?R?4>(?' >/?xq?-??8߹?v ?sL?-&>[8>s>¶>_??e? ?L?j?rѾ5^?b>t>?P>Y=M%$=|>r|I? >?x?0w ?<=?IA>|>d>ݾI=,ʾJc9h> >9>f.>ձtDh>?v?y?[>HӾk=īw? ?=W>>? ??)9?w?͔?JD?*>B>?Sʊ??0T?n?f?f?@?'@>Fѹ>Ϭ>NB>|+?lY?=>lq==S?fl? ?n?~??;G> =?\?X">%>5b?)l?6=Gf? +??e5?]>?a?"?9R?K:>V#>e?pH>>T;?%P??M?G ?Q??r>b4?%G>l>>B n?POX' >4>-c= q?~k>!?^>x>?V2?=׮=`?>?]W>c1?Sx?ܣ>L??mY>?R?2g???e?.r>؞?W?I*2?7_>[>a;>N_>7e>)>l>*? f="??-ɻ>???~>?i?P?Wt>z>_>T?dV5?l >`=}>='>#?"ܓ?? +>9?R~>k^>K?EBi?M>d?G( ?h?l-??(o?=iC? L>?̉?&?r>>;?[G>?J>W>-b>p>]??=?>?$S?%,?d[>'?y>>4@?&ٴ>Q>m3?&?u>8J?">yj>I=).?½?kR>g>"R?n>>?qx?l5?'?/X?b> `=?Ҏ?,\=zd>)?>(b?P_? ?>>U> >U?gi?,>5^?zI?V{?;{?s>[?;r>W-0?0a?'>O?<&? K>?zc>>„?c>?2j>)J>=? ?\1>?"C+?Y?7>HJ >9FL?e >>?[?S74?+?W?t>Ц>߈?Q='>>}k?S2?;?,?!5? a?wOy?>?2G>{=Ыw?S?U?1> ?a?I[>f>g?h?>>.?%?,\>B?L>>X>>3?">` ? ?L ?Z)?T?q?^>Ax>2c?.?1?;?u>hp|>l?A?5>>V?l_?| ?/t`?w?P>M>9?l??>`w>??vs>LM?U> >6.> ?Kt? ?ap?>p>'?M?- _?M?g?4#?8I>y=?j?6>=?}>DR>oim?$?}p?C?c?1(?B]??TF> ???,??s\??e?Ң>͒? 2>?<>??C5?Cu?$>?8?R>?{?2?]?>6?xg?HW?A>7>x>@?RY?DǼS?L?\? >>d?w*>?\>O=xj?`>p?X?{>O?fwa><]? ?;|?G?t??Js?[?S`?[?)yX>ŝ? M?$#f?G%>n%"?N|?%)>>>ۿ?M ?E>+|>ʒ?d%}=A%?1k?N?3>Ac>m /?Cd>>>k>??S ?:1? ?&?Ak?T?]>?@a=(Um??? ?raf?C-:>v>^Q>^???G?W>^??E>q?F?$x >ް?E]?H>7r??N?|JO?T[?Y>7> &>??K>d>?P>;'>u>\G>Z?Nt?yփ?s5j?PB>9?&M?>E>ڹ>?Ϫ=P_?6iV>h>6?Z/{>U? ~=??v?>?RD?%z? ?@؊?| ?^I>c-?K^?4m?=?$?/>2I*?"+'? ?\?b "?HT?=Φ?#Θ?$? ?B:?J?{7G>Hs>׸?NbI?0}? z>>hb?=Y>^? ?߆<ʘL?e>GW?!>>]?Bw(??\+>}??%f>~?CB>yL>uy?4y>g[>vj?M5?S?A?[>lm?Vɇ?S?XA?d?{?>?\!v?.h?E >v>>*>?2?'%?wh?}u4?Y?5?";?W9c?S)?}0?= b>M?6?k?0?"?$o?}g?]?#G?!g??#?$hU>P>>I?g ?R->Q?>Z??vm?a5?9?jw? >!?U? z>ɶ??Y>e??'>s?5?Q ? 4?]Q?Y? Mh?p?`?]ﱽQ>WT?:> d>:I?q>\=>՚??Yr??[?=>Ej>K-?A=?Y? `?m?6?P?VI?@n?-W>>ߒ??t@?;3?+$?:5?M˹?e0?ܙ?d>r>g>?:?b0+?xq??=>?>w=-NG?(?i?>>W>Ms?VP=?C@K?Iz?PP>,?rU>?*4?>?c?/~>I?\_?N\>{>M>z$?p%?=4??!=[n?|?Ea?I?z>ǫ? v?gy?..?6|??F?+¾"Bn?Ed?{9?XQL?>3?s/?w?Ie?1>y>7?p]??F7?B7?N:?7U?% ? J?*?q?c{?0j?a>诔>?'.?>?-Y=>L??P>a?X?M]?)5?J:?O ?0X>?Q??z? Z>>`?p9?8?L?0d=M?3?[->x?H>D|?-?s?}?s?R>?jV???A>=??#?d=??H.>5?iÏ?QUK?<|?>K>?Z?v??R?!|??O?vI?.H?"?s5>Y>+i>ot?O?3?p>Xr?xVL>܌>w?+?'L?:?%>?*?L>O?h=>?o?OȺ??uX?) ?>?D ?T>J?T3?Vi??Y?.*z>B?? s?^A?>?bLQ?j?(>.?>>D=ݜ?$\?% ?_v>8l??>?=:?>J=`? ??>??"?}?; >ɚ?i?z??K>k|?^>?Q'?o(?NT>B>h?K??3?C_?…?.T>I?wL?z?n>?v y>??ft=? E>>).?r?U?]'??L?1r?X^w?U?C?u,_?=p?I#Y?x3>>?8?O?X??^6?E>?9R?q??[?Fݶ?>?wz ?c]?S%!>!?tS\?-o>.?N>?IFQ?g>?? ?+h? a??Ԉ?x@?^??`?ֺ?a?z;?V\?w[%?k??#?>h?ԡ?M?fy?g>NA?dYn>>e>>ܙ?h?9?d?q?v??C>?ޠ??J?G4?v??;?@[,?&2? ?n@i>AYi?.W?-?S?ym??M?0?;@?-R>w?2?G??6= ??Cc?~?`?@???Ez??/???ϖ?S/ =?K+?T?\_?H?>z? ?>? ,??&??Z>0m>-?>Ǫ?o?U?ˆ?椀?*X]?5J?69??1mK>?rs?nަ? W??NV???RX>??]>a?v?B!?H??&?2c?3{?EO?cyN???x?w%(?2?)Ѿ> ?R?2f?$?XN>ua???1]??O?$ɖ?[?g9?Ώ?rz?eP???8?N-p?Ғ??X?@*@6Q??|??'[l?Fs?B?E[\?~?xX?Fc?5i?fa?8Y?la??a&?Uu?{;?g?$x@?q:?w?U??N>>n?j????1?P?n?ĺ?d?&?^?J?/(?S??Y?0?U?L??-i?HDE??4Pj???>o?3??@,? ??8d?ͳG?i(?s%?#,?l^?4???[L? ?^>>>?}?7?'n>7? R?X?M/?,D?5?P*?U-&?h?H?I?W1?ݚ?bG?S?s ?o?:?l7??=?/L?˹?I?X?&?;?U?M?j?K?>ؚ?V:?/??-=O"?????=?Q>h?!?Hp>?6S?O?i??L?Ͳu?_?]?7:?"?2.?M`?T??bI?h??Э?}C=^??Ǘm??\?jy4???zE?A@@Lkw?ޑ??9@ Sh??v@?|?w0???~??? '?>L?+3?ą???RL\?-b?x ?޶?0?u>w???O??.]?|>?p?"?_??Ȯ?/9$??߭?O?f@rf?U?λd??e?+?\?t?I??B?w@k?+?_0 ??7??D?0?E??R?7p????)?/Ť?͈?~?w?ϖ?0=?1?Kk? ???*?_1?Z?ҟ?nx;?3W?#?7?V?`8???Y?_??/?R??s>dD??9?ɲ ?KD??o???X.?/??#?h"?c]?x??5?Z?xf?͛?d-?;?חK? ?C[@(!@??1?r@Б?7?h??ي??*?կ??HG?]_?]m?Eq?>?d?)@;p?? ?'???/@J?@??????l{?S?rQR?Ը?(?k?əl?i?H,?f?b??@??`?n?y?6?s?l@ S?͌?c???|7?ҭ?mP?xh?R?ʦ3??p?ﶻ?V???R?>??t?^?V?L??Ӑ?͇*@M@ ???N?u?NV?=? >?v?뵏?8?I?@??q}??р??n??z???Մ(?*?Z&??dc?* ?)?J??h?h/???v?~E?.???Yp?{??N]?F?T???-?? ??Q;@h?i?}@n??,?HE????{?U??Ԩ??@k?Ľ?3? ?x??S?sg??V?Ҫ???؇?ύ?*?Ҥ?e??~?t?;Z?6'?m?_ ?ݬ?w??D ?3?@ Q@p?p??e?|@l??ޛ?j??v?ܽ??Ʌr??I?ϛY???J??)@ ?o??怤??\V?@@e@>@$NL??]k?^)? q?6?4?'?g?ҵ???za?K`?ُ? ?,}??P???z@=h@%@?@?;?¯?ε@>@$?h??8@@b?I?T?e?N??K@ m)?A?I??:?0??Q?‹??G?M?La@ ?Md@ ??{?ܰ?R??.@q|@|?b?@? ??r?mh ?M?? !?? ??Km?J??rP?[?Y???i??????j?)?w@S???Z?}?-?U?[Z?R?Ӓ?{9?%?x?c?D?? ???X???Ô?c?]?C+?A?ܪ?Z?r? ?*?w?F??ռ?Ҿ???i?V?ߙ?n6?x?0i? ?M??ڏM@ T5?* ??j?8k?9R?g?)+?$? ?N?Ϗ}?p?ˉ?y?թF??Ċ?j@?~i?ɡ?r???S??G?ҳ??"C?k3? ?6?p?ҽ?c?(? ?;??? ???+@ ɰ??e?t?-?] ?u@o ??U?R/?&?A? j?~??;?R?زn??i?B*@@@XL@_?J@$@??K@[@??ǝ??N?z?R?8d?/@ ?o$?0@7e@'W?m|>?+?ۦ@@ ?w?5??ڀ@ ??Q?V??]@ߧ@R@u?@ ? ?h@^]?,?[?ߐ8?L?!?I@"?E? @?[??M<@@^v@8,?-R?Ɓ? @ ?ۘ?o?m@ @ -@6?R?Om?!@k8@hS@d@;i@?@"W?wY??1?5h?eI??(5??@???}?P@ @@ӻ??'?ی? "???{s?c?^t?G?W%?bC? ?‰?v@@ss@??@I?ܥ??C?ߺ@޽@H??k#?Qn?E?@?@@ "?P?:??VW@@?9?3H?Zg@.?–@ M@?p? ?Ӓ,??8?U?????g???WR@?o@@"k@$5?ۢ@|@;@5#@C@ӽ@?o?֔@ @G`@ ?]Q@}@0@ @ T@%&i@ ? ?8@'@A%@??m?Z"?(?ف?;?iY?ܕD?jp?އ?C?c@;@ղ?@J@?ޠs@x@%V@@q@![@ L@f?G@ h?F??I?u?C?:??b?t?@")@@C@$\I@@$:@1@???b@9q@1@d@)dC@"a@|U@ r@Xi? ?ћ@)6@@2@ ?&?Ix?쟩?a?6?t?%L?ؾ??@=?nl@H@{ע@o@{T!@Y@:M@&S@p??#?^?@kk???+?Da@@ ?-??y???;@5s @@eþ?_?v??@ @?<@H@g@]8@)@+@@AX|@@O??3-?u?P?*?p?- ?…??@-D@0S@!|@!X?y?$@m@Y?/T??@ ??Ӈ?ʧ?@9}?髫?RY@?? ?@?B@E?XP?$@_@?)?@??ڲ?p@ c??ڙ@X@F???9G@?53??R?/5?0?b^?/??@ ?1?K?>?g?͓??喗@@d?B??@@f?7?@?8?=?)@,?>`?R?者@0@(a>?x?tp?#?}@;??s?Ȯ???_??S@@?.B?䉵??@@ 7@f@@?hV?Ǝ?VA@?³?;?@@a9??] ???Ē@@?՝?1@?p?O?@?ے?㈕@p??(?%?y7?ʏV?B?D?~?_?@ @<_@V@3i9?l@R@E@ ?0ش?/=@<$@?|#&? /?+`?z?@???҆@߇@-@;???.?ұ^?K?E@)h?n?@??P@6.@x@]P?>v> @H@.?@Q@q?@<|?V@@'|@ޕ??8??sS@,p@u???܊?w @n?B>:@U@G(@U@u@h@ @Y?r@%@"!??2@H?8u@Fh@Y@.DI@b??U?a??H?@ ?w?t@U}@O? ?@c?Y?`@{C@O4?@@'?l?@?@ Y??@H??%@)u@?@ ??VH@W|@Bul?X? ?2?;????BI?h@6??@h???@$@1|?r??@ J@.5?K?@@M?zN?k@@<@?W?׺?(|?h????@??S??A?a?ϲ]@p@@???Q4@$X@Y@ @>s@<*@@e??U @1G@G@U@k ?;@Y@(Q@:?.?`p@H@2@$X@Y-@<@@ @(dL@@@$@Ԡ?մ@ ?@,@J?-?ݏ?ͳ???מ@@)@?צ@5*@?Ϣz?d?6?@ @ u?p?c@ 8@V?{?'@@H??J?d@e@#@>)?,?F??@S@#@(@%M??@+W?k_@Tv?d?p,@)c @ ?8@@+_y??@$x@@=?U?@@a?]?@ T@0?~?@).@/`@9@Y?XG@@(X@ w@@.@ؖ@ F?x\@]?@?Ѫ@ @@s*@ @@(@đ?WM?@t?>q?Δ@k@@  @<@}@M@ -@¤@e??0?j@, @Ѽ?a?⃛@?΋?އ@??{?^??F??x?L? @7@?@ Y@$@ cT@ @/M@M@0/?Y-@z@ R@@?@ XJ@@@ ?e??;@ cs@ӗ@@@5?'@ >@?@v5?r@$@@"@?r@P=E`ƻ@{A3*E|@CAO@2@=^?+? @u0@0@2f@Zu@j @v@f@>@@.Oi@6Qn@@ .?X??̏@ ;@^3@PH@7k@.AH@Wf hv@[;@x@„@%m@ @ z@??b@#M@@X) @K@MY@@ A@p@ @Q?"u@_@`?Z?;?Z1@ l@@V@el@K*@P@#;9@$\@E?@+@.O@RF@!@Of@ @3@; _@)@?@@W@YG@@.6?~?g+?@0 @_k@_@6@wB@?@|@0@N@@$@i0b@%;?}c@7@`݌@?)@ ??^@+V?h?cW@8_?f?ݝ@Z">@8 @ߓ@0iV@"c?19@Q)@y>aq?{@ @'@Rn@X@2@@uJ@R#@]w?½??"?V@#;?W@ ?2@q"@?H@+<@Y???@tB?*@ ?d@@@QL?L@3Z@+@efX@;P@ @pz@@Ad??z@z??J@%K@9@Bn@ ?ݘ @%?X@y$?hC?ى@<@H<@'H?)?@0g??O@@."?@}vAxA @@U@J@/u{@2u0@#@CB@ms@PvS@p@@e~@!@ @J]@/@A>޾CZ@\ @@!@&h?t@.@*?L@C@ @|7?@%i:d@@]@i>AT@ͭa@,@7@@ @C@g@L@@ @'?Y?d$?_?@/9@=F?@|@FH@xA`<@пǾn@f@m0@d0@AS@E?.4?ф@3P@H|??to?-?R@&F@i%@J@U@ |+?@/с>p:>Uv@@2@HjA>ҹ@g@@H@?(@z2@0=<@eh?@ ԧ@!Ջ?m@.EN@<} o@xG@w@!A9@gք@?v?@"L@M'@+Kn@ @Ue@O@@‘?@@2V @¸@WGT!7 U@rT@{ @>I{@1e@A @o;? @~~@ @T@>3UA18@CQ?۵@NX.?0ɿk@Q@S@D@`??@*X4V]??@{"@9RgW?Ƃ?̖?@P&?@ A,qYAD0A @@8@7RA4@@X@@9@@@@@e@ @ B@@@@@f@@@@ @#@%@'@*\@,@/3@1@4 @6r@8@;G@=@@@BA G@l?J@r@|@@>?d@G3?&4~{+lܙ@\2@?p>ce?Soo?b^e휍?ȹU@x@n\>O߸:?0j5`@>o?:`хL?FӿR@ ?UN&? @ӷ?|@?82Rf@1 A@cf@"爾gT@0@#k)p@LI?&?C=/ӿBM?4? g?7 L*dx*@Dڿ3@[6@>JeJ2?w? ſdj@2@ ÿ$aVp@qw>S>?c@ @h|?@F`$>?k?;?i5@s@;O@ M?(?!?o8X? FM?Ϸ_@ۿ?iC*}ʹ"+9?Rj]6ʿZQ혾W3?[ >r%R>@@ J*@;q@>D^?>䆿u,ѿKŻ?G*P?Ŀ`v#?.!?-?$n?+60O-wľk !??+>|=xX>j (,L`Ï@`>Fx?#T>&do?OK>,?(c?˾3$? ?a½1>]Nef@c@H [?ʑ?H'޿.?w@ D?>>Ys?p?;R? >,7R?uB>!"?t>0o=ъG>+?p? [$ t?12&ÿ'MB?}v??>/>K?/U?4@:?'f~p==x>F>곿G\n>qf>rǭE@> OBA>do?ј> I.X>w??:V*a[@ݷ?o?;,=J?\W?e_?>y>)?PHf0>4U?>f,=|X?>>7 >V?g(?;cx? @*(sf?As._{?@>.=>{H>0x0X"cɾ?^? D'QwžKŽX?? %?d>>(>1Bʿ6Hi?rsb?h}?訑'4?yS?eu?J=f;>RY?!>5]);@?c>d<&>,?>n9>?@y?:6%?i&?A??[y>}h?bB\?>>wz?> ?Wv<>h?>y>?'wC>K@?J"=? ;}=2F!>Pc??A>VO;??Wc>U`"'tJ5?K,?~=u|>}v>Ib??*b1? %?v>?2>'j?A>`kվ̄>3u 8?I׿4a=-?OjD?0?;]?!>D >gn?+ >)x>?WvE\?=aF^>?7?>|?̙>=gS>?V?J=7?>{>k?JD>&=/?D;/?Y>>i2e>+ӄ=?;b͛j?6Ջ>=>2??@?{?}?qCR(u?oC}>5?#`>!ԩz?o???`VtY>?0?qK?׈kkL?7>Ȁ@M?y?>); ?30i?`+?> l7_P<?F?nUᅣ9>[o~=w*RssF  ˶? Hs? 8=>U?.>J>?+?T!->㇖=?bA?R>a>k>\$?1*.???)?Z*a5??g >$vڽ#?oj?%Mx>|>Ͼ`ue?p> >?9N?md<??߀?2D>!?W!??{}l>{(?_>A?? XZJ>̈́>=>0^?2??N>F?V$Y?z?Eg?5>yWsmVf˾4=w0?m ?Rq6*&`>S{?`><2>IΨ"=/#XA<,>R=V?b?$?!`@>W=q@>0C?A?=9>23j=-Yh?0%?ɸ?ž)f?l i>?:`?,k>>??e=? q?tR?+,??b?"n>J? p?GB:z ?>ܒI3?=?>n!?R?`6>!@ )?%} >?J??PtZ?W?>OG>߇/mO>j>&?%?-/?ZsR>?y/u?$?2-n?f̾?Ѳ? V֭S?I?>6>sԽ[4=o>Za?#?NƽϢ?)?B^8@?d KYZ辘?r!?(?>PBRr W>?@0? Be?5Fl?Kq?=?F?Nҽઽ7Y>Qc>>0>:t>-V&?.>5ÿE?@ ;E=2?k%>p>?ˆ>Q?!$?]=?#?ux7=?y&?2> >^'>_?=?R?r?c[f?S>Y>?(}x?6??&?No>?wtn?Ln>>>=T=0? ?=I>)>d?2?뾑>?w?b6?N_??2? ?M ?O?5?$>DA?>p~D>C?w?$?3~?R Z?}7?1?!l?D? F??Ps? ,? G?VU>#>k?>4N> >&oL>%ƞ?/u??)>%?3 >y>?)=[yږ>e?9;??$97?Kk?r_?>?I??O4?V?$>?Y=>>F?->֢?Z->^0r=_=>z=k>w?J,?'?F??,|??#7?S>%a>zF>e?t?o??b=>W???% >k>`$>>f+?-?R?&2?>l?-@r?=?>?!t?0=>r@>3܆?Z>=?P>' \/?BW? 3?j?#|??7=?b?`#>A?VMq^?ŶYY?5b?wd> >Cm?)?V?s5>kg>?)1m9[>^>?-.?qӍ?I$>f? )???io?'V?>V??s5?}Q?(?`?E?C&? q?c>o?C?/h?y?O?!*?-b:>9?0?$5?>b?G"Z=W?) ?9? ??%f?_ ?a}?>?>@S?!:? 4>=։>p&>>?BL?&j>?.6>>9?+?P?2?ޛ?q>(?L-?N?u9>?CV?i+>x?G?G?%??y?5??>N4?=4}?.0?>?1=>[?Fp>5?s?t=>ܻ>?B?>땊>?; ? >9.?,I$>㣽G>j?-\?.??{Ȉ?a9I?=>M?&D-?4?;v?RP?:=?wIl>˔/?]?bX?X?r?Vk?'8>͸]?%o? ??فx?Ÿ5?~?Ha?@?k?q--?$ q?3>5>/?R.?CǼ?I?;?8> ?? ?F> >l0??1?F?~?s[,>b? ?^"?F>Ӟ/>:\Q?}?I?'s?7#??&Zw>?t>ec?3 ?['h>F?Nu?&?I>X$>_D??ɖ?\>5y?jQ?H>`?J'?z!> ?hU?(?Vh?<'>>rj?2T ?jYf>\?U]5??? wh? ?H?BfX?U~?-??V?H>C@?/?|??G>?5>L=p>>#)?=b?|?gx=? ?:9>? S?jP>x?nb?Vj?I??U>~?z?,{>Ū?n=?x?XY,>V>??pN?ȍ?V;|?IM?F?et? ?=>1?2Z'>I8?$m2??D<?T?)Ƒ>;?%?A"?el? S?Q?;??/l??nj M?4N?G.;;?>[v??>o?x?s?d?{/ ?܂?6K?*E??Qu?o?a,>??Q????YL>/>?a?/? >A?)????b??S0??ď?]??С??ř??X??-??֖??oS?L@? ?l?Ƌg?qP??5K???z??c?>G?cS?F\?3i?q-??}T?H?NLM>8O?w?,?@+^?֨?h?5?:r?C?&?)>l+???.?Ȋ?cNNa?*?i>)? ?ₒ?YI?'?:?[#?r???t]?ط?@t@8(@ @)Տ@<?p?Ԥ@+0?|t>{$?f?|?*4>[?'?st>??@?qj?O@? ? Q;?S|?H|>"4?Z<>j??ĝ?g?J?2? ?k>͎?1?ϫ]??y? ?$3?ii?J`?hR?AU?~???y?L>?j{?jt?f?y?????q?;??3?ȦX?n?t??j?Q??#?6y@?L?Z?n?f?%??7?Ii?@?X???q?:?NAF?0uP? s7?α?zv?JM???z?J?g??\?N?p??-?V]?7ƛ?H??EZ?]?}J?8??[>{??kl?2?7ZF>?+nW?OJg?6?e??V?P"?*^?W???\k>0?-x}?,?K?`?i?x3?||x???v?Ti^?Mو?VV?d??A?`1?]?XE>?#Z?n8?]?/?GT?t'?}?N?s@?O?дl?X=?,?ئA?ą:?I/??ZeU?v? ?~m??o?σ????P?[??8?j?:@????L?ȶ?P??,)????v???P?^?jk?A@?H?֒?`[?P?4??J0n>?E3??=d?>^F>.?8?]W?wT8? ?J?J]H?m+??|R^? ?U?=?T??)|?M?QO?m?j*?E?Ґ[?Y??`???#??,?h>uR?roq?7x?8T?w??i??m?V?D:;?c0?)?S?.??L?ZW@?S?%??}??f?@?@K???œ?Z@uj@!i@n@EG@,E<@7?@?.?'@9>@|#@ ??@B@[?@@?z?q+@M@@q@>@e@ 4@C@N@3(@'@JDE@Zە@U@ @N(B@:m@3l@@+p@2J@z@ @@*X9@Dp@9@(t@@@( 9@ $@I@,:@'ix@Tp@#@:@(D @0@4D?)R@>'@a?@>@8@T,@ @-@jL@Hj@+@X@)@&q6@fE@1@S@@HU@[N@^7@/ D@@7 9@F?7R@0O=@HY@#w<@@/@ۖ@HS@I@$(@,}@G@-p%@2@6ݓ@Fh@Vvf@@M@?r@*k@Aj@VY@2,@GN@Tn@'@@@@3g@/&@;89@G$@e-@~@GG@.@@85@_@@ @P@(@{@S@a@WE@ @ @;y@E}@<@Vmc@cB@J@d@2~E@V`@@n,v@A@2)@U:@yk@G`@\@L@V~@[k@6ȩ@X!@/@q@B@.l@R@ @U@@@<'@9ξ@@/@i@a#@@J0@`*@h1@M@@G@I@:qr@ 1@'@jt@L>@LY@G!@,<@j@w(@6o8@,@Z@f@e@k5@u@@hE@'@1?@I@Rp@`@lM@P4Q@>@w\@@aO@d"@|?T@j@@U@g?ᄁ@lD@@b4?@Hu@z@!@iL@@x@#@Z ,@[@L@u@}Q@#^@)@I&Q@Bh@3|?@N@o@:! @}x@;@.@b@j,@K@h0@p@~;@<@`@ @?"1@g@#@PbD@$i@Q6@*[ @o(4@nI@M @Ks@e1@}U@@}m @ms@@K@L"@L:@@@1ú@FY@|!@c@N@+]@W@B@fP@NO@V@dj@Z@AYV@H@aQ@@P$@,_@|@[n3@y@<@Hƨ@zv@[@v@vI@{kE@sФ@rs@,c@R@`@[z@r@,@yG@d@sN@`^@w@n@@_@eί@]O@*@ r@^@iy@@i>@b@u-@U$@@m@O@]@Pe@0@@G @C*@@I@}9@t@@@jb@-Y@*@cv@oO@{|@?@`M@W@4O@O@L@@v@Z@l@@z[@h:@v@a9@`A@v @ee'@OIr@]@ay@l@@x@`@E%@?0@{ǫ@l@s@ @@y@h@L2@e/@~ @rh@bd@0I@37@]iI@bQU@sVK@*C7@^j@-P@[:@|Ob@Z &@Mn@?@G@ca@;@T@e@kR@rf#@gF @v%@zU@Yt@bg@G@O@z@@|@5@c@k@nw@i+`@>n@@~@R@}@\@]@@D@~@p@{[@@@ @@{r@wY@@54@@9@u]!@7r@@v]@$@U@~@Rt@k@o@zz@K@jY@M{?@@!@M4@d+R@;@RH@u_@_@}@~@@]@8@@s@xn@v@x@iS @^2@u!@6@3@j"5@Q@bq@^@@j@0 @m@|z=@@F@K?@@@#@o-@j@\(@@@~R@i@@ @o@ @f@@`0@b\@-@$@u@v@4@oR@}&@@@f5@A@@f@xw@ @@P@e@vg@@@m@@!@c@$=@@l@}@8@@!@J@@h@ݳ@@MW@W@f@~@$@ @6@@֢@@z @@@8@?@@P@@4x@;@@|̂@[O@+Y@g@m@2m@a@@f@ @q@YQ @q@{@G @W|@@@@'@g@nJ@jQ@>@n8@fi@nl@>n@K@*@~+@mb`@nD@[@Ih@b^@d}@p@C@uK@]B@u@i@]@o)@H'@U@5@*@P@Ll@Q@ @Jb@@@@G@@ˈ@tS@jL@s"@pl@q-@EV@@@ @@q@@@}@@@@ם@ce@@@@%>@|@Q@h@]@A@@@@Q@@!@@a@\@c@@Y@nr$@@@n@d@@:@Y@S@@(7@ξ@sT0@yQ@5@@Ұ@@c@y@b@}@@P @]@Xz@5@2@t@@@:^@O6@;@@"@Q@^@2@@E@@an@8@_@K*@7@HT@P@v6@@g@S@@;@@G@k@@8;@@{@'@p@"@\@yB@ @@@~@Xj@@_S@ r@lV@4e@͏@֙@@@k@k@6@J%@@@|@{@B@D@Լ@h@̰@F@r@G@@s@B@&@@@i@@|@=@g@c@9@D@;9@2@@ML@@̣@@@@hm@%@#@B@6@Ŀ@Hs@@@I@d+@}^@d@'@l@k@@~V@@@9G@@c@8@U@@@@@?@@,@:j@@@r@`@0@@@@@T@w@E@@[@@@9@o3@@@T@@J@ր@?@@[@K@@Kd@p@1v@Q@?@ Z@7[@lg@@ @ll@@)@@@9`@g@@B@0@&@ y@@O@&@EC@/@r@zi@.@'@@@ @(@O@@| @@R@d@@ @!@[@@G@D@z@ނ@s@k@j@@@P"@@Ǻ@@h@6@@ԣ@@+@ʰ@@4@@@@@@1@@@@G@U?@U@/@@a@$@(H@@<@Y@@A@t*@z"@ @l@ǧ@ @Ɖ@C?c@@E@R@/@e@ @I@0@=g@@_@@8W@MQK@@ay@@@@w@>@c@@@z@@Ŀ@@ő@M@Ɛ@@- @|@3@6@@#@@r@@xZ@@@@@@I@@/@@#@9n@%@c@H2@W@@B@/@8Y@@q@ +@@@2@@X@ɰ@@@F@@>@u}@@OD@9@ 9@#Y@M@6@kz@@@^@)@@1@@@ @ @F@@@`@,@ā@e@li@|@ @k@5@8@@@:@Q@Œ@CJ@@@@@@3@Q@ʔ$@5@@ʮ@ئ@@@ɔa@ɨ@@2@@@vH@4@~@(d@@q@@7@@@@ @-Z@M@ @4@`@F@s@w@@@@Ov@@R@@@~@ɣ@6@P@@Us@+@!@ލ@Gh@4@Ȅ@ڔ@@Ի@@@@ @Ŭ}@@e@Â@@@^@@;v@P@Е@S[@@@@ԫ?@xh@( @!@@}@ @ެ@@(@]@@I@/@ƃY@P@e@@8J@g@'@@_@[@}%@}g@W@f@@"@ @@Ŋ@h@y@@Q@@2@@J@@HC@_d@Q@ȉ@р@%@@ż@ȟ@Ľ5@@Ն@?٩@%@J@ɯ@@@@Օ@b@@D@C@@@c-@@p@@@X@@B@^@2@?@Ʒ=@"8@@&@@Z@h@j@@M@:@+@ %@@o@M@@&@@+@=@,@4@Ƀq@!@^@ഹ@@Q@@?_@jA\ @ԕe@m@@זS@@g@2y@@@s@V@ @n]@I@rz@y@@%@CM@K@{@#!@M@@Z@@x@5@bR@@=@ !AAYB@}@v%@[O@,@%@ю@{@I@ڬ@by@ ^@@饏@:I@@@Ә$@@b@`@Π@@Q@G@@I@-@D@@3@V@]U@V@@X@*@ @s@Ù@ֵ?0@ @ᵑ@ʹ{@@p4@{+@9@@Yy@c@u@*e@Ϟ@@6;@R@@ H@۴@ّ@n{@'z@Ok@3@@ @ɰ@-@!@ȿ&@@h@d\@Bf@@@@@ʟ@@i}@x2@a@zl@@m@@@@+@Vo@%@x@y@k@€@<+@@@@-@\v@@<@@QX@D-@@S@ @Ju@wZ@™@@m@@`@ɝ@@y@@@@@0U@@@^O@@@)@+d@9@@1H@D@@{@@Ļ.@d@…@G@ơ@@@@8E@]@(4@@o@C@l@M@Q@@y.@E@ʰ@@@@@HP@YJ@y@Қ@<@ѬF@@@ԧ@]@ʼn+@Od@@L@Y@k@+@@@I@ʟ@B>@ @4@@LR@'@@@8<@ݮE@4h@@#Z@V@{@@ʌa@S@@hg@vE@@i@h@ܼ@@g@)y@t@@NZ@@@ƒ<@h@@ܥn@*@ʜ)@@@ߦK@?ؿf @ٿAw@@`2@@ܤ@@W@7@@®@@@@@4@!@@N@OA@*@o@ߟ@Ӆ=@n@`=5@GnA,@@0@֢@@Y3@ @ AKA@O@'@`A)A-d@׏@X@ũ@]@W@=@@J@I@oM@R@3@r@@@5@@o&A&ģ@@J@R@N@Ŏ@7@K@ۼ@-@`@I@@@w@@|@4@8o@_A:@q@֤@2@Ѧ@@ñ@@@E@ý@/@9@ct@X@v@Á@@by@M7@@ܵA 7M@@@@@ⷬ@@Xj@@+@@}@ῖ@: @z!@@@܋@@A@'@@A ~A AщA=A HAAD@ 6@ҥ@q@ @@@$@2@@:@1@ @@8@h@ע@0@ @8q@@'@@ƭ@F{@ @@g@!@_@@ŷ3@u@@@@ˌ@ȷ@^@@:@D@B@ȓS@W@p@h@υ@շ@^@@ٿ$@s@8@ԸZ@Y@ن@տ&@:@.S@ͮ@ߙ@ޝ@T@֩,@Bu@l@娞@~@b@ؒQ@/H@r@ @w@@E@@S{A @@̾A&AM@ʯ\?u@?8AAYA@+@.@3@4@mA!@@J@@s@3@і@/@@Ѥ@ܺQ@K@@ɢ @@7pA/A@s@Q@͸@@z@@ZV@@@^@@@۪i@3@@ٮ$@TH@y@j@@'@r@g@6A@ɠ@>q@.AI@\@֥@l@SAl@Y]@(@3@@y @ׯ@AO?=AA$s@p@w@c@z!@ql@ֵ@@@R@s@DA @@Հ3A @NW@V@A XH@@ѫ@@u@@։A@֊D@@ۈ@0@w@?L@'@ `@cA Z@@?t@@j-@ƹ@ن@{@߰@Ž@ߏ@z@ʹ@3@?@@9@M@vi@@@κ@ e@"@r%@@@@瘎@;@܅@@vd@@Ӱ2@?M@Jb@@hS@ꎠ@2@"@Љ+@co@ @]@@C@0(@m@s@md@:@ D@B@@p@@@CI@@U@*O@}@c@@@$\@ @@M@̴@@Q@@s@է@/|@]@،@O@@k@ @U@}@@@@_@ۆ@@&@|s@@,@@ǂ@U@@@w@N@M@Hq@s@씍@{@ˁ@ٳ@Ȅ@@@ @@ܥ@@ק@Ĺ@ٵ@@@T?@@@̙@6@"@@|@@4@^L@@4@@A@̘{@_@ @D@@ϑ@W@ط@0c@ Y@rZA?@A+\X"lH@A?r@@.p@%%@@A;@p@i#@ì@U@($@ @'@@ _@i@5@.@Үk@@dp>mf@A h@z@@N@@d@t@@@E@ӹ@Ҹ@۔@aU@ri@8@Fb@ǤA@@wA*@@\@r@P@@@R@E@@קAM@ӧ)@݇A@-@j(@[j@棁@|Ad$A^@@/A '@֒N@Ht@>u@Z@A@@<@@$@Q@s@&@@F*A@]@@YA qf@]?GA @@۴@@U@ @I @@ A'@䆥@–@@A@0AD A uv@X@@@@ޘ@v@AA@%@uAcA;?@4A@Qt@Z@ @$@@@߯.@ߕ@AU@Gs@@<@ݥ/@%@}@AA@@KAۃ@@[G@yAI@@=F@ Aqi@A.FA5 vlq@0'A@s@?@A=@AA4@U?gAA @ףY@'@8AVA;5A o@@@Z@^U@xs@+AA@%B@R"]?k5A)aADk.A5xV@޸AI%@ݭ@:@A Y@֘{@1@*\@'@ @mo@eAGA>@%@tE@@AJ!@bHBXA̵@A&KO%?@SA@b @A@^Ao@o@{@2\@P@]g@d@@<@Ms@^AAJA@A@(A#i5Am1a!.eAM_@A<@c@ՇA6@ı?@=A A@P@"@WAdQ-@?'AJ'@@ݔ|A@!@:@zA@@ 4@A@JSA @z&SA. Aq$VA?A6@@@AxDAB]74m@:Mr@z@A @5>"A85?K$l^@@ʕA A ?}=ax@El@j@sA!UA@*@Ưh@[Y@1@@@O(@@Nj@@A@@7A eA/A-Br]¨'?A@W OF?P+$@AE@4@=@@@@@[@.@@@@v@G@@@@@]@.@@@@m@ >@ @ @@z@I@@@@@P?C>.pͥ,u qFv=?V!?@8EX?cUz?E<@y;A?.@ƙ@ ú@!>2fF؍Ja@N?@ ?=Y??EK^?@~̬qq@a?B ѿ +?ɏ?u?˙@ ?%97,SwKK@y$@ i>_>@,M@b?]=$?@o>fƿ?Z@U7O?w:@?%>{G^?QX=?@@Q>K?`V u>2>*F?῵ ?}a@m>VYѿ6@40dڿp?$=?}m?J4@A?҇V K?yɿ$l̖ؿ >Qǿ8Xy?A?qE>?-|?5I?ʿ!MGHAu Ӿv??= k ?T4f??yٻY=˟S1>? ?H-?'?V[>=c>g}`??.A?9>@&?[Džzƿk>E@C~7$}?u?\~?ݪ@ Ϟ_>N1???ʿ/ ??-%F?u(@, YSI <--+?m>?=?GٿHf@}H>j2>9 m=?c>Q???Tu?zlbs?7?g@s7@j`?> ?4{?> ?r>:??T?k{@?G>1$ɾ\Z0?ЃR?>^0G?@:?m= ?Gޫ?lr$F];@(jI>w?`?GH?}?\=Qdi ?Ƒw?j b>ZK@鸕(>i?B?k?N??)!>ƪ>c?(?>`]L>:?E@>qw?J>-b˿Hm?qdeT?@U? ;>?> x=P?-J>P`?e?ZZ>C?1?RN҃*E?G? ?mnrN@?%?.?̨?@$4v>Q?MQ;oZ>d>O>6<>Q?K?- lY>?G!?d8+?W[=;>_>q/?#:?޵eXο\޿ud=@?@o?W ?xܾw?V⿄I틷u?j?M{g?7??0\>??4 ?]?ſJ>A??>aQĿQ0>@9}??S>޾MgE?8R~?U=0L?~kC?[!EIc!>>\,=Ej?=24qp>Eq?>>Xk>։<0?@?J{>D3=Em>ʽTH>'?Ɯ>$>S?>Z>?g&>rϾB>=+c> >@>f6ۿ7@w?B򲾨 V;DO3P'? ?;0?xV?^$>io>pV >?>0??Dܱg;8?l?? ?L>$=?#ƴ?>ܾON?`.Ρ?tW?ƿ*ѿNNP=xK=[Q@>_8J?R?L?>1d=>>"?{>x)!??)m1Bh?>?k@?,=I?^!?i翂^>A?QY"> t?ۤ?}>O=>T?ԥ;v>k>c >>wm>+Nժ>Y??q>>J?5,?F?F:{>'&:>(?H>?W?,=꒓{d?>=IhP,vmG$>?n3?V>P7?C7>:?)>\v>R{|?T8Z?)>B?LG>&ML?_Q>Dl?g?K=z=>>H?J?ޙHpt0>!>ᒾ >W?n?kq?5x>Wy:>x?l7?;"OO?nv?f] =gy>I*?*??M?ADMѾo?JלB>M{?u?/Z>Ԓ? >h˭/=b=5A3??Ƽ-xZ> h?[?UB>u=R+>\7=t :?K,?SL>?oMQ>&J?G?kE=H??c>K˿2ot?50~?j~;? Q6>@h>;> r>?gU_??">/H?G?+"Yy>$[վy>5xH?x4>?C9?>5=г?si?1-\=g?򻾡 ڬR> +>н?y>?"k?&?|?b=<փV>>i^>BT=<0>I??>>?f10?AY?,̱?>)j*Zx?Dn?>?C=<&_?7?b?t ?Qī?CN?OO)Sr>p~?>Q3z [.(7l?2.>" }=>Ҿ{>\<6??E2>2>֩?#>?:'?҂/>=x>=!=g>N]c=CLK+>?0?=??;A>k>?!!>{N(>,?@NPwm?[{=Ww?L:O? W=.,ԽܷG<}>?>Uo.*>>>)pi^? ?*?Xmr?~Q?;b?<=r?J=>j??A;?J=K?F?Tu>??^޾oſu>r &>fDD?K?V1>?E?&>>?U>?0S? J?'G?>R>Ck2>a?]?ks????ӟ?RȾ+ѽtP?4f<s{?Je?>ɱ뾘 >^?:>V,.>i>᡾4 ?^J>gܽ R>g>nc>n>?/? ?R>N?+$?#]??>$h?{<τ=">9 ~3??'A>1}?-><8?K>Ҩ? !?d>PB?h!?.I>?W?/޾Q?@U>ֽUٽjy>U?I?>֧?bh?1>!>ݻ>~?A??Ք??d?o"=$>f?)=Rm/>Z7>^>%?M8?W?>s?V:]>R?5v> X=rh?xJ>ܸ ?'̍?57?>i?<>?%C??I>@?x?#?3-??p @⟾? ?`?4?`>@'=\> >6o?2K?`n z/?c? <$>,9?\@?C>>>d?>=ىiFL>p?}?LVJFN??>[A?,H??q?s?V ? ;>Lk?Qx??ϫ?@d)?Y?.?ٜ?]j;x៾.>ϟ?=?/?$=0>=$n2ʽ?.T??Sf>?i?w><>Hy?\f?qT?My>?'>5rN>X+? %>k->7->?\?mc>,?y>|?Ls?z?3>>>?5?"ͭ?Bw?gV?9>=>@T`?Q ?{օ=؊?o>?I??B?o~?~as?^>=z?0?,O?A\? X>>ńm?!?\YL>۩;?&??:??? ?+:?V?%h>rR[> )=A>`?5??r?h?>>?S:X?n=>^>{>G>N2?y?$Խ˱>>Ѹ?Q'#?0? d>#y??Y?57?nr?|x?>ߪ?>|>0[??7{D>2o>?™?i??>s>?/?eV?tz?I>:?k ? >S=6?U>q?8?6 ?p,?lN>ݳ~?3hu?r:?D>>w?)>+?>)S>?:h|>祥?cU?>>>3`>[>>K?>p>\?0d0>3/?džJ,?`\?Wy>?i-?H=>|?81T?ak?#?7w?|>M??>Z?B(?x6s?ޞ?>?.?Nl?'?r>t?|??CG?cX?m@?M2I>?>"?+?a(> >>/?& )?)?:> E>77>{N?|?a><> ?x?" ?\2?; I?Ǧ??&=>&?:?Phf=1y>4)? !?D?'??'nxs>?u?J>-t>_m>I?T}>?9>F?/s!>Rc> ?=c?2q?7/?iъ<ڠ?:l>F@>?p8?Yk??#j??pWo?!k?0Q?>I>L>"d>?O2>~ ?c ?YA>f#'=x>@0?(???p?2S?*?67>>E?ߋ=F>?E=<?eQ[?X>S?> >?,dw='+>!)>Y>MyD>2?$>?1+ m&>O?>xl|>O">?!Bb?Yi>{ ? a?I?hp>P ;>[?(?,-?o? 7>O/>F.?28??W(>b?_z?+=J>-&?D)<>|?%ۧ?]$?>?}?h"?]>->`???S? ???w3?(?>?e?2>.ز?$ >f??Jr];l?}?|F=6;?$?;5(?N??s?|?{??f??-k?9?*?Ha??_$?=͋>?§?_?Lɾ@? >>\?,ͽ?>YQ>Z3?nɤ?!?<K?xڱ?j>q???~>b4>붰?\>>_D? ?0???4$>@>jX>>k?b?ݤ?2 >h? X?E>eW!?>ܪ>D>;$?E?e?/>ֈ?e|>=>?}?>>>F?O5?>r/?"S?F=0?` ?_mu>?5Lf?8?9I?_?g?a>.???F?S>η?_?/XR? ?vz8??%$>3>?=?\W?F[?=?r]?1>%??A>[\?bB>> a->0?S???1S?p?`?C?B׃?>W=0??{+?= ?_?0l?/ ?0>X?mu>ǃj?2?->L܇><6?I>F>w>->:?q`?Yd>t?[+?I>9>Y?H?ƻ?>I? B?ͮ?3?G>>nw?r?r?UB?00>8_>?Fz?;Y>^_? W2?N?q~?]rf?9?7?S?~?Q??l?/?OS??\?' n=>=?{_ ?I|?)?U?t?A|>?[>hF?><"?(>_T>?#@?$>>ڱ?R?d(?}(?B?P/">sYp?D?_?73?9C?W??d??ߵ?f? ?>>Yx?C1=$>P>Ġ>?F5v>+?&? |?J(?s?n'?Ҿ>e?.?n?`ڋ?$??m@?(j?0'm?)>S?t??7u?lx????y?c?l ?c?.4?VŸ> Q>? >8?Q?݄8??c?eo?\ _>o?R?>?uM?v?.?qE???=$?v&?R>^4?+[K?.?8??~??#??L8?7?]$V?F??}?r?2>[> ?;? k??^?Y.?.?<>A?G ??ܐ??@>/>,?sG?s>e?P?K?@>?R?? X?NX?m<>z/>bu8?Z >??־Q?- >s.?sv?z>>͕?Ut?{5?y??B??|?!?$5>c?m?X,?f@?/?љ?Ͼ ? ?nP?:??/?}Q?Q?0G?A?z?L?T2?Q?e??Q*,>)??6:?X??i?b?780?^?aF?b?!*g?^I??w??t?%/>=??ϗ?}??9?}?a0?x?+??6?7?(*>DJ???cM?x ?M?[?R!#?T?\?oT?Zn??n? ?0?#,?Qg?S?S2?&?L|?l@??`?{F?Ӣ?`?0.??{?ro?U? ?' \?|??pQ?e??C ? ?4?v?Z??} ?Wv?h?T?O8-????8RD?Gm>ĕ;Q?bc?Ճ>w?w?*?J?8|?lj?q>֏?573?VO?#???M?1?ʇR@AFA(Z@:)?牪?x?B?g{??dp?M?.???Ϲ???x.?%????O?k??|?Ka?? ???n???S?^??+??m8U??? ?:?|? 1?z?,?Β#?b?D?YL? ^??oO%?s|?c{?J?P8?{i?P?m? ?y?D?&?FE?3?P=??o?=?(? K_? ? ?}3?BD??V?.?2?u ?q?@p?d??+?`??G ? &?M?F?1?LS?, ?>?Q?Й??~{??)?c?d?-[??W@T?^E?2?o?O@r? ?=Q?|??ޛ?\O??D?N"??-???n8?9*?)r?a-??Zl??3o?X?N?a??v?a?'?y?L?x?q??AM?uHy?ק2?eM]??D?gDM???ַ?,?Z7?̄????,?S?4??~?W8@j??8'?Rr?3?(8?f?@??h~?Ag?^1???I}?~n? ?ef?}lr? =?b?2g?!?C?t>*qL@(@C¿?51@lh?۳??,?l?Z?(@?5?w?cI??E?ɼ?v?.?@?̳?Ŗ@-?7?.??s@ ?rq?I??ʥ=?#?? ????ޗ?Eg??ʏ,?h?w??u??ݜ@ f?פY?%`???Š]?Z>U?.????bK??? ?L}@?|?r@ /L?h??i?k????Z?O?N?d-?N\?Zs?qP?%?v?N?AL??\?9O? ??? ?\??Z?J;?P[?K?>6???"?ԘM?!?o?ag?W?ш?R??????P??C@ ??]?x]?7@ @ ??I?_H?P?G0@ dk@ L?i;@#?G?߆*?? @,4?BS?|??|??T.? ??&A??Ŷz?&???O??ٜ?q?sŹ?? ??sP?p??U?V}?X6?d?w??ˆ?͏?_?g???7gS?KW?m@@?9?_?o??WV?Xt?}??Bܯ?;8?D?`?@?d@vN?T =)?<@?*?Z?2Q??T?I?[>?/?R?s?}{???U??/(??O?y{?+?w?*? ??? ?V???b4?!? R=?E@pQ?&??W?s?K6?_#?A?H?&i?Θ@j?/?+$?5?XN?a??x}j???$?R?*?g>@V?y???g?H]?y??n??? ?U??7?11??1?uu?u?3@@-?O?_X?ڌ??>L?=@@ ?Ep??cq?n?"?j&??U?ʈ??R?s??k?O?yP?H? ?P? ?ՙ??e??~;??p?%?@ /?U?T?}D?U'???j?F??f?}z?-??v?ڤ?f>?0ۑ>)x?@ V?W8?????{ ??۴?F?s?c?lb?X"?@ N?f??@@??_?d?ƞ?S??ӧ??n?C???0R???[z??Ϋ@A`@?e'?$/?????>?V?,??o|?c??%?@?P?g??f? ?)@E?H#?NS?Х,?c?Z?@?ni?~? ??ҥ?2?[???ة?i?Ƶt?^?/?xn?e?(?X;?S?d?ō?ߡD?%o?VE?I?M?{??zF?t?Ц?["??ጭ??xe?F~??k??@ *??;? ?d:d?r>??Q?B?fV?й? H???@,$??{)?:R??^?c?^?%?y?P?`(??6?i?@??3'??j7?ء?y??R???F??P?I6?~W?q?‚3?Ƭ?8??0?'?;@^8@'e?'?6-?ɱ??Z??)?E?+r??M??<@??ry?V?^ ??'??E?0J?7?ɲ(?%??.B?"?G?|>?,??oT?u?@F?s?~?(?{f?c?;?a)?t?J? j?\!??#@?3?j??q?@ !??$??]@ ?Tg??e9?K?ؼ?K??li?j?l2???˫?Q?`?n? ??S?@ =?l?~?t?v/@{y@9w?+^?ԟ@??j?e?Vm?Iq?L?֕%?r?ch?>?\?@)k?1D?G;?5 @G?#?T?P ?h?LZ@??^?NB?? 2?[?R?'@ed??ʶ?j??؁@[@F!$:>.?{@=@(:??S?t???f??@n?@/V@ #? @???@RJN@=iv5?C@4@$?`@?i??i3?z?Mr@ ?=@r[_K)???0}??̀?@@,$?I2$?@Y|@@@F>>?~?I?е@GS?ĉ@?Y?/o@ R?@"|@?IH???@u?1k?ב@??F?c?\?e?H@s?2?|-?D?Q?Գ!@ _?K?A?=????u?%???m??ᥓ??X.?i??Q?M7????.?? ?QZ@?>o?6F?x?`%?pj?,{?"0?ʁg@??.?=!?}2?}???ѐ@??z?H?X??d5??h?8^@7D@JG?UQ?`t@3 @:٤?|Q "?wS@ G???o?@???ߨ?\>!?h@n@?%?f?!? @ ?~@ $(?ß?b@@)q? ?\?b?@H?+?Т?DŽl?ا@#?;??'??P@ ??q?ĵ"?Ĺv?,@t?J? ?m???2?)?T?7?u?%w?aVJ?ic?С?̪?@R?=?s@n??{f@(@@ ??$@ k[?I@@@ ?? @q?^?c?@.o@Iz>L<`@h?@2t@ׄ?-@#9@ p?)?|@j@".?٢? ?@?s?"l@˓?6?;?F@za@Cѿ?M@,J?~??u? 4@(5Z?-?i?G?`?ڃ@IN@wS?`@/?΁??VT?y[@ R????t=?{e@ @@ @A*@G;@,y@f@K? |@5;? e@@? ?Y1@7@ ?83?%w@L?l?@-?@>?ߥ=@#??j?@U?ߍ?4??Kd@m ?&'X@:T@5/?ƲP?[?4G@I?@k5???;??g?Ş?@op@ {@1@ ?:s?S??Q?Ś?޺]?K@}@f> 3՛@?\?IC?H@D@'?!?@ 7@*x?i@ Hf@y=z]@J?>Ɔ?l?މ@6?? ?G??&@@ c?_s@Aʊ@cU,?@??@ ?/?]?v?8?˻?nD@?ݎ>z?@ @@m?Ԁ?=?u?=?k:@@? ?!O?֢@ Ɔ?t_?F??H@O?;:@ `@?v@ @n?U!??? ?P?w?go@X/@6?o?FQ@w-@+1??)T@@??V?LJ?ȃ?0?j=@ @?~?T?Ƀ?9? ??Ys@k@k@O?G?xo@? ?ښ?lB@)?o?*@ UI@@d??Z? ?,?/?#?G??0@=@@@4k??g?@-?݆?b???d,?oD? ?<@7P@E?6_?ׁ@b@O?]Z?x?Oi?D?u@3@ ]?u??{??O?f@?@" @E@[>(n^@@uc?H?l@4@^@c??C@?L@2?%n?`Y?ʆr@:%@ n?7@?I5?E?ԝ?Y?܃@F8? ?kG?@U@Tſ@N:v@߬@@?q?=@0>mJL?*gd@@7@g@#?@E? z@"M@M\@@ z?I$@Ib@[;d~l@@?s??Z??@ $@!> ?l@'@?H?Ò@R?U-v~@t?څ@ N\@Q?J?.x@ o??ł??;8@ #@^?@@@?Q?_?B? (@@?ǣ??֐??i*??ܖ ??M? @G??f@ ?i/?S?5?#?M?@{?@yy???!x?^@@w?@@ ?ʖ??ӟ??jj?΅o???b@ @>?ax?@ ~O?n6?P?۪?o@ TR?I_@A@+@,@ 4?@-?G???ի@>K@Q2@ ?+i@s?@AG?@?P=?΃@:;>> A@?{*@??'?؜??/?o?0?@K@9(?榒??@???n@ ~?됺?G?s6?n?aP?Z?Z @ @?P????]?$w?@@D?P'? {@ ]i@i@!@B@X@?b*@@X_.?H@F?9?h@.b;@$;b?T8?'@@k@b?q?O~@@+y@@X@,;P?@'@@@p@ 1@gd@6\@@ @w?s@Z@U@ >s@)g@Q?n??@@){>|@%@j?L@d?Q@^@*@m@FYL@w???*?T?&[@c?|M x@Q2@ ?@?A+$>K*@W?Ȭ?"?AJ@@??I@*@ 'p?݅?@~i@[ld?6?tD@.n@@PPl@@@s@"?Pa?89?E?(x?ͥ@(@i@@N&@l=? z@.\?G?%@nP@m#?}@'?P??ʐ??n?Z?[???Ã?Zp@J$@#,@1+@.?c.@@D??:?ED@ ޸@=@?\!?|@O@Q@(~@@g@)??:?V?.@????4O??c?Ɣ?V@-ï@?j#?%Y?S@Y@??$=?[t??{@@C@ ,?Z6?I@k?@?ϴ?h?l?l"?Ǽ"?͆? H?H?>?xu?y@ md?j?/@ ?E_?Ԏ@$^?:D@"?U/@h@oa?ےQ?ׅ@&)@.?a??I(?X?'4?v?呛?@vl@Wh@??6@'@]E??@ @@@|?@ +@@`?v?ո?#?E@*%?0??[>?? ?,m?R@]{;@K~?B???dB@ r@Jnb@V"B@6K@+:@-@4KC?@7*@B?@py@?D@,m?tX@8@ϿrdAp@Aˈ@IO??}%@( #@~W@@@`@Z@G@ #?Ȋ*@{@g?1M@~Y@^@g@7[s?9?͢@@n?6@ H>@(@@ d?k@R0@|??t@V*t@Mi?ma@@;@.O@ ?>*@?d?D?@ m߿)@A @h??op?c@.$@03?;+@lu@u?W[@ĉ@ @@@1@I@>@@@??r@ ͠@!bj ?@@$W??bM?/?դ?ھ@G6@Y0$@a? @@@z?[??ꁎ@@?{P??͊$@@&??@V@?~@%@jM?;@_y@s}=@@#>@VA@8Z?@1Ԙ?P@C@d)?@%D@(?(3? ?@ ?@p@an@'@>}@&??8@m@2 @4?,@!R@?@15@v@*@c?z@+b?$@$ @#A?p@!@8>Q>@ @@=@/F.@(@S@? @@#f@a?@@!̹@* @]u@N@;!@ރr:M?@n@TW/@@eE\o@\@wtu@9?s@@@?R @A@)v@@:@W@Az@#@F@i ]@\s@8@@b@*@Ev@*@$@6WԾ@F@R@d@ |@_(4?=? @;]T?@1\@`D@*0>Ry>,@8t@u>5C@E@2f"?@eo?Ž>@Db@qM@y@%E@@R*@T@ѿ@W@NI@:$?4w?@[@hљ@0c??n? <@}@7s?2v?@@#o@Tv@K@h@2@t3~@V?7 e?@(b@v@lF@@@s@$@@?(@L@x*@ @Z@pF>6A @oQ@@Ep@@ѰK~@V@@}@%Ӳ@7@r@J@m@$ܾ>g@f@U@:@X9@S-u\<_?m @ A|@ʃA^¿0@d@jAfA#)1AUr8?@@@m@@P@@4@@@@@i@@K@@+@@ @{@@Z@@9@@@@@c@@A@Ǯ@@ʊA A @@@Ĵ@(4Z @Ss\#u?O@9$.@- @Ē@G=?H?] @>^+@"`@r@#@#jv@-KA7@,@ e@@^w?@<@qډ>n?sZ@> οv?E6@I@o*5ŭ"@D:@/ Jr@7C@ƴ@7? @y@x?>A8>< @M@ʼK@?@>@@'3OGY۽@ @B@=9@|{l?T??C@@_M@@23h@p@v&;*q㔗>?C@x@GT@@]׿f@b@7@NS?_?_5@l@@@t5@G@A@YE+@g7?]_?b(%@>ޟ¿f@A?>bʇ?B`@+@h/@@: O?Yr?g%&@c ?@?貥Aa@L+???B?[@@eo@m@sG?@Or@#k?D?t@, @w6@*k$@]p@?W@z@i>J>z@z@@o}s?@A?@Gc@dQ@Y@)@#@Y@-?|?O?h@+@K?{?*??}@ @zU@z??@3}@4@P@8V@@L@/ @|> Ғ@AR@]]@VN@9@&@~sG@J?_i?*'@N@ J@v@@VG@??>*S?d?{@ @ @C? }??@%H@i|@@B?A?@&,#?@b@L^??ț?1@@Kv{?p2?v@U@i͎?88?%`@`;?O?@@Y)?@X_B@@^S@ ?}@@ϗ@8?>7@@C@?{ܺ??-=?lS@!g@@-@@f@r@5@ls@d@6$@Q@B8?b@@;O@b@6w?/@_@n9?9@ ~.@K?P@M?@@@=@-@%>v@VV @?Z?h@6@l@S|@%X@ox@D@,E@@8@@]W@]?bl@e@ .??n?"W@ ]@;@5F@@=v@Uk?@|]@Z@p@@mR@eB@\ b@2&@@{@,^@tt@~@1k?E@f@q[@d[@*W@w@"*@@ù@PIH@RS?o??-@b@zѳ@$ _@I? @)}@PRN@P?>@@&@"D@{W@$^Y@\H@R@vU@G=Q@U#@4@Y@Em@x@^P@5@ۨ@Q@@g5@?Z@'\@8xi@1@2@E@15@2K@U3@$@N@I@X@C{@@M@)@8a@@+4@" (@ @m\@S#@7 @d7@@9@AX&?[@Pu@:X?@?@,@L?n@R=@7?<6[@A@v@O=@cU@+@@W@7[@b@~@wD@FP@/38@!@e@~V?1@-f@!<@j]@@o1@N@.-M@Aj#@B@i@@VE@@V@o@ա@%@gx@'@u@E(|@p@@^ɖ@U@b#@{@+@qh@9@xu@@@@m@ @@gp@L@k+@~Bw@!@@?@^@{J@$@<@.c@1@@@8@cֿ@D@R@@/@`@A@ @~@]U@s9q@0@G@P@g @h@ys@*7@IP @?\@׮@9@e @J&@Qm@Se@D@>e@ZE@f@@@@`<@hb@}2@t@s@Ii@x1@x@V8@@@\@z@s@@}1@X@@RxR@@@@@P@g@wb@@T@VZ@^:@e:@]p@U@y@@(x@DG@t@oK@xS@<@zA@q@@@@3@@@@C)@6-@@Q@:~@0@crL@D@O@ T@@oٳ@&j2@8f@|@'@ic@mm@b&@Sq@Y@@@F@0!~@M]@_@ @V{@@@ @G@i@ @B@hS@is@l@0+@ZW@@sA@O @ˢ@P0@@Tz@@@r]@M@yA@[@J@΂@h@G@s@E@a@!@;'@?O@@u"@@@s@c@N@@G@@>x@EQ@^GS@p$(@i@Z:@*@{@p@l@@o@c@|@b@@@I@m%@];@@A@b@)@@@^@@&@I@~@ek@"@tQ@%@Œ@s@@b@L @1@j?@R7@g@2g@I@mPs@w)@wz@@Ā@@@@Ϭ@ּ@vz@x@y@9@35@-S@h}h@O @r=@@@[%@kDD@{@{g@j@|@T@A@c@@@ug@i9@B@l@@}@6G@@ _@X@N@$>@y*4@xJ@n@X~@z@uw@B@@a@{@V@5@`!@~/@_=@c@P@Z@t@@8@T@@@@E@m@\O@j:@,@0l@ @{@"@^@K@}q@w6`@C@[7@@@x=@ @n@}@o@*@tc@@\ |@I_@@n@l@[b@-@_y@{Y@wI@sV@@@\,@T@`@r@@$@@@@}L@LY@@Y@f@b@s@@@v@ @2 @@s]@@d@@O@@*@@v @K@R@m @ҕ@ @@F@-@ra@|@@v@fJJ@2@@@@o@a@o@#@y@m@@\@72@~@@tw@S.@6S@Z@@@( D@D^@w@@f]@L@8@@>}@@@@xp@@wR@@b@@\@kT@\@@R@@@ܛ@t`@1@@ @@uy@@x@_j@@v@@@j@h@{@@@t@4I@[@@@<@Oe@@2@h@@9"@+@ @@@@X@@D@@s@!u@͎@fL@@@@L@@W1@@y@P@=@@@@9@O@̰@@ʰF@@Ұ6@@Ϻz@@oq@՗@29@A@@؇@͒w@̳@ y@|@@r@~@[@%@?@+@@V,@@J@s}@S*@b@V@@L@ǰ@n@P@Ÿl@mf@"@8#@΍@c@/@ǣ @ܻ@ɰh@՚i@=@9@@i@|k@p@֨@ϒb@^@D@ P@N@@^g@h@@|@2@@p@B@@&@V@#@N@؏Y@@ڧ9@э@~@M@o@F @A8@C@1@ç@P*@@w@@ͭ@r@@8@ɉp@@o@c@@ʨ@T@x@@@>@@(@@i{@d@E@@rs@۫@@@U@@ͣ(@@E@/@@@@U@{@@3@N@@AT@)@@l@@@@D@@@@d@4@ @֨A@@@@T@@]@3o@N@g@@>g@[@eH@`)@,@zv@p_@H@ޥ@q@@@@@ş@Q@K:@K@<@(@S@@a@=@C-@㔸@f@ϲ@ٗ@$@;@ 4@0@ @kL@K@hC@h@ǔ@+@u @ @!@f@@ι@D@@@o@6@dd@s[@b@Jb@\@[@u-@9G@N@lRZ@g@W@Q@Q@'@@@@(@@@ @@J@ӌ@J@О@[@4@k@̵@@@;@l@Ň@ @Ы@7@{@L@ֽA@˒@b@@ӏ@ɋ@@^@8_@G@@+G@#@@:@:@@@@z@@@@R@h@@z@@Ez@N@@ @ @S@)@"@qdAA@%@`/@W@ǜ@@ cAMA@@A$sAA)l@xA A #@oKA1A@@q@%@g@@@@ c@6@b@@ˌ@_-@@no@'@aҍ@=̓@I@Zd@*u@]+@)z@Κ@0{@]@b@*a@w@@@@̐@:@@51A=AUAA@)@O@@@@h@@̽@@[-@!#@k@1@JU@^ @omj@`@D@_:@\F@@y@Ʈ@܏/@A!@ A A,A*UA#A=AA&7A!4VAuMAAIdALA'v]A,JA!_AA!cAA#AxA+A2~A)A4IA7A)gA#^A04nA"^XA&A#A#*A.AA4A; A:> A5A:CA9A(A*ZA;(A;:A2TA'%A2AA=YA9eA)M_A*A6A>uA A$QA/A'A8A)A)A3vAhAXA(dA&A&A(W7A(A2A0aA9A6HA+pA-A0BA?fA8A7̙A9kA&ΓA'A/#A+#A'nA#A 1A#iA .A AAYiA1A.QA*mA1A4oA4A<AA+A;A2c1A.-A)RAA%A1A, A$NA$#YA"A9NA&y'A @aAOzA'FA'I:A+A*auA'0A&A$!/A&I-A6 A*wA-8A.cA*&yA2VA9m_A4A3QiA+כA$A%A*A+&A*A%A.VA:A/8A2/ABAFA9A4uAEAB;A1^iA5A;?A:LpACy$AN$A>#AIqAA@A-?AA^}A5MA1TA6EA-vA"YA-l]A0'A%.A/9A25A*#7A/?BA.A3hA>A8A/A=jA<A>$A:A=vAHALHBAB&A:A=WA>oAOA5A%̾A4IAH(`A4FA;@A;]nA6A: A@A2kJA7tA0A-A-A#3CA:wA1>A1/A,ApA!+A'AӆA0aAAmA AQAAfAAaA A@AKA* A:A5gA:0AKAMN4ARP?AQt?AOqAUxAB2~A9RA@oA?A5t A6&AFeASȽA?AB!hAXlA`{FA[AUؚAf$MA[%AdApTWAj0Aa=A_ΙAg]AjAd AdAqFAcAaSAfAgOAhmAo2_Ah\ATɨAcArBPAiZAaAd[AbMAi4*Ak^AgAffAk9Ak>AqaAn<"Af AkAhApAd>`AV@AU~AY|AcA_;ASAQbAT8ARA^\AaFA^F2A[OEAb AmA[[Ak9AlAkzAtAtArAzyAvEAwXA:A|PeAAyAyFApLA^AIAcA&AzAyAụA`7'AmA{An=Ar2xAyxAyHA}Aw"dAYAwA7AP"AA=AAAxAAAAMAzzRA~Z$AbAAA{!A rArAs An|AmAtAtAmdAcA_vAhAdATA[ Au~AzAh,AuArYZAnLrAuAwuAyi[AuA{lA)A|AwAjA|YjAApVIA~A_A AbA) (3@Bs[A}A AA(A_As-ASnAALArAdgA!sAKAA^ArA9A{AA40AvuA =A@A}AiwA?CA9A#AAC!AجAAA0A]A/AmA AAA AwA{AxHAAwAs /AxGAoAtOA^AApAAXAAAjIAAxA~xCA}QAfAsAqA|THAx_Ar4A~zA ArdAtjAA~'A#A5AwF/Ay.AcfA{A.AArA֮ACA{A A`AtbAv^AAz)!AA'5A{77A9AsScAuuAuA{LLA{/HAwA~YA~A0A[AKAA~cNAA@AD{AAݲAptAA{YA}AAA7AAA2AA.@A4AA!AAzAAWAlAAgAAfA?AAAAALAjArAjA[AAAzA>AALAGUABAOA uAAAAZAyA{AGArA:AA3A8AA4AʚAA A+AAHA6|AoA/AGASAAAA{0A#A%A[AA[BAAA9AcqAC"AXAްA!AmAOAAh]AA!RAKAKgAAAUA0AuAARALAACAAlAZA+cAvuAF-ZA]PUAA%AQgA5TAfsA[@AyAjyAa->AwfAn| AgApPAJA@Au/AA}E_AAA}AA7PA\ALAAWCAAAgAGAAKA,AArAATA^AŨA͓A AAAԫA&&A5bADAxAAOABAAJAA1A-A{AsAj\AbAg1AztAؽAA+A#AjAA+A!VA͠AMATA,A[AAJAkA9AOXAAA`A}AAAAuATA|5A0A)A:A|4AzA?hAρABAnA~tASAuIAxOAA7sAA/AAAAAcAAAtA~AAAA2pABA*A?ASAcaA8AF2A\AˎAMAAYArAz A،AA=A;ASA7AA-A6A00AAA A>A.OA A6AA+AAAC\A8AIAAdeA/AAAeA(AAAA?1AH9A>5AAyA{JA3AXAA;AHAصAAAAiAFAȓAu+A1AA@AA.ASA8A#AJIAAANAoAGAtAoAOA!A+A[AAACAAAAVAp6AqAJAfAuAsAAKAA[A%cAAGASAEAeA4AAA6FAA8A3^AAkAWAiAGAWAXAAAA[AwA'A^AAuApyAAA:AëA A5AaA AԡA߽AMA*A9AA´A'AA,AA"KAAA2AtAAuA.LAj&A"ADAEASAaASAAA5AA^AAAA/AAŅA7ArAAA#AA)iAՇAA}AA AA}AgA%A^AA)A{AYAA`lAA7Aj AR*AxA@AARA#AEAAv)A3A\AASA?AAQA0AA3AA:AA4dAhAA8AmAcAWA}AAAAAVAp:AWAoAjA?AZAAAA A"A AAaZADA_AoAAnAALAAA'HAyA+A-AHAO_A^A# AAAAAAKAA+dAAJArzAeAlAALAџA%A=A%AڜAHA^AeAAcAoA_bAO A"A#zAAnAdAqAA)FAټAnA A'AA8pAnAAcAAfA̫ANAA.AAA9AhAA=uACAA,A,AAM+AAA.A}AAAXAwAWJAAAsAAAAbAA$AABAzA]AA@_A!A9A8AAYA;ABATA=AA>A?A˚AVsA0AAKAAGA%AA8AմANAAAAGAA#A_8A4A&AAAAAAGAΏAAAQA}AqAqVAuAsVArOAAzA^A1 AAAA9AAJAAAAA"AC>AYAnAgAUA}A{ZA~A A̜AAmAjArtA`ZADAE&A#AHA7AyA]AAAA8AȮA!A]ACAAA!AwgAͮA$AAA)AlA̬A BArAAaAADATAyFAHAA.ALAA2AA~AAAAAAAAA%7AyJAiAeZAATA%AβA+AAq AIAAALuA ^A^ASAeAA!AbArAkAADApA~AA`AA;AEEAnWArAhcAAkAc8A;AhAlZA!AA(_AA&AMHAUAqDAA A AWA zAeA:AcA=AAA[qAwrAAOZA:A~AA%NAA-A0ADA.tAxA,AۙAAtAAFA6A!VA@AEA$A|A7AߓAAACAXAA)AAAASAA@AOAKA3ZAMA AA ArAciA!wA:AAYmA-ARKARRAAYA2AHA)AA||A[[A1nApAAT&AAASAq;A(A6ACAAAQA+AfA)A>AAU AAADAAAAʥAzeAveA2AxAKA-A~AoAC3ABAA$AlAgA]BAAAwALNA,wA7AAmAAA A;AAAAMAsA#A-A+AsA AA AA!A{A*AA=A^LAnmAAAܪA7A]AsA2A#AA A ANyA/AMA4ADAVA$A 2A@AAAAA$$APADYAAAJAAVA7YAA`PAeA{A@A#AAA_A0AjA&AlA4AƗAA:A AO}ABA\BAADA܍AA-AOtAAAvAA$AAAAA+EAGAȨAA8ZAAAxA{A AA]A6A- A8AAArAAAIAeAvPASAHA8ACAAjAADkAAAAcAnAPAd/AAAɷA'A5AA#AA AVAuCA\AA4AGAeAAJA_A:AqAsAo#A AABABAA}AvA]zA A8AA+AtAA]AAA=GAAAAAKA[A=AKAA3A:A9ASA|#AA{(A#UA AA~AEA*GAA AbA\A{A(AA]VAAASAA %A+PAAr+AfAGA6AcAA@AhA!B1UB+}AIDAAA'3A1A7AAZAA_eAHAlAAfA3AA%AAA_ZASAAFAYA|A\A~A՚4AVAqAkAAXAA}A%ACAAMAAAAFAAEA "AAA*AjAHAAJAbACA[APA=AA~AxAfSAtHA-ACTAvAmAFAAUAAAAAkAAAYAeAAnAA(AV"A}ARALAeAAA`A~A_xAjARA5AèAiA#AчAѓAAA?A^A&AjAAKAA|AvAaAAnA$APEAAA_AA=MAAAQAaA]AAAA,!A+A߳A,AXAPA&ApAuA6 AEA AZAuA ALyA$AAvAntA+AnAApA@AAAYA:AAAAzAADAAAA6A*AeA{AoeA{@A0AjArAoA/AAAA*pA2AoAAAuAA/AV AAAA5AhAV AĔAA AjAAYAt{AA AAhZAAƾAAgAAAnfAANAGAeA{AmAAЮAA޻An7AiA%AAMzAAAaAHA!nAnAWAqAǂAAHAXmB @AdA@AK,AAhA[VAAp5A19A0AA2bAAIA8A;AAEyA2tA|AFANAKAA.\AA#AAA=A-A\[AAE^A$AaADAAA:9A,Aa,@,=A5ArbAc:A|AAGAAkAAՕAJoA?AWA~AM,AWAA4A?AAAAA_AJ3AAAA)BAPA#AuAAAA AAAAA-A|A; AlLAgAA5EA,_=ArA;A`lA0AMA%AA!A_^A?aAA_AJ|AdgAAt A9NA6AyA6IA+A"Aa%AAOAfAB@A^A7AKAa@A~Ap5tAAA4AS{AGAAk;AApAAMAc8EAAAAAAA1AAA*ATDAH AAA<A8hA?RAJAHAIA.PB&@ gPaբAÐAVAl\>Ay5vAAo? m?`wI@ev?,>̾d>(a>Jqi&??o'>h ߐ-?K@|s@:e~@ n>?> @Y? C:@ ?9 ?Oa@$d.r'-b??Kb@A@1,-@ #?ʫb?׍?ϋ @ڥ@ ?Xa:?(?;fy?yR)V?@$}(ȗ@v?0`= _>0&zo޾vo?!?qm@3?=K@0ȿŏ]??5@{ިa@?P?0?X̵?@y5?NM^?QU>>VkAf/R>hĞ=? >ۙ 8s?8Y%?m.?@?w?D ¿7E;kt?PG?fLFȿ|v0>G?iͮC-??^?%k? V"Gg.Ur[M{>G?I@$!?YCx5?>@ !?g[.>awFPQkq-=$P@zk;Z>Ž|Dÿl>=>Կ7>E :?7 ??.& uw12 =JǷXK=BM? i'&rIԿ^ REYO|{?'?H>i0?οV{#?>*9c*?i>#& 쿪ӱ]?/gK!-?|a?qu=*|?͈QUF??G?i?*? v?Zr? X?;k= >`?R4J?fP?``zO9?s?? l>]?9xOn?X>;?e&?K̿6?D^?^?"Eѧͯ>,>2 ?|? #nmӼ“?@?uVz??)>c@X>@4mٿ>l?-d꽾`=W>{} P?6b??e_':pg[>߸uOg?5?&<3jTI>O;b??{R?Ϯ?5F?o>ӎ>?Cl>}j?~?$oo61Ͻ\.>Ŧ??Q ?Ṣ/>vֿXx>o3?Vſ:[?>Dβۿ3۾L?>h>m`W?~MK@U#>N#?G鎽I=4.R>CO>^?p"0ϾӢ ?=*xظ= ?>}n»h?s> Lоʾ#CN=†?;B= ̿2.V?:?)_>6zD?>o4B A:dh ;Fx CM>*A¾_iվ?!|>>r.?iJIxj>[Q_<ؾ&N7{@>e?)|?'8'}>_<. A>Id?7k(>t1>0u }ob>i>dǾz`½Z?I=MD>(;SU?s>ә?F?FY>BнR< x ǿ_Y Jk>n>"ν{<Ⳓ>V>;?w=,gPT=̾>L?>86<_g]zGq7,"淿+>s?VID >?WjT>>j?*"w>s+ml?z9eo?/h?-Z?~=>e?`E>2ƾt|T?9Im>>??N?1ܫ>H>`>]þd(;,??dG{?o>*> D>L7??~>f& н=? >l$>txSoI`?.?% Ir j->7)>t׾%>]>d>c >7?'3L?0]<]>૯?&>M"> )=o??=i52&>>#Է@?G=?P$B,>qV?#?z?G? DI?>@:">WiN?o3>N> a>x=J>->Ъh?Ⱦw4O_q?0/>?2>>#?C`>.ۿ $>i<-/O=5Nc>??O[>X=H,=W??M >Л?0?n?[mWϽ2?[?GҾ38??s?w?^=q?)Ǿ6 \ղ>8>?ڿ(\>AQ>e`ũ?4?*>]@>l?t?^ ;x='x>?Omv?nz=_zLP>>eƣokf>O>.\$?/> c¾"6> >Wz2>q?0=? ?=NV=q.?8>,?Ѿ5Q>7?mKQ>>=?ɅD6?jR<>O=$?m3>p?"8b`>>)?Un?8>>wt?R\&b#?9>i=ǫd>-?-*;OVNKxM8?>{F,$!b?RsoRAdV>y&?=C==Ū[>t>s>>.>g>=Լ?Z=L*8>m<-32V=uj>?>?#>>#*>s>E?BY>NHɾp9>X?HZ ?gU>$>/qX>DD>>UZ?23'טl>m;<6V |_ǾE%j>Y>1?"_fV?klm^?;q>??>@b>DW>g=KR>u?f>7? ~T>ܞ<>>|V>_>=IEZ> M>H=M>>RP>LVRIj=8?>> *=>7>mǾ޴=J 9oIyELN{f*ɻC¾Ιc|>f?a?L+^En>$>Y8>і)>eMV=ʑ'>7¾Pj,>w?+?=`? ӿ_B?>}CPF>-=8W>9n?B#g<@=Zy>e??K>8w>&?{=-ľ:'n>/6>>i??8">ɿ@`3??>o>hQ?/?6';\?>V=R=ѾɾV>`=n!ؾC>e?W_->`>,1>M\> C<=@i>x>h>}>a>NP=S_,>u&5!1>iC>>Y> >' > ?W 8iѾ @>&t>,=s>;D7]>$>B?9|>P.G=Q>?XC=>b)>'>1> }m>\̈́??z??>@>~?0>ټ|>W>~'?>nݵ= Zd?(I3??_=򼇇>>kpߐͻp>>EQ? ;?W=B|of>%>|??-ltۇ>T,>]?#ӿ?IT>Z3>0Lx?(=>zXr=G>O?=Q;i?v^m?=";>l57?nK?F>$8#>z= >ԇ?9 ?{V#?e> >z>>h8<1ྦྷꋿLr>p$,>Hn&̿!e>)?#>D^s)뽶AD>V>c=׾?&e?$莾$(==L5\g?% =g?I bb>f?9B=GG0>:]ƛ->䥩?>Axqbq ==ޯ;\>;z8>>wm? :? G˔>x?֟>=>PQ=l>2?I>tcH>ٷ?3Nѽpq>k>'1>葾%>1F?(u.>W>Rk=? ?{>'>6>? ?<"=Jt 4Z>;=!=s>_`=>@@5;? >?aRb> ?(>;=^=Ʉ?=O= t=Mn>v~?d?V]&>9C>^=\<0<o+# >.?J6>Taa>>.?6L> hw?0S>E@A?N?I\>%=}==澮Ҍ?6??R?Bz>>?c"?8>KW?^X>j>?l=d?;F.?ց?V>P?G^(/>P>̀>?=W>< 4gK>J>ڕTI>>>7U?N ">b?'\=81U?Bi?6G??P>=?X8b=־b",> t`>Bs>)m=">@*>>ǹ>A??A<}>Y>~sΜSf?-<>&k?+o?i,>Z=s?#7>T@:>ǾI>?,{>~v?>?D>~ͽ>z6>?3?->x?>.tҿ? Z&i¼/|R7>-O?>>}\ }̞>du@?UY?=>@?&*>2&>z<>Ѽ>>ax>=?>e>?>>?>?E?I׼=ũq.>l??%j?;?>hC?F7&???y"h&?Bg>YzX<^U=I>Z?/=<3>d>к[%{ˬS6Q 8=|-<4$>|o>lM>J>ZC>A6G?g?]F>^![>Oe?4}?o>rz>z>a>x2>si?U?Ѽy>>">%D< >G?_z><>*?`>?K?<}>(>/?_>>7!\S>sR>+?Z>)Y >>I T=={?!zp=ۧ>@>;>^~?|d>_>xn? >:sn>0?:? ?&>NLg>:H?*?H>+Z>ɽN?8j9>;>? >f">.4T>>>i_:>?3.?qTM]r.?TS?6>@e3>[?g >S>> >I? >I<>d>1yRa >>>GݾVg?[?4=]> >&>w )>?^ ?[}_(Q?>ߞ,#W<>XmT>?@a>G>j>Ϋ? Y>zm>$>l>O|>{{?=~?V7LB߽!aQ?e<>i?0? ?.?+׭>=<mOA>>2=u{? |=" ?*>X=7G>>Ys>>hw>7>_$>#>򉾺? =60=q??4?k?e 0?.?D>?Ө>? >U>AFp?v>'>><=8m=~=Wb'>I?4F?Sh>BC?-?.>|>}>>K> >(>>5~@>?@L>eӾʏ 7>>딽=R]=M=ʰ>}?)\Q?1?c>>>h>n'h?Gf?bF>D>ɫ>3i>?@R>;?Rb>>;~?RU>(>u>p>2>M>e>W󹖘[>`>>U5>%?*?ut\>+? ~?N=? >דJ?Cr&?>? ɱ?\>܎<̈ ?b?KB> T?w;>š>q>">d>{?B?`!?@]?z>ts>!#?_ھ9>cE?'>ͮ>n->>$>>~Ui==֨O>H>f97=7>>s=?%C>Q>==Q~VG=b>?6v>Ͷ>`H?q:<=8?mپ^?T%f>*~>Q>\?_c?=s#5=>N>K?J>]Ƕ?!.u? >2I*=>!f>?:x=![> G>>A?rC0>`[>?h?>D7/>G=%Wy=Y%?5?D)>+< >/>"=?!L,?9x>j >?dQ>?;=ʾUIS>G.y>8]?5m?L*8~>F?G?*>PXYx>#>YW>s>> >.y? }>z?C?N??@>5?4?ZI?>G?K>W%?D(?>k>g>R?1> > ?"?=?]7=?>=2?0?a? m8? |?>/a?T>O4?Jt>m=X>R_>9p=|>D?!?.=?>>?y#^?`Fx>i=8N>+t? %K>Úi>y ?>>Kjm>T?s>??>Ja>g>>p7>m??as?eT?'h?F?&?3?%? >r?>A??>Ǚ?_4?[>J?W?hR?K^l>;@>,mr8s@Ѣ@.QAA1G@j >S4o?Z?*E> 1? =q=hf? O>ΐ? ?zF>>o? ? A>U>#i>O?et>Z=>e>5?&zu?'Cl?V?Z??2?>B>>3OTv=c>W?>H5>K>j=2>.c?q>^>">G}۾L>뼒?>7>>>"?G?Ty?+>=O>b?97?df>3>n(=>c\>&?.?0?i?F>e4ù$>N;D>F?KX?)>?%/?9Cۄ>7,?K?./<>>2Z>E?>G; ?/>T?ap?t>>Kb>M?-}?V?$?l9?JG>#?|?=>y^?![>_=?$>4>3??FE>睥> ?A??5;'?K>ڤ> ?z=?EG{> >?ar4?h?">3[D>`?1?&?0wB?/.¾?F??T>? ?O>>c+>y>\?_1 ??? >X=ω>Yz>9>Q?>7?T?G>Lz?3?0p=S+>=~>f>,> =ďB>b="]o>^??O>:?c?yٚ?D,?4?-i?k>BK>c?!ij=@?> Y-=>}s?)o?ʉ>?M? =?e=_?2?/?1L?xnB@34@K@"z}:7@]?(:?@>??d#*=M>? g?~>|-?]s"?$K?IK ?a>?%^?Ce?pۍ?jqw ? ,?7 ?9>eY?O>P?*?j?I`lCE??c?=ID?`-?P9?؞? ܾS?՜?,?\*J>+ >>E?+^> r=L ?!>= ?0|YO7#?F?pVT>?̎?n??Yɪ? ?{^?;f=>ޡ? ?Eq?[T>b? fA?_??f>@(?i -?H_?@YX?MS)p?]">\ϽϾBg)?[P?!>c>?O C>겼>>h?l?8?AQ?/>>?K=?=?ŽD&q?>;b? ?#?]?f2>=9> ?|W?d?>Z1>e ?E's?]q}?:m?Zl:l< ?:>d?a? ??vy4?V??8BC??T?\k>L>?#P>>|?,=4Kk>>S3U=+?m6e?<">łE>?fT=E7?1%?-v?4?|0>>;?rl??p>*>E>>0m=^O>5>>?H)?N>c ?8^?jy?Eqc>K>1?]K|;?+?>竊?=J:??s?IW=+>'>`)>x?$>5>ON?T;? S>J>l,>?#?Kc?r?K?| ?:?l?->X>K?}?}}?|x?E?u>;?|?D?Ց>ϝ?P?>o? K??6 ??u? ?$X?,?dy>>?%f?>ײO> ??=?·??ӍX:>}­?2x?PD>|D?n{?8W@k?ڟ#8<տZw???]?J$??SX>>ɽm=+?_k?>>B/= D?&?NJ?3x1Nj<_?x4?gF>E>?d0???8?nA?I??u?9?}ND?9=??t[> |>B̾Ib?Wgf?h;?br??p?8?=rR?"?})=1?>i?Ox>g>?S?vh?3?T@>.> ??Y>~>!y>ߦ0>s7?`,?'?GZ?k?l??6???r_?b?Ds??Gy?@X)?1?X?q>??>?,?vdg?9? R=K??>1?&z?>/><>?*??q?]>7??Έ?U%h?D{T?qT?[Gt?~?n??7.4???^/e?}2?W?J{?m??$?r?[p?#t?{? ?l??;>?2w?=Q?>4>?:B?=???x>Բ\?H&? >>?8??$f?~Q?)GK?a?MM?(?>a?{n?Z?x)?ZC?J?]?{=Ji?+?En>?>4>d4???)?GP??-?+ ?iT?_?? )>.>"?Z?IN?<(?o?~?? '?ȇ??|?8?KN?|+>>?% ?t7?S?I?VD?\ ?]?}}>d>c|?N?2???\s?>?~?h?w?c?"}?L K?S?V?b¤?O??FE}?k?sB?+?i?^?=&p?V?q9?Crj?q`??d>I?Qx?2Q?;_?N??8?aO2?ƥ??W>}m?>?am?@%?8?b ?cX?>?ZV>-(?3f>=;Ka>6 ?0`? >]?q??:@K?6_?p ?@?0????_?!?#?9Ar]?N~?L?J?1?Y?|?Bo?jz>AJ|? k?_??es?}A?A ?$M ?e?|??iD>?Ϳ?Fc?z]?y?TҮ?!?8b?cH?e?:*??[?@?{>?G?{?7>а:?4?p?pC=?a?>!?@hX?VN?]C?~?>is?,?b?x?.?\??{ ?U?!?d ?;?Bc?!"?>?W?fX&?}?|?5Z?v?Vv>,?{8?z ??|V?L?2>M?$,?5?ǃ?C?V?)o?pl?p?/?Bz>N>m6?@?e??D?9? ?Q ??I?>V?ju??WF>;Z?#? 7??>A?E?gx?)Z?Y?+?w?_hd?}???Hp?p?Y?%@?,d??k>?f?^?]??0x?%?0>A:?bT??\l??yM???-x??Ƒ?#]??[?tA??J?=e?.?e5?yG?{?H?)4?X??!?G?^/?W93?m?f|&??<>c]?h?N?I?ʛ?T?3?D??jr>?a7?gD? E?`?ܿ?9[?ׇ?u?>1l?{??j?; @#E??u?r>X?K?J?lM?s??7?M?h? !>Y?P?z?k?F/?[?c??>?w?d?J?\ ??r?I?(?SY?K?u?{?|*?wI?????@?X??b?Ąt?'s?a?pk?̋?d?#4?5?@ ?*?*m?=j?X?E?|f?V-?L?]??W?5? ??g/?p?R?p? ?׹?G?ס?I?@W?l?t@ ?/?T=?;?|Y? G?w'??E?m?O?&?V1? @?p?U"@#9?Q?Ġ??%???{q????Ь?|xj>m|.@ @ZP@a ??}?_?^??? ??'? [?G?; ?z9?> ?4'q?~'? ?Ym?aR??8?1?3?g???? ?-?`T?G?^? ?o?Z ?Z@??K?e?N??ж?ʴ?f+?Έ?o?o=?4?.??L???_?m??Ed ?$D?w?U?L`????I?Ps?ہ?v??o???+??c??3PC?G?i?{?C??t?ȋ?lM?8?4?vL[?@ |??ʲ? ?:g>?d!???&(?%m??}|?^????+?MK?2?b??~??x?Y?V|??:7?C???΄?`??T?-\?.?Q?=`??ڇ?j?f?c?D?? T=?q@??G?U?X?l@??F?к@?c??o?5:???}?gC?w?:??[1>on>e?^?|ǧ??a?l?hq??=?hO?4?z????^?{U?nm?P?X??v?,?@?wk??K??? ?p?5]?cIA??E=Fsp@?9?@E@)d? ?+]?M^?68??L?yi???]?,??gm?me?R?@4W)@D@?E ?@f?§j?i?X-???Г/? ?l?)?Z?@ ?t?M< ???>?Mӵ@,@A4:?4d?p??X??⸐@-?_o???ϥz?^?݁0@5j>Ei? ?tg_??Z??YT?rem?+>;0`?j??`?J??\?Y?k?]?ʢP?~??G?x?V??{_@V@&.??u@ ??r?\?B9?`?lI?ꛄ???|>=?,?q?b?T9L?_=?2> ?#@??q?>?G)@?[@3+?y2?M?G?&?.?9?w??6[@p+@;C?ze?i?z[???a@s@6@??Q??g \?r ??`?q:??ʸ?=3?(U?e?@2@p?T?G?%??8?!?ød?ZY?tI??\D@~?N{?]>Ѱ?t(@2?A?ϛ?TY?p?z?y0?Y[?Y3??&~?I%?P?>?*U? ?7??lP?Iĺ??%?`? ?x?8?RPC?@?2k?vJ?R?+!?tj?4&?P?O0?u??ū:?\>`?[I?ܿT?g?W8?@C?C?d?w?Xv?P? 0???4??"?/?o??s?9?? ?Ȝ??2?E??~?(&?8?\?Ͳ??U? ?db??i?{&@ T@]\@= ?K@".@?~>E?\@@;R??<-p?3?qm?Y@GT{? .?.+? ?߃?9??????+?q?u?sx??ߗ @[]YA @j?c?s@N?;9>l?6??CN?|?s%?D?Ӧ?B?p??ʜs?S%оj@?՝@\?j?@?@?=?Yu#/Ah@t?p?,?ͱ?=? >@@X?s? '??y?XQ@:@tL?l?[?j??>?|~?#.? @,@)M?Nf?e?R?x@@GS@o?2??U??o?͏?n?j??g??!br?R`@6 @(w/?? ?@???H?^s?|<\???0?k*? ?^?b\5????!?xE?=? ???;?p?@)}@L?n??z??Is???c?@/?އ?vc????%?U?H??-?s\?2 ? ?*??Oy?Jk?0?*?E?f??;?Z&??An?{????f?*?X?i?C???I?c@+?)@|T@?S+?'5@ @@R???*?[;?0y@T??u?hZ?qb?n?(?F?s???|X?Fχ?>?@R@s??Ϧ?HM@ڎ@x?{)?U?>H?Hk?T?c6?M?P??Q|?B @@e(>???:???b?@%9@wJ˒?@C?Z)?el?F?@oܰ??S?Z?|T?ڷb?Q?@*%@L ? H?l????@"@@??JN@?ǀ??@?z? a@bl@Pm?X?V??M>?(????,?ȗ???J|?Ɨ???@@[u>z?wQ??J]?hq?d??n?y(?y '??H5?p??2?1@o?2?l?P?Ş?w ?b?^?-??!?V?D?W4??Z??X @ό@T?-?a@?$7@-@#wl@0 '??/@w?e?M@@Fl@˕?")@v'A#@F@'ɹ@ @,P3@ Qe?{?٨@W?!?CE? @N?@?Yv?8J?d?W@n@5c@A@AzoJ@]@#@@ >@??N?|?A@ F^@;@ Ts@88?҅@V?4?s?f@5+@`)?µ@@l>E?4@ 'l@-7Ԭ:@@R@R?s*?"d@(“@@~K'@@\S @\@RD?T?̸@ @B:S~@$? @)T@!$H??d@[@R@q@c y?t?W?S?=x@"@3?8M???-@?5p?~S?@@M @#Ta?T?ae@ ݐ?-@`j9cP?k@+n?{,?B@89q@@"2@/M@>?k|?ҚU???@a?ۦ>??.??r}???E5?@.?v ?@?7?6@?YEz?N@(?ֱM?Z@@.@z@ ?H?(?k? l?&?fx?b@t8?hb=@?ԫ0?q]@K-?S?;@@s?؈?֔?{@1@ 2/?-?]@d+?r?C@@IC/@Y@t? >gi?E?[@@0Dx?#@0>6T?q@?@-%v@-?F?<?;@@ %?D?@L@6?;_?@>?ef?9]?G@(&?k)@3@~ #g;6@ ;@.#@K@I0@V?BQ@j(@9/d#@~W{@ d@4 ?f;?3@vA,1Z1jYT@@ -??C@#@+? s@A@HB¦@"WO} @:@']@tKt@VЍ@bH@#?@& c@wv@F@ S??@/?I??ӹ@@po@#??@D@XQ?@@>$]@>n?@@,[@b@P;i^@{6?.@?@z/?畃8@O2@ z?uk@(@M@{A$Q՛;"@4^@@#@~(@.@k2?e0@s@_@t?h@Aj@W=AVAd^Se@@_ ?cH@^@Ka@p?.2 @AE@]S@1@14olAC#C.9 @X@ @3Ap@9@Fi>ӂ@@/+2@#ج@{[7@! 9?'D@׾@@u?b?e5?g?j{?m?o?rb?u?w?zH?|??*??k? ??K???'??c???=??v???L?@U EڱnNl2?=Ax?z?0cJ>a>Z@?Sq*7H?M%?j MpD=# &@ +#iZ@A??e=?r`??6oο\?|?ő@i4>H>ޞ?;Џ@O?b@?0?Y ?@P#7?F@ L@7@@-^:V5E#P@*?؆@EF@fJ@{۾ը?@,|??%ȃ??!`?{Z@L"@~ľ?G{?~c]a=dp?V?>=|X?ϮLAJ3uK]Y?c6?@'?GCc@,S??‚׿ >w9}?m/?\?5Wj>S?;ݿmPbv/eȀѳf=__=?ZU?*@? w+l?aƆ?Hq̻%kl?ghl<(Y?FP>?Xm@p>C?]@?*h??5I+뿄?4?{>? ??w%é!??)[??^>: <\2G>>޹>#Z"l2R $}uS->(@0?A>ݿ ;;?`)?<~7׾'?pa׾g?^qлɚa?_D"N@9Y?2]="`H5y&:I&Ap=k>' ?1>1 >?i?_8r>RMY?@5 Q+}.Fb 8{),!]0?˶9<>"{>5e>?L?MС?"?VSOOw>}J?6?uvЫ/B>j?]^!>Uܿ:}?N?\I)-^??ŷӽ]\??ۿ)]S!k?#[ ?W5>?K9?z=nlAmCP}>%a?]??en; /O?k|?eQ?ʨ?Ӎs,5??e?x?9? S$`+?>t?rh?`?d >0S?2X.?n5>8K?Ĉ?f?A/>2gF>iD9=! $э?!?0i7t>U ?F?g,?BҨ,?4?[g>>?K?#ұH?O>.7^?6>t}?"p?" >???7*?n??b=>#=gn$JeYN:>k=>t?4\?w?FX((b?=>(P?T5="c@a+>>K?W׎?f?b<X⥾<)M?*?N=戾fb>6!?\²=|>?w?%9̘3?]0\?7?`14B9:o=f?)?OU@?S?nV>Q<>tM?@~? 4M>8b?Gξ[&2_? >?$B>nx>Z*?8>\?5?˓ 4?rֿ? O>"uέ)>w>c?u6?ā> ? 6->y=G?o?i>ڈW?A>-~>L8?B?TN?=w=? ?d.?/Ǧ?9k?]Iw>\3.='=`WLh=Z>)X3?n>郰AݽL>2?ƏOC>Ⱥ>h?.R>T ?bB?ю>l4{?Dоo_\>Ӣ?$>I h㾕>LK>h>>iON]O5Ų=R!O?-!s5? ?X?E=^ȦZ?Rx?=6ÿ!]?r'?J?/4>,OʄϾ)&>?BO?}־{*>nP?@?&~>qI?w?=ă>??s]? "c>S>g=#=tH?E@O>|ZLK>>G=QE=x=g?2u>x?0S>lg>]?jȐ?pA?rf>#|@?=? 9?uZ?d>̜'(>g?c?(??_s9R2>7><>tD?_V?F=y>M>LJb̑? '>[?> >1?n=\_e>p?Q<Đ>Kj~?z)>T#???k?a뭽 >Eƻ?o/?Z'V|o_p?6 o.?L<>?g??GG9J>U?UaTZ?X?Dzt>ċ??Z9?>r? ?J>+Jc<r?̳>#>gþ/5??پ b?/!??U>a?͛=4>F??C?4?t>>q| >K>f;o?]r?>0=Ã>>˞?J?^S?G?5> aM>wE?j?>> ?ܿ\a>?x6?[??{d>љ?/E2>ŗ&`.>ҢJ>(??P^>T?->= *?O>@&U)?^6>y?Pi?>>%C>4=2a}{%=^+? ?+kf?>-?9/?:?r'?º>2 ?4p>e>io>*?4 ?|?F?b>h> ?!ec?&S?#M?H$>1y>I?(>?9 @>j?${.?5M\?gз? ?$e?T?ё;> >>ǥ?H*>#=Q?M)=? k?7?V>n?>l?T ?cO~=NCh=?-?c?"2=P>H?0>K? {?Q|=2C?->?.?*?G#?R>P>??=0?_m>M[?1!?|>ys+>???(P? ?UB?T?R}?Ox?>n??#?6?>8??SY?%?>?=?Qs?f? U?>?.S?}>>u"???ȗ>=_>Y??G>y??*?@>w=?U?'U?>LXQy>=?4>?=F>T'??Xo?%BW?>#=d>? ?v?Q? { ?~v?Zp>ޣ?> >?w??S#{>?E\?Ww?9c?*0??f>a?p?W=#r?X>SN>k?o/>#}?\>ڎ?i0> ?f{?o1>(?64? =??o?yZ?i>o3?j?>w=N?h%>I?{??,?[>So?W ?.?RI>}??w>+>T~>S>M":?P???Z?: F?K? |?_[?z?>?4>>/?#?ITO? o? ?>?[>8d?D>X]?>>}?yh? Y?`?Ԫ>0`??m? ?o ?J>>K?E??7?X7?(M> ?I>˃>/?1Р?n>?k?&꽒,(?\?R>l">N>J?#E? h?Ǿ ?k?_?8>~>?-_Z??؝? _?? ?5? R>{T?i ?K~?U?!0Z?`?#?Fz>>?. ҁ??2w?V?a?}s?Bf?A?e>V?n;?x[>?>??Br?dJ?0|g? t?P?m?>Ͳ?D?8? >w+?#?x?50>?UZ??P`??Aќ?l&?b?y?V?d^?5?( ??#7>?!j?\?{M>6??>?/>}C?:>?T?'C?U?X?2z?d???dRA??=V?6?D?C6>BR>;? ??B??_a?4?4?? %?pQ?2>4y?l>l~?~?S>ℵ? ?DCE?+?!X?-?RK>_8?Jx??h>>1?z?`ȷ?%& >u?|?z?F>??/?B>'>Wr?&JJ<>U?}<",> v?X>?H>??z>%2?>L>o>>h?%o? 9y?-՗?Y_#?fX?>?#?YO>>L>?> 5?W?kF?3>N=?U?B?L?~?{?+Œ?-&?!?? 3?C?t?TB?=9>?Wf?Ov>>(1>y?(?>W??L?%H?TwI?'??;?V>'s?ϟ=w{?jA?|iy?W?9>R??K?A-?x??F?MX,?j?S?f?_?SI?e???@7?!?Q?4,? a?O/d??D?R?}?$o*?D'?8>}??4d?g?F?He?>ش??T?Ӕ?GH?+??mW??g??0?e??FKq??h"?)??^?2,:???J?f?ۯ?vU?A{?X? *?=?E?Pt??G??/?G?40?{?ӣ?el??`?}?x?9i??j ?a??S?Q?-??^?C&.?S>׵>R?n^?A9>qM>s?f?-?h6?1???~{??K?`?*M?5?)?4>qO??u>u???n?u?x?Ob?bb?ّ?5??̃?d?~??ɒ?>҈?ce?*n??:?Is? ?t?Bo?.?4?~?D?d,?;?g? ?o?ĝ?ǐ1?[?A??}3?s?m?1???t?|??},?@?$?dÉ?/?A?G.?p7Q?z?{>R5?}c??@o?j?L}6?2?C4?*Y?Me?j(>`l???07>틤?&ț?F?b?c>?tq[?Aq?:jt?3j?+_? ?N?egB?9=*>_>_? K?>߂>g=>?j?Q??/?C<>J?6Q?Ƥ?b??!?3?l$?-8"?-`Y?R?>٧0?~?AJ?o??Pڛ?T'?Oً?? >>T?R?= ?E?o~j?|Z?h?c?9p?2'>?9??b$>??p?q?XM?Qr>>A?[??Jx?iK?vH?>A ?K ??&?':??莁?J?x~~?']?oK?%E>5?;?4> >}>?>a? }=>? U??A?.??x?L_???~? ?jc?{? ?ع??e??(e??v?g?N ?/?"8>E%>b?.5?.?@a??SN?^X?3|?E/?p15?͞??????y??t?t@@3@f??4C?O?ɇ'@????@E??Ȯ ?L@??O8?C??.@@]6?v@ W?N?-@M?儠??=????Ǹ?n'?1F??@@@ @?t?i?z;@?%?r?4?@\@@9@Ifx@!-?@#@#@0@4*@'ko@@#o@@&.@3@.a@@ @(@@@ߖ?m@!@<"@8y+@@s@9@-@!h@,@VI@ @*0@@@+@> @"@@r@"@''@ @@@"y@g@+@K{@%c@7@@"r@@@+@'*m@P@>59@),@y@7v@2@[@bm@ I@'@Ic@E@O@V@; @"@i-@=j@Yx%@P}J@ @#$@[g@@A%@-T@@5s@[@O@"4~@D@X*@Ng@.L@W@n@y"'@@Gu|??D@q@Yk?%?{S@@@e@@P'@n@,@,p@Z@ X@{@@,@:׽@<@bP@U@29@>@BB@@I@Bv@ >@) @4H@(@!'@@A@xx@7;+@N@Pi@?;@@B@,@4 @.*?@(5@x;;@4t@.^8@} @V @2@I@S*@b-@b@(0@%Y@V@JI@M@Y @@:-?@(@>?@"@"@;i@JJ@k@!@O@EI@<@#{@T@Q]@,@5@=@D@F!@2@$3@l@;LU@DC@W@Bf@?)\@5@V!@Ci@Ewr@oΞ@fQ@O@L@M]@,@@X@9y@ `@a@Pg @@V@\+L@S@c@[&?@B+]@Yc@Z@[6I@g}@O@3;@NS@"@4^@4}@F@So@@Fh@?@Y_@1?c@Fԗ@L~@%@,>@j?"@PD3@:`@p@D\/@Y1@Rn@'@H@M@#8r@X@ xO@$90@BY@@;5@M@5@M @:@%H@6@3_@!|'@5]@@Tx@p@i@SI@ V@JL@E@UԠ@N_@;7@b(@~@<d@k@JqR@4cP@kP@@G @a@Eu?i@c@4@V @[@_@Lt;?@s|@ a@&@S0@_@G@8K@CÌ@n@V@@?O@3!@@Z @)N@Ll@R@J-@b@8@3P@b@>@9@COG@n@]^U@+U@c@jӁ@=@J'@bf@P;d@H(@{C@QL[@+@A!@Z@<@&@E"@A–@&![@9a@w{@4?@8e0@SIJ@>}v@4@<@E@Dv@`r]@q{@j@m@El@/ @9P@Ne@Wĩ@Im@-N@Rz@@W@Vi@.@W@T@SS@mJ@s@P@KT@-@Ek@f~@h8@_@mBJ@M9@R@:@4@h@@p@yi@zy@qڥ@bJ@F/@@A@j@rW2@kO@e0$@(@\@j@i<@=r@SxU@1y1@'<@^@j@=-@TX@b 8@X/z@ER@]m@(@X@q0@n@[@c@j@E!@n@I@Ww@P@"0@I@*4@EA@@b@d@Un@9l@@S>@qM@Ru@3H@WFt@ǒ@63@Sv@E@YG@G$@fw@4@=[@T@F@a. @T@Ok#@(m@15@`Y@7@Z@O@ZV@] "@TWS@W@[T!@8D@]Z@py@E@?t@[ê@=@XQ@l@N@aޣ@Qa@lk@HH@g,:@5R@K'd@Hf@.@T= @pr@`>@v[@La%@<$@T+@QR8@?-@l @0^@8@Fm@F@M@V@M@I)x@c@@VK@`7@U@2@gx@>@e@:'@XE@KN@o|@qh@J@@96@Q{@a@_1@c:@D\@Of@H&@G@N5Q@@\@,T5@c:@,X@`x@} @-K@O@M@b@Pc#@3J@Kc@yt@N6 @]@IT@;@Y @Aff@-@L @: 1@6@kLs@d@Gz@v8@^&@@B@W[@/?@q@Ih@1e@)=-@RY@7@@0P@b+"@b)@?k@Oz@j@z4@PfM@9.3@]@`}@MP@Nv@ZK@Ol@4@?@C!o@Lm_@K"@>w@>@X@GDk@C-@TH @bl@bV@]I@V@m/@t@_@MA@Ulf@R<@jf@R@?ڄ@R,@;3@gF@Mz@I|t@Oo@EL@c @?@B@B@3l@7O@Qt@@J@u,@cL@?`@S(@\@&UL@G[@ND@D_@K @2@E@I^@0q@.y@B1@:5@Q@5k@ZS@yL@OQ@[{~@HM@;Zj@=@0P@@#U@@Y@#$@b@+~6@!/@K@Vuu@D P@2!@+CT@")@ @#=@5G2@LJ@O\@(@F@P@@F]@F @R@Tli@R@1c@>@XGr@Ie@A@*+@S@q@\@DKI@f'@Z5@Y@c@]@H Z@H*@Y@wev@_@DO@N_@?@hH@BY@`@zrJ@4@;j@j@ˆ@@kQ>@Y@yk@>@XU-@G~@n6@@b3@g2@CM@X @pf@V@|~@w4@W,@uv@bV@A@>@3e@23@fQ@P@@&@AD@R2@OK@5,@1G@rd@qO@Tk@lW@x@Vj@~<@|X/@[@vi@Xz@Wb@<@|O@s#o@w,@bd@>%@^@`@:-@Z"@@/C@@sV@d1@@Z@uuH@@S@V@T|@s;@ @z/r@@b!@b @qH@P5@8 r@U@dz@rV@pp6@g9@"@Z@i;@y@@@@S|@{(P@m@e@h I@a@{s@}͉@9@d@YJ@@@t@w@n8z@yw@@̺@B@v@@@t@_p@u@Y@r@zn@{>@)@PH@lE@~@@4@@vh@q8@u@h(v@_@d(o@J@p @@@oV@VB@d|@_Q@e@u@w3@gJ@iC@Jo@m,@w@b@d~@hOS@Wn@@Q@}1@_@wR@eU@v @@{6 @ds|@^@V@[R @^g@WX@G{@~@@&@M@_@Ll9@jNT@t(@r@f@L@gӊ@ @a@Yh@@{@(p<@Y/@q@@r}@lT@@z@xH@i?@b@![@zϝ@@O@sj @vړ@;F@X@ZkQ@P|@.@^?6@J@@U!@@ˣ@Y@yt@R@Q@f@`Z@U.@mi@q6@e?@Y@T>@f@qw@kB`@Y7@+@u@Lx@uj*@@@r6t@u4=@(@l@v@hC@r,@u@@T@d@*@@[@ @@к@c@s@s)o@L@+@h8@o@We@@@3@@s@@u@l@{8@@mE@\=@9@~g @{j@@|/&@~S@@x@ul@N@gr@z@b-\@o7@J@}H@l(@|D@8@2@7:@ڢ@aM@WY@kIz@cc>@X@v@r@| @f@w@U@\@3@uD@zV@t]t@b8@g@\@&w@t!G@t2@@@|r@nv@O@+@7@@n@v~@w@ua@{@@`17@d@@x@ @@zpL@=@ig@o^@b@~@~_@c@@"@q@ @FB@0@@p$@Tl@x ;@}@q@C@z@n@R@r@|J@ro@zn@iĆ@km@y@~|@@uV@\.@B@y@]*@@@uC@}6@vJ@A@y@A @!@@5m@:@zF@gb@$@p@mU@w'C@qH@@@w@J@%@e@@}@O;@eO@j@@@t@@C @W@d@~F@V%@?@p;@<@v@7@t@q#@b@@j@@ @i@@F@wp@d)@s@v@zǙ@ 8@{@<@hP@u @X@f˼@n_@_e@~}@`@z9@nm@Nh@S H@b @GD*@J3@B@TT@RS)@C@TI@J@LZ@j@|@@t?i@}@)G@@D@&T@*@z=T@¼@@e$@h @u@db@@5@p9@wf@@&@t@R@ʎ@8@B@z@R_@l@J@5@ @w@K@Q@pi@^ @m@S@_ja@Z]@o@ޟ@:@d@'@e?@@!@g@@@@9[?d@n4@@Y@vR@|V@}?I@&N@@]@@{@.@@S@`@_@4'@~M@,@j@~@,@/!k@N@(@p@@UX@o @ @٥@tl@Iu@ǡ@Y@@@@@Ë@y@WR@@#O@@j@@=@Q@}@g@Bu@q@vh@3O@8]S?3@O@“@YN@z@7@@R@&@'@l@i@j!n@@z:@&@I@TA@@QV@'@gb@f@@ o@n+@-@Ƭ@1@n@Jdu@I9@ @@a@q(Q@@a@{$@o8@mS@@@@J@{T@ @@s8@y@U5@|Dm@~9g@\@f$@H@r @5@~<@@"H@@@}@vr@e@c@@x@@~u@mC@{?@U@ZI@_@ph@pt@q@v" @F@>@|ڑ@5@{;@uu@@Cw@@p@r@'@wg@`l@w@{@)@tݔ@.@'@xt@op@e+@d@~T@`@s@x@8@}@S'@ff@q1@v@{@y/j@nS@sD@@bֶ@R[@g@o@@p@ro@@@T@=@~(@oK@2@J@x@|+@ZTr@@yw@W@`N@xd@1@`@w)@|d@vW!@@/@kX@{$@@xȁ@@ys@w@@MZ@t@+@@ZJ@|@*@@C@a@.@ @@k @ Y@g;@@.@4@@@@@y &@@} @Y@;[@P.@|+@4@x@@@u@{@%@\n@}@%@F@v½@H@@@@m@O@@@,@1@B3@@`\@e@%@'@vC@tv@@9o@1@@}@b@@G@n@ $@͝@<@m@?A8l@$\?H@uߺ@s@ @^@@@@Y@p@>D@@{*@W@Y-@@@Ch@@NW@~Q@1@@X@|U\@(@?f?bAG@@@(@?@@E@`i@j@d@@@4V@i^?u@0@/@T@@2@@r@l@H@͆@g?@T9@ @H3@0@@{'@@@Q@Z@lA@~d$@ra@۵@=@@dB@@s@@O?@cD@b@0@@g+@y@@@@X@g@_@s@@@\ @}@MR@@2@@J@@@ʎ@@]@t[@@@z@}@,@@a@}@} @z@N@k@{-@~@m @@lz@w@@~u@r:@U}\@_Y@Y5@\@F@@X@@z@ @@4^@-\@@l@m@m@@@^@w@w5@C@'@@'@G@8@U@Z@kQ@0@\@zJ@lI@@ B#@;@xI@Q@i*@a@c@q@|I@,@@{@@kC@K@@E@@@/@z@r@=@7@@ @@C@w@wH@@S@G@@I@~@|E@@T@H@@kW@?@@x @@@H@w@@t.@i@@@l@@@2@o@@@.@?@ @@j;@%@+@@um@@@-@@@v@3@,@o?R@@@@@( @4AX?8@@ @h,@wE@@@@ @=@cs|@5@٫@%y@B@@;?@ @q@ҳq@w'@2@)@3@~@XM@&'@@i@!@@9 @_-@{@`@8@\C@~x?y@.@"@e@v}@@[U@Y@i@@NM@ZŮ@:3@%{@Z@yFY@"J@@>@@N@I@Z@L@m@@*@ @#h@D@6@@#@@>@@q@b@@@h @J@;@J@@@@25@vc@K@W@@@n@,@q@~W-@u@-@5@@|@$@0@@m@}@@@@@ @ A@#@i@Y@t(@@|w@R@4@P@;@l@@k@@w@0U@<@uO@u @f6@@ܓ@b@@)9@RA+TZ.;A@M@eMp@ATU@ @yd9@1q@b@96T@/@t@@l@]v@z'\@G@U@0@3YM@tV@@z9@o@"@Dq@A@(j@A?@@ǒ@Th@ @+@X@@>@@;9@^@{]@)V@@@QΓC@@ @@@@~՗@@z #?T,AI@vA@:@9A@@ @'^"4DA6@n@~u.@6@d@@OM"^@)AL@@S@r@h@ad@K@ܻ@yr@@@@,|@k@m@e@cR>2w?ǝ@@8@@*@UJ@;@@ϋ@u%@ D@@?@6@@+@U@<@H@@7k5B@'@@;@#@@[@!@#@8@@0@F@I@o@ @4@/f@ݍ@+i@7@"@A@@@؏G?A@4@~@v@ }@@@@z@@@+u@@\W@C4@v@2V@b@s@di@@@ b@X@h:@@@{@@.I@w@dm@E ?/@oC@nM@%@t`@w@3@ @=k@@@G@Q@˦@K@*H@@@%S@@Y%@x%?A@@@a@@@@@|@@5\p@?k@$@@*.@wo@@@,@ ?@t@۴@u@?@8@ @?>.w@[@ @8@@{@ϡ@@>@N@u'?iT,@yA AcO#Wz@ &RA@bK@k>@@;@h>@@@#W@b@I@@5@\@}@3@(@@8'@Yqi*/@AK@83@mbg@@@l@=@@r@KHk@<@@m@w@^@u@H@@7@@l@Xg@֏@d@i@Gl2?@'@k@4@p@О@y1V@8@m0?nυ@w/@f*@X @@\Z@@R)x>,yA9@@f@(@%@:@\)@={(@)@R@3@@åE@2@Z@ۈ?˫@!?_@@@ &@@E@/@&;G1+@l@lk@\@fa@@pd@9:HV*A UYA@L@׶W@H@0@#@@š@ARA@eL@@@U@-m@1@r@kwm@@H@-@@@@@uxAkeˆ(j}AWԷ@JA'@RhAoM@| @|q@|@|'@|@|@|4@|@|@|!@@|#@|%@|(J@|*@|,@|/P@|1@|3@|6U@|8@|;@|=Y@|?@|B@|DZ@|F@|I@|KX@|M@|P@|RU@|T@|VAGLi?g꿟a޿%v?mOP ǿd Q^? ?.g7Ŀ"οy)7@o?,Vc^=x&@gv@$?́?c ?P?+!H-? *濐ؽ3zdv@ ? J5q?k] ҿi6W̊?޷{?J;@`r>`?W;?BO?@Z,'+ё@ >ݶ?&܇>p?@@H@c@`@b ?G߿=F??ىN^`?Q@{@Wa? }>s?óAR)?>U)?H@G| ??Wr1n?x@7L_@???`@X_:@^?b?ݿm>?y>H@`(s2F?+?ؐ?̍<) 8Ϟ?0ſ5ZfV,u$ qMn>t)@%@7M Юy? rv_϶Ebo­#D??bdUE ?p?>I>Z?C8?@:?:]?}?=24QK?`u 8lJ&<n<ҿu;D?[?O?1)?n? ?)?޿'G1?[?Ҿܒ}?= B?f?mPd@T/@'ե?-?0:@,@2_@!?iA@?sb@C_?!-?Mc?Rw6J>r?)@e(!?>p=k=( f ?`[?Nſ>N.}ih1=wa? y />s9TZF>K?$n.ٿ,>??@%`G?z\Q@:e@>B3^v?>^ڿ(K>Q?;(2>UN??Dž4ʿPϾ,ycط?2u= A!o>ir?> ox>т>짴?"鿀D??+$=G(Hzн <]?!?>ۼ7/>.o-=ޔ H_&K=>v?bU@? 06@9?>?z >My?Ysw?B6?Uo?2t }?t?ț?TN>u =e ?hS?@v?)T?z`K@??aq>VT>p9;?>r;?u=?!=Z>># 8azݾ/?:>ܗ:˾ߪRDĿ?Fh?g %?yԓ=kpAݿA>NPпr?Fߠ? ?=aV> >??Y?.?L *>*0?7;45?+kp??n},?k?*%[J>'j?ǿCܿ徭>8?8ܾڿ-vN>*7;[d>bkg"?8ݾTm>s0> _@ᅣ3&=? >M9CP>T-?? ?6?s?W5?w4=Ĉ>'Ce)?G-ih>? ?9읾CLT >酪? ЄP=S">ȴŽ >>貭N>B? ??/l'ɍ<>>8<<>.0?3 >os=:A,͙ y&>V? h>:>W<75?>ܚy֚>0?~I?-J>A\|,>hY?PWؖ0X? ??J_?,K0<u=Kx?e?~>M]:&Խ]>bkv>?A!;.??b ?1?vs[->= =zS?@?)l}Z^`ܽi6$򛿜[?,Q 6>i. S=#d> ">B7?J?y?.!? žAp>ێD?5밿ç>>'?س/ k& |fj,'-?8=>?Mh>֛)jǼ=Rҽٱ?[CT?n=u>}?D(*[9?|?7 >?!a>>Py>G@>l>>+' )?yS>=>_H=S/{=>^?%U?"Pο辇+?j_"d׿1>;>Q?M̼>KɿCQm>l?k2d/־eD>?>8>@?D?b-m? =j>dl?39&P<̾Ai=ų?A%=?S7S Y>ӝ>>*n >?5y?A>>&H=f?-@>G!/?5 آP׽*G? j=CUF>ac?Tx]=?pp? ?Կpe5|?T`?Xg1|P;~?V>]O>9)?uI?x>$WС?'?8iA?&!1?>ك?O=?=Ü>\>T>bνcd'fF=v#>T>9=~>O>0FT %/o{> >}˾=ʯ$p>Ѿ j?,>A=U.]>C+=>]4j@?\? _=:<ǧ>==?(*?>=>ݼ>k2?CG>}k>>k?%>'6kE>Q?'(?? wR?Dv>!=ĺ&Fo>$%>>0(;"3?Ǿ[M*>?z$ þ>x >>5(!?:f>d= ?m>Y 3=@{=R 쾢<?~px>rB>=u8N #o>|>J8J?&N?c?9,u1v>=>G(>?} sBሴ=2>>ؾuR4צ<Ȳ9<>I=g>>G ?ŽS>=mN?B5>'?&BS>E? >dM=8?o>>G?????>?r>T>$}+8=@ȩW:?LO>>Z, 1?W,>=1-?n~?;=+u<=> >>z>+Xo; =>E=o}~>p{=>H>>vν>>Gl= >Z<}־:Z彞^>)I>꾙;FQ>pw;8Ń+>.?$>Wּ(? HK>W>쌚?Wʵq?m>A? w?N:?? 3?= m?$J[v=>;j>g.>1> 5+>{A!(ڽ?L?v+O?9^|?`K ?w=> ?SN#H=a a>=n?=W>VjE?*n>1'a$>n>ڐ><>'? g&= >_ŏ?4_=كZ>=?lN>6ee>@>';>ƀI?!>%$굽vM>(= 3j3=? ?t?5>GI>]> u?HL>%EX>'?)oN>>"R?2>>0?h>?[˜=W>E׾I? >?"^|CD)ǎ>?%W? >EϽ{>>`*>^;s=U?iy@: S?d?ys^6<9>>>wCO?̕\<{>>~2>v?lj.>yF;ȍ6=9 {>˹>4m% Y $g>g?Ru?!"?Yp>4?5">V_>c?>Ua?,=>i @ ?0z]7=& >%>a?Yr=<|=lbϽ;;> D5&=[>AI?S? _>U] i? Z;f?&=o: >n=7p>- =#:>]?i޾ E`_= $>GB?`=Z)`(??6>A⋞ >=R>~?>>ؐ=BԾͿ3>Ҕ!>ջL?ry=<%!?Qc?Uc?r>w_ 1>^Q?$?=u?>Ma>"}Z>:H>h>Rw?VY?>< u=#> *>x?,B>70#?r>=n`?48>"]=o>m)\>?>gVU0=j/=G.>( 9>?[= y}:=N?8_>QʣĆY0??C>}ү>y?OE?\+9?`>>m=P?j >Ɵ=n >Ȼ>:Y*=e>>˻袾7[?1?¼e˾3=?+&>.?qms=[YHoڼ>>&? R>?Z=>_}=?)>ϖU>C?>վTL=FX&3,Yy?/8?`?a>>X=+>H? :>>ڇ>j>=[H=+>Z? [>?,>e>;<>x>?b=9U=սR =Sr>>-?]>ָ>c?.G>M?CQ=Q>> ?>D>5S>(?>~>&c?.>J~4> =+ ?>>L<>U?(]?PV&?/ё>#= >I2>/k8I?Ed?Bp*'v>6=fN>b>&>>z>ʾn?ZSY?R[>>˅=Hz??? ?Y?Uf>:>\m1?X.?.>(]&'=An=RTi>G2Ah>X?6>٩?M?(5>Tg?">6(^U>"k>F?Zp?"?&?8yN? O+>>>eq?O>kD >2!>>|¹g< ???7N,;~=??x?-?D$?*ý[2?I(>Ի>^=][?-l?BI> 9?>нI>_)>? ?8s=S4|=ܬ>>?>' ?Pi?Vq>s?Vq!>W?.>/%$S.>) >۾>3x^?Ns?gk?(?$">` >">>1 Q>">?>f>}D?> >!>}H?&?>Nk?Ai?6">v>?-,>@>2>?a>Y8>*?S4h=E>?bh? νc?$p>}v?,? =tt> 2?'>?VA>>Z32?\-w>k%=] >>M??v???]>L>ߠF?=?,*>|?i%?R ??` >Ξ?Je?Bf>%?I=,?:k>F? 4>?c??{?'?K> <{>{>>~i?W?a =ç?r?j쾁>Jk>k>?>ON?;e=t8>>Ǔ>">>?"#= >?^L>d=)>M/?C>%>d>b?Lh>p??~=C?1k?,?/'?>d`>>jߵ?,#? X?t??q??!>8-&?hR6>^=?Q?gv>>?s?d>? ?^>?^@\>ʗ^cJ> F?+>&>z>=P=b>%k>$? >??O?yOKt 9>>b??E7>K>4>>p>>r>>\">sa?)?*?n?:{ >L>? ?L?J>??#J?/(?@s?b+Z?0>c>?|?v?X>=ɽ>=?>?0I>?Z>]>dD>4>g?W? 8>)>]z?~ ?P>R>ϩ??Hs?!>/?"?-?Rw>Q},|>.?2>d??iE? ??c?Gw?"N?ud?xZ?;U?U>3u>@??C?i?!G?)l=>kod?Z/?^$>?>>wg?Y? ??^?Q?8?b?#T~>&>?yZ??(>P ?k *?I>>  ?"ML>tX=>{?5D?z? =>(M?WX>|پ>?k)<@?+ >>X?.n?X;>l?B4?qk>u>>"u??g?h`?kȨj~">-<>hG?E? {~ר*>V?s?q"?[k?[f?{??#$>)? n>??; g(>O]? a>>M?-?G?ge>>?oH?i?5>|A?( ??%<7>Jl?>>>U.?L>>Ɗ>3>ˁ>'g? vö?&>|A>>|??>Ϊ>:?h-??0Њ?S*?*g?2>`b??`?-RF?G~? ~?E?~??ff?,9?J??g&~?^>9}>0?_>5q>[&#?-?@e>]?v>L=5>b_9?!=$? z>Ԗ>C1<Ư>?3x8<?.l?R©P?mU>LXb?6}?tx>^> >n>~/>^>?]>.??I^u>>?@?W>Q> >ׂ??sz`?~lU?(?8?fM>b>[r>>j?'? ]? `U>^>#?67?D>?SW?\?a >?Sr>>?nD?Q?ā@l@#)l'FBA?So?o>W?.qa? ?Lg>q*>?*??xļE!?,?u{?|[?? ??t?>>L/>??>.\? ?7>>#X?\?o>>Ym??3U?>?By> >?(6?<*B?I?R A??(>=Ǜ&?UU???>?LV? @?U8?d">G>}ɛ??!>rT?Ws?s?v?i=D?9>&?Tc>g ?iv>ƈ?SI??ȱ?Yڽ?7 ?A?^?>ӄ>b:>>܍???b?<?Y?KO>O ?֮=6?PS?m?C8?ho>?7@? q?/?$l>&?0?pU?#P?+R?t?+?j ?U>>?%?)??u?[?@.?&N?=0?g?n>0>Z>'?8N?Me???v?Sk?+R*?W$?b>,?Kb?$ ??&3?'m6?g? ?e?g?0p?́?nQ??v?U׬?h?~?i?>??:}>uo?4?@ݤ?՚?tt?bp?+`?G?ƿ>>F??H??l>~?g?SM??c?T = i?[>7?`l??K>[??Q ?4 ?S?z?lA>??_?#>?Sw?Ɇ?1d? ??<>C5?"|?'5??XN>%W>B? ?2??c>h?HH>??b? 7r?RZ>BX?L>>L>'>o~8>ȥ?f?>??@wF@JvcĀn >=j?L?vCy?5˨?VW?>? $?BC?Rq? ?=?CX?#j?4>>t>O>?Y?8]??{??(??>)>ڤ> _?q?>*?L?=u?!Q?]?~v?> ?K->?Io?E3d?,?M|?~?"?=?[y?9?s=?~><=&?H?W#?Zq?|a?S?jF >>{7????V>/?2?D[>>3?&8??X?s?)?d(?W;(>#??s7????(Fs?T?rcQ??-?>A?e?ժ>b?U:,>'$5Y?>&>1>ӽ!JW >|4%=vҼ.> V?$F??Z?*?jJ?y>z='?d??_4?>??B{>w?o?:5?DN?o???R?J>)*>P?`>? *?0>0dW>g?{d??L?Fu?3?ik=L3>?k?8?3??Ɔ?[[?Q@>K??|>???5,?}?41?am?X ?у?ʏJ??l?>>qn?/? ????Q"??X?????s)?z?gy?]??֥?V?x"??l?R?#?9???Z?D>G?c?r>$?p?8?G?h?)|a?&?V?K?[@ =??@-b??f#????27?[?x?Q@ u@ '@[A?>->?z?^??=?:???ʝ?>ύ ?_?_????T>Y,??iJ?L(?of?F~?1@_?$?׏?K?)?b?EW?tY?1/?&^?-??̉?P?ܜ?Ύ@8@Ta??o?sq?r? ?S?pT??"H?3K? J>#F??y ? qX?>yW?Y?߬t?x?$??b?q?t??dv>h?g?X4>C5????c@N??!?džQ??5?Ъ?$7?C?s5?Z?1?0 ?0?$?"J?d?-?~?m?̣?y?(R?EM?+?zJ?R?U?U??*?4? ???ʭ ?ס? 6@?G?ɡ?F?zC??>?K?LA?sv?q?"?в:?B?Q_??4?e??BQ?w1???a8?ѩ?d3??>?f ?٭??o??S?ε?\">?I?M?r?#?E?ZG???|?%?L??Il?P?FJU???3i?M????z?8?c-?EE(?# ?Z?`?M? K?|]?? ? ?9?\?oO???P"?A?/O?q?Yp??g??Ь?]???.?L?7T????@ ?7?*?f?d??a?1?b&??9?F?]????&?(??"?Ҧ???y??#y??F?&????{ ? ?M?)v?r#?Ɓk?`f??PW?B?P?Ò?)N?4?g?kw?mv?z5??j ?w??Л?-?!?$?iQ??R?y??jB?cڝ??.?e?:Μ?t+?#I?(@ @K?B?@?1?R@F2@#}@o?_?@ @=tA@@2=?@)#@N?T@$?4@"?=^?'?(Z?!@ @ ?3@H@su@@q?@ @5U?p@ @:?G?gc?@i1?t??ޥ??(f@28??*??XV@.?!?R@b@P?/?4?o5?F@s?2@.@S@3O?7??v?؜!@@R@YY>\d?3@n?4???ý?GF?Ĩl@@@ٴ@ @ t?Ԥ@VP@w???Y8?7@?K@S?ք?T?"@@@ zE@͍@m?h?Ƹ??^@+?ޔ?e@ ׏@R?@u??i?2?N?S@@z}?a=?T? ?e?m?@??a@+? ?䬌@M?щk?P@ @Q@@@{@b?w ?:?@ N@@ C?f?N@J?۳?ca?M???̃S?ͫ?T?@ @@*?v?^?#?nA?ܸ6@@?*?=@%X?@ [@z??W?0@6?S?>?@?$@*?@ &/?@"@9*@@1@Z@@??@\?k@.P/?@s@!@C@)@@9]?q?a4?/?@h@?u@U@e??B@|A@S@@?<@+8@":y@;@g@F%?@I@?p?b\@0H@eT@ ??f@ 9@x@@Vf@{??@@ZQ@\@ /?rT@'@B|@m@$F?Ӹ?>?@ h@NAq pE?7@f?l>?@Xz@ @ @5T?p@@ ?v@l@ ?@j?@5@)7}@o@@ɐ?[@vP@Sk@q@@2?g?H?? 1@1@*?UU@1@S2?F?Z?x?f@N4?\?Ƿ@2@ ?P@@.i@?8@Ca"??>@+@ n?S?q@d@ LP@z?]@@g?a@ @@@'@'Y?S}@C@6?@ "z??|?.?w7?d{@#@@ @"??|@?֙(@mG@ *f?@@/@;aE@ K@ @@ 7??B@W@:?"?)@ @ ι?@8?ۜ?@&?qJ???C?V@}d@ K)?@ *?@!@ p?_?@ݝ?\?@ @@B@7@@%-?N?@(:$@6Qw@9@,@?l?C@}@L@k@+.@ca?@;]>;=@ ?c?Jb@2 3@L@@? ?p@? @!2@7@"@ S}?Z@@ph@Pp@@@~?@"@M@&ϳ@8?ɓ@(g@?c}@bk@ ??׏@S?@ n@%y@p??.@u=@@ %@@B@'ۑ@,?~u@g@H?l?ZH@cC@E>@4?]Bؔ/@9F@&?W@@|? @n@6@F^@@'@v@@ U@'@?@@b@@ f?vu@f?@c'@{8@Z[@]Z@w?@@h?A?@91@Hn@ ;?1@$"N@@@0Da@6@6@iP?@Qz@ J@R-@ ? r@0@0.?w?g?L@@̷@@@,@C>A@?@P@4K@Y?AR?pa@>;?һM?2V?ԮS@ \@!Sj@ >w@@(@)?@(@H?@<@I*@@ @I@%?_p@7@3T@ 7@@';?އ@m@*@.@e@[᫼]?=@[@;@J@vK>b@3@7?@+[@p?v?A?k@_@J?R?@&?6?g @2@O:D???ӵ?@@'=8??̚@H?q?@@?d@@7@ʢ@4M@@@ d@@ @E@!Z@T@>@$?4@l@@#/@Fs@$Z??4@7?$? W@@@6@-s@'܏?7=@ r@#?@E@@@ @ ^?0?0@?H@'@j?"@@(5@0@Y@@k@Q@I,@Ԍ@(\=@@ L@v@ @"U?@ u@>?ʼl@@9j@;Q@5@0@X?˲@'@@ߙ?@"@@%@%3&@@+@P@w@@W@A'A 4mA@@5W>?AQ@Wǔ?}@m} @[>@@S@~@ @g:@.x1@8?2@K?E@#'@X@Pa>@PI@p@*hAWAxA|A @c@J @!@c4@G^Q?J@ G@ @* @#@I@!y@E @;@:e?=g@B۝@4@f@u@7&@6h@ h@)} @3$@Sj7ǿAuF@e@4i@S@"@ͤ@権>@@z0@?@W@(P@@5@#M@H@g?֊@V@p?C@4M@#@G@z@`@b@Fk@5i???A@#@w?v@c@;?Ʈ@= @3T8?@4<@|@U@V?<@@R!@_?@7t@cl@;N@q@K^@@@=J@gd?Y@@U`@)?AG@4M@wF?*?n?32@v@&@ l?{@l@3Z@@Ւ@i@@/F#@M@ h@Uڳ@ ?9@:$?c!?k?%G@@3\@=@6@/Cy@@ §@!@$A?ekq@&@.?U@5[@k}@A7@@@7?tא@A@M@x@@Rm@ @@o?% Ԥ@-@K@+@F O@1u@3{@jk?Q@Y@?T@ds@#o@E@ ;@GF?5@>I?N^@.3A@r}@A&@|??8@)@U>^@ @+*@Q7@X@&@kѿ<@@@:{@\@]@-J@)Hɩ@dg4AI\?v@ᬻ@9kl@<@T@M?Z@m@%@e?מ@()v@?A8yrAI*@}:@ n0@3t>l@%@`A#m@?@f< @PX@a@1@/H@~oF@U@Im@ZI&@V@DQ@r???@8;@k@@+@&mAsO#A)eBA-@@-3@*!?(;?>?*A3d"A3fA3h A3jA3kA3mA3oA3qA3sA3uA3wA3yA3{A3}A3vA3iA3\A3NA3?A31A3"A3A3A3A3A3A3A3A3A3A3A3t>ُPr @fA~?G@|@ZA? +z`>i?2^(?ƛS̼D?C@*9Ba@a?? p@N@@7 j@tQ@m{@t>ޣ?F^?Hv? \@~;?S:2h]?@%;ӿ]I@=H#D@ @4@  ?^@̫?>?7?ÅU>޿D'?|$C^?@˿.i@@X@G2 @ D?ct?? @&uA@4?@o?R@ @@EĆ쟿@j>c@(s@!@?+@$b̿u@GZ@ +H{@6h@ ?;]X?e@LcT?%T<Q@%/?`f: 33pM!H<>@XM@%&N@@&?dx@ FW.BPQ*>@O:@vv@?=v?baD@&4@ :齧94?>?Q[?s~=Xi?#@J@@/E?Y ?W?Ǔ;?۠?- ?ܩ>1>bsd8r?@;>@ ?.?>-¾߾e??uS>KTڿ.[U"??TP??z6?gR??t??7?&d?1?@V?m2ܮv?k1@`@Ӑp??a@ @-?*>̅>l>\?R$???M?0OO?)? !? ?K)?M?8@=y??Ek?\t?⮐@'?:?+O?*?P?jL? ??tp?&?t?7?U@U@r?a?.q>ID>?э? ^?~?z?@ i?2?[Ӕ?g?'?^ @!S@]@'p??v?; @6?N@<9?{@(@>?&5?5@-H1?w?\?b&>q@o@W&?Qa?ϡ?X?B?6@y4@f"????V?>>? 7?$b ?86>_d?˅?vP?t\r?Q1??C?.9>8?R?̐?@J6?6>T?_@?3׆??8=d?4o??7>,F$̾?$}e?[o?L@?Gv>??b??v?h? ?Y [?Fz?t?i?|?@>5?]@+>o??8?Vθ;v?F?V?? ?gJ?????nz? ?>??U?{?ă@0??'@h?c????y@:?\g3?t?>KX?,@-y?9??{??L?&?)???? $> ?֎e>ɶ?F?y?@m?J@8?7 ?Hg??@?=q@=@4t@7@^m?`@~@ X??O@YM?ĉ?@HW%@#.n@)`|@?|@y=5@I\;@"‚@@3x@S-@Y)@\5@U@9 @l]@)@@W^@J+?]@X,@K(@6@"7?iT?G>q?n?=?? z?7?T??5@Q@^_?m?Ъ???@C?ۂ ?@ ?@@?80?[ ?N@?J@GK??8x4?qz????#?@Ȼ@ @+]????˩@J.?p?ա"?#?"?a@?I??#?l?v?C?݀??E??/?xܮ???;? ?#??Ly?AM?G(>'?.T.?ݯD>?,l?ܓt??"?g>? ?^n? ??2@??D@?q<2@ѣ?̈́?08?3? ;@@ ?K?y?,?>?$?^??s/??t=aE?f?o0o?`??wr?9?]?O?ēc?Ww?ȍ?Z?{ ?l=?ZI???&>8n?RC???^x?/??F@??j J?>'?Y???c?]J?h??R?r?[?sǓ??{>)C?dB@W?-?SI?\?@???֍????ZZ??J?9???t?{?^??@P@c?4?#;?{??G??9?pz?P?jX?uR?0r?fkL?zw?Y}?C?e?? ?>[;?hܸ?/?9?sa?t?}M?{?heN?lgs?7?;?!_?_?o? ?/?ww?O?>A@1u?=?Lރ?[?L?\~?d?|?-?;?qr?,~?%\?f?#?t(?zO?E[???eb??Xy?-??^?YZ?Z?mgN???!c?K??3??Q?Ya?d?g%?3k?Z ?ρV??G?.X?S?y?g0#?k$?\R?ZN?9;??U]?)>?\z>?k>ϔ?5H?E0?J(?D??\>1>Ʈ?ޘ??@uw?z`6?h\?!|?pl??MΞ??^>n?V:?V?`2U?#1?&?7>@%?9?O?2I?G??ƣV??~_?*?S?]R?D?%>?{???:=?K@,S?3?*?c?~=>'u??P>%B=!?ee?qb|?G?6?,~?A{<>??C:l>F\#>B>-T?P/??c???B?d H?S????SD?A ?<z>s?D?z??6?ͱr?=}?8??Y?`?{~?{Fd?Q m??X????I?{(?i?$W>{?3-?J?h?\>?fo?'?â?N_?F?RX?z?\? ???UT?A?ڕ?k?/ ?Εc?Kf?ym? ??AO?\?w?o:?T?J?pu?{?"p?a??G2?"????|2C??PX?Db?ĥ??y?}????C?yT?A~>B??Wo?|?mu?GU ?tx>l?I2?Q>O?+?6??V? $?C>%?]vb?2t?`?y?z?;? @?:&??/;w?L?~????V\\?w1???pr??a'????[P??x? ?h?|?$? k? ?^?=? >wJ?W?M?f??T??J???)??&Q?W@?U?c?%????R??_=??2}?>? @??1u??G?ap?'?G? ?4?Se?̥@_=?O ?*???n?B/???6?~??T??׶?ִ?b?P.+?Z???K,@}@)@1/@@K5@N5^@6@G@I؆@aE2@@ @v1@t_@3@@<@ k@I@0@T@2@m@긬@б@0@Y;@@r_@1A O~AA' AAIA/{AOAj"Aj An1]APE AVpAu^A./AJAAMAy}A^ΧAL=AH2AB A:G A/A#AjAWAAxApA@՘A@@}@h@K@@@<@@ӣ@@A@@z\@‡@E@T@u@f|@v@u`?@H@@gN@@e@]5@ma@a-@Uǔ@n+@@j@M@P@BTr@)@p!@cZ@C,u@@{@B]@5{@9}{@1@(EP@BA@J@*Ӥ@Dw@G)@Ty@<@2 @Di@ v@Q @c{@U@RЉ@X7F@N"@B@I@CG@U0!@qA@1`k@[Ј@o@_i@l@W@G4@O@P{@Ij @E@I@L=@-@AG@f@Ue|@F@&@E a@A.@0 i@,b@'4?@2@ =?@?zU?n@%z@@ ?,?(@x@?@&@dD? ?H@ @ +@??֜?W?N@??j@7?G??Գ??B?p?C??6}?ʭG?Ԭ??)%??"?#|?@ ?? ?$??86??U9??L?zn?? ?@p?x5?A?4?r{@@?˖?:?"??j6O?}?? ??Ħ?L?d?HA???9!?}?7??h??"?Y[??d9U@?(?n?m?L?8W? ?t?, ?>7??l%?`?U?/?r?M?gg?_?g??4?S???O!?Vf?Ls?J?1V??g?6^?1,?2?>j???p?c?4-?Z?Z]?~?S?[?R?k??#?F4i?W+?Lt?w1S??aT?Xe?m?P??I??;G??4?EG?%??[m?'TM?\3|???͟M?aB?cZ?lV??>ђ?i?l?*a"?n8??Q ? ?V?{?v?j?pe????|V??+??T>~??B>m8?xo:?%D???m?*?8+?h?©y?0"?p?Q? ??C?5M?j????"??z?????vk?}?wD?sl??Y?R?F???hJ?@'?!?a?kL????A?w}?ތ?B?#E?Y??G?6o? ?\I'?L>? ?͘?Mq ?q?? ?*?2w?>?c?$???-?:?ptz?X?:??)?p?5?ڠ?Z?2?>?F%?U?l^?6??V??p? $?[?Ď?;?P]d?Q!?m?F7?IQ??"+?A?O6?Z^?n?_,?>?g??5T?m?!R?1?q?O?B?#?t?$n?f ?3#??!U?7E?>_>k?x ???O?'?V?!?Y?x<;??3?5?.?Q޹?*??x?H9? ?=?j?lr?N??1.?> C?Y?yb?:?&?e?M?VF?b?>??L?u?M8??U'??h???D?ԅ?|?V?P?Ӛ?z?熖?(?u?C?#? -?_?;?59d????n???k?H?f ?:?lP?r?1c?@S?m?F??ʹ?,???C??d?}??W?b?\?z?>?y?u?t?d@)?o??#??߯@ Z? ?`?ȗ?G??@j?ӵb????*?0?w? ?Q?{?? ?/>mA?}??sJ?????U)?rU?~S ?TE6?0?D>?a?K?d?S?X??>?=m?-?fn?Lg>[?????We>5?L?h/0?z?$?X[J?A6k? ?? $?N??!3??z??U|@??;?j?$#X?{G}?Q?|"f?Y%?E>Q>G>ie>:?O?5?k?1#>M??~?KhK???}? U>??$?7n?&{?^???9?0D?Q?I?2J?Y?tg?KE?^s?j?`T?Wj? ?C?0?pE?#?P? ?"?Ŵ?}?pW?>B??(?0?E? n?x?,?h8? ??I߶?/??=^?.?+>J%?f?[?e?@?$?j?w?{6?]j?V ?E~?E>?1Q ?S>?u??>?P?/?cN??L?w`?7j?m?\?^?~9??7?I?j?4?T?z?E?@ n?f>#?85?|?C?]?W??o???????D?ջ@E*0?ąٿa@ͯ@Q/@AYOT ~?s?8??-9???>Я>O}??>??a?I>zpP?E?f(?i?? I>??3?6+`?A??O?{0?-?,??L ?1????M?t?1j??:?P?S?u?O?st?&?P/?y?(??B ?\?>O??7? ??LU??q?3\@?o?{?V?|>?5|]?s9?:???ނ?h\?M@M?ý?K?%?l?O?ޱ?D?n???U?in@:~?K?R???V@@7?Q?5@?@?B?̝@ ?~'?Ȳ@@(".@}?Gi??E?v6? ??@&?4|?@@<@i@?R@)@.8)@Z@` @+@-_@Dݢ@'W@ @U @<@=@)7@Vp@R@4b@MXn@p.L@(3*@pO@g>k@H@q@]@)@P@[@h@6@9@n@7@l@@@ @@M@@ @Qr@ɇy@@K@b@7@@@p0@J@@}@@H@@@@_,@=Ю@J@GN@7m@G-!@Q/@/O@A@>V@G@<@@k@1;@%/q?;G??t@ Ys@??J0??ya@!h?$@b@ V?f?G??b@>@ @ ?t?@,??Ǒ?2?o? @ň@\??!K??p@6??2?)T? ]@@5?`?#?y,?S@ ??C??ب?Æ? ?i@?=o? 4?HTd?? ??Q???]?!-?@?x?[??%c?x?B?x?Z???m?wL???ѼB?f?f?v??[?Z?> X?,@ \?KL?@d@x?ɛ~?5?ԡ???o?3S??;l?grl@N?Yt[?C?v?^5s?[V??Œ???D?/?Z??-?@?P??'1>L??υ??{?sL??H?n\[?,*??N@??c?|?v?j?D?Q?b>گn>ͮ?^>3Y?u?mr??x?=?}wY?dH???JZ?f?(?}? ?q??i?.=?q??C??y?ˆ?E?cF?{?wn?}?W??x?CŅ??d?87?Ҏ?\ R?4>?\??t??fZ?&?N?vD?ϫ???y?4d?U,?z ?/@D?E#???~?@P?0??B[??K?o?:mr??6B??%?B?,p????o?0?Lt ?e ?@?(O?A9?G3??2?=`0?ws??tQ?*d???$?fc??r6?t?j>?Y?O=?%?? ??$R?ܮ/?ai?f?G??z#?&?F?·D??7 ???p?9}?D??c ?\?Q? ?9?gs?g@i?bZ?橵?fF?h/?ð? ???'i?yU?o?yT?Re?'?u?^?Q?gp?h?ؔ?X}0?k??>q?M??Yz?y??p?N~?@?%??>h?-&??"-K??6?@?jQ?B??Yd?n8?j?U?s?5???x@?b?[?N?&??+UG?u?J?lh_?~?(?(u?['?d=>,X?y??(?ʢ?>??9S?'6??]?Wr?j,?e>?8?ar*?DX??R?Y?'-j?uv?ru??x?h><>>N??T??b y?Z?Jz>W?N?@ 4>>Kj? )?+??([?mQ?'?e?_?21?8?9??(??{W?py?wX??]9?O^>?=?.??L0?Z$?> ?e?4?M?>?-? ?T?ݴ?mk???0X??<?K?v"?W?Yy?P{?la??i?QlP? ?c?k>ʰ??Do>}?;ޥ?#?|ͼ?d??7]?0?.?v?Cb?E?eF?k ?C?>Ļ?0 >[? ?1> ?0u>o?2?_?9?s]?>sa>?Um>޲?m}?9׆?<>4?&%?^p?-?p ?6?*0?*?:?[?M>?;}?t?T?B?`?T9?܌?Nu>i?c.?(L?nNk?/?bd?iR?vΕ>)l?|e`??;A?pH>>?8?<w? ˅?:??5>ѐ?t?pa?=?]+?W>Ƕ?:???e]???"›?1?S*? EL?|)=?x?zm?3?5W? =m?d4?mo@>hJ?V??b?\?r??6>Ţ??( ?c>?:5?.?n?\?p?5.>&??6?M<>:?jԓ>)?C?g>%>1}>5z?8t?;p?d??NT]X>:?)?.?$Zr>`>?ƥ?L@?\??!>}?K1?v1?6̥?r_?wO?4m?L^?U?O>^? ^??7h>{>@߿@6?K>'?[D?鳃??w??0v?-?..j?_N?8?ew>~>?F?EG ??*o?/>fv???'@O?^V?4??Z>/$> ?f4{?Ɋ?`?a?[4>?Q9@#@Xn@?B??2?N˳?{j?כ?q2??q?? Z?1?c?@N?;??QPu?&?}> ?X?Al;?BW5?gS2?X?7?+?5??{?<d?V?W?Z.>,y>r(>]R?W?;>k>?A>o?(?r"?^??Q>?TF???g?w35?AO?[?Mb?V6!?4?N?4?F??a?X?o?:/?x??L?Gr?l?‘?>? ?jf>{?Jw?3c?_???9?9y?X>??rr?6?w]?b0?l?omp?Zv?N ?a1?|?]>2???D@?hZ?E?ے??a??{3???6???f??%?p?f?F,?P9?S?B?j? &??>?Iǔ?J?6#r? ?o?x>T?3?0"?S??.o ??? >:?pDX?4>?k#?t?S~=cW<ފ??Ynz>Q?7/?v???¾?/<"?ۖq?cp?9?Bx>=?9 ?*>yj>뙀?m?lc??7?rf?ǽb>?&u/?\-?x?m]?[?0>F?A?(>?Lĉ?Oy?)?m9?>٧>"??]?;?a?r??d.?Ȕ?8=?g?[9Jޑ@??v??d>ʀĞ>I?r?? 1?@?d0?oH@?6<1?!?Ch?%>?2;8{@y@)]>N=W?ch?.??a?:v?=?6>̐?p!?ѯ??~r?H>p?/^?#<><[>z4????百?a@?@96>8>M??j3>%?K?֘?c&G?2U?fFm> @$ ?Ж?)Y>|>nm?}>]#?&_>硣>ձ.?1D?T?##?`???ER?`2?oi@?G?ܗ?D->>yU?yP?`QH?+s?ֺ?@R@ ;?lU]B>??Eg??3]>ܟ?3#??_;?m>\?+?:>B>?s=>;?iOV=K?tX@?f>e@#@@?o>q>=9@P@ ??6%E?9]?!>$> _?,q.?$V?!? M?Ml7?:?on>;=?@7T@0?)e>.wj?.?5:?SZ?m!?d?7A?q(\?G1?S??|O?- @2u>n> *?3F>>}?*~[?>? ??O~Q>?D?9N?w?ΟUE>b?0ɠ>??I??á?[wn??2?R*>??L>ȃv?D~?g??q>xF>|>M?O?Ei?>>R?>>C?0G?\O?`?+Q?^?w=?҆>>?!?e?+]?>>?0)?~P?TOs?e \?cr>ڗ?->Hl?Y>p0y>g?P???h?=?;?DnM?2> ><>?6օ?.?? ?zJ?p?3 !>t?+a?SC8?0f?Q?i!>|>ߞ?(>?>n>Wr?ex?-?S7=\Nw>K?S'?S?Zc??0?T_?69?dv ?#S?,?Q5=>?1a?Gn??wj?3J? ?:j_?L_?Ij? y???!p|>?@/??>˲=O?D?N??r?#=?u^???n>T????wXS??~v?w?Qb?:?Y>L>$w?\?8V? ?h?;+? 2?]?&?>??-w>>찤?_H?V>P?J[0?p?_@2@:?=5t?Շ? =!?q~?fL>5>?s`@ %M??%?>>`?hI;?gv?>4?+L? ? >s0$?!@@ ?[?OI?5>+?A >M?IJ?T?[?_1= ?@@cy@?P??"?! C>|>ׄ?M?`j=6>?-?:? h?G@?Z餿]'=@?>K?[?:?f.? ?*Y`>,>?:?Yb>y??*?l?I?=f9?@>>Ҭe>e?3?Z)?1?y7?w6?*%?E\p>? 3W??2S?s>?;Y?/#??[y!??B0?F ??L?a(?<^=bl?? ?X?4>? S>>?ch?\r??5>b?a?w?t?4r?G>>@?%?$?HK?<@??z?*l?,??X?d?:>@?D?k_?e?lS???K@>°(?!?Q?e?\?>]H0>?x?Xo?XiZ?06{?5)?(m?8=?d?2< >>?8 ? f?2X?`.vB?2R?>?M?F;?%1>i3?1>Ej????Tv?&f&?\Ͷ?5:?7Z?lr?1f>?tO?P@h@V@/?90_J>iq?? N@Fo?ߪ>>>{EjrP@Cc??R?i?T?s??#r?l?,?>m>\>o=Y>t?B~H@Ճ%@ }[0>6]?<&@??>,?Mk%?YZ?h&>$"?hd?"> >?w?N?i??X?F.=6?*_?^@?X?;~?L'?[H?#??,?:3>,?/|?@&@(Z??j??q?x@(??[?0?P?<}>?O?/w??Q??6m?=@!@x?d?31?U{?kٿ@'@0?㾚??:>???~6><>? ?}?4V2?H?,3?{?@??4V?G@q@o??WN>[?q~>4?G`?U>O? \?@??;T?."8?im?ԝ>A$=IT?>;?z?@?@=q\?>??=q?5= ?B?*y?9?3?\L?q?Y:>P>%"?7@?DY?yz)>_d_?C?p>qf?7mG?ł?>{?Y??a>Б?|? Z>j!?6f?B )?T|?B>?'?p?<6?%?j?K? ?Y?>'y?q????? ?Ӹu?Vf.??bM,?"`>]>`?s &??Bz>{+?>?p?jw?2#H??3G>?>E?u?HY??L>O??7L?}"X?v8?iެ?|^?d??< ?'(3>h?f?w?Hcc?C?); ?-r?5y?P?4`S??Ae>>?QT?m?a??L>??^ ~?w@?l?<?At?f?nd??P????#?TA_r@17+@!A@û@A7 @v}AY?^??].U?Η@?`j??{Q??GG8?:o?&>>fW?$?Y?`z?U; N@@)Af^A*@oA???C[?C?m???l?Uw ? ,@,6O@;?"???J?J׆@b@*?l?h?/>n,?Hh@= @dA6V>[?G ?E>T?dw@ 1@?R@8@\:?z?*V?s>c@i#@Ɩ"@E?&@Ne?ϴ?!?=X?Ώ@B>7 J@@K??J??,I?G-?|I@{O\@T->U?N?,h?^ޭ?9?Ƶ?)?^R?i?/`?\?v?.i#@, ?R@P?ej?(N@R?? @?vz>rU?,??;?h>x=þ H?_?n?td?4? l'?%?P? > >LE??Ŝ?of>w(?5?Q ??7?.p??!?M?|???t>,B?0 ?؞p??B|=?Y??v9>7y@H?}?y ?:?:|??=e2???H8??;?M}@К?^@?dlu?/[?c??\@>I@t?j6?C~?a?w??10J>Cv??ͼ?H@?e5??hc&| j@+@?\??oΤ?v?@??2?&(?e=>+?DK??5s>JJ?r??1(?-?R?r?C?B U==B?³?C?j?>?>"@+@r=PWEaq@@@9? ?6?`s>Q??]m??)@0,>;???n?0@[9\@6@?&?E%>@?a?@0>W?-  @Iy@y@#K'@#?*{@7@*???>?v`G?>q??v?U4?@+@W?@2@/y$#LZ@N@ >+J@@@\?0??X#?;??>?@o?+@@"&?-V @@.@([h??H?x@Lk@I >P@0@pڸ<$?D?]UiK@h5?@^+>@Pp@3d?>+@aq@?=zl6?V@?U) ù@a=E@b%H=@~@2@| N@ė@)s> @ s)@{r@?(L?@|}?uw?,@Z!?Ay+A#j@A۩@c? uʼ0A@]:@G@%d?NPK?w@>?q@H>XV+'#݉??9O`ſǿpZMzA3@h"?Dtfҿ=̿+сYq)?ֽBF=@L@q?L=6@a?=?@?4?3@?@E~>b>[?ֿ{\C@:/?8W?${_?:?tn2i@{re}>>濑gXſ{$?k@_>`?ץ*?]^?3޿{;E?Ӑ>jTX?"@?@a >0Wc7oG<?gx?RT~y?9?:[ÿٿK?e?^>: ?7>?@i:?G>?Z )? y2Y^>¿9?` l~Y(;<>?C?B-?Ӿ,@?ľV43X?ֿ3@]{@dq@.8raſ?W@@(?Y?#C? ?;>>:>j?]?Y??GoϾTY(=*}?c\q+??:%=Usip>bv&?R?ȾDNUG=A?X?G?)9潙'X >.?G?[T?ܐ?6*=g@?xWڿ,{A,?I? +?w9?Bp^<>aw>G?Ė)> ?H?x=?@?y?MSv?x>B{ ?'aiA? Ӟ3?g?tvL>&?3?n?f*n8L= z7? ?D>F3s 4?翴?m?9?࿦4A=D!z?N?m5q?#?s>T?vm? L=kgfG^<ƾNo?4?+?D_? %>۴?An?[l3jO?'#jM>>9>2/~^? x?+?9'?y?6>?J,?\f(#V==9þb>!c?>jཌྷw~?H?W?4?VO>==2 >D?kWM?ad>,[s.>v;?s>R?O+Rot=>p>]>R>`~[>"E>Jݾf>!?}b?f_RT??6>> P>?>?[g?Ec<Sp\?>?T? ?ou?qo{> />~??R>͞?$}u>1m^y;y>A=~>|?{G&b=|> >=?vL>򵃾Zq>l>QT!?'ٸEo>_ESN?U~`?&[Cޢ?i?ZS?O>Lпr S*m{>X?Pڮ> ? <_:<>kj͈=b!?\?I++<]F\RWq 'm?)>ʵ>矾.ľ.<: ? e?%pOd?P?k0>5?#?T?ގż>(>:!G?d+ ? +=^? 8}?ɞzR>[Ǩ>`?~ſ!n!>?mX7t>>>C=?/?(]P6"}?,>cO)Yɵ?1=r>f>?=v>C:?HL=<hѾ??'>)e??U?%P?-<<>wξ3;€?>AZ>ӷ>8>f?"T?mT>>>۝?:?q?W+ٿ/?73yW?v>q=>;?7Y>6 >"g!7>Jc< =l4>툤T?X?zHu? T?e|߾?5)=D?}?h羭g?X'I=]x?aD>@=8p>98K>s>>k>Jd??$ub\gX =`T=Ɋ>:`?7E=I>>>>.>?@?i3?Y=Ƥ>5e? ?+'>c>>I? > ?O?a>0>c?&x?0=f??[n VR=@>9k ??=?,Ȋ=ys,}*3">@wX>>?Jv>ϩU>FQb#>叄>I6mI>4>~? ?p?B8>>ۯ>5%>YC?꾮H B>>>jp>k>arþW6Mƛ<~b?8l=D_<$g_y> ?6!=]>-DľS>?>Q>?T>C׹>>ᄇ>%%(>.a?!ʹ?8X?'>t?LQ=?GE=ō>?'vJ>I?? rk<?e>ي>.>&>><0?ibJ_>? 4? 6?6o??1?">+=St׾ ?F??(?tξMX]m1 ?;O>Z=뿕?uYkt>?VW>~?l>Y3>?DDx"u_>{=Y/>L>¾۽: '>y?8?1(? 0>Ha?/H;b>=>ڲ޿!pSͮ>|?HL?>pܾm7=h>Ţ> >}?Z6:ſ>D; O>X?EW? >>4?M?!sh,?4*&$>>ro?F?Mr?HS?-=j>ZG>Q۾漈>ż?RR;4>?.??RBX?Y>?$?>?!P?*V?3҇>2??<#? ~;W=?4kYq>؀;%>q>Q?DEZ->SR?4}?:m/Σ>?>}C=qy>g?\{?Oqg>3 ?c>5+>ߍ? y>8?G=Ydu5>>S?L >k%J?XǾm?F?3cnϼ*B=eH? D>G?1$>L>1> n8_p>5>>?Y/?YoG>sF>E>===>>04>VAtE>Ya>qEKSV>1P?.c? L5>>ٵ->o?7/>@?&?往L̾>v}>?e?T?>W<V?I?#*^%?0>Cr>O?646=>e_<ϫ>F>>aD>?G>t4?(K>L>>k=%>w>o/Pi?U?_>KD>?-|>=*cm>ۿ6=?v?:>]>?Ph2 ?!q>>T:=6>u?[1?E-<n[4>;>:PG,>{>|Y>~I?-s;p. ?ɱ>H>_.?D?]>%??g=>Y>f?m=@j>}B>r> v>?ol=it¼땀>~?>>z8=?MO? d>W=9<'o>U?Q?R5>>>;u>L?/=bJ=E?&%9=鶾>&?3E_??F[ >H>>>dj?1-j>>+=O]<0ν?,?q*>z9=s=|?@=b?4m?"Bk>WDD?)dU2 >xc??'?,o~>ξO>><>$<+>?Qҽ??>?4\>z?3?8>>_>g>v> =E;>U?m>n=?c>Fr,>:> ɽ >?W?_㽣ٓ>0>>#>N>> ? Ug? A<>Wϔ>(>6>}>f>FT{>Fd?=uȾ1V>a T?!u>i>|p0?>5ٽ^mH>Ş=)9;~?7|?#>?al?Y>?C>>X=wE=N>=S>D{?`Z>5>x;>@>~?Ml5>y?=~>+ڽ>=<Ʈ>d5?^\=d?;>^>J>qsi=C>? h;?.=k m>Zx>?;>M>y>z?3`?[-?Bg[?1Y6>f?)?H!>(=#??c9?`>~:uu??LL?4?XY4?5>A?;>i>98=k^=ק> WY>i!>P>;>3>׸?C>?>RF=?S>?x>\=!W?$~>њ>?/Y?4>s>1?n> 4=2>=\:ྂB?-Jܻ7v3>(?7[>c?G&?{w?!׼&>Cn303R9??J+>w>(>a>~?V]5>,>3>>g=ذP>I4?pi?=qO? 07?.?W#\>{?#4+^=`>C>~"vs>NI>ݟ_;>p>A>?=no>>r=s.>0?J>?00>x>dz?˖>6Wp?5l$>nu=촏?g.>>>0 >i'>)v=W$H?=? ?R=k"?.s> "?? >ď?4??,Vf>'>?J^t~>?> ?>䒤>g'>y?9"? >D>O-%?>f3>I>fWRɽ9>ǟ?kS?W;>~iEa<>f?N>6/"N?">{bԼ ^>`?%?L^>ּ<>A>Y>?+? ==i?'a?:>\2>M?!k?[>Tn<ۭ>? !?> >3>u??/tk>>r:=͔$d>y?(CC>q> >s->]>>C>$>>L>8sA>e?o?D>m?=f?DOQ?l/?<>>>= R?/)?#)>? `?L>?~/?#.M>,>>>m>K"6; =XӢ>I>c*>^?]>>>D>(&>5Cn?*>?=ih=?bɰ?4>=f>$0>Њ>ҧ>B>&>v? m? `>y>Y:v >h>Y>qmg=uW>f? ?">%#>?x0?$~>:/>;>Q>? >?>3y>G?65"^#>j+?/ ̾j>։>y>>9 => J<>cE>`>M?,>$>x>x>j*&>/?1?B>+?cQ>Y>?1o>נq=*?"y>ʃ>9>^>Ki>-4>w7?>o>e?>>t?46?\> >?'f??S(>A^q>K?,}>?>|>Z?>Ay?f>y7ؾH>> ș>U=&? ,>= ȍ?\?Z? 0?{>f=5>zϔ>>a?+>w>2}??>1g?;K?Va?<>\>悔o>>m?Md>g>c?o???.`??X?1?Bq>?+*H?< >܆3=>>,?>˛==L>u??KJF>?q>>N?^p>??((t?VCw?Jt>yӾ\v&?)6?XG>8>?3^?3=><>5>U> :>zC3>a>??[?8 >?S6&?4Z>Ғ_?>>E= T=͇??aH,>@ ?Z= W>r=胘Ra>W? >}>?G?@ߛ?&5>a>3>ŏ^?M0?.>8sX>>?3!xU=n?s@?2>z?>g>??A>>{>y?9n>>g?5>y?>>BҸ>^$X?"D?>[<q>.>ɸ>&8?'2>]>Pl? =?F?? i?z?@>j)?9>b?+?5%U? (>͒>{?D;??.w>A>s>]>|O=t >2>=9L>?Y?FP?;>>>(?NJ>T6?$? e>o?_>0?0??7?Y?T>%??P|>~w??X>y>&?J>?)w >?A=>?5(>]>o??N?>Ë?9?Pw?!z?e>|o>I>4>?<,>?N?nQ?j>D=#?G?L@6>^^k=U1? sP?(r>>>b>؄>Ά> ?y?1~>S?Rxz?4>;??}7> >>a??@ؠ>N>?m:o?i>w>{D>R>$)>ߴ>>1>Ŏ>Ľ>?D?6?[j>\?X?I?c@> L>>>P>p~?-/"?=T?&(?)p?!I?)q6?Y\P?\?X>F=< ?D>`?:Q?ff?62? y>9?'w>RC>4?"F??8? \>>W>n(> ??3D?X?P?y &?OuH?v?%)>Ã?pA?1 ?4?>?ix>U>s>)=>Ǒ>V?09?Y">0> 0>D]?>>9d?K(? ͥ?M>R??L=w>Z?[?5?S)N?~>?Sv?:R4>Z?V5?"6]>x}j>ُ!K>E?q(Q>,|4?>>ǣ><]?o> ĉ>$=+?u1?S?B\?$?_?i>{?B>? Ր?;?6?J>n??ea? 4?%w>Mt?B>f>?` >F>">?t?" F?.>/=ɇ?M?9A?m?T>?"?>\?-WP?ۃ?= i=/5?/>Kw?>+Ǿ!???{>ӛV??8?1?Z+>>;? ?X:?-W?c?Xs?&w2???8B>0?vf]?Ӓ>a:?';?{>ѩ?o?2?:-?F?/?%>{? >z>e? C?/n? ?N?!>>?UU>Me"3?j?q?0 ?)?jv?(g? >>S}>B?@A?)m???f?.\q>~>>>*>Μ>z?>?v9?&?}z?h??Q>?$?!>?&w?̿?D?NLݽ@o*@g8 @L? lc2u@!?N?69%?1g>K?_?ZP?&?c >.-?-wo? )?Ho6>>?H!?*Q ?U,?+Y? 5G??Rs?>Z??{?@?\?%?x?>?"K?ljd>?o?^>87???Ɗ>p?j?>#>/?_y?{?3>%>M?@>?.>s? L?jV??>m?s?ś^?%A>"?K>G="??EO??2?М0>U =?g?^?S?7??>M??7y?T;?+?%th?n.?? ?yp?3C?7a?U*>>S1?B?eX?!?>՗>{1? ?>?$q?;?mb?FcyI?UE=?Kf>^_>T>~>my? >@?>FQ?X???#?p?>?K?%?i?J?$>E>:R?Ո>*?s)?r?.v>$>#?9>r?NY?`wq>P>?r>fKA?Ma?٣>?*?׶?/ϭ?Q?K)=T> T?X?Mn?5?t>8?OJ?7d??Z?>t?Kv??E{?^?2Cy?/E?<??d~?8>Tl>?-?~(?!? =x?*[k?ф?B >P7?g:V?=jb>??lA?'c?b<?|?ve?y?>G?7m?Nw?t>?jO?F?lI?zG?<?{?8>>2 >?l̸>g??>?L?N??g.? ۽>h?>a>6Ɵ?>p?'@?$>?0 ?ԧ?G? q ?w~?#|?ӎ??s?e?>i?F??%ؒ?Z?x??#?[~$?8]?W%?(u>8 >9?Wf> .>]ل?A??^5?5?\>M)?,?_???P?+g?ѩ>N?cu>I?@4{C??^?.?Q?|;?l?,>Ҥ?jt?r?c0>~?(?wuM??vy?H?w?J?EX?@U??0??>??n\?>'?.?G>?ڬ?%??:?k"?u%?Ce?py??0??Q???fU?>2>>魿?V?>! ?#?c?k??Èc?>j?~?]?e? ?R?\?V?e?5Y[?|Q?U?Tbw?S?:S?a.???ׯ?Qp?Mo?'?RK%??^?LH?Dd?eWF?:J?Fr>?N????c3?T"?s$?qw??So?XK?l?UԖ?F?Iq?\??;g>0? @j?x}?>2~?K?p?qœ?B?>kI?2U?A?zM?,?d?@?~-?si?e?ob6?!'?^?$?NKO?,?? ?YM??Su?\? ??`t???\D?_??zhI?g,?Ph?C}?#?a;? ?t3,?kj?r-"0?&Q??(?p?h?O}\?O܋>>5??8k?y ???8B?s?i?S?C?Ho?U"C?(?F?0K?y"??#??u?j?9?#x?$ ??dQ>Ɩ?m?k;4?9.?`_?bf?m?{,?5?z ?i=??_D?­?=j5?$?/??Ϛ ?k?*`?0H'?SJ?9vy?mX?,C?Z?s?]a? {?>1?ay???>+??93?W?y(?*{?^'?)?JZs?T@?|?\?]?,M?L??Ok?,?X]{???v ?\F?%?v? ?Ah??s?r ?Y?[?[q?Q?B?+?p:5???7W?c@]?? ?q~?)?3@?:H?!? E?M??'?9`?2|D?`~*????gHm?9?i?%X@?1v)?|@?????7:>A?( ??ZZ?-?zWm?'?LM??TB?i*d@q@%@O?ƒ?m?A?&3??9l?`?X|?[U?P?a?^? ??7?a??nJ?)??^ ??ת?jͷ??E?a?@y??@e?>?@?&?_)p?s?a@ay?T?0s?^k??1?? ?E? ?%?[??>T>qP???>??lO???Č?/V?>S?R?_l?,?d?/?s??|M?Fi?p'??GU?3?o?v ?y?Ag?A???f??h ?*?V??g?*?g????8?Lp?w(?H4?hR?}^ ?u?=????C*??*?|}?ʛ?c?[-???F?$X??˹?b&?=???|??*?U??]??(?٨??-ҕ??!???)?t ?J(?7c?Y?x?z?bd???c?,?H?\+??ƥ?W?#?h?!?X}j?7>?c??????j?e?O?fr???A?m?N9J?A? ?x?J?j?L?xo?z??R4?qϥ?>q?%?:??YK?<_??Uj?.?a?H+#?gX?L?aН???'L?M?r?j1?X??jC?4??K?j"q?#?݋?5 ??՞?Ё?9??D?W??u?j?L?!??%?;?4M?}@? ????q?Pva??k?{V?@`?|?}9?u??(?S`?'?D?|? ?@'\@E?V#?A?W?.?Ss?4? ?s???a??İ????j8?M~M??x?l@_B? ?DZT?O?Ԙ?T?Y?i?uA?I?j#??*?{?Pm?WK?r?"?r?*??V?nb?L????vt?E2?_?W???^$??w???R?*e?O?z?^?l??L?|u?>B?w?k?&?yM?y???<=z> ?}??@ ?L???gZ?c^?ù?f?g?@?,?T?w.??B?iv?_L?py1??e??Q?3?/?`b?Y?Z?tk??5)??Vr? ?d?^@@??}0??RV/>:>Id??L??P?)?(?˗????ɢ?l6+??5?}?y6>?"@@ J@Z@Л??AX>p?8?A_?W?U~?? ?ב?D?ڇ?? F?8?i?<@tM@-?h?pX?N?̛?9?Y?r?¢?Q9?f+@ (??y@ ?{?m?~%? ?eH?0??߯?'? ?[<?wg?DÅ?? ?|?@=?JA?W?L?d?@D@>->w???x? >?? ?VA??tA?xNd?DM?ŭ@z?@X@.j>?Jee@6??v??UG?@&?l?`7?~CL?F@q@?J??M?D@*T@v? ?f???un?Q9?U>:?y>c?,@Ig@ h?uh@@[????s???aq{?W?7?RUX??W??r@V=? ?|ܽ???,L?s@0@@Bgo9?b ?>?cW?e?ۋ?O.?=?6?{?S#?^z@G@0>>.?"2?`|?s&W?_?*?c?8;?c??d5?Es?t>?y?`M??PP???3?dk?}?3D?|??? X?s?qC?}K?%1??I?A?βQ?=????:??".?1?-?m??Y?0?.G?"?Z?8?X?*??8?j7?jk?#???؋?o6?O?q?`c@Q@*P?$?> #?n^?\@F<@4?4??}??tW?[?=?Zӳ?No???)k?kk?y?^??~?3?H? ??@1]A (@M27R=/@Օ@+>*8=>?G?Љ??8@ ;d??pg?M??I?j?@K^@'t??к0?;?@ @=?ׂ@ 9?I ???jX?&?n8?s?s?t@'҇?m^>?{Y?wC?{?2?:??tr??N?܏?? ?U?˲?S?5?Z?2@ ?M?n8??7@@ ?e??lIq?i)?{?lgC?Q3?Y?fC?@%J?$U?+?I???l?,??l?m?z?2?;?}???a?K?^?:?G?~?u?h?0?:?Jy??(M?Ɔ?????(?&?֐?*?g?~?ƃ? ?l?>@h@M@l?I_>/!@2'n@ B??B=wM?Ӟ?tu?k??Y!? L@G??!?dm????"??=?k?/Q??? /@ t@{_@@'P?P?lO?8??ذ.?a&??x?&@p?=?*?%? )?>?^?;d@D?@1??H{? ?V\?x??@D@K ?v?y?(x?~?@9@b{ҽ;?Y?^?c?@>@)?>DU>D>??,??H?EH?5@NT?>?W?ҡ?"?ά@m2@NL?DE?u??? >??jk? ?tj?s)?վ?Ug?;??$?m??9?8?mU?d?Kl?d????6\?x???'??F&??!`?>?E??@.?̇J?};?jq?ɵ?{?#?oU?՜?M?Pl?g`?S?K ?tJ?a?P??x??i?Ù6?y}?k?_???\?rG??7d?b?i?8a?j/?J???x?c?|?ʐ?[?))?@?70?+?)Z?B? ?Ru?h5??'?O,??O?H?͡???(`?Y?#?r?9?&?ag?l?˸??D?K??8?yEi?ռ??]?}?{=?̺G?V3?.?w?K?p?D?/? ?z?ġg?}@@@@AF@MA @Si@S>5T7@:A?P@?{c@A{@v@Qy@4@?U@D??͛?$?x?@*i@7?@@@YAܻAX@ -@TO@Hl@5@W??@ @?$@%L?u?94F@"Ek?R@'F?ղ7?@ ʎ@2@t?χ?@+@ ?_@B@n$?: Z@M}@h@-T?'?ɡ?]@9@ ͬm @-.@O?>??p?Ej@u@zr?\>+#@A?L?b?`?@L@.?D;?S4@<@0P?i??@ +?`@A?Z?Ɓ?? o@@E@x@;r?T@# ?Ҡ@(@u*\@ 7m?q@?9??ȼ??φ?ӣ??+l@1??o ?!??В?E?W/>7?@PR???4`??=?@P?]?E?h?ɺ? a?Oi?F?RZ2@ JN@?7?> @2x?{Q ?س@l?(ބ?ޓ@?0?o?e@.?[߿g:@M@?xs@W@>HX@!@[?ÿƋ@}? ?ò?p?ƟJ@$*?1@@m? w?=LY?W@+?:?o.?\6??h? H?b?u?\*?tE?T?P??^?ĞX@^@@'?ᇻ Yf=p@.m%?"@P?.?:@+G?k ??_????9?F?X@?3?X @?`_>C?I??/?~?l??|)@AI?c/ @oM@8w?(V?2??mx@s9S >,@C@d*?k?Z?d@?80l@VG@?z@D9?]??L?@Oܰ@8v0 f?:@ =@ONuPJ@?Y`@K@;h@;?{@w@hD>8?@eC?d@ @6@/@8?Z7@OU@_?@M@aڌA<@ƾ@bRq@0@CO@]??6@ ۸? > [?4??>?i9?@?1H>?@0@f8@2b@K@fb?@ 0Dk@c@>@(5>@t??N\@=? 5 F@c@bC@[5? C-n@2@O@W @O@@{@,5o@Y4@@V?z@Xjv_ cX@Мqu@A/@?n?᳿G545XT@/@;@T@@d@@`=Q@kY@jZ?O?Mp5?Ѭ@D?[3?*@J^@9$?"?Q@@[?kA1M»c?RvAF=vA}Av@>A+A@t@ݶ@@8@z@@@<@~@@@?@@@@@@@@@?@~@@@<@{@@@7@u@@@/@n> :> ??͕@D蠿?<@B/=ABejs~?n?T0MB޾2@ @f*@C@y??Mr?儛?;LU2m?@=@@#?@51]1iU@"?}~?k?(6@v8@r2=]Cd=@ ?ZXW/?=? ?߁p>> >~ſc"?&>z?M+R@j?wMo?lJ@t ⡿{n-@5D@?o<@12@@0?P@&@ UB?|O?/@ ?ۘRw?OS@J@1@B_@C\nw@򀾴>Im@R0@9?@?n8|Ԍz@i ?_ m2@~`@NlQ3?1Y?*@Sm>Hڿf a^D?^@n@J8(?KV@)@m@-=9}-?Ȫ@7@?z@ g>?ʊ,@ S@Yg?> ?y2??νc??p?+@2??u?:*>3D ?6?ؼ>ŊP??s[KQ?Inu ?o>ʰ#w?o@Г?Tb??K??J?au#??o?>? y?\:?S?>Vw?ml@_?O?F?I>B?f??'?@3f?:>t?t@$@>g|k?>7@FԾ`>&?8L? ?}l?$q@ ?C}>MK@@#l!?s?[?&?Rx@,(@ >ړ=ok?7ɣ?T=zz=2R??,??\2??s?#?gg?@ @?M?R#a?F r?@A?q( ?b??a?Q@7+?99 ?6?XR>6]??NX?r?@U??0]?ȷ>?@? :?X?N@??b?r?.? @d@0.?>>j?p> ܎?q@? ^> '?Ot?Xu?,?*?ǹ?@2??@??X@f@E@?܋? ?@,?uX?VM@b?P?@J=?@9?gb?&>|4"?>>.?b?@O?!?T??Ҥ@?s??c? ??H?r@??@b?8?"?o??X? r"?y?aJ+?>=?$>^I?t @H?w???b@ G)?]?4?o?6?|6?Y>?e5M?5T?Tj?@=@M?O? ?+?h?i???D=??c?\y???%j?i?+?˒Y@:?Լ? ?,+?5@Z<@@?U?h?~??@|?)?@@?dZ?Y?f3@@\??6B?? ?}L?M?X?@0@$l??/@@?@Z!?I"?@ g?U!??Ɨl?r?"@?b?Ho@3^?I}?b?Ի@M@@,@%2U?=@P(@w?@7@,@@:@3@ &?h?i?@$c@:!?(#?@ L@?*? @U*@E?Q?n?W?!?ۢu?@?Ձ@P??1"?tI?@@i@[@1@51?.?=?@0@k??Ƭ@ H@g?7?m??a?n?̚?ֹz?Ԕg@Q?e@?@%??P/@7Y?Ձ>?@B^?@ l?3?@%@@?'??N?^??x?@*@v?R?@t ?S@ @3>@@@ ?B|@ p?{??k?(n@T@)??M?w?e?d?@n?4>ru?f`?%?@0@?s?+@8?K?'?6@9?0?{??H@<+@0??ܣ?-@%?FA?&@?ڽS??J?1?.@ @zl?ⷒ@U@:?0?5@5 @j[W@?@@(F@#{t?)?2I@ @56@ ?@ٹ@b@@?bc?,@@!@ .@ @7=?@BS@8<@@@D?@N?}@!@&? ?>j@{M@N?@8S@ @d@6@? ?o@?@ @Z?Y?u?^@X?.!@>?wt@5@u@/Y@,<@B@@ZT@P@@:=@);C@@5k@,@ H@*ň???U?Ö@@-@N@} ?x?r?P?$6@*^@F~?[^?D?$J?D5@?y@"@@;g%@y@@$]@)2@0@Z( @1P@@K@/@0@T@b@6?p@ @.@;@% @&@R~@&?C+@M@\@2M@K#@7UB@D@M@@01@ ;?[?6@)@?@ @@ W?|@"u@&_?c@'G?e?}@??@/F?m?`@L/@2Y?7?U ??]? ?2?D@K?o?z??ma?e@0o@?*@#@! ?"?0@*|@qU@ @^#|@.b@@Uu@M@EP@+oI@ĸ@4(@'$@,m@[?@Ev@@[z@^@b @y@V@-/@ES@6@9p@-@"@* @:k@A@4/@C/@>@+w@8@O @7 @J?@/@1?@@+@2W@!DB@}@ @ (@B@@@7@?A@0@:@D@0T@]@(Z@@%MZ@K@a@r"@@]@@@?@Hʖ@Q@@Z@SQ@-Jk@[l&@@:f@9@J[@P@+@Xm@@).@0װ@*@#)@P^@u]@Gb@1ׅ@) @G@J@,?1?/@?Ax@&G-@6?5@^@#??da?Ɔ@w[@ @M+@-??O?@1~A@6K@x@&;@#@:uD@U@Z @6h@-@Hq@;Er@J s@a@/!?@@C@Gn@Vv@ /@%Fu@@, @J1d@Mܗ@I@TA@O@L@Cܰ@A~@9#@2~@b@Nx@:5@`=@IP~@45@!@Rg @8h@+@F&@J>@V6@Q@8@8@Jj@H?z@F @G@`3@@@&K@e@Mc@mx@n~@ @&@mɞ@}@D@^ea@qm@Vd@RaV@hv@cC@y@w9@fJ@Q@U@l @@$@ @j@@=@zm@qn@d@_|@|@i$@,@7@dc @L]@#@a#@@$@@<@@V@R@[@ʛ@@@6@/@@@@]k@<@Y@@ξ@nm@\O@ @"@@7@z@ĺ@@@8@P@)@.@Š@@@ǡV@~@=@@)X@|@M@ʽ@7E@p@ª@@ð@@eM@p@@nb@q@@5@q@Q@@@@@3@@v@s@tQ@dž@]K@"@>@B@{@@@_?@d@Ӓ@;y@ġ@@׻@i@h@g@}g@Ғ@\@&@@&@ @J@˪i@G@!%@п@ēo@P%@ @@e@¾@@2@ @g@Žd@@T@@ @ @<@{@cw@3@@ϥ@@q@@&@1@@ @.@/@@@p@@p?@L%@4@@@$@@:@@ǓN@lp@ n@O@ @W{@Nn@@@i@Dž@ @QF@@˜@Rs@w@nja@q%@_^@6@B@@i=@Y@@&@@B@A@@`@8@@ޒ@@|@3@@@@ )@޻@ܢ@n0@0@D@`@U @0@h@ĥ@˼s@[@(@@ќ@\)@@!/@@š@~ @ʋ@w@uK@<@J@s@ @@C=@m,@:@@$@.@a@@Ә@Ӕ}@j@@@@X@d@X4@<@%@@@e@"t@@ܰ@J@A@!@@;@͛s@r@е@׮@e@̭@@@M@Ɨ@p*@4@@Ȧ@ϒR@È@ь@u@t@c@@T@؍@~@@f@@8B@q@@@@@@@ޓ@Ɗ@N@>@@ Ayr@5N@G@Ժ@ @!@ۿ@_@A@ԅ@7@ñ@R@D@D@&@9@y@hA'@&@y@4@o@ @p1@;]@챤@I@$@]A?@@Ԃ@ꂻ@@DA`@@,@wAvA&@?A-AiA AA A mAdA=AA)@@@|@=S@+@&UAq@~@@#@ H@@@@X@%@0@ў@w>@z@@i@qBA4(Aٜ@@@0<@/@Y@@"@@@@ @ހ@Z@ AԯAHAFQA5AA3A QA޷AKA qA @@A Aܛ@ A 75A tA b9AOAA V@`l@4A < A @0}@A ?A %uARAA 0A[ABMA'A cAx@A2AA_AVA A[Au:@@X@@,@_A#A RAA@d2ApA4KANAAAoAAA4A^gAfAA ~A~ AAA!^NAGAAa`AۊAUA A uBA\AA)A'AAoAKAoAfAAeAA!AA14A&>+AA#<~AΈA#`AATeAeAAA AI)AjA"`AAP@AV:AvAA7A P@jA FA!@PA LbA U}AA AoA"AAAA`AA#kA+A:A&iAjA7 A AAځAA A$>A#A ޏA:V~A/tA",fA'IA$A,oA$.TA;kA"*NAMA%A!c=AA&A%5AKAA6^AA# bA"UgAAA ASAIAA%AADAAAOAP¸$B lABADA1AwAҿA,&A!BA,\A9q A/FA0PA4XASA%AvA%iA(,A%aA90A0A>FA9fA(/A ZA$]A2 A.eA-A.[A+.A(.AA(~A#z+A!:A#r$A;fA:o#A/JA@/{A/,A1A50A0A<aA5rA0HA1A1 A*A&wA8A?jA0y+A0sA8A% DA=A)DA*oA2,A012A9NA,wTA-A*7A,mEA&A "OAFARAHTA@0A(#A AA@A2sA0hA8NA;XAIA=.'A7A3aA,XA2A0-?A-UA*)A,A,A"VA0A:fA:A8*kABgAN%AA5xA70A(nA*`A$A-&A&@A#AA&7A%,A"ѽA3dA-|A1A-!ZA.rASA!}A# A0PA4JtA/(A#&A8)A/ A#PmA\+A"A8?\AKA*AiA%XA.6A*A-AkAA&pvA(WA0A"3[A#A-QA0AA bA'A7tA2VA%A$0A2VA'sA#MNA!,A&,A+A"A6A5A+A+A.wA"#\A'rA(A.A5cA6PA7 A?A/bA1y A3&A,A9A2GA4A=A9kA*dA*(A-"A;rA8O,A):GA A$^A' A-A!A&_A,IA'eA*@A+BA)cA$A% A7]A7 A0mA3;oA)AxA6A,!AAQVAASA gA$IXA2A6/hA0FA2AGA6UA$EiAHAA*zJA)A#A5=?A= A2rA()A,A*-A8A3؈A("A/A@oA9AI,"AD4A5kA8AB0A+aA$A*^A1A4A1kA6AAA>\A;RQA?;A>wA>[A7yAD AFA7VA@AJA2-GA1A9SA4A;bA7A@djA:2A=AHiA.A;A8DA5 A+AA}rA(ܫA-(3A(CmA#A"kA(#lA5A0#LA1 KA@ИAIt7A8юA:ێA8ADt|AIADAQALuAPVAKAB"AB 7AJAKA@HAA߰AEhA?A;cABAM'oAJ} AHѰASoAZiASANDATmATAK0qAPASAN AJoA; A-IADBoANA@ ACz AM/QA::A6A;ACFAHayAJ AUAJUAYbAhA^A8AVAGAVARAMARiAAuAB;A=NACAHhACf AKXrAURAP(AGZAU4AfAAMAPAJqARPAW ANAMAHA>ADA:BA6kA<~AAA<̗A72A7A:*AA AHA> ACLAD1ABADACgWA?AIBAJZAOAZ)AQAJ AX~A].ARsAFAE AQEAFA@xAAA?AAAO!AN5ANnAO 0AMYAPAD@AFwAJx#AI AEmABzAMAJAQ0ASi ALZATALZ-A@ȨAB,A@AAfA^8aAB'ACAKAL{AK=AE@ABwAABA:DwA:jsADKAETABADAKTOAQfwAN1AD/AR6AMAL3AI<AE8AVaAYAJA9A@AUCALAIׄAH4pAG3AJAP^AQABcAP AKfABÙAIAEAHAN$ATwAXkAPeABAUG}AA0A@aARsAXAQ-A@eARj"AUAK[9AMeARpAM2A[ASAKPAVLAM#AGAO%AU.AV4ARUAGAQz ANCATAZ |APAM +ATJAQʄAI~AJAMnADAcsAABAEAKAR =AU8AU`AUA[4]A]WJALARAP3AS EASALnAQ>vAP(qAN-rAYAZ00ALqAS}AZoAS6AMAE TADlAh !AVAM APӋAVAZAHAa0AXANٹAJRaAHAXT|AZV.A[`VAKNAEZAS+ALfANXPAT)ASAP[AY٠Ai0hAO~rAZA\ƭA^Aj@A]/A]AXCAUAWA\)APyAAAK"AnrAuAtAah,ALAQ;Ai-Ah OA\AV AR^HASpyA`ASzA^NA\*AVAaAa'A\EAWA_LwAXAGTAeA Akc{A_AgRAf 1ARA`Af(Ai&'AhAbA_,A\cASAjA~[[ApRAd!A[bAc$Ac1NAb`Ah(A_rAdAmAoJ4AgVAaAb36ASArA{gFAm AfAZoAfMbActA_LA^AYAW1AY AQ_Ag-AjijAI?A`oAWAmAnAb,cAdUA\_AX Ac*A`0cA]Ai6Ad:A`A7AhAg!AZANAR{AWAg{pA[kAS6AdxAXFA\AQyQATA[GAViAXYAY(eA^tAkAc%AM:Ag0AUWAZ;AZA\HAiPAXA[4APAORARAL݌AQ0AEABATAaJRAVAMARARSAT/AR'AJAYAelARAOAREcAHANASuAK"AAlArT Af͋Ao}AkɸAiAn-AdNA AnmA`oAfrA^>A\Abw'Af]KAcbfAasAd"qAagAr!A^Acj ArAgĔAm3AtcA\$AvҪAc ^AhAnARA][AjFvA`AZAv.AtZAcDQATFA]C)AmAPAZAG_AO$Aj1AkjAkAejA^A];AasAghAbkZASAwaSA^ABϨA.AzAAsOAdwAtA)RA{#AfAd!AchAyAAjAaA^$Ad%AbA_5AeAaEAtQA[ԄAXcDA]d@AF*AM/MAgðACsACAgAeϿAMASAMcAKcAS,A\AfHA[XA]AdAaA]YAi7Ad'AaAfAelA[jAaA_sAY4&AYA\A[LA^4AazA`xAhAhrA\ArJ ArnAq:AlAkAqfAgaA[EA]*AeAc%A]2ApAYuAX-A[ AZ2gAZFAZAqcAkhbAfAi Ac;eAaAXAae5AfA`1qA\vFA]Aa!AaA^A^!A^NAMAU_A\׀A`[A_A_PAfAhA^A^7A]r(AUAfAptA]A[hA_AUKAdAU:;AapAbA\LAaAQ>7ARvAc5GAaAXt A^AYFAeiAb.AZIAUAVEA_AXEAg IAi0AYAVhAbۗA_-AbAgAbAZA_GAW AVAXAX,:AWAZxAeA_AYA^5Ab A[AZt%A\9ARVAWJAj)9AQ|AaAdAb&GAblAZ\AM7@A^AeA\A[2AeAR?AWA\A[A\A_,ATHAbAcA[< A]Ag>KAW4{A[bAQATAgRAoZAfA^AhqAadAdOANA\AZAU AW^ATKAUPKASANAMANyAJP'AL!AV-CAV` AGsAOXAZASZARqA]AbA\AT[AZAZAYBAaASٔALAP-AiAwjA^ܱAh;)AAADqA8_AoA|nAz7ApAypvAUA AmAAA^XAgRAh.AsmAs^AbA[MAh3ApA_1AcAw0Avc_AMmArAMAlAA$AuAkV|An AnAp{AzVAqAh::AoNA~AzIeAhvA|tAkArhA\AAHAo!ANGA}[AzkyAhAjкArbAt\AafAbAn AjuAXAXAa;AaAd؟AsA'AAtAs:A}A|UAqAt8AoArApLAgAj A{ AO AztArCAsAs9Aq Av As8AwAzAo\Ay-Aj1AfZ1AqAtAieAyAgnA~2AAqAAz$A{ArA{AuDHAk A|A}9A6ZA{"DAzcA{ZAt/AsAsBHA|UAvCAqA'A|QA~pAz*Ar/AtCA~AvA|\vA{Ay~AvAwAAuArA`iAAt:AcA AA{AsYAtxAvAi-AeAhAlAoCyAhTAbAZAxu3AYAxApLAqAtءAwUAuHAtyAx}AnAoA`^$AgAuVcAbAh~AppArAjAb^AAVAYAh5YAaA+Aj[A_%AnqAtRAmAdxA] AaàAk9AlFAqOApBAcAxsmAfJ}Al?A\$AUAAV-AoAA]TAlArA#N'AhAAt"AaMAZ$AuGAa-AcAj#AawAWA[AlA]AZHA^AW]AMfzASeA]~BAbAdLJA]|.AIdaAh6AqA\|A`AZAbj/AG*ABYAbXAl2AN׎AcWAi2AjAk!A_jAlA\Ac AoHAg9Aa$_AiBA]fAuAgئAAzOAg/AjAu?AeDAcAu#Aq ^AZAjAdA_AAl)Af/A?fAvc(ADA[dA"L0AAkAwA`cAcAAvAkApGAvAmAgArAbhAnAeAb:AdAWAhpAS`}AnAiANOAelALAdcAe|AdsA+@A_A~AeuAbAmAnaAggA?AkSAg?AaAn0A`AiAeRAbAeAc8 An4A\Ai1EAwd\AfNAVA}A1AVaAuqA^7As3VAuIA)$AV A~A}@RVeAMA0Ah|cA_lAq̿Aj0AmiAlAvKAJ>ASAA0AhAg)AlEAq̫A!Aik@A|A'8A^ Ak"A_AkЂAiOhAgAwDAwATiADT@2hAb]AA_DAAKDn@uAAnA8AG{AmuAAWAdAcAgAwAyAn]Ar7A|%AWAjAc8~AzBA?d"AT=BA_Ak AwAʱANAAmAfAs_BA\sAAwAANAXrWArAeAAAxZArCAywAh}4AK!AfzAv#AwAl)A@tAZAAqAs- Ak6iAHAl*AZ&A-Q7Ai/Al{A_lbAo@A^AWTAJAҶ@hATA׮Aqj5AoA06AxAhb`AL/AH׆AcA`AAuAk6A3AFAt3=AɇA}@VmAbAXAl:AoAh3A2AosAA_AkIAdbArA\AI;Aw@Ӭ3?#A4zAA"ABA;Aj{cAw|A[HxAQA|WAqZA~_A} Ae]AVdAbAZmATAmAJvOARA>CAKAFHdAYAoBAUyAs A(*Bf•"JA?@1?L@M?&¡?H~%xyel*?8@@t2@#b>/ſ@PmIj;H@/aT ?? WXj@a KL^$@3y_?!KlKq?#?×?AQVL?t@#N?GͿ4˾*?k<=?*>z @=?ZǾ|?۵@u? 6п+?x{*V)@aV@ dc@J$@S>?] ?j@R֡>t!? {$9r[!@ ?ޘd6@@V%?$ac?7= c ZTHA:@k@a)3@0?*1)?5?wuD@I=?qTT?(?Y?}q=>>p^?_O?3?ږB X?qþ/ᾒg> x^=J ? |@ ?Wk"a.Vv?e[?>qb"?PԾ?Cῂ;Xxݿʺ'H?dD>>@UX@!ʫWTa.??@L~ۿ>&?>R]-"//L 4:X?HѾ\+>GK|>=??2?$龭^e>"߿+Mjz6'?{@?1yT?>F̎??4?C@0z>M`>! ྺa?Н6?ft?o$?ma>`?s?S?v?TG>>eo>\˾оFq?`l?΀?ZxݚVr8ڈ?)ȱ?]>>Pzq>ְ 5`?87>L?՟?~-8dQA?+|d0>PWD>Vp><=#>=13(e?P쾩?(?e-ʝ= ԾiqS&v?nne5K`ƿo Ⱦ9M~&`o?׵? >_lᾣ>@?*? ͽfX>|q?!ң i?&?LES?m=Cz>x&W>N ? ?(2b>-A, ? k@?0y޾!?SG?ym_>pz[>õ#4>֢6/3z=JE"e?Z5?u>?t?Ŵ?Ф?:_%?/E>s>m?鯾Ӏn8紾j2H6H応JԿ7=W>,(~=#??L>Fg=4>9޿C;Xཐ<->w?6 ?>θ?0i?W,k=Aپ"ߐ?C?B==ŗ6?m>m3.?ISj+K>z2_n1޿u쿾N2E<>*? sE?2>eQ?8?]Zt>*bC~>Q>b+)>2?vob>sJ]*E>k=(㿌?',.A?N? ?>׬]|>_= maZ˿&?> ?M=rD޽_?wT=@S>==I>k}?PP>J|=$=>n>L}<˿Z>1ǾP6>\7T>e g=+Hls>@>r>]>a,O>,?a?w-ק,=>}?;5>?=k=>vD>T(i\Mľu> 辡8O>5=3.)r?" Ɯ>ϰA,=I?6>~LJ?[x?EyF>5?:{=g>%f??'fe]#n:>>,_?3r=s hrDܖPO?V {H?A}p>Z>?Ҏh >IHFG>`W?Q?=!=X?<=BI>ˀ?>dx_g? (>Dd\?A k`dT>>? #?=G؃mҿP۾;#>*=?D!?R0?(=ә>˾㈾v?[ N?%勿R>!=Qn@>_K=ښ$>nL?^I;YR.?ɺ@?6>*!>٪D>=qA>K?^W?Hؽ_u#>/C?H@;>y]s:=žX?f>{Cʜ >, >+>{m1?l>>vF>M@> >'ao?(>=?V:A>[H>u9>|,>oP\2=ZA>ʾ3|qE`?K> ľ?0i7>?w?e?gL=t?q>guG=>z?>?Fu=G4>#>?ս>Jy?9 >>!=[z=Ji= ?"?,_~><3?gN>of lc? V=B?>>Lx>=KӾ=Ͽx>;>޾?u? zキf>޿^k =̚J>I>v?>]? w> u&2>I>wޱ/> 1q>n>;E?km>hm*xԾ>՚O>-X>a ?1\>~o=>| N?11Ux?˾N?+B d:>:?+V{? I>=[h?<=?9F=> >7=>T >iٽJ䟾r>FҾGj?N5]Wx/R>Gʞ;:˝>4?<<6= .g>hg>N&?,^3>>b?{y,?2b F d?;lj?a >m>;f[>xJl?#?>/ 5<| >¢?aH3?^=_Z>:O?3&=h%?$ ~>>T?-I>?Wn >,>T?8K?\پ>H{??G#>>i S8> CQ9> W׽HI>1s?>=#^>>>G^ǥ{>͸_v2>`>>-=)=I>{y >o ?>D=Z??;k(]P=z?)8=Y?#?9,2XI?پ->J+D?>èÿu;5%>?ck>?$>F"y>X@>0>reοa`[__90ƽA^wǧ>t? t?E N?9a?-?>5;>>r=Xd'>6>N>xt?5ٓ⩿ >@n<,6?9= > G>s=&w.>J;?{>b_t=b`>hFgH1=N?0?[?@M? o>(>)>er`>r>S>:K<?>@( L}d>>x0=??Lo?>4>4b>P5>_B?B>־>wA?>;\c{%?[" ?&>!#>TC?? bӾD1f>eu>G?Uv=9??FEy%j>?i=@=0>뾇(?/?!)?f<}?`>9x{?>$tG?&C?7^>ྒྷ>>>Nf>D,m9?<e?=TF)3>&?>m%=Xh>>?>U==9>?|>O?0I =a>>S]6>z? h>n6%>W=^~?>>Ћ=⾢&>Bܽ"NJ?),j?o ?$0Cپ?9 ý^+ >*>Z>1 >?-?"?,b?{?C]-=)>m ?ND>}L>>U>rt>)>hԽDp7? 0w?,/>3>$a 0\&>[_>=e&B`~>>>=>Sa&<>ǣ*>A.>{>j:>y5>T??U۾t1=䮤=m<@?)S}>?Y>:r>w?v>#-z8>? H>aJ$>{ͧT >%qxJc=Uin>K)?"P?*d>Z?c?9>v>==Xѿi.>>"G>ߞ>>N"t?X+?<>]*?T=lW?=됟>_=?g¾Y Q? ݆? J#-> ?Q> $>B7ľvk=Z?p!=o= =ɜ?f0?MP>ު0>Ѧ >f?Bo=%:>?:>M)z?X>:k>B+5>jPE40>>>%>)}6>>=iZ>DkR>Ĭ>9>D>}Ti 3q? >>j\cMԴ>`>=p>R>W?$O>Ͼ>?} ?.>}ds?+gTS>|:.U>??`Y?>481Sz>P?`>W>{=g?R?V=M>թ+>{>)?N>>%`>"?2O?@Ӝ?!?0=Rw==d>1=7.4>)0??Vj>e? o<5=PU>hT?Z>/O? >%x=rt=?/>Yd>`> >vy>?|(?@o???(@>->K>}>M>9>?~,b> <>\>}>8d?Ih>#=i>ʼ?,>:>̼>s@C?>L*?>qGn?A[>ᾮX?e?ǧ?/>tߙ?-?K.?} >m0>X=e??eȽ$2? X]> ?Ku?v>#t?q [?% /?/Yv?K==3zjQ?E>!]?wC?Q?$!">9G>>=?=>\D?!=ӾiN>ht?3?Y k>.?E@\?Y8?v4?Ͽ$=o~>*׾m{>ѫ??QC>eς>8?2>^?> ?>y>ŗ? ?qa>W> >}> =3?9}>܄?g?@@? ]?6>?M=7?LJ> =`?W>>h> >>>h*r???BA>'i>a?7#>>C?n?=A %>V<< C+?53?4O>k`ir>D8>>r>e=(>GM>q>U=0?8u="T>>޳>Ć9?Kc>n?!v/>iI=K=i{?_Z>0`?5?7mG>NH>K F>8>>Z(>>>j?H,?N ?k?5?4`),T>TJ>{;Ld =>W>=h/? s8?zH>?;>>J{>;֖=>j? >M>=>ʝ?ȫ>>=V?E?e>*>e-*?#r?>?,? >,9?E/?:Eg>xQ?` ,? >q>>lд>d>?>->>b?I%?=/>@W>p?2>Wѷ>?==o>en>u?)"?g>![i>>?:?V>t5??>r>/=jL>F ?;?c?z'>}s??i>?CO>:+>?Bq>?Hȶ?Z>?p=EK'?6? >=>ĵ? ?'0?Gŵ?+<>U>e?+? q>O> >v>>0?>O?>N> >a?H?~?=:?6?CA> >l'&>= >1>~~!>e? ? _h?-SG>>U>'~?>H?U?n?(H=ᭅ> k?@=M?f>;=Ac>m?d?b+>Z>?>>ӕU>w?t?2>[=Fz>K?>>V,?->(>Ǚ?\6?<>ቂ>7>h>|>?aTm?d>?I?T3P>SK>0?@> >ȣ?4 ?oW?>?1f? >9> ?y(>$?h?g:V3>>:> >L?@?>5?յ>L??d>a>읾>n? >>?zk?>?T?XX?&> a?Ls >å>V?.J^?H[&?>?~>#>Ɋ>ŲuY> >>>0>mIv>yl7?,>>?~?4>c{?-?>>8?0O??->N? ,? N?!?f?[??<Ϸ=u`>y=??_b>jO>Ɇ?6j?Zi>p0 >v>ޕ>72>V>>ֵz>І? ? ?!B? ?(`?H?0>wD?`i?qx??g潛{E?=[? ?/s?Al??Eq7?6Tb?>3? ? ?>a>qT|>L"?>h=>l>">^>6>>>4?Ty?;>+? ??#>5>̐ =݈>G?&q=>(.(?5gB?(f8?{>_p>H?B?"#>Q&>>N-?9%o?'?|x<\]>??L?%?2??^??"x?"K?_?4M >G?w/>o?b?81?Z??>PC>X>n#>4h? -?o??k?a?-?X?(>W?=> D? ?;?ApA|ճՋ{A5 A @?H?`?n?a>&Fm? 2d> ?#?k>Ub;-?0?P?L.>?gD?+>fq>>?!?".>߻8? ?I?^tD?`J=J,>Є>p?e;?ao>d>9y?C?D ?H"?Ԋ?~>|>>:g>g>>)?@?[>5?E?d?J ?Jm??2?>H>D=YB\?j%?>'u?%Ba>a>ߪ?%*? Z>x?n>??+?5?xm=4?c? >P>5?+b?; p?16>??>%? ?.?]>?Y?)>DX?:?I?.q?X?(?f>[?D?F<>!?x>>v?j?w L?x>k?(;?@???B>X?Q?%K? ??6g!?=>?f?M5w>Ԃ ?_Q?q?A???AŞ?'?=?3b?o?5><?]?E?M?x˽kp?+^T?Bo?Ja>4B?+? >H?:'>>v?PG?1?[?g$>?.y?|?k?3??:TA>?=2K>4?}>?Gr?p~<1>!??8?c?9?L?A?+ ?*??Iyо8?(x =͟?>Jt=?*&?e)>?D?J3? @>$cv>R?Dq?A>C>]?>)c?f.>K?>?Z;?x;? >D'?tU>t ?C?BʹaX? R?|?; >>=+n>?7?{?S @@>j" @M@$w>Z=>?P4? >?Q#?8??f?ڽC6>B"?? >:߽J?"?>A?J?)?l?nz >f=vT?7>m\?F\?}?>L:?t:?ӹ?R.>|>(?c75?=?Ez??d?-8?T?۟t?y? ?Z)>&? ?p?,(y??XO>)= >Tif?@?̊->$CE???7>S>î?hDd?d>[e?c?n ?q?/?D?? ?U?삳> v>{? M?c+>s9>s?n?} ?[? T??pP=f?t? ?o? ?_??!>(>>QA>/?}>?:??MP? D? ?,r???I?P?B>$6??J?F??+艽RR?<?W>[?= >m=(?P,%?aѭ?/??Pi?OB?hI?g\???zDn?,?]?x:?Eeq>P?0??X?t>?f?zU??l3?}?m >?Rr?\.??' ? >?'?p?H?\b?K\?U\>-?S>ݾ?nY?5? v?|?9Ä.>d v?d\@ ?𐰾/N?>b?r&?l{>s?.>>??Kf9>E=o>^?I|?i?1?q?`?L]?'s?m;?F?>W?+N?'??l;?}/A?[|>{?Fż??(W?Ҽ?>By>-n?|?Oa?8k?U?tm?xs>U?CK?iab??ɇ?p?W?؟??5Ɓ@,@i?bu?O?q?@-h?L?yo@7b@.e?bz?sJ+?/??T?ş?7-h???à?xr?S?I??A\?b?*?>??h??5?ʠ?:?w?c?iE?v*???? /?ft??¬> ,?4?9?_) ?_?᱁?&$#@X?؟\??q:?m?Q?T???E|S?=?2?? ?i?]????]?@?|?=$N?6f\??R>bm? D??RA?D7?ٴ?sS?b?3?>?Y?Ӧ??QM>c6?PV?w]?:?1[?N?Q6?{=?G>G?U ???m?9?_?,?@?1?d?F???~?:??7B?2?^.?c-???V?d???Cd-??Aq?p?f>?4?$> ?=?\#??~x?R??d?H ?K?|S??#???Lv?&?$?S?4??x??{ |? ??y????F?B ? ??K?L??]? ? ?}?uNe?\!?Lg?ے??_|??R?q?_??^@?ݽI??y?*?@.(?0??A?$o?c?? ?ҺP?:k?i?◟?!W??S??K6?3??(W?H???'%@N??A??||??K?z7??ҷ@b]?{??\?pp?ϙ?,X?u?#? ??)9??v?C ?x?`$???ğ??ј?.???%?? ?Q?v#?R?n?,??\?l???Z??@4???V?@ܪ?[? '?e?1?*?@?$?-? ???ɪ?-? ?&??3Q??ȃ=@{@"N??????lf?????й??訥?*?5?y?5?NX?8,?I??O@ @6?@?@??:?, ?C??-?? ?W?|J?|??!??M?0??Nv?W??̊x?X?H? ?|~??eS??L?6?U??]??w?i?IL?s?@2A?Ɋ?kxG?Ԃ?6?ǂ\?r@?v?ܻ?@?K?%S?@?(B@p?=???4?c??؃?1?˟?h????Nj?q?ž?[?IO???M%?R@d?Y4?׼?@@"ۿ@@$? ??D?ԕ@s?%??P?ŀ?L??4t?:j?_?@*g@?@)@@@@41@J??mr?Ý?݂#?y@]B???Ք?V? ??/?sm@%?w?w?@??'e?>N/@26@,?I?)@??o@@h?T?,@?xo@2W?%N@@:2????M+?;?[?|?|?b@UT?h??^?rI?@&΋@?՚a?<`?ح@oY??2[??Y?@UF??a? W@@ h?ߧ?*v@D@ z@ @?4?"?M]?ާ9???ڏ???g?vQ????^??Dz?f?RV?f?`?m?U0?g?ӯ?$??r?۷;??T@ ??@R?e?E?D?z$?ý?5??,G???1,@??B{?[@?Ϥ?dF?Ԃ?df??^?؆?fk@ӱ?o@?ݣf?? F?@&ɏ@?eY?Fg@+@??贓?Ƌ?@#@,?@]@s????<0?xC?{??d ??ş?y@? ?p?9??g@!@q?|??Wr?h ?=Z??{ ??:??&{???Wz?d?? ??9?+?{?֑@T?G?Ř?B?w?g?A@"X@P?Vm??+?l?8E?S??ٺ??^?Od?P??^???ՙ?jν?@ ??IZ??׃?6??(?G?ɰ??m ?^?? ??M??K??~?_?\?͆??W?a?v?徚?@(@L@=P@r?.@y@ ?q?\?d@y>@}W@@A??]3??f0@?@F?Jz???}@?e@ ?,?$??ǭ@C??ت? ?V???xJ?O@H?#@(??ըj?@??M@6fL>?@?FN?0@. @H_"D-@S??h??F??L?R @>.?')?h??@E@ӿ}?=ex@T@.Γ>V?? %?]s@@b">?Ų?{ ?;@(@dO?a@1@ ??̈?i??ߝ?EM?a?5@ Zi?#p??Q?v>7/i?]@e"@?@#q@P;@@?ѹ?őj@r@??X?|?@t@q@T@??)?xb??:Z@,q=F?R}@'@3?R0@:?ſ?ؐ?V>9@(? 7?u?л?@?yr=`?O@3???p?5?J:?@@ μ@?w9@@ L5@4FN@\8L?6?>K>? ?I? ?q?o?n? ??M?1?>E@ %?vE??w\?o? @x?@ @@?-X?l ?m?Ci?9?k ?1b?R?dZ?s?[?\?(?? ?Au?ɥ?=??*??)?\??:??H?B??+??".??(?e?̏?%?R?ͱ?!?(?v@? @!??H ??LR?J?1R?ը?d?˯?d"?OO?2?pL@h?,i?s?H?+?99?If?K?U? @8{?@@{@hz?d?u?@m@}?#?ݓ??Ҵ??J??z?)?@!@*'@3?xz?m?t[?`?j?J)@v?~?ƸQ?-?$?-@V?S@ XA00]j?@[m@C['?0>@k?(?@Qä? U@ !@/^@:?1@W?z?Nt?s@-@PS?Z?@?b?Ǥ@l }ޞP@h@M@3*:?߈?{o?F\@ X>j @#@xYE@1?4?D@5on@V70@#?qR?+?D?@ f? ?_? ?@U@3a@ !?ӕ?,?[:\r?E@??R@ ?ɱ?ަ@ r?>?ܹ?À@6?W>?@`@C?韖@ ń?#? @???#?i@?w?_,??e??C? &?@5@+1?/?>?.?֎???^?@[@d?i)??#?@ ?V?O?@ ?I@??S??УL?2@?1?ՀH@M?"?YE?%??D?:x? @v? @e@B_?d7?@@nw@@?M@ Z@4?3k@eh@?ӧ@S@Q@09@/m@k?N1? @e6@ M?P@,@(d@:?W@@BT? +?ŕ?ڦ?#@I?p? ]?ѿ?1?@!??vl?V@ !@5n@ ??k?q?@Cq@ ?Ҹ?S?j?r?$y?ۣ?0~@P?3?V?????J?@L@"??í@1I@t@`{@0@1@LN?j?Bd&@]@#@$P ?y@l??s@}@7. ?+?,?N@;c@e@ ?0@ {@/q?W@!??櫧@KB@B@AO2@e\@(@A@4@=!?ю@o?C@@-{@`@@>G6??7?W?/??̺?ܓ??gs@?_ ?'7@?@ @@o@??@!@ j@ @?́?fj?Α?sF@@?WE?ð]??n@iS@ E@o@D??8@@@DY?8?Q@ 4z?߂V@$M@@ @??? ?V?C?@?Q?|A?<???[?6?@¯@5@P%@3?V@n@m?n@'@d?La?E?5e@@?y@0?@@'3@??x?S?h@&!@&#@*?+?^@??F@W?y@Ϧ@&@0ߘ@ )?J>?¿?@L߉A@ @_p@۶(@A@%@j\? ?پ3@\n@4H}@Xb@??@@:@]`@?{A?Y@?5@3?ox?Zw@gMW~fA^!d@1o@)@"a@"7?0@{J@(y@@?K@s{@R",@@$?T ?7@?Cy>0@-@@.9?É?A@+T@7i@ @APM? @ ?@Va@)mg@ PAy@Y?d@(ȓ@I?@0过C@,@U?f3@%?k?@9A@C?ݼ@eqY|@@D&?n@ZrC@@-@ ?W@f?_C? @}@@ @3/@XC?@Q@1?@?k @@nA?şe@Fuz@ZI{@}y@d@3#?e@A??ͮr@5Cx@ >?ń@(8?1@FO@e~@Q@ @7nE?ģ@$@z @Ti@1@h?j@F6f@yQ@@"ܿ!5@R5@X3 @IZ@?%@@?g?`@s@Q?ل?Ϭ@~@@?V?H@@e0?5@??@1@@0l@@#_?b@/ Acq@)H/E@WA@A@K?@]@.7? 2?@v@,i@I@u?D@8@23d@0=@ Y?a@A2(@i?x@B/?@'3@ W?pW?@ -'??o]@U,֒Ϳ@@@U*@'=?@ =@8S@>i?|Y@@n@/4@$?@`|?ƚ@n@b@@ڰ?=@@@)@W?Gzpx^%O-AS;@/y(7A/@`8@K@-@J@#@@td?@o<@+?nX@3@PE@8@?A@T+@S@;@{?J3@T AT@f@o[@;`a@@@W ?4?m7@@ z@!?@ I%@1@<?@)þ>@Y#?@@.g? -@M @Rg?P ?6?þA @L@2?f/?nB@ @cjc7`"@@!@k7F@?0@j?SAAHm@w@@'~3@BP@z_F>#@y^i{p5!?H@E9@Jo@p@K@@roN(,A<9A p>q@P@tvX ?hA@\@ T@gd?xB@P@ucaذ@n\A%@<@s?c@>x@F@5v;@\@$@@R@@s@>͑@@G; @o@?80?>j?@8@{?o@uh @@z?$_BAA{`AxAR A:٠?T????E???y?5???d???̑?K??Կ?u?.??ߟ?U? ??z?2???R???q@A@F@Je@S{@@@F@~۾=@>^Q?F?@S@s濡{<ٚ;'KQ#!,`P?T?:@@pDg#o?W/>L?yn?T@?͍?h#nUJi*E?@XّJ?3@@T?ϘJ_ks#= ?@q@r19#?N@>@L>M?$@~ֿ:?]@J@%g*=u#Ŀq@<}?շ/z kF{j4@}6~<,??/?f(Ew?}"@V@F>@NA?OU0>ȳ@>O?@Q8@2@6ܿ?V2^>Ep/ک%|@,\@WJ>bzy>*L@a>9Vy&O>۟_1@=`sm7s?n.>s, ?i>?f>ȇ[?k?=龫=9?0=C.7>=ɽv:fD??.?oULʨ,4/&?Aj>ݿ)>Rտ,=7[>?? >?O?>(??th@ϕ@?<:?m7?Ƭ>&mD\?t=̿Y> @ .@"L>35?_@jJ>ls&?INJ3Hx<>5?{ĽJNʿ?EL@5@ !?C?;?4d??ةk?i?x> C?q̾/"e??"꽾7#M@EfG>(>jT@$>-?>\>{@|=;Ӫ0h2$i>h=LWgv?4"=^?=)$? ?>ھK?cv?•@ |?^N=VI>z>7n@.@&GF>?5>჏>3>?>?*5?~Z>y?>vm=eu>f?0? F?;i-@ ?iJ>Gj?fl9=y?ޮ0?2py @=i?3<$?CA?/!=nt?1Q@p??vH>%>춾?D'ac-uDЯ>?8>l >`2k?ș?V~>q??ހ?N>?Eپûf?*>JeG@j?>=~I=)=>Q>q%m!=D?Wd?洿'Gd>.c ??)w>^>z&=j\?>=?O?1!4>8;B <>G>>Bo>˒n>`'=Ƴ>f?ҕ?hὔ4Z>o>zF3=._ >?Fm ?*ڿ_==A>N?7 >>?.{?'l?b??2??>?q?h?>J[?F!=Θ?M?@<=cqg!ռ:->t;p)C_?TA?0qľ)=8>|C> >D1c?v/??si<>3>2վh>'? >KkSv>H?c?n+>$J#ض? <x=eL>x!?R>Y>?>N>9?-?)y&>??9h;;%>??T<> ?Hz?]t8T>Rx?r ?v?>Y?{[^ ~^H(>i> N>>j޽^ཌྷ!?;јb??G`>Y?gC?By%?9/}?6V?}>h>Z> >??:>##ws>ͳI>=$H} >mrW?J>qWԾ?!e<뉚=B4>>y>|PǾDv??&x?Gý}Qw? sp>X >b0>jS?H?"?2?`ۅF>t Rf?9t?=ܾ?G?;V?E[>?ѽC?ȷ? 0>qA?5k>nI7?U]`(px?]a#|u=-[%>Ҁ>>n*>n>(?5">14;޽>R? ??T>GM潄>`:pEU? >K>m=U¿Rx/>qR '?wUi?2Az>N>>}HB??j?>r??t p>6O? !6>YA>BQj>>E8?Ca?~?D?>>D=:Eu>b?.??c=C ?L?#>4;P?Xӆ?>Ė?;W?~y?K? ?&Re?C>읽 ?V>K? ?53?"+?Q>?$?Ú>@J>y>!?ZijE?r?p;:Cx>Ǻg0'>U =ʖn> >Ntn>Ɋ??X8N-dS,g7Nc>T?h( >x+>r?>?v? >k?>a?h6>?߾F>6Y;9?U?y?=bXu>R|>925?*u>>^=>Ǯ`>>s<>8q?\?P>l?_ ?O>q"">a"?'07>T<й?KЀ?Re=">A"e>C?S?m}?9? ~9@p7 *>k>e?w鑓 |>KK?#?#>T>>,J?J? 徿g*>/?)a?$b>>D>4G?Uw> ?~\? ?Uɣ7?)6,?,I".?z?w? >?!?.{ P>W'?@ȱ?Z?>̂?=#?(>9>?f|9>8?(e>f?D? >"V?Ij>[= ?? >N??:x?y$Q??;?\>%?u?'B>g%>Ro>?k?5+??B?b?J?Hg?_0?\T?tT>0u? Vk??EZ>e?UY?(>r?3?T>0L>%?A?"?F%=;۵?x?+K?_h? ,k?t>v>_? ?>&?ha>g>O>~jw?#e?B0 D?$?Jb>>: >[?^?{?y?|Z?w?I>K??k}/zV~?tn[i?ne?a?)l?|3E?>pB?6?C>Ƙ>>j>NSf>m,?e>ϒ>??7=F?6*\?Ph<&!6?a>l&?]U??s}-?6ھ.??iB>? X??R>9Ɵ>?a?S?m =Ek=L?h?">H>`=?2x?uK>`????c>̧o>Y>8?a `? l@?J}?GH-? ?x?J-?.?C1>N?Z7?>x>쩱?|>`?KG>}?;?>V?u?3>>'>95?f?P? ??O?rP??%E>>Ϭ> >?3?X?#,??i??YG??14;rB?,U?d?u_>f?&?sv?|>)\??*?OiO&?m ?5>A?jEC?T?1~?80?"?m>?^?3>6I?=-?>?kU?K?V}@?Y-?@?s?~ս?OQ?B?`? p<(z?p ?`?ݷ?W?N?*?+U?2{?m?Aq?k???#S?qN?G5?Y?g?ϡU?U ?Q s>˻7?>ʒ?X(?4{? ?#T>m?<>. ?at?ȘL?Ϳ?ofA>k?Q?|?l?:?:?:?KG?c??A}>?>?@?Ш?U?<>Y??x?)?G>?8%i?+n>uu?'u?WG?%?3qz??;(??[F>2H? *m?4m=?F(??,b>߱*>Ȱ>pi>r???ap]?,V???[?|?ti?Cg>=??I>ք?C%?R>L?;r?%f?F?1.>>Ui8?;?lR?0U?>=?$%>v?1>?j ?K?C&?ho>?RbP?X?1>?!?M >;?VJ??>?;W?>(>)+?=-?j>4?L>?V?s?\?+#? {?\?2X?l/?;?.u!>NV>>??9P>?)R0?f?+ >>4?!\?(>M= 9?(Ǯ?> ?ek8?d?8O?D??(?v??S<?M?e֖?A??/?8? ??jE>?4>6?x?ʷ>/?H?i?ؾ ?Y?ǻ?>?O+?O?4?6+3??}rF=c3>9?fɎ?/ >>s>H?K{?%?Y2>9??g? >>> n?b???&G? ?~+??.g?)Y?k ?4t> ?ҍ>??Z?G0>!?N?:?_`? ?D?dgU?y9>G?m6t?=>nz?"W?>A>s>>Rf??1Kd>D>> ->~?rX>?M?Xi/?^^?8>׸k?г?҂??'W?A~?Cg>e?G???>ӡ?|)4=?8U? C>?q??d??!B>>u>pe?8?P?q?{?6F?Q?.?">v?ob?^?N?_?Q }?gl?C m#>'?1$?9?]Q?>?`W)?~pi]? L?e>?]?? D? щ?X??o?w?B?Q^l?]?;o?Lr? ^?#A?]Cl?|k=?Qp>*?b?K=~?>l?->?O?==>X?2+?[>Z(\?Kam?P0?=(D?(???R?u?! ?k??p?{???.?>m?Z?kv?? ?3:2?(?P?[N?=>_b???u8v?;>#?ʼ?F?R"!>w>`D??`??*#>??m??l?D?gjv?\7[?EB?M]?m?X?,?EZ?)?.?f?~?W?|???L)?o?y?h>6D?}l??H?'[?$I??.$?s?~*?R?q??lU??h??#?{m_?W?`?M3K??@M? >◰??:9?S?FH? +?*?V'?f?=W>G?h0?z2?~]??g ? u?{*?W(?r ???qI???~$?9z?s%?D?gz??\??m?Vkc?#7?$?Ψ?a??"??d2?T;?xm??^?? F?3Y?2?_?-A?W?|?-?UZ??c??@m?L?U?Q ?ڀ?=>?p.?zN?[r?9?)@?{?I??3?P?X?.c>3?]???FGF?6'? ?v6?,?s?nb?b?Í?m?6?.\? ]???ڭ??e?ib?r? ??j/?h?x?9? ?q??B?ݓ??? h?6?U?=?-t?^?w X?c?yK??F>ٽ?e??Ihf?PN??DM?$k?uo?Q?i?/.>?u?|tb?ܒ>M?d?nB?W~?- ?f6?E??'?Z]R>d??2?iq??'?l??H?V??P?}Sf?@&D@ ?*?@&?ML???\> ?1?]KR?Wt?]??L??I??1?_?O9?Nx?0?;?qy#?ºR?@???bf??ov?AI??"?w???]??6??>?R??ʂ??œ?a?T?ɔ?Q?z?i?Z|?}??ᬂ@??L?`?N?b0?l ?ީ??%?2 ??I?L?e?>޵?b?K >>>++=y =h#?d?Q^?S5?u?_?T??x?5??ʇL??G?.??+?[@a?ĩ1??e?uK?eZ?ȝ?#?+:?ߢ0?ړ?Ou?@@@~?c>?$??(mE> N?*>=?6?P?â?)?U??O?l?B?C4??ك?t?S?e?D@ۼ???c@>??@t?o?@?dq@X ?ɾ?ܫ?a?ʅ.?%??I~@8@@8;@ ?zZ?y@e@4_?@@ ??nQ@_@#@;???q?? -?V@$@?!?t??A@ ? @? '?߷?m?q?+@ 1@^@ =B@ O@e?ۤ@B@??ٝ@pA tAd[J~\T@A@5@H,$@ A?|@??֐.?#?ʞ6?@8?^??*??J@H?c@6m?W?@ZV@x@~@&k@@I ?ݦ?3?Dw?Ӷ?_? @@@x@@ %??L@ql@@@*9u@ M@a@u?Cq??!? @ ?+?DZJ?7Z??ٛ*?r6?'@1?gj?F?@|??R?+?l@}a?ӱ?W@ tW?R?x@c@@ @:3?x?Q@u@1,x@G @7@@?57?,?;X@@"$ ?@t?@(N"@f? @@ @ )?`?ࢋ? ??8?@~[@-@ @ ?4?@@#@+:@h?M@ }@?沄?*@+ @F?{@ ?{?@'>??@v1?ܗ?@ x?ڼ@B???p?%@@ '?vT@ @ ?@;?a@ ?O?h@/@5P??U?OK?Vh@V@?O?@[#?ݘ???@f@'e@%GN?.?R@)n?@@9?u?y???ϵ@.p@$)@'@?@#Y@q@-L@?/w?O?~?$@2@@$@@?r@i@n@?@'-@n?@ @*x@A? @5ӎ@;?ى.?zF?%@]?뢳@@&?,??PA@@X?@ *<@@\@v@@ @u@2V@5?S?v.@l?+3@?@?@G@??֜?受@?+?y?>?`?#m?U?#@!?4@@@ @5x@Y@=@,#?j@?t?ֈC?@@ P@ ߤ@+=*@;F@y?ӈ@v"@G?4@0@9S@5U@@:?Hj@@@,@'*@+@ @,@?pH@ v@ʢ?ť@"?@ ?v?ψ_@@H@_T@-Z]?@{@A"F@@X@nF@O@F%@J@@/A1@K??L@Z6?Ji@@2@??r@)@)@G@o{@^@Đ?@h@:@?G@=@@,@׫?0?I@&=?@0@-1@%K?b?Z>@&L@D]'@@@V@@7]?ߎ@@J@&@3@^@'=@BD@:@>@5W'@`@&@@Jn@G@$q@_@*m@+?M@A}I@_?`@>:@f5i@G@F6@=]?@C@0x@j@i@G@D@E@6n@6sT@(1@'@*7@$@(X@@4@$@"@T/@d@ e?_@$@@1>@;@=@5aW@'@G@XM}@$@e@"Ƕ@3|e@X{@7?@ t@Fe@A??@Tj@_@= @"@*nz@Z@U@#m@/C@$Rm@49@BD ?#@q@@}@A@@afI@@+t:@?@ @E}@h@Ul@PHa@.?$@ A@Ox@+3@)i@~@^@ cE@!?j?@B@7-@=@@'<@ r@I@7@2@x\@UA@3r@ck@mVf@(>@)@9@4@)f@ @W@@s@&@B@@1+@d@6U@ @8@>u@@JvT@96@2@ @F(@+@ɇ@9@V@V@R@;پ@'5@.I@Ia@C@2@Q]@i1@X@<93@,-2@=@@8@K[@G@dV@@.@-d@6M@N@3ۈ@ @PG$@L@4@GL@M@@9G@X @[h@Up@2A@@BN@Q@?@8 @;@/|@4&@:+@LV@\ @P;@FGz@B@?"t@Q(l@J?j@ p@%.@X3@E *@@ @l@sI@'5@Uf@WC@bL@e8@/@`@Q&@FY;@}&@x9@p{@kL=@iY6@K;@6V@;@G:@E~@@B\@B@,@m @*7@8)D@>b@&@c@&,@ZS@fT@Xc@3$@Pj$@fu@@H~@S"@U@DY@7Ʃ@W@i@HsA@D@qe@@P@?E@=Jh@>И@L"!@X;@J@2P@9|@B@BT@\F@m@C@L@Ui@IM @C|y@Zx@v*@9@8x@G@AO@N:@ll{@;/@AZ@\@@@@_%@iuw@]>@OR@5S@-X@`@S@W@Dy@:_@:^@-@?@@=@M@a@R@+_@B@1@6O@* u@L@&_f?@A0@%ڮ@@:@7[@%@L@:Q@@" @#x@$@Lo@=s@<>@CS@&@e>@64 @&ټ@>@ZD@d!w@9@9 @>@BP@Je@K*@8 @L@Q9@=@>@@\*@B:@8s@S2U@LbS@;.@W_0@>@Q@Na@E\8@L@X*@Xُ@8O@B@J@=x@Aa@H B@&@+HH@F[@]1@jH@\@Yǀ@Ql@@@@BA@R@3@8i@I6@A@;V@W @W@g@j@XU@U@m4@H\1@9@>@wr@H;@@@RL@3s@D/@P.@Sa@p/`@3#@T%;@Bp@E@d$@J|@K@GQX@d"@jA@G)@.@9_@VO@Mw@1@%@%)@] @j@5w@`@AC@5F@pm@I(K@L2@_@N2@5@Qm@F@Q9@R6@Ev@N@A,@co?@: >@:1@_@:@ZbC@@@% @:.@B!(@0@#_-@21@WBQ@a@D#N@E @5@kc?@"@&@PV<@1T@=@I@LO@/Z@!@IX@(@+^K@K @M@?F@K@?U@KB@N@U@Yɬ@Dk@L@2@N @s @qӳ@^C@S@E@}A@%@A}n@Ѧ@Zv@3?@H,@N3e@(z@0@.c@.6@9;-@FY@V%@dU@@ry@;@>i@U@c |@>H@H[@`Po@T @T~@Y^@5j^@B@Q@C.@I@PL@+M@B@G{@Bj@B@J@OKI@>zh@mt@LG@-:@B@K@M@>K@D.@?1@1@C@W @NM@EP@? +@;h@B~@G@=J@Z1D@bR@;V@'@Fb$@( @/O@PK@E@Dr@>@Yy@b&@_&@GK@2S@f4<@.M`@#I@k@V@>B@9S@Sq"@5@2}@B@.@4 J@T@G_@3%@\XO@Q(@F/@T;@<5@@Z@Z?@C@N@M[<@+&%@Y@Q@Eo@`N&@a@><@U@i޽@e@@\o@gT@qn]@V/1@N0@R@E@=@u:M@3dQ@H͞@x M@`4@'@@}> @;|p@-V"@qY@& @1Ef@(yP@=m@I@>#m@;{@(o@E@Lw@h@^ ?]@_(`@V}@[@Z@W@@/@m{@fw3@l@l@`@@W@T@c@W@XR@P@S@XPa@~ g@n@%q@_E+@k@u@^n@^@e @V@EDr@@@y@K$g@Vr@n@0@C?@+L@UQ@S@f}@T@U?@lDC@ͧ@н@P @B@c@qQ@|q@B@f@<5@MQm@u@gs7@eN4@|m@f$@brB@o@oO@a@k@YP@F@U5@h)@Q&c@Y!0@*A@`@jc@:q@l@:4@wD@}@P*Z@D@c@eк@+@U@J@y@qك@W@N|W@F'4@Y @nY@W'@j@.q@K@hǐ@D=@?@V @Bz @dw@h<@M7@q?@=!@8@[@g@TL@]@@@W@MCl@Q{@JT@`ګ@kjL@Q2@IW@`@c@Z:t@bx@YH(@I@Tui@t@\@\@V@Tu@b@h|@E[@E-@c@Yv@H ,@@ @>@Qa@x@a@Ww-@8@T@a@mn@j< @4 @\@NGx@2*_@A@Wg@6T@JI@c@a@n?@w3*@L@j_m@J9@q>@wg@%y@-$@Z)@d4@nr@i@Pс@b@@v@b1#@q2@b!@ A@[@;@Gn@-@8z@?/6@@@`7@d|@vy@t?@X@Y&@`@Cw@A#@f@|>r?+%@<@z@I:@_p@hx!@@@ @@0@]9\@m@e@J@K@@@X@b@{0@`@;>@y@y@@r@!I@?&@N<@US@M@Pn@t+@p@1@3@j@ S@.Н@#@{@B@zؤ@@`@s=@@KÙ@Dr@=X@M@@8@bUD@i:@x^@u@>@@r'@f@Zt@g @V@@@sд@1H@fR@u@Hl@W-t@s@fe@]x@T@`g@U/@^@)@=C? @_@k4h@Ex@|@wa@M@^@m@f @^.H@\@I7@@G@@W@@-@Vu*@My@o2@`X<@>@tu @p@@u@i/@_8@v@{i@-@~@l*@[@gm@7@Iܿ@.{r@f @kp@D@@@T@L@g@0l@n%S@t1@],@c@`S@vb@^6,@f-@zs@ynZ@_j@L@J?@GW@Q@j#@n@J$+@N0@A<@)P@-!@?@4,'@Si5@\@:@RG@@Y@H@t|@a^@^@@;3@f]@[ @k@qv@`@U@R'@i\@jOe@a@Ylh@C@]W@mv@a@Xd@`@`ܨ@V@Z@z4@Jn@b@;0y@ ~@Gh'@d_@g@uWB@@kˇ@Q@b@sU&@m @m/ @dl@q"@v{@{-@[Y@|@dw@R:@~>@!d@u@^*@s6@Z?v@mB@v@xY@il@R@u@i@cL@m_7@N@Y@H@\@p@h4@\1@B@hw`@X@oWU@ٽ@ @\y@(@'@wK@A@@w<@g@@@m0@dT@7@e@H@c@Z@jo@}ظ@}@a%@j[@1|@c@hݕ@X&@X@U(AhA?hck?H@ˉ@ ?@>t@0@be@}5@@#@j_@YH@z@V2@Tu@}b@iR@w@v@j@te@kv!@}@yn0@^@>?H2@ @P@~@^wu@Y@ @@@S@e"@ G@6w@~@@@@2@i~@)5@V@I@y@~J@i@c:@e@h[@@O[@=J@k@,@[&@z@ o@@Bl @F@@I@yL@|;@[@h!p@lާ@Wq @XQ@^:@Jp@1X@O O@d@vy@W@k@?}(@7@vy@>@y@k6@e!@v@nm@b@j=@e @T @q'@l@p e@m@\@g@@f@SȀ@_N@h@[#@g@v\!@@r@Uy@[}@f@g@:O@K@c@T@g@M H@jb@t%@a@R9@Cl@$ @W@Fr@)O@D.@G&:@yo@YV@I@R@Ti@f)_@hw@ 8@w@] @}r @e@t@I@U<@~C@q@WZ@(@c@Y@@A@@q@LΠ@wH]@ͨ@iL@])@o#T@jL@o@`I@7ה@M'@ @\@L+@Ut@M#P@w@o^@uj@`™@TL@g@\H@(@zH@d@Y;@R`@L@hO@|o@hs@M_k@U&@o\@N=@y@c@D)@t.@V@ov@YL@H@lR@@sH@pk@och@p @l&@g*K@vd@w@E`@p6@@@6$@ s@|@^@g@@@i@jd@zb1@T@4@@W@_*.@b@~@\@U@;@f@j@r@Sm@y@J.@mB.@?A2,AT@T@@@Dh?}@+@m@|@@r @%@@|Ċ@Z3u@v@ @+g@?@(G@N_@(4@}@פ@RF@NA}@p(@@4a@@SU@@m@ȡ@5w@W;@Г@@rՁ@uA.77@O>A,.>t@Hv@K O@f@k{K@@y t@C6@A@e@e @-^@W@T@7@sAϾ@{x?Zh@`@\@t/N@dh@#@H@P@@l@oH@R@`@]F?@=@_/@!=@+@uC?@lq@m@yъ@R@z^G@R=@T@c3Z@JI@fJ@>-@hZ@T@_*@6@sv@2{@Aa@E@R@@Q@T@g@c #@Xi@_@W-@@i>j@@}@j@@{@{K@u@&z@j;@h@_o@_Y@6D@>ճ@9@].@]s@@fF@dr@@K@i@Z@u.@_1=@^^P@v@@pk@j@n@@d@('@6@@u(p@s@t9@m@{[@jAc@f&@@r٘@CH@{`@~@:G@{@hĥ@<@b|@T@h@l?@i@sc@sf&@c@w@`@h@@E=@v@\n@^C@u5@ԝ@@Id@{L@2$@v@:@{v@3@zN@sM@zk@xI@X@yM@no@E@k@jv-@v`@i.d@z @~@l@}@@|C@mn@r@@_F@qx@@i@c=@@u:@z=@:D@y5@z%@}G@_>@xh@X?j |@-@A@f(?D>?Q@k@C@c@T@q0@@@H@e@^@@d@Z@2d@sc@ys@U@~I@fƿ@j6@X@#@AD` m@e@\tI@@^@v{L@&R@g@ׂ@bp5@{A@@Q/@{{@ϓ@2@kb{@@@3@'@t@@V@kb@P@ @q@㌿19@f<@@v)@d;@^*@@A?b}<+R@@@r@@@SC@2FQFm@.@c^@{@Pc@F@jg@u@k@V@F+?a@$@@b@}N@>@@_@R@6@@@{@s8.@Ut@@@y@nn@@pL@Ӹ@:@ @@y@x'@@c@/H?T@SX@m@@o@d@l@Y@@H@@@Fm@a@ty@_O@iS@“@9g@ >@@p@n@X@F@X:2@I@ @r_@@X@w@@V@3@^@AO@1@By@@x@{Pq@V@ @@bE@<"@&@k@z@@%?"@O@@>@Ċ@U@>@e@6@y)@@r@^l@wt@k8J3z'@6z@dj@jB@q@ @@|`_@?Y@@# X@O>@_@h6@`@{(@Kc>@za@{4@pL@@Qa@Pt@9!@Q@cU@R@ֵ@@nU@3;A@JN@?~@ u@@6@@@@Zː@'.@%6@@@j@W¥@@G@w!@@@E@_z@dB@@@@8w@a~@p^@P@f҉@K֐@V@@W~S@@@Y@}@k@]@?@-@@D?]@@U@@@@@;9@ @o?3|@:@=@s%@Bm@Q;@!@>8@|k?p@Ly@fYt@a@jj@Mb@@ H?Zz?@}@Ȫ8@)K@@t@c@Z@*@S@OA @ @u?r?@@?8lAs@H@E@^A@Br@ AB@u@E@@`A:@dN@o@@sbA+@cт@?A%A<@zjA"%o@Ȯ@@c@@@&>@U@i@{@@F@ @@a@<@~@[R@o@#@AHdzw0(AEEA`&D? pA+ZnA:@1؛A>Na5@@E@@@"9@$@'@**@,@/s@2@4@7`@:@<@?L@A@D@G3@I@Lx@O@Q@T\@V@Y@\@@^@a@d@f@i]@k@J@˧_?+@V@ ޿A_?ݬ<7Y@?Y<@?Y?Ca?t&u!o9=tqq@@vڕ? @?0._,fh@;z[?Z~@^<@im?UDi*fH@MZz_Ӊ?ݾ>@ w?Bw?w?N=u`ʿZ@eqZ @zЍ@z@)]?^9.j1)?I3@b"@J@zʖJ@G@@I2@ 5C3 ݿʴy?Q50}i2@ "@ Ը?Y%C@qa@D?@e@\>!!@=?!(>(=-?އC@?9@+?=@N]w?οTj8-|^UKg?ջ-> ?3rMz??g]>?@K8@ZyՊ&Ig?g ?hnAbH>zz$?@|mq=??r"@sҿε0?&>m?+=0k?|Oӿ9"]>;?+>kkG?:??Դ!֐?l?cȳ?$?2?Fv@-?D? d󢐿\>޹U?y^sտ노>c?j??@=,?$?yl?[? I?Y= m>w?]T?s2>RYw,Q\>t?X?Ld?:m?c?9?w>Zih-?am?5ۿQ%?=:3?tX@U??@]l?>5??y%>տv[?!?@K鑗?b}Tk??x>?<?W<>%?NAW=>?> qۿ=x$F%>k 0WwNa =dgf톿3?p:?GL<5?!d>ſ[W? 5:s?K̫.>~> ?2 S>j@2(>XE>?2ei?$莾eM~i0??z?=b`dͣ)Q:(q?Da?w>_>??2u9 x?,t=?(U\O?F\?d4H?糽;>n?>H}?.MDA?Y?42?>(kT\T'yP}8=a ?C7>{(p<뾮jp=\r=L?Z>Y?_>>f>g&>0?-g)?ҧ)=X?5?>7?C?J*? ҅=T ?7/=K[ĿC=?'>g?\$侶C>G>\?z?|i=;N?>>VR.i>WT?H(aԾ*!?<5>J>>L ?/>R=ɏ =wV"<`ǝ=b >86>lȽ2NL?@?" A?p? {>K?WȿK=1/>ef??ȣx^侪\=Q4?ƃ?%=v>燩㖃>g?E>%[)?'KQy=Z?y?(>bN>??@½Xj9V??njB?/Ij>a]'>p>{d?,G?i>B'FJ?Y?P?]0?j\?48@#=^>T\~Æ=}>ʎ@x? &$Y>=>&j?s>|T%>m?~SOFH?J>d?[?$U뾶A>2=8lh!?x=P?G Y>+?E?7 2?o>M?&?W>Sq?'>Ba~>>ꕾ̧??%>㾙h?/?M=??W%?Oڽ>7>*^4>ԇ?{ad?,=@'?Q.?Ua>l!?E>}?n>Y0-à?ju?D?b>t?>]m>A*>sO??>,=mx>S~>m?;ֽA>:7a>'?UP>kjOrP=j}) ?6!? DDC?!q۽\m? =_8>=P77an>Ao??mp0>ȉ3>ۿ./??,fed?͓>?}?5C? N?/gѾQ>y?b?w2qX?;G?<="==m?8N?$?{2?;> >2>M>˓.>*z>mͽ>J>u?K>4vvG?X?Ç>{;?7?3?'?/B>`?C ?Ѕ>}ƾ>,t>B<>+T> #>>?:??L.?7?,,?t1">k%D>B>Ӿ*>D=606>|>j\4>><랞>^> =<>8O?k=G&um??~>mNF>+?L??W*>a ?/>la)??z8?dk?5#>>1(X(?u3>>}>>J?Ef?!O>$`>b>>H~=s>f: J=N?k?S&~B.>D>hN>XH?~g>R>/@w?{?9?0>H>Ԙ 2_?RN?ۅ?NK>=6X`?>?^m V??4?48?K/>"X?.?SA>Y=G#?)C?U? ,?!?e?7?8/?AG?bz>2[>sA!O  >$\? ]>F?/ t>2?jY?>P>ټ:?Ul?=!I=)L?u1c>v>,=?"G,>껾>:=eJR>.4=\"K0?` n?YN82>q)A?v> 1?ix>H?1GH=>6G?x??vu?q2>mo?@b>>=T>S>xǁ> ξВ>A?5D>ab>V/[3VKd?޾1V?'e?) ?0c?/s?L?!#>/v/=|P92ۋ>W?3?&>s=3?p*?M>>=?v&;=H?( >~dbP>r>Gul?(K?Sh?|*>0>7?'^?>=>0?Iv >|>ȘFU?_>G>?*5=˾E?\?y >0p=x> پf>>\>7?&>@?>ɜ?7Y?>>oK=|>?!Y>kwQ=?,-?Cĭ> ;>,iZ=M $?->>0D >n֚,Jw꼈&7?N]>*?>h>rP?tB3=?a?p>>h>Y=q>d>}#>>Y_>b =Dw> ;?R>=?*i ?1w޽>*?9 >=T>Fվ>)g?'^?#?>>>?5!? 8=x>Afpμ_9>K?>?{ >w>8>6˻?j>x"?(?B4}HO>0*>Bc?I>-B?c?fISA?;>ՠO>ef?K2?X?}=n@.?S2>qM?dq>R6?h?07>Dx>^<>!?~>5=f=-? -?Td>Xm>e>]>&>i뽃V>Ѹ:?^漘h>I}?<9='>f?؁>m3>6$>!M>p?O#?6?X>>=Ȏ=1>C/-̬h=F?!I?;>;>~`V>(d>l?_ڜ>ꭾR=P?]?C?#G?q>"Y>Ŵ>?(Ϟ??n?<<}]?~?z>F?ҕ?5&0?4W>"5?7?1j=˼=>e?w>j|b>G#>rg='<>uX>Ih<2$cvG>IK.8>?Jq>S<>?qϾǫP>x>L[>BD>i?Z=>#I>?v>={? =K50?׷?>z7/22>@W?Yf>։?S>pF>z>?U>?d? >p> ?5=FU(>o>{?>'8>]>P>Y?%8?wB?3G {3">F?bY> l& >}? Fᾛ,>?"ʞO?]=D>S?j>bc> J7??h?G??p>6㽊&D>/>)>>i?>jqƽ`>6u>P>a?O>@K >>E]]>&f@?8r\?P`(.㾥l>x? F>^> }>xN>">p=ku??s%F?B?6p>DӴ>Y?>Yw^>P??!]>E=uw?)>>8?X)??=t?? >?sZ^=c>ªg)>q= ?2??&?Xv?ef>DYw?K>?3?pS׾\h 7U==?#?Oe> >?&?>`>I?w:>?و#>?A1ǽ>z Q<cEW>>>r?>iIU>C;=p>3>TʽD? >Ơ>?=>K?"/?_V?!c>qM?_5?ޱ>?9t>ݗ?o?1>=UD? $>9>h =9>>>B>>`>C>F?C>?2L>({4>⌷o&^#>*"?F"?F4?1n?0R?,s?xU>m:9z?>>; n?7>L>>'?XZ>6 >??5?v6>u0?X? >I?$R̼t[>=?=?'?[?Ge??>ō>? f?DU>O=9>?Ts>,|=>s?>>>巾lξ);8pZ>b>Ƨ>N:N=F:>,>>z+? *?<@>n6>? ?9?4>>>S>IN?I+>ߛ?]?'>]>Ĝ?X)?h?PR?T>He>h?>X1>1?>ǎ>?J&?*>>}4A>ގa? R?<> >L>uTSz>=Tu޼CA>S>%>ܰs?">6?Ps?G>?׌=d3d=B|?@?s?> *??*F>(">?5>ԾK>@?4d>v ?<>&8<>c<>0%V>m .=xfDM\xSD>?4?>N= ?1?=Զ?%=0>|?*3> w>|>w?@!>>->iٴ1> L>*1=,M?&>V??=O>>>Dg==@+>=>'?@dm>q>'8[?G?/>gH?>>+>;Hj>(6<=Ѹ?>;>~==ý(8E><6>~>Z=M>- >9|>?N?>>(>׌>3?U??>r >s?,?AFp>%>x? > >^vg? u>>?#?ie?|>{=sC?W?1,>d? >q>>&>>ľa?Yg[>ȥ>Nd>S>?,R?>(i#=4G>F|>D>rb>>??s>>r?s?>3>>C>8,%>Us=@?'?#> ?\>@>w?7?$>>>+ko?>>z5p>>> ?%=N>;J??>H?+Y? >.b>Ѕ?p!? 6>->0 >h??>J=> E>L>?N?1>!>>=]ʑA>;΢>;s?@h>Э>C>Bl?]*?&?G>>?\|?l?'m?5u?gT>!>пW>>?$ ?> ?q>?1k?Lj>^S?T>x>>=>d?>V>q>Hmo6>.?>l>YK>3>ܽ3>N>ȌH=.FN>?NU?9?-?Q(y'?[?cY?+Zv>K`>.?)%=x}j>bY?J8>7>]>? z>ǝ>ZL?>7?d.?6 ?<*(>ha?>̯?+?p ? ?9S>O?/j?j??? r/l>]=? ?XM?T͡?!??g?jB?B6>>?z>n4>?>η>J?,%F>d^?_> c2>|M?4x=CV>ښ*>Y>d???#2?O*>O>G/}?BW>_?Iq>t+?L@?8?]hI??{?A>>V>?==>?3V?=%?6F?E7?2V>??/,>眂>=?X~?)M?=f?S->_?O?e? D? >j >4?>>ѵ?i>k!wN?d ?E&>?z6?!Ս>b>w?E?~?P:<9??`7>h8?!1?"\?W>hM>b> >w?>?)d?wr>;">?>>?/>*S?l?H?K?'?,P? |?*? > >L>Jj>?; >D3+?A+?>?60>-7>_)?Šr@@0AAD"@ 7?h?"'?l;?`$>ixD>o? =+ >п?$0?X?Pr?rq*>鰲?@4?G(>kg?Q>i??6?>lS>ӶBõl>@?A?f?,? >?Ts?>?CX?/E?a>Z?WX?e>r+=0%?b?w&?m?>;d>'Yu>?!ONj>n<ܰ?=D??D^?b>:?v =?py?(??f{w=">ix?s ?b?L\>n?!?ol?Fi? >v7?(o?65= >?H?5U?Kd?3?ho*?!> y>lM>i=N>a4?=\?+?a$%?qo?6>`U=y4>? O>]K=G?9?cX>[??- ?.p.?K?PLd>u>fw?1?4>?a> >1p?Ѽ?3?h[L>_ =g>>=~?9x>K^<U?Xn??\ >7?]>:~?q??`?$?dG?n?d?[1>>B?2}?j?B?/?d>jG?_n? >L>tw;?K?cYP>=ٿ?6D?ƈ? %?Qb?8%>fR?>? e?G ?g?ww?H׏??h@>Wt?pҮ?>'>(?"&?L\e>U>G0>GV?D2?6R>j S>^;0>PG>6>Bv?G? ??4?>`>sp?29¾C??R]>?(>SƮ>?>iJ=}-?E>ʤ?2?Y쿊[?P@{? Ew>w?*?P> ? ->#P>>a>4>*?\c>J^=\?O?NbP?m?U? ?kH?:7?Cnzw?)t>??b?R ?>nL?Q??x?D?3>Ï??@>I>;=\? >? ?#? ?\ؾnq>ۭ?B>]=>Z?5? ?J*??X@?%r?֮?0>'?'q?Y> />ʙ?>+?g?8:?(?>?wz?̒?Vh?+?9;>*`?mp??N?:I?0?4?Q?,?}?(x?<?&?il?p?1Nu>C>c? ,j>e?CU?0a?ߊ? Tr>z?Dy>>w@ ?O?~@>!?N{=>I?<?4?~Ÿ?D>Y1?C"?6ۆ?3%?^?6>X?;>]?_?*?QZ@^>9?q.?8X?gU ?قf?e(R ?w?~?? ?'?,?T[?J?3>>?1?>,?- ?O?6??yw?˲ ?%?ŧ??fAP?>>b?E{?j=C?3?? ?# ?w>b?bo>??6p?jP>[o>m5?r>???hz? F??/g?6h>/?o???4<= ?.G?z?> ?V?Dz?O?gs?\X?s>_X?>?>z>:?^w??*$>V\v?lw>=,R??R???O>?a@?y???@?V=d*??Dt>C?@?@R>?&C?=?`c?ÅI?D?qS? K?%,?!?d>r>?0+?Y"?J>Q%Ȼ_E?[\?> E?@?S?5?O"?4o=?pKH> .?C?qK?{3?e>?L?^3?f?iU??e@ н??܁?*#>1?o >]?5?ܤ?3??$>?f ??A?r?$MM??>?$?2;?ʽ:??.hV?{t?(>9@ ??p??`?m?iZ?1:?$6?>{?%p?\???I?;e?F"? 5?ith>!>i?"????&y??Ǭ@?.?q??.:?)?W{?2?T?^?r?`(?m?u>`?T0}?~?Q_?U]p?`?b?!?k?&߼~K>>:?;R?x?y?>?@U?FOK> >e?}!.? hQ?i??4a=?K^$?>T?>;?&?(*??<\=?^'?`?JY?[?rЕ?: ?;?3?ݗ??B?v?T@?T/?Z?d?6?pS>>$&?k\R?}%?6$?)q?=a2?~3? ?rD?{?K?+??/?e?o?>z?Hg?ړ?s)?"?iƩ?;???@? M?j?5?NC?r????>?pD?Z ?ɷ?8?"?^?w?J?o?C<$(??̌?z?ub??l?H?s?֝??x?:?E?Z:?^?1? %? A?9EQ?; ?8?X-???r?Pe?w;?FJQ?Z5?7?|O??K?Y?[{?s|E?ؙ&?N[^??\??@?|?i;?[Ϧ?`!?'[?0?6?8?M?X?5??{?`o?p`?K ?8Xj?sV??#?K?ʮ?T?_W?U9K>O>*:?OQn?/js?=I?.1?U ?>?̷?^ ?4?Qg?'}?2??j?y>?j???q?+P?Q?I?"?2?D?s?L?`?Q2?"?Nj _? ? ?????G??Lc??#8?~?z??D]?B]?@?E6??m???z؇?,xZ?ٰ?????>R?? ??J?K)?'?}?ĥ?n?:?l+?>?J-?t???~?PН?lw??Ey?YJ?? ?&\? D?@?W?/,>s7?L?(? ?;?R?{?ڒ?Gח?lI?#=?.????8???y6??Ŏ??P?j?Q ?е?ߥ?F?rE-?t?ݘ?[P?!S??i?N???n?ލ??—?^??ؕ??pc%???&J??N???V?M?*?$U?ý?T?Kt?Sa{??N=?»?tі?E?kb>c?,?K?Fm?#X?E?ӆ1?`?]??ګ?Ӕ?eU?zC?|+?N?D?nl? >!??mc? q???.P?g?v)?Uy1???%|?o3\?v?? 2??e_r?? '?u}?.??]??`?b>?__?q?:@??q{?/?R?58?s?Ύ?"?}l?%E?O?O? ??Y?(?I?x?|?E?|?1?nR?_+?? @??23?^? ?pc?R?,P?yD?D)@Q9@ r?? ?Y?0?~?!e?KY?;?q? >dF?!?2 ?l?G??Q>;??b???I?a??_D?"? ??!?Q@?+???@?>?@?ѷ(?@ }@M ?l5?ƅ?%?|?Gh?W??]@?c?S ?s??:J@*?J*??!^?? ?T??^?,k?_?7X???ɗ7?×?s)?[]?u@O?w?+?h@?a?Ƃj???>?Q??F0? ?s?*???n?,0?t?T?Q?Ұ?Ő???T?rW??W~l?ω??ޛ??P?ø7??_o??"?}??R{?3??ن?^b???)??K?b'?ى?C?Ty?vm?|?_??ht??L%?m?x?ְ??4?i?:?>r?D???2*??ǯ? ?L??@?}h?q?N?D?:?w)$?V??I?.?.%?K?*?H?>O?ke? ?c?pb?:Q?G? ?|?֔c?m?@?y\?X?B?ǒA?q5>?C&@?u?A?^@O?N?E?Q?u?p@ M@LJ? @o?л?f?V?0?H???lF?҇>?ę??V??B?u@*@s@v{?t?y@J*@0){???D@ w@_5@V?5?ܭ@'@ ??h\?Zj@A?js@@ @?j? W?Қ??‹?T?|?rp?*@ r??3k@`@WS?Z@8@)y????QV??@ غ? !?r!?у??Ӕ?Q?7?"@J??B??B???u>@>@@+4?5?\U?v?m(?@@5r?e?)>? |?ܶh@}@^6@6?ˑ?̺V??m?A?C/?;0??%?߯?h?b@u? B?$??A@y@]???.?O?k?]?d??[@l?L?r?w?q9?8`?z$??ᩍ??v?Օ?.??o? ?.>?y?ȫ?~??&??P??܄??o??_?~?Z?_? ???[?P?K?$N?9?Ӎ@ ?4?Q?< ??I????U^?e???D? ?Y?e?,? ?^?0?m@?;?r?:4@sj?O??ȱ? ?@?]?֚?? @g?@@@C.d@CP?sR?z?b\?"??6i??k?H?6U? X?c8Z??%x?MZ??Y??@ʽsy?"9@G@?^ʔ?5r[?ϐ?T?5?f?Bz@Ĥ?߽@y@oa@? ?@F?ZD@0@@-˵?LA@1sm?g??ő?w?s@ ??Ǿ(?d?t???(5??/?೾@%@|mw@x? ?Ǔ@?A?Y??[?& ?vO??WV??Z?|d?ܒ@/@,?b??b*@r@GAU?m?6??+?s? *@]?ܭ?w?!M?ܕ????'?S?7]@ ?@?u7?t@lB?z@Su@ @ V@7D?8?^?A??"k?p?$?;P?2?!\? U??٭?l?=h?ı@?,?żb?>?V? ?BA?9?@?OX?B?N?E?˗?)?2?t?݈@ ;@ @?B?X?≄??b? ??O?l[?V? 4??1?˵@ A?Ʋ?P?=?0?^4?QR?߳?Eg?5?qF@ @2??O??J???v?V??y??ī?z??3?q?tt?u??@t?@@??`@?Nϵ@~"RANE@th@L"@+@a @R?@l?ԅ@WH@ \?@D?T@H@a?Ɩ??ɽ@ 4@T@?8J@R@n?pf@?1!g<@.@@ @=D(@$m@v@@n@-@t@-f@ @=G?M@(?@dj?M?0/@R@d@4.@?Pg?t@*I@@i@f}@8a@N?"2Q@?V@S?žK@ @)@@@ *s@.@J>ľjw?-d?)N@y@ ij@ ? @'3@(<2@ At@S??9@B@ ?q@V @c0@r>2@u@;:@?@B@@3_V@()@@(F??]?s@i??n?1{@2@0?H?RL@ss?M@@,@:?qT@@$@'?є@@t[??-?è?,?p?؝?6@g?@2(@*L?ɽ@?ؒ> a?P@'?@x?@?0@(e@?᧶@,HF@$C@@?@>@s?|?s@ @"??у@t?ڝ??@ @=r@!T?j@ ;'@.@7@>@@t@ @1*?>2@%p?^?i@@ ?@E??O/ ?%y@40@3(?!j??4@I?s??w@?s3?؉??J? l?_???H@ I?ԙ?i@r>'=AZ@ǡ @[?+@%H@0sT?f@n?R@nf@@,=?B?!@(ME@?.?A@?.@1@}?@!??@0 @!@?f?& ?M??hM@0A?#w@@M@?G@ @@5`@*"=^?yA@K@#8@?u??P?F?M@$v"@a@^S?N??q@"@?.@5@Qཅ%=5?*?Œ@ @?ࡰ[w@&?@g@S@@A@@#@c@d|@ *?b@3?a?_@'#@/%@d@ (?0@ #u@,N@?cD8@ @ó@% ?7@+&@OG@JT?l/?F?,? /7?k?γ<@g@J?ǒ@ @?@y7@V=6@[K@? ??b @#@z,@?@\@L@s?x?&^?̩@T1߾-b>}?@ @@gn@"?њ@/X%@QF>|ȫ@@?@@.8@P2@ @7H;?^@Y@Pw@@x@g@ 2;,AXa@2@&{@&T@xC@ll?O@A&ۡ@@7q?1@m+,?!|zZ@TV@oA@*<5@(#@Z0@G_ @qo@D@1*A$y@>-@@}~@u@J?dL>@^?@ @ie@?g@>x?p>@~@!@5CA:>€FdB-{@9D@LAb@?QA=#\XAGAAjAAAAA=AA[AA{A AA)AAFAAdAAAAA*AAEAA_AAyAAAo>n`,?z& a@M?vS>{9no@T[P@s^@>J?"0?6}aTA?\1?WA UL@$@ \ ??}>_)$ ? ?Y㎺mTm@@'ݿq@`5L)t1N?=>h9g4<Ùkh@B:fӿ4?y?F?I]Ph1kQU>rU[ M h^@@xh7}??ÿ i=U? 1?0'F=μĿv??w(?A:@?E_?mb#%@`[9?dc>}ň¿SQ;i?^(,P@$>Aه+@(X>曟.x0{q>?c>̾{#@ f@`1 @ >N w??q=9C_=JP=U0@7^=$Wы.?u+>}=? [?l]y?xBE?t?*%¾+_hA$R>a@+ ?'.Pj?jt@lm?q?ul]@?J?R)qr?3V>~?? L/ި󿒒s?E=b㨿&}8?m>M(>?A3>0zY>[5?hɌ>}?*L9]?I?prV[ e?9]?]?3?x% > >#⿀ ?˨?d8/~?ʄ-`E>U>D>z޴8?tu?=<?d?)a?]q?,S=$>i?阾DBdR?A>E#ڃ=l?e?S:Ccq@ \Q>]o ?9Nx>c?9K?u˘?4g=I?zD>eɌʀ>} ? ?JLf?>z l*?j? G{*ZWпL?/?.C? [/Z[eM.>~F5=>zvR? <R??,?v>>.@_1P3E2?,?4 ==4?2W>8?W>Sx?#?(?bQ8Z@>0?<߿>G<1=J.s?#>?,?(?/1>6??@?mѽX;0=Lǀw.?Q?w? G?ʾ52=>>4>B? ڽּ$C?0L%ٽ?Ͼ??Z6?f`a>F>#`>>) Wz?4.J.\Mn>U>/K?TROvT#=7=!h < ?E@J??&L3? 5o;}>A$>H? w?4`u?7 T=/!l>l>]!>Y[ǿ?du*)? ?O>+=jc17a>\z%T =?p@ vs=UR<>gjw>!ŶϽ<am>->b>Ƚ> )? ,>Q>Q|>d'guV<ʿ9;P>?̾ ?qY?/w>y?,Z7>-?'aw †yoH>N P~>λq?I?'x>=?]z[?V\Q>X6KP|>JZ4>)tɾO혾xRWj>?lgK>3 <>꿄P^,3ξf?>ά?=>6<>M?Eš?W׿O_J<&Z?`Q>?>2%?h>psf?7? {bA,t="?'? p>8N')!+=>ZHؽ+U?N>v*>䍾0@t>3M?{=WR0=?z1u>I>U? bO?S«yx.rO@>QB?aص?*8_0X٨?^?igK> dm gLc?#>2׿~>?I@?M羃>s~>r:6>>>?%?$6ӽ>ňV`-1E ?O>+>?>{Yj>E?Ue>>h<]Y==P?j>+?#G>9?W>-h8>+?s>Noc>f!ǾPLY?  x#? =X2$gl>cnkiX=p>xW f2==+61 T;>Yu>9OX09?_?;>Hξ4>Ϭ>eC=&>7.;ɶg>4? H>= z>i,ھ`>w5K2=u *? )?]>ϻ>F >5?&?^>%P=  .v1Mj>ٿq<<>S>2ܾþ( ?>ӽ>% =/`a=C?Tg> =>^>cK=Zz辥]>G&>?x?n 4>l?!b=/=F>>@sɞ>Qh0]>H?4>>jfXE? ek/>ڙ?_(=b> >ψ*ּRc?>y6o}>??R˼='?>4Cnt>Ukߘ0"v?w-<̽VeD[ uV)+5>6ee?>9?*6?վS=@4>=>>՚m[> ?)T?q?̈W=gaX>'HX/F>ŋ>׾U]:ܹ>*Qh?5 >>>C3j&Ižp:>>[y]<F>dצ>׾zp>ˢ8=Cɣ6V>N>*쾖־@>>t!kJ=?꺇=s>\>4>Ec?[C ?F=+:=>>5>=6>0>Qi]>>v>>xN=F=7& =uxU=D:= ?,|>h= A'|QºYq>=(R㼶a>?=ͭK*1=9>z>C>I>? ֻNߺ<[d>Ɇ?>=:^>I@$>>Fl)>w>w;f.>&>Խ->W> }  <??/ D?dݾ ?? ھ>=ޘ`>e>r3]R>9>J_)>`=Mm/ؾ>;F[?:)ԕ>>fI!> k~*~>n?>h:̾6>2>7g-c>x?I.:>cd>m>=u>3=1=,~>R> /m@=]^>DRҾ=_=?)=<ĸn =}T>7={4/ >⾃J=U>lW=Nw=Z>c8_%O:9>ހw'©?o>K⯾Bb>灮=֪w=l>7=T>Pϥ=BO+_;>R ?OJ1q>R୽=BL0)h>D>U>I a>?? ؟={F~Ϩ>|>6n7z?I=gGON?( ?RY7։能#>%8=yԾKA> >^"Z=L>c>PǢ 7>?G=g_=@3ο>ֈz>( G? = >iβEoю>>>h>m.&`J V:=>>3UX*88d=Y ־mpϋF ;=5KMv=>>ˊν1!>z>}>S ^hDT=eD=!=~C>uX2>!UQо6mE>=YTD=9=>Aj:ˢ>+|6>ݾW>5>K?V;<о!%qZ 2)%ZL>O:>_<]? DvHA`p#lΨ3}s7Q>0>F^;>|>E\N>?$O<>O ~>}VM= =6⾛>U>WYT>/Ƨ?">;}<=/+ >Y/=Ac== I>ھrX'>=|0>st=>&;#U=+#7>$п'k=[T7/<*B6>Oc>U>>3ԙ>1?>k=b1n=>Oֻ> ?R%"=>2>:>p^h)>Zm C5>#>+#SG@8x Iu>e>Zžk>|۾9߽Y}4>>lоjފX%>Pj?Zs 1<=?07?>>ӣT?&A=o%>\>{!>ȿ0>͙i=)9" =>>vu=)=g>F~?2mN? S>&> 7k>T>uо'KUr>?i>>l|6 ;⣧<#ZȽg>Fyj_=>>د=s> `>lt>S3<ڕG <>9%5>]>&ʼK>\*>e_>\=>j AIe= <>G>m> P==a>k>g=o=;p>^Y/o=l<\"=xe=>=^9ƽ#{D=}md}XO>VM= m=I>1>5=Wl>>ؾ01辴0>qP>;>6>[>]-H>ҾdE>m8x>|W>>νG<׷M> ?V> * I>>7q~m>75>,E=jI->u>e=߲)><>|>&6=B8>q|>O=*þqXx>YA>==&= ki&a>?=m>=Μ[^CQ=WAf&*? ? 옽8SC=>Y>[>t=1b<- (c>[= =d<ܲ~=#t5.B~>ks>=6=Da=>5FϽX>Y>>>O> );S=츫>43>V>}/>= ->dg=&ϼboy6(`6ƫt3,pw>'{j矼i96IM %(U>*>ݽ?=f<>(>9=->ܠ3DDe]>+߾Є˾'~>9=#˵nc=+C=B={DT0%InS>D>A>ƅ<>Iӽ|T0վ(!>j&3h=սodo= >>=$=C>|˙=|H>FN>ZV:Ċ:A=g>A(>g>z>qE><"=F=菼^|>ɾ]mD g>G>F[>`G%`=HF'zM3G%8Ľi>>=oܼ0=<Y0=Ҡ=} P侹>갼yb! lV=<?>"f>h$yZ>lLn==@>A>H>8Q=3?x51a>|M>/i>2J>=G;eνy֎!3p>5=:>9 =u>lS;aY >iG* =Fc,>j?:>8>ܛɻ6'BT>[?#>sH=G~0 9м>?Aέ>0!j>"Q=bu>NH>@!A>>>=a>b>d|>]rD>{:=޾ơ>&}aj'>p!>b>,2dҾhRRj,X7"y=aD>=E= =bF>>r??Vd1B;?<>T>签?A>k=ÉC>`=q=k>bT>L=S%-=>ʾPU{>?S=s>~u@)˿.ܬ%=d>#>>!Mq=K^D1>q6ܾ.;@Jx~?.~?=>>׍p(}@>:>9}>}3?UǕ+> HY> >f>>6&>">>W о>=>i>E4>bv>>&> <ʾ!>k>^8H>!5>>"?]>wh=>)T<T=%Ʒ>0=_}=F>ÊϽu>o1俾j=eżY R>>=i>Wi3>]>>KXsʟ=6=ӽn =?!>|Z>e?HG>^>'?>'%㸾Zz>(H;> e=~M>>E\> >L> ?ݿBRd.A6@`+??@>ջ?>C.> >;.d>-b>=$>b>y>ݻ >?v:龌ΠrsR+̾ーʆ>:=:^>7X>X-:i!΅> 'B8%sKx~>=->]x<==f*?>ĺ>wv>1m<-?q>">m^>?>?#2=>.>a>mz%پշ>k>l<;#ObKG=>>']=m>>zz&b?B >,l>iC뾸+延=!!U:->ok=> ?x&>>=dT==U.=yw=iĽi? 5>r9mx!>#Ee=h>>xF>e]>r=>]>&)>2D)?4[>˾B>#>YS>Ҽ냼彫#÷ھW =e9+?I=pkv>$3u?^8M><>O>ZW>>ֽ>W<3R:ξj>rp?p3V`=R>/=¾X==UZ8;>/>$QH=>[F<=ww>o>UԾ>>N?==^>I[>$=>/e#htߡ>񛓽l>էYc=>:m70l|o>齤kž>镂=S5>f!>3lhnJ=3)>%]?>\>B7>\NF5Q=! =ժvP?0!?H=?wK?>>9_?1=9?R>>?3^s?C>4>m_ >`>JyFl.>b'?Cl>CL>d ?<\ ?/g>P&>H&?V >>dŸ>A0?DwZ?Ȃ> y?%'?0>ś'>?=g?|j?WKRƾCR>?>CtĦ=N#>^y?Aӯ<=cCv@>t?73LF>J?(?>e>sP>Efp=xN?Ig?+>>k>Ķ==,ː=#>=?1^><>Z\?1>)=3=UU=a=?l@??0R5f==C!$? Q?S4F>?A%N>?_`= W=ⴾ2>@>5>>Ŷ>]&]fV>=Ҡ3>>ax_qC">bu?w#[?-[<~:?>U>I><@?C?>%G>>;==P =׺И=>X>*+&Ҿ ?~]? u.>!W>;'eb=TN\?p7?*? u>E>(?9D>2O^>Dy#ᄃ]Kj?x>8=\>k*>sk<tS?x?p=Y_e=mj֧(>?5ϥ?oXD>pG>>%>P>ZT>J>*Вi=3>>>?cp>?xB>>H=@B?Ae ?>O扷?d??N$?(c?#v>H>O?4>?>1^=?)Vo><=Ὤ>&z>M?{W?yn>v>4?p>^~>ꖾi> U>лIF?& yRνc?X>|e?-O>]%8P%>->J"]ܽw?-_>2>?1fg?jTEu>Zsͽ`?p==>B9%t`/o==hV>\?'>о;oݾ۾=2?eĂ?h?x??q 2>T~>Y >P<ߋ=ć6>Ծ"F?>WFEX9>=!+>F?g>]?iۻ>qP? ޽o?P>P|ܘ? #?> >J>5? ?$\>y>&]=D>K>D .~?u?Le>943/?3x>)FŔ>'>ڷ??~c>M{2T?U>= >?&h>NZ=!>Z> !j=(ܽEqD>L?#a=>&>?BP>>f=B==B/>A>Q>>[} H=rȗ9m>D>>ѽ>ߍ?-*>4?;? Ŀ#?"=M>Rӽg=.)>;?; =>>5F&>?y`>#>>>ɩ>e>|};lPi0>h̉8„>DaZ=>y?%ȕ=]?0=l >H6پ1M>%4?=?>>D=z=~]>>u*>>T=? />*?z>C>xXA? >Ww %>#)=:=ee=῾?ǴI=ja*)>=&޽W>f>zt>!G>&ݽ{4o>H>J? 4>D>9>Dl޳>v?=Њx/Y>[\B>^w>C=">!1"R>c?&=Ot >o>?Tv>C#>X>>U>},> <==t>b>/d?N>==>H=H>sO>ek>d>mn>X͐>>>s>>=Ge=>39;Y>a>->]>8=uF=N>0O=lI&>h@_/?#O>)8>]>$> >ۡ3>$Z;Y%=\0>9ʾz>>P6˽9>{>*M۾>[;ʼ >>N >䐣= Pf~7FWE8+= vVG>bD>2{=[? >s>fm>"= 4>l>É> ?!z.?&i=)>>(qdZ=> s6>{ka=9.5+(߽Z> d=xIJ<`>*=> xv?F̊~O? '>͋Q>#?$=>&O=vU>ى>+="x=f>">o$^-O1?|?ڸdY-ɾ <=>#?/(= =$.D>uIH?,\> >=o@>.][:Q?%J>>==>?/x?=1?,k?=W>p*> bD>]Y\>]4> ,D*ɗ>lѕ=k=콜#=@->Gd?P>=D ?> '46>Q? >WE>ô>>Ɇq>d>u>1Ὠ)"=!>Ds^>eM[j>2=׾?T@<> >>={m>B%? >(ի>S<6G`=$?>(>>>m`=?>,J>6sA=6=t*>+2?m=<]>UG=y9K=7cSּ2_<<& ڟ>T\=>4>Fڠ?*>kU;=z6?()6P<Ȃ=u>=>%B>#>f> >X>_ =-[>>=>Tz"<,@?&>2gL>\>g=L<^Ӿ>Rp]>]M>=궥>72k=V> Tټܲ=R+8>d>Z) =:q>*YE> Y? >]ҽ1z=v35>zw=\,û>+"]>j=}%P>(>j+>7!?%>Bl>K>e*F>=D=f==?4ܽ} ;Qa1 =?l>8E>.g<ב=$)>6&!?>[<>nPRX_>j>m>|5>>Ad> ?,>?y?U ?u Ӕ?9>b70K>|>3>q}>׾wr>L=s}O >&>*@? ?>E%>LY"{U;ytM>"j=X=?B>ҎxŠ>v Qik>[ِ ?`>b4>d!L#>˥?C5y[/E=>F=mhJ?%?A`>-hݾ9>>DQRƁ>֜?g0>4ʦwS?>\>rt>R Y>1>,>#iX>`>q>|?ıkt;^? ?!>.>3T>=c̸>ݼj>c$C S*?Nm%??=k?>>v|> h=<=^0>"\=gy>!]x >\d>04?62>8[>=<*!=þh>g>ܔ,=14s=<δ=50=ώ>#> ?;Z?8=?u>f&'羍>sep> />/>8?!*>B>=-8? .>7X7>ɺ->>13S>~$>=?\,N>(>ۙ=ѧ>>pP?"h?[Sb>#Q>(ZG>!?>t >X|"R>h_=D=s;>@Pw<*{NT#`k4C˽>? z3=_=_;>,>c>>\>L1!1ȼ.X >(SIH*P-?4?Q ><gS?*>ꌩ>u'?BѾ% ]}?4$??N>j Ԧ q=.>rV!= Rս >$(m>d<:%>>">=!A>>+C=-@>5>>?>e{>_[Е=,<F=~>ƽғ>>!8>ɾɘ}ԿDQU>]lWJ?4??oD=;>G>=о(M=t>+;c>a>Ukdn>ߝN9=?׌ =?S`>O>mm>y">t3Q'=k4$ؾ>>T>>>J>Q9P k_;?o?E6 ˾ ![D>>?v? ^ڼ=)L>B>AS>{l>!(Dƛ9c>t >e[x >s'e=(>6>oX?fbp9= ?R??k s#;!:>U? i? ܽ;꨿3V 3r7? (>а@>g>J*=E:Ѥ>$6->>yy.>D? VOS>*猿n '?I>侸?O?U>Bվ!lC;?!G?/ > >:0> >'n>P+>xn>{>9!=̶4>\"?|ʿܛ"s!>9>H>=M>qK>Ft&ҿ}>S><80>(?1 ?L ?w]=>ì>M >\pDr>O= >4M>꼾xt7⥽꽃 >E;/|u>bï?N wǾH>?)?R>io쾱S?F>6<b>>X>~=aC=A=me>Lݻ? s>.?r1\ȿIi> !ɾx>>2->)z>X!S >r/=>>ݧ;Ǿ>CT>z>:R>݆&>,5N>#G>U>ܑ >,̽ >>& TZI>v?MiLV=>V >ǽ9~: a>J Ծ>tn>D%>MP>\߾a1 =>>O>$S9B==M>%fn=>n?>?x>= >m>){髾!0x=LZi=$>E>f>s< [^(L>>I=?7">>H>*X?,b?1??F)>;e=,>j#Y>>?'B=8Q>×E*At=> >N-ɞ mJ=> vqUT:o"-<=\@@ ۙ@/i @3?XmP=>7?4"?lR0=>S޽ c=?_>nֿ ?Ӿ(x=L>$>> ?h??坳'>C6? >*)jeU>-YԐ\???7=S6>Pw ?v`@+ >n,k>fb=oZ>d/ʽe'D>흿 ?0G>>@tyjN?ȷ?,?'>t& >#&<>C>we="IlBA<0Ѿk1Z?H===&b?[?>%o=:l4>i<>z%Ks> k: (X<>u>W>R=ٴWQ?0<5U*`;%Iǽ>(>X>!̾G(,cZ>>EV=53>սyڻ\9>Y>o=FЎ>@4*bB>$ξq<J=Ǡ>h?=S߽>|=̑=6>>OƬj3TVP<1?,>wi>9>\P=U-RA5>b?yT=}=>d-=w_lMhԽ8^+>>Q2 =X>"=a><>D>s<%>`>/p>= =- =wE=&9<&:=_=h> =" e>i>LGq>6d?5>{X>^7J=;?!= B1> L>j&>n?U*a>>c>|?>\=eF<'C@ $?5<>43>(>S?xW?>. K Pݛ<W>~@+>aJ>1m>?=>+*?ʳM@ >?}[>~>?>İ8Y>[Y 9>m)>go >&j?!愿K?u=)$?8>$ʿyֽ=n>vTt?O@wp$X@C>`'>;>}F>*>jEK~?\L?TTǾœ?Q>aT?=Tj@&ѽw= >T>-Y>m\>-F=o}>?= >U?y2?&>??']jM?j?QyuK#o<->̈́>|>J9>~8>>>K5?>>?&5Ne?m;Ay??[p{=5J >TX>N>&p>mʬ>|׽3Oc>>Fv> =_?&j>Gl>I>?:ɾG>(>BK?ν>̾f>Pk"2=:v*x3V>WھFuh>I(>>zt>6Wy=0^?<>~ =7R>8=+ r{f>گs>>=r>u?=_:O=W>gA>z径h缝3=V? {>DE5>8=>=0!ЅT={>n=^5;&=m>Mx>io>r>>C=b>B>>^o==s=L>W=48 <|4-0}>p>G? ?Dd?I.e0>Ǥ|>;͔IR@sʭM@t@ @8#T?P@sz?k_?v)rDj?@0;??ƾu =f;Z<`?J?i??C?3ؿ@wZ"> m>(?dH>uc=~> 8=m>k @ P?I> ?Q4>]>`>+??]*#Pu?qE>]>{>ueS?{e>@( TB>D@i\?> H>t?X @<{,@?CP'>p>Ǿv ?Hο/Kp?e?\t? ~>0n>ۀ>S>[`@$*h>b>B>'7=IHYum?.&@cn?zp>E?1>8Ul>zi? ==?_ᄆ78>w?Uk8DY#@?r 8>t?򨉾?H?q5ݽ]2=p>|>> >T/>p>>?0<ƾJ> ?_Qlo>:?)^?>řTY?"?O>R?1u?-d?^j?4۪>7>.=,>?1>(?s?F@>>s?9,>Q? +h?XO>~>2?7n>ّh>쭿?A]R3@_???c>Y='?7=ئa8Lк@9? A>I4>C>&z>8>6?@7 ?Fв?: >ǽI>>g>>7NF>+a? KM> >ۺ7>]L?m?>|?8Sd@+?4?m???=^= ?%a?=6??o;GϾ<㶾0J=e= ?<=S>*+>;?p>N>\>뇾m>f>hN  2k/BN@ ?@SK?P>li?]?1CA=l>R>@:@ H >>Sn?%?@?  ?q&?gƫO>?5?3?Dp>h?bhV?+W?.&?Q*?WY??t4?O?r>kI.}?wT[?o?Ӄ*Mq?tKo ?K;]?rd>;>A?>㏫?q>?h@ @Q)hξu?A 'M?@J>@U?#>c?c>n>΋l?.>Uq?t?bpn@%[>w?W >?B{@|+?;>?$}>O?(}?{l@29?Q$?p?4@-<?J5?\MTYcڳ??> 2?F >̮g< ϥ^??>Ha??]=?;4Me a?Dž?^G?@{;4*aS@P?+`Uۊ'>m@!(92@GY%?@mv@ 24h?h@'$?ҥ?r@H?JFsNl/?%Њ@w>k4ɿ S?h>8=v@$-@q'?2T??A3,2ADJ@#~_APR%3AElA u&A vA x|A z&A {A }|A &A A {A $A A xA "A A tA A A oA A A hA A A _A A A UA A A JA A A >?(?->>͏@R2@[s]e=ٿ5e?ҿ;eR%>Ʒ(e@?_Bj! Eƽ\ ?"[R?p?O{R=pF/n7?#:@c~?@>@!so?~1?h"l@̿v(@ = ?SP ^?y9?<$ Xm?;X@3@>w?wPC@=#@91T^/ i;濉0?N@FcOښ=/?1? vQyп3?@p?Sҿ@]]@.Ga?GO@b?s4>B+?#?n?>?g*?Ȧ&??Dif@7h- ?qNi@8ٿ迆C4uf@%*@/y?.?04=˶?*?|<>qo3B?M?v*?~p>\ OOz;>dd?蕾~d?:]?n@+ @(X [fZ>??=C>u=l=ъ? Ul Z￈Y)> j7?$>>ׅ?f?=?#?k?k?A=?L"?+?:ѿN:T??Ü?>_Կ*+?۠>>SON4?kN?>I9{>I?)k?'>ĠI?uf(=?>=@ޗ>=q? A?T\"[>4?\>t@[>A?&`>Tj@ '?~>yVOrܿura?3إVt̿!ڿjTrb=v>>Y?1>y?=ѿB'>i?K츽ZQ?n?.? ?>?2Uh+?G?}?~ >*]$`x]A{+? ???m?|??b#?{4IW?_?$j> >>k{?Puu>?Q??o&?*]Q>"@ ?<]>Ƚi>J? &z?ԣI?j?c>^,?kD?Tu>Xf <Z?}?_ii??R۟^V?r+?G?'>^L>؝>ǹkx>?7ؿ=6pھ^οY>Ly??n>?k5h?y?!6͢=j־{NC?Sݿl?"=$ϿP??j??=No9If .7>z4?>e:?qs? _?3?L 2>w1?%.>N>'>Ե~>8>C{?w?p?O?>a ?2?|FD>ԋ??! <?3 ʾb6=B>R>QHJJ=`(Ͼk?,: >5>.GA+!>_ώ?+M?3?lu>?I@@ q?e ?)ǾcP>v#>:?R<пFi=?@?y>?&i?MNв>G=Ǫ7?m?M>"È|>Pr?ҳU?*e>tI?g?ɧo ??G?{??4]> =f>?r}L??TB>r??@=!>;U>UbQ?#>%>f?t?`p?O?M> `.?L? ? X??>7=>>{#?o?1`?-+?L>*`?,Yc?Q?U=?k=^>ͤ9=|?>վv">g ?n?B\0 ?7>suV>Jd8ە[?q;S?TRH?g4>d>qz>>C?>bqn?6-?ED*?)lLU=.WN?8t?3?;>l ?ik?j:?3>? Y?J1{?$?X?+j?+l,=YTh8: 0?M׫>q?_ǕA?*L >ד>Z̑>c\?j>g>%y?=U>j;?;??Vd=t@>s? ? ?9? 56?nA4?=+*Z? u?Zh 3?V^8?Fn>?3?=9c?.>7;s>L#?E*Yr? ׌>=>K?? * ehp?T?R{+>j4q???x?!>'G?Ѿf?S?`&G0H`>`ٽo񵻮S?$>\>P}x>B?GA>P?$>c>>p$?8?;>>n>JPZ>(? 11%K>.'>0I>?,r>:Q>?mB?[3 ?$#1??=b?@->͹? >Ld?\r>ĝ=x>x.b?}=(8?wyi?>&U?@'?m ?.>Y=,!>9m?Ou{?A?wO>?D?v> k>h9?37??Q>>^-> ?.^?Ա?e,? ??º?oS=?뾳o?-?~>z߾ =?G'>^= C3?htg?z>x> t:>?RF? =SlŻS>*>.ܾ4ڽ%?BՈ?A5߾8? >>6? ?ak?_er *F>+(> ?F? N? ?;K=0??5e?!!lw??Pܦ?O>\/=> >ՒA>2?ۺ?C?G_?%z=ȼ>_f?@>y>?9G?-[>=0>E?;`;?f?>F?sb>| ?&? 3?M>;*=?@.l?9c>U>%?>骼MK>E??*R=i ?%??9(!< >=3z>?R?G>>G?">1h??`? >c? ?s$??xw>>??>\^?=o?[?پ<??7A>>á?i?>?Ko??N?.E?[>?|I?z,>>?y? W?g?s9>q;d?F%m>%?T?E?$?ͩs?ߪ?0 ?|h? =C?~ ?`?Z\,<< ???ҏ@O.?*>r'h?7S7>m3?"F.??]?Q?0>?fM?}`?\t ?]?HOW?j6O|h8>kC?RM?>GU?uy?G?dO>TC>m>ؒ)?4?#9?3?!_ %D??)&>AF?Rۆ?x?yUB>Õ?I?̀>') >=_>9>B?"?o>{?s91>˪սc?E?f??=t?f?q?&.>X>>z?@?t ?4,>><>6?"x?~De?c/><>u??(e???">y<g=6>D?8m?<='?U=??Ed> >O?*yO??LP>נ0?ux?7? ???6?u >&??m?]>Po?hf?Q?C???d2>??#g>>>/"??m?I?;-?W>96?>?s?E>#?*I{?Xg>M?NQ?4=Z|[?{?aZT?@O> ?$?"? f4??S~=c>)?0 ?U4>(z^? 4>KB=>=O$=Ǻ\>S>ݖ?E??^?>-"3>'?$[? ?Kq?>>h9?72>ƞ?@6?M?9?6]?c?2>>7>=>=?]? x=?D?Cu?+> ? ?v?I>I>%>KH?'?A?!?E?:> >>?onZ???_g_??-^?`|?D5?*~?L?c??/+>b ?*-~??????4?]?$??1>麙?`?=>a?O>>s=ۛ>R#?+U?">b?X|? R!?c0?&>g>Xb>?L'??&}P? s>Z?0j?Hb?F'?P?r?h@?$?'? t>!>c??>V?X$">j>V>b?0?"[?"?a9?~?0?0?y??*k^?s?>Ā??Z?ZB?]? k????drI?F>K>Ϸ/?wI>u?8A?w>??!?V??R?M>G? V?_>?7?I5?;{?L?^?|C?5h?k)V? ? U?Lp?g9?h(c?J?lj?_>l&?NzK?B>t?Y?*m?mD?ͻ???@̦?y?a>3?9?>uƒ??>>?vd>_>m>.??z>?~,?>Ss?`>?I $m>?zW?L??`?&?j`?n?l?$y?DC?Ml?^?: >/??4"?IN?k4> >C?%[H>.?3?^?Dz?6V?d7?h???!?r? ?`M?XU?g??>?;,%?$??? ?????^?!P?](?3?F*?>>??lԲ>2?@?I?_?p?N 0?6?@d?6O?z>=?y[???]]?qv?U?i?cp?>;?1?QA?ϕ_?}~?q?`?  ?¤?:?'I>Wa?$??a?AN?>]k>?@G>?n?\@?x?|in?{6?\@?95?&>BE1>x?L?@J>v?@W?e?o#?C???ӛ?x?Yk=a#?~?>?~?X?l0?{?;>æ?4?\D?%P?u?? ?v?=?h̙?kW?_?w(?1?«g?C?CK??f? ?)%?p????|?83?-s?Mn?P??V?l3?Q%?s?N??G6?.?:Z??^F?1>N>T"?a{?f]? ?cv?m‰?+?wm?D>=?%a?=&?V?K%G>q??F?|I??Z?"W?Z?S|?0~?^>h?R)?]>X??S?U?lZh?B2?M?;{?yo? ?rg>]"?j?3?RN?//?Y>8??E??Mf?,Z?W?>J??9? #?eZ?J?"? ѳ???Y?I??#.?W!?n%?N? \>?U#?-6> ?R2|??ʧ&?|??0M?aV?V4%?O$0>'?Z?F ?(Qu?G?4u>E>t|?z?ci?.??Gi?t??;?)?ۛ??",?HXI?"G?QX?i 5?I????]N`??n>?+c??N\>P?W?L?9?C?t:?(?X?F4?A??2?%?=?`??8?? ?gw?|4?Q4???ӯ?+y??u?CS?E?{~?^V>L?n&? ?+?[??,^Y? ?0??\??v>>Z?!Z?4pW?b ??n?}?D?-?R>??w[?>L??^K?$+{?wo?Tq?[w??P>;?w??K^???v?\??!̡>?k1??|e?hB?l?܋?Pg????4?C?_k?l?da? ?OC?S]g??f?~?n??~z?vi?iT?wm??q?G'?Y?R?B?QN?g?f>ǀV?2i??p#?rW?:?w?Z?vW>?'?_7?wK?)D^?&J?t?????!LU?&`?a>>{L?j??I?D?[ ?[O?G.?DW9?b}>??w=?Wy-?F)??k?ZI?ٺ? ?nys?v?q4?̹?߯?)? ?p?/e?!`?`R?xͽ?`?==?0]!?K ?z/?5???l?j+?C^?nI?!>?_??^?{??0?J??zC?????z??)?Ƶx?)#?? m??-?X???F'??4???y??ö?s? ?n?T?f?K??v?{?9?,?΄???i1?g?H?]?9)?Bm?U"???_ ?S?A1?m?I1?Y?b?ӱ?Wi??K?`i?w?k? o?K漄2$>??c??EA???ۈ?V?#?ٿ ?z?-??biw?,V?|=9>?V*?{???⧣>&?*P??8?k??.ׁ?c?4=?|??k??;??Q?vQ*?F?~F?~V? ??5??s??Ds?ϼ?!y??J?pK?KJ??=?'?Ź?챢?ѭ@A^?bz?F7?c?a??,?+? l?/?X ??K?[?{]?4??Qz??X? ?v?p|??3@? N?fy?s??%?A:s?0??I??ϕ?w?φ?Δ?V?c?g?@h? ?B?m?x?1?Z??l?3?ν?g?h^?S??V?C?]?C?~?A&j?|X?a?&O? j?T?\??^?}>??6?/?b$?z???ô??d@ @ ?d?#?_?@M@;@$P?u??w?V???}C?U?8? `?}>C9?3?gu?f??k??ω1?ձ?Ʒ?q?Κk?p}c?_?N˸?f\?l?m??G@?]?~?Z?|a?K|?h?f???\?(?j??Π?s@J@Ѳ?7?e???O@*X?:@@r5D@SH>?c@5@H?5?68x??R8?@hX@+D@ @ =@ ?ۉ?O@Q@ 0@ ?Y@'4@6ON?3?W@23@'?@'@?U @<@@#k@"Z@f@t@!n@@@1W@ ^@S7@ ?+ @@@u@@,ݵ@rH@???W@,@IB@6?? ?\?:?o@i@T[@y/???=?χ?~??{@.fa@}@?ʢ@ u@ ?O@ @ A@?B+?@IE@ ?E@@+b@*=@(@"@Q@@A@o?3@@=?n??@W?@'ZH@%l/?+? @%A?pw? &@@] ??M?L?C?@@z@'@:?ܩ??]@!?٧@ g@z@$@7}@@ @@+H?ߊ#?)??W!@1@(h@!ۑ@Ț@e?9@O@U?`@y@p@P@0_E@L?b@:|4?%@/@3C@G@ ?/"@x?@ &@,}@@@ P@?/?ۺ@Ue?z?@,:?D?@Y:@?-@ ?bD?S@y@A-?f.?@/@*?z@@"ie???|@ ,?@?@@#{s@E?>?ێ? B?a?a??.@D@@0?R@+<@0T@)B@U]@/@(@ " @+@*@,?'@1q@GFS@ ?"@?M??׶@H???p?@à@8S#@-?@?h@i@@! @i?c?V?@y@C%@=?5n@@5F@1 @@d@ /b@&?@/@-G@+@ ?3?@F3@P@&@2@?e`@E@?@N@ Ct@?f@?D@6@^C@PyF?]@`@ @\ @T?_%@%@19@E@Z8@S???@ E?5?k@ 3@>?0@a?G??{?0? ?.??ќ@:K?@K9?i@@ @z?@:@j @H/@Y@5w@* =@@@1@)]*@?Z@ P?D@@0L@6@ z@>@@ TL@#@?@YF@q @2 @3?6Q?u(@ @>@;@'@ @ e@3@(@N@\o@@W5@@E$@$@!x@#A@"I@?Ğ@JR@@̒@! @S@4@)@Ľ@!(@! Q@;Z@Y]@$u@8@MRj@.@+@@@%t@@@>k@U@1Z@/=@@@?CN?G@$^@6χ@ @'@@59@9[@:?Vi?@ 4@B@MUy@J3@6W@/[N?&_@D @g?@.Y@$@Rk@ó@@ *@@l@'@ ]@!@/@"ca@7@%?@,@N@64@#6@-@BLU@)@#@7@@3@Bv@5U@@c@#@@C@1m@"H@4,@[ @<F@@d@5@C6@9U@@@&3@;@B@(@9?@C>@$?@@ @@SME@,@K›@3@#s@I"$@O]@**@5Nh@ 4@x@:8@4ԏ@@\@@A]@@)?X@;v@21w@& @Jb@@0m@"@7@<@1*@5@,*S@ ʮ@%#@4W@8p@!:&@8@T@L٧@;Z@7.@%@r@-@O2@B@4@ )@%@@Ti@,ڿ@.@:@2 @416@1&@.@>[@^@<@0n@Xɐ@@HT@ K@5q@T @Tv/@J@H@@*d@+~I@"@c@sc@$@E@7-f@Jx@?:@_@9a0@B-@G@&@c@8@*T@(@<[@2@:@7T?(@:e@3_\@/@-u@`@NS@ @>I@,1@@*V@3B@J o@ @{S@=b@LN@7/@3@=N@18@-@D@RƘ@>@!z@Tj@h@=j@C{@=q$@ @'@?6@(@9@N@6 v@26@7@6@@9!@A@Dx@LzL@T}@2f@k@N@@$N@9@JJr@"5@@.@@t@It'@Ap@:@En@c@k:@C@If@%\@;=@?&@(|@Th@>Q@=@AMu@7v@'P@ b@4@Y@[`@<@:~@O=y@X^@.U@7@W@X@M@\z@Ee@A*~@L@?c@< @@M@U7%@9@,4@:B@J@Eڣ@HD@;@.@M_@V1@A+@75@)@d@$j6@4@WG@1cQ@p@=@6ޯ@!L@86@Jl@@l@0Ox@6_@5(@)@1hE@@E@8@ m@Dsu@:R@Hь@OQ@3a@&F@6.@@Ͱ@ \@ "@=@0k@#k@@@@Co@L @+@1@@&ת@p@%ۑ@-7@2@Cm@@{y@32@B@.45@0$@;@4Z@@-@. @'@0Y@JAt@S@9ٙ@>W@*V@!}@.@54@!@ @?0`@9f@7U&@MLk@5@/2@&۶@@@Lć@D@a@s<@Zr@{ @g@S@lH?Z@*g@L@G]@6?@@u@SE@"@@!5<@$@Of@|?@[A@$"@P@77@;1@C @0O@A8@m@G@+j@4@<'g@3\@`@D@EC@1@`T@:O7@M@5 @<֣@%)@n1@^<@9@.@@@<@>[@f@y،@% 2@ĥ@>8@I L@Lt@Z@QR@;@.[@/@2@7@> @7z~@'@%M@/@7@4 r@(֪@5̝@Z"@ @;@%e@%}1@M}@2|L@Req@M@Q@V{@>@!a@K`@>@B@FO@8".@HL@lP@7@R@% @/'@6Q@أ@` @.@)@IrE@n@I.@%@-2@;@%@* @1@W@&@1#a@ @YZ@Ey@'@F,!@:R@B4@J0<@1?@6@>&@X>@4C@@7Φ@9@4@W@O@Z_@{4@W?@>W@8@'@;&/@0@>R=@?b@9B@+@)2@&@@iw@or5@hc@&?@@<@+*@+y@M~@5@3A@@ G@f@1PE@;Iw@ @L@#?޸@@@ @G@O@@n>@1>@@$@*i@D@9@)z@VU@.d@H@S-@A@>@I.@ 4@H@)@ ,@TJ@F@?@Fm(@3\@#@.)@A)@b-@gz@F-@F"@Dm@)@ G}@,ԛ@4@@@Db@33/@v@Pr@ep@@y@Ƀ@]@]@G@`n@d@CJ@OY@>Y@]s@:r@*@L@7@Id@Ve@@@r@,@t@;n@@\|@v&f@Q *@'.M@R@(U@8O,@d @dQO@TF@P@Pq @PǾ@F@Eʗ@dn@Z@HZ@e@Y@.@UX7@'qx@H2@NL@C@_@o@KV@#*@D @(S@8Y@{@US@MY@W(o@JU@V@X@;k@G @I@UD@`G@4ئ@4p@0^@); @G(@q9@A5@3?@aqU@|7?+@[@^|@e`@U%@X<@4Z@?oZ@Q|@x#@t@8k@@?d@yM@wb@e@i=@W@u=@@b"c@Se@D@@<@WW@?@R2@k@G@Y9@}N@Ui@p@J@\@Gؐ@^@cq@8c@;@1$@V@J1@c@gݎ@G@5@w@XA@u@/I@0A@;x@?@M @U@j@)z@K[@8@0j@d @r @s@l.@S@=)@@H@|@Hl@Nn@,:@5]A@:>@P]@dɾ@3@@8@S|@Cqe@V@R@=@$@Z@6,@8K@H=$@I@u@@#@@:H@C@P@^'~@b.U@VӚ@M@VZS@p(p@Lk@C#@[@*@La4@]@NF@V#a@m@Z|@n@Xp@JpV@M+@E@7@op@Zf@%[@;@9y@3#@W@jd@q@S4@6'5@bg.@Ry@BQ@^f@h@Q@S@OI@Hs@C@Dux@H%@K3@V^@[@b0y@m@ni@Z{@\@E@D@K@6Ji@a@b5@b@AYJ@?;@\H_@_c@>l@Z.@cmL@N0@FM@X@C/@.'@+A@>t@Qn@J@`n@L@:v@>@Bn@=?@jO@9@T{@a4@64@Z@d @w0@%b@@7@Kk@0+@J6@B@4~@iO@T@)@5@Z@&Y@]G@Uk@d@fp@C^@=g@dP@p7U@]C@Pdf@:@<5@Yտ@^J@@"0@V@aV@@v@BLAJ@:I?-@)U@@6?a@1@W@T?@vH#@f@>x@!o@F)@Ns@B6@B7u@T)@@Wt@&4@+ @K@R+@V5K@@֔@~A@G@?g@@@WT@::@I@l-@6#@N@6@@I9@OtU@h8@x@VO@`@-@kn@O@V@?@7t@OX@Qe@D@cŋ@rq@S,@-x@A @@2z @"@^@@>@(u@[@N@9@XA@kMh@Pw@L@H@:Ԙ@K@1@t@O@Xk@8]~@V@ @E@f@j@Sb@=@Q@S%@O3@,d@'׹@d @Lc@\_@Se@V @c|@f]@`@eA@P$@cb7@q@`~@`7 @Q~@|&@o9@bl@pB@R`8@^a@NUE@BF,@Ml0@`r@a,@XL@r'@V@Ry@ys3@_ya@S:@Y.@[@@=@4m@r@fo@Vv@N4@J@SH@d@W.@P{@>@S@hz@*;@g.@oה@/S0@?܌@]8@g@Ô@]ͣ@X @f%@GW@Os@p`@X@M@W@k@{` @16@@m1@0`@E_@Is@FK@ @ @>C@O+@.Z@B޽@:@7~@5i@:.@_@Y{@9>@C@_rm@`O@N:@C%@R'@kw@YH^@@cg@q@X@[@kP@f@Q@IX@T @z@p[]@i@p@A@[W@_@Os@@H@FA@Eb @v?M@ff@@~@r@n@V@H&@ 3@k@u@D@EE=@W$@\@NO@@w@bh@Q@K,@P@>@Z@V<@^@`@S@X @>V@5}@b@D?@q@u@v~@3@<@4n@H@T@R@@l@&@~5@A@Eq@T/@^ @e@@Z:@ }@<$@Lȝ@8 @`W@Wu@r@0 @Qs?@|&@aj@>1@BP@S:@HzA@hdW@@d@t5@#k@Kc@Z@7@ @q@zT@@8<@Z@Fq@c4@f@,@=ؕ@Ao@G׈@P(@zFj@j9@ao@s@g|f@@uw@7~@2U>@V@cϫ@]@t0:@[t6@Ttv@.@O@C@/@9ǻ@,~@SB@Y@B @lw@-@B@X7F@L`@zq@D@J@E>@@@Oc@`3@M@6rs@^U@f@im4@I_@A@QlS@-U@D@L@F@K֋@K)@@A@6@S@4]@S@m*@fE@M@Ne@Ve@Oh@J@c0@ap@XZ@8/@)@H@k@cM@^>R@>{@Pj@yY@F݅@}ć@Q3@*@'-@L@A@Ci#@kwK@Z$@cg@@=Q@O]@^X @I@R8c@A@:g<@@<@H @F@7s'@9i@jT\@u@i@d @Z@qg@] @H@E@EP@A/@WJ@R¨@YB-@UT,@Cfo@P|@R8K@P@ru@w"@T]@S/@=@\@FE}@G4@q7@R1)@R@L@@H@Y@UԸ@S@p"@uR@q@KB@E+@w@A@u@w8Ac@8@@Q@Q_A@6R@h@@@@"@@@VY@@@W@O5@F@{?@fQ@vV@0t@.I@|@TA,AQ@ @j@G@@@Y@W7@|}@]@@@@w@x-@iAg@<@Y@Xw@a(@z`M@Ui@g@N@A@]@LBl@kA?Mv@2A$@@j@6@`@O@X=T@3?Q@&@1@L@N\@r@~l@yt@}@6@ـ.@@EB@Ff@J@.k'@h@`@hc@@1l@w@X:@r!@J@~H@]+@@Y@O@R@fO@i@~C@~ @z@{@jx@{;@i@j@m@W @yk @d@@[r@o@T-@?37@@@Pv@9@@{@x@T@o@?l@XOh@`M@dw@up@e@7o/@F@b@@@u@@Zl@s@n@l]@@8@x@Z_@ue@P9@K @@so@DG@h@z)@u@_߯@/B@@k@AGQ@4O3@$w@/~@^1W@\ @@SQ@`[@D@B]a@@d6E@ @ca@BAv@@f@fm?@@1W@on?@@S@޽@=[@cr@m?h@-J@ge@Wa-@@Q@@`@m@|C@U'@L@5L@_@f@)q@r&@.d@eu@,@_@^36@ @qE@R\@>۾71GAOX@F@](@yO@@uR&@le$@ax@:@w>@'@`@Ts@\V@cC@P@Fb@Q@M@J7@O7@X'@d-@J~O@ð@@{@yP@F@9@dc@Dn@e@x}g?ES>@.D@p@vc@h'@ @@Z/@<-@?@]9@A,w@~@"@ A,^A .@_@1@5_@Z@dD@@lW@=%@*?E@^/?؇bAA@@|@U@@u@Ԍi@v@[?2@@@b@F!3@@@2:@_@\oW@3UN@'hZ@%+@o$@q@@.@ V9AIA6@z@)&?&H@7@]@}@Y]@u'@ @`Hi@Y@Ԩ@Jp@U%@ES@i?@\j@܌@3@"@_wT@C@l@"m@;¿A«@*@l@; @21@k@%G@#>%@wkA0=V@@h'@*8@j9@LC@JOAQOA2@@2%@y@H@@i?ҵ?D@J@1-G?f@@˜@ @ w@PD@D*@/A?DAKbA%@*@U@??AA@ @~K@Z@@@C$@E@G@JO@L@O@Qx@S@V?@X@[@]d@_@b(@d? (3?=6@ӿ,@[3&?:)F&=%ªf@@B` F@nc7п܁?}@q@lA?*>OkI >jl@U!@F:?`gB[g??Ⱦ6A@>>IJť^6g3'>u|?="he??ð@@>㍿4?*@*׿@@}':@)MO@'=@3-,ff%i>@8@Qo?! @Tsw@;@ с@ ?%?T?/- @?|OE! A @@n$?f@P?@hz=@(=@@?+/?q@C@?^@> ģ@O?l?sZ9|o?kXM?W>ZN_K@˿;>]_>[H|?? ?L.?)l?g?o (?=N>S?}?Ja?@z@7@??(>?T@u@>˿T>F?Z?6 ;o>8Y>&^*?z?I?ir>J?5?{>?#?#?(@0s?F4?Q2?il>{澛@MX@A.>(?.6@(?9&?rSiK?b@@?@#5@8>?d뿥??8j{?*?27?$>&׽L>r@@?rG@w΄? ^?y?}?f?$?l}??Ԋ>> ?1ԯ@n?-|+?5i}?K?W[?/?˦]?#?Ҹ>>[x?\L?j?"l{?tW@ ? ?]пf?%S31P?>=p@@0(?l@,@]?وjM?'>?04?S@ /?>%?-5Tb@<>>!?,>?>"?'?QH?@ ? ?l?1OWcU8?U?@L@&?x?c? ?9l>&?P@?%r?ʟ,@>F>?o? /?V~X???㴾$@P@?3jV @ F[L\?!3#>V=ݢYN?;@/ @??吠?ԧ?Ҍ>jzpU??䧾?{>2P>/,>?Ľy.4?E?N{>DjG?l>fV?8T#?7՜>? 6d?_B(?cqҺV?>D??Z>?ѵ>\@Q?\?*N??Uĕ?b ?Ɲ?v?Tv?:S??J>c@?`?}2?$>?rv+?̒?9?2L?ѩ??6?Kj?R^>>EG@'?Ȼ?2P>Ѕ?@o??ӲW?g'?O:?rB?g?v ?m?R???c?x==?c?>6J>W?p? 3>+=}^>`?_3ar? ?.?ks?R?~c?I>%U>~?f?E?0?"Լ?8=~?qQU?DU>^?`?8?\?ܑ??[4?)l?R?P?u?:~?>?o?4A?b?g?X?4?S??e?o??ͩ?~?4s8??s??ںf?I?[?I?d???Y;>]?4\?>?&c?ٶY?`??@??#,???RL?.>[?#h?,?cc?j??}?E,??k?n?@6??Ę?>d8??Y?;r?n?r?h? ?w??;?j?W?)]?^?HZ?y?iA?-@}@/b?qX?t? >?ۈ?5?Վ?̤@ Z?0??s?h?^L?&???@V7?ʕ,>ި?a)??P?޹?`??P?&??Y? ?\e??s?T?r?j??ײt@?LK??WL@)?????;?5|@*P@9?J?Z?}?$N? ???ʌ???? ??Z?˜?>z->̯?myL??d?;x?p?3??ʓ@@c?r@?t??@9?w@ e@?m?/?|@c@,b?/?t@@K@@@??~@K?Й@]?AM@ @%\?O??>Y?x?44:?D\@ e#??x??ʽ?ޒ_@a??@#m@z???7@ ?X?~@??AA? ?fҞ?(?\?+?,_?q?/Z=Y?K@E=l??$?M?@@S???Ra?g@)??@ }@$ټ@ @b@ ?+l?uB?r`?o,@}?-?ސ?ij?P@S?w0?g?F?Ο@ A@.'@l@@@.@ @]@! @@@@XI@ӆ?]O? @6@ CN?:@0@ n?? ?ւ?D@yM?a??R?7?@ Fz???Y?"? {???4??\ ?;?E?(@??˰@@D?@6?ө|?:?_??}??]@`X?d@@ @7?J@6@#?ơ'?@?ݞ@@ ?ԍ@Zl@?C @?ԏ? ?{?F@!?;@ ??$@v%@[@g@/(@@?@@ ޢ@ 3d@'??͑@d@??΁?@Y@ ^@7y@Y@XZ@t?I?F@a@@6@) @U@?$?K@ S@ d@T@.@ ?b@B@=@9@ )@2#u@ @@-@)@a@Ej?V? @@*@"O@ .@u!@͵?-?{ ?"?nn?h?!?Iks=s?e?O@?G@yW@?j@s@m?IC?ܐ @ @ 7@ |~?@^@!>@#c@Ē@"z@1 @)@*m@%0@?@@.,@ @@@B?kA??Y2?K?@??ZI?n??f-?R?S??i?Ք@aG@-rJ@@D2@6?@ @@3@q ;@w@c@ @B=F@E(@R@KO@N@_@@"@=@Y@]oF@Ig@zw@J=@Fu@]p@L@,>@P@D$@=6@G@A@3=@nq@ v@4@mr@C@?2@kr@x@F@<@a(@Z7@XII@!@3u@Q@:~@!?@?@U@:'@0;@L@W0@[|@i@m@`x@_jv@q~@(@D*@ae@n@mnR@Kk@5|@/{@A-@GI@7N@28@:X@@[@?@O@\}@J w@jף@]a@[#@`m@X@V@i57@Fi@;&@Hrj@K@H~0@n@h@O&@P@\{@F'@J\@;>@0@HX@e<@X`@z+@ml@:@*@;y@2]@b@e @_@^M@O@3B@;л@$@Q@@O@ H@pu@`09@8@1G@=1@I@I@'0@E'1@U@Ku@Lj@dB@_@b/@TR@@A@W%@@{8b@D@@S@M@*v5@0@`@pD@BV@Q@=&@@/@R߾@Dg@G=@R{@(5@Y(@;$l@6Gt@V?@b1@N@`w@;a@XO4@lW@G;@?@W @;F@T:@OV@"L@L&N@c@1'@!T_@A8'@h@q9p@9 @Wg@_@\@`@Mt@v@x<@KF@DY@^)@D*@[~@l@I @P@ @q@/@6@L\@PI@AVd@'̈@4@J@p9@m@D%d@BS@P@K@m@w@P]@GK@N@G.@iL@'@Z@qu@W/|@j@R@_L@6@Q;@K@g,|@cׯ@i@Q@C&@D@}k@Z@F#@t0 @r@@&=@{@Hc@w@M@x@f@A"@-C@23X@P@vG@I/@D@HV@K<@J@P@Jv@kJ@Q8_@.@zA@H@lv@=@@M@{o@@h@]@0S@A@~@: X@.?@,9@:@;@7i@V?&@4] @7%@F@@9:@@SQ@W0@\}@@\@Ja@p@iS@} @|+@Uh@b@:e@j@QeM@OC@ @h5@r@aͩ@m@@}@@@zPn@q@X@G@c٦@oi@`.@sO@C@@K@y.0@m^@;@Sr@]m"@g`@nu@aɵ@_N@r/C@px@H )@Y@m@z@*@{eA@mJ@^@@@n @@ZC@~p@@{@t3@z[@~@]@|@g>$@h @-@C)@N@n}@uY@mh@o@},@z+D@@'@oQo@KX@R6@@|@k@s@އ@w@_L@q3@@l9@]@YV@@y@|4*@@"@@Z@u+=@(]@@@Q@tR@o@N@j @K@@V@k1@@.@|5@Z@O@3@l@<@<@@+@3@k@U8@jA@[@<@V&@ @b@Ȅ@G'@@a<@@@%@S@s@@k@@@2@IB@K$@`W@a@Ĭ@`@N@O@@1@%@@x@c@@@H@X@@@2@C@@@ D@`{@@@@2@e]@V@[@s@#@o`@u@ e@U@/@S@1@hc@B@!@M_@A@@k@@z@@@@@@0@Ԯ@O@#d@$@i@Uk@@B@@F2@b6@@@.@\@R@}@@/@@.X@@(J@p@|@I@^@K@C@o@F@&d@P@ts@@Iu@@ҏ@c@ll@1@@l2@1@u@ @@(?@a>@Y@H@o@p@C@@@|@X@0@~@O @@@@@@]@ק@`@@1@K@c`@i@@@D@!@E@@@@@&@t@@;@o@'@4}@+U@|@@@`^@m@@@R@Dw@V^@Z@+@Y@@ӥ@ H@@@s@@@/@ @5@@f@r"@@f\@ @7@EA@ſ@b@}@M@H@@D@u@f@@:@i@sw@QJ@F@@$@@@ap@#@K@@8p@@Z@'@2`@@֥@gl@O@T@%@nJ@@œF@@b@x@@8@@z@~@b@@&@E@@*@iH@@+@@@{@m@/@@w<@B@@<@P@o0@u@D@kr@϶@Ŝ@@ >@0@p@آ@Ch@$@@6^@m@@j@h@}@v@E@C@d)@.@Xv@S@a@B=@ @@)@V!@e6@J@H1@6@S@Ɯa@@@@@ø@@i@/@@q@T@k@=,@@@_}@k|@@,@w@@Q0@$@r@.@p@(@7#@Z@& @@6@/@D@@M@@ @Y@X @:o@@@=g@.@@b@@٤@"@_@Ψ@e@@@:p@@ @@ @@@G @4{@Q@d@@@f@@m@*@ʊ$@.@@aq@V@VE@@<@@ 8@;@^l@@,!@|@@f@5J@ @¡@*@@@j @v@|@@2@ @@g@7@1Q@d@ɹ@@@ )@ @@v @ȩ@@ŕ@ʼ@ @K@ @+@Dy@ij@ā@@nl@c~@@ @gw@ @&x@k@_@$@@o@@C@ʥ@ɢ{@ބ@B@\@ń@z@v@@b@@ŀ@-@@[@,@W@}@{@(Q@@$@@! @@{@Å@ˤ@DŽ@@@@;@@;@@^@ős@:y@@H@@@r @@ӈ@@{@6@î@Z@{@,@5@ @k@H@h}@o@k@pM@@1@R@:q@@ @0@@#@@9@n@&@@V5@@ԉ7@#@5@m@b@S@@^T@H@ @°@@Iz@ɳ@@@ @q@8@" @@w@ @=@h@ͤ@֘@@@D@@@e@J@ɱ@{#@@F@@@H@qG@3'@v@@|@̀@Nz@o@s@ @$6@p!@@_@@̫@ C@]@@Η0@`H@ɼ@@@ӓ@G@Ʌ$@ x@@@.n@T@k@IM@ޅ@3@@֕@&*@,@80@6 @J@T@˶7@@D@<@@S@M@~t@=@L0@@_@P@g@/@džp@48@΂@?@k@d@@ۘ@ԻK@@@g@d@؁*AA*{ASIAQ AӨ@ep@7@ƍ@a[@|@@m@;@S@S:@=A L@p}@@֋@̘@{j@`@̌@t@@[@(@Ġ@2u@S@$B@@ @Ԣ!@@ʙ@Ë@R@„@~^@.@ǜx@ˆ@8@@@Z@Z@%@@@ڟ@6@JW@p-@@A@i=@r@pd@(@f@ȸ@ Z@@+@|@@2@ƪ@7@u@G@P@ @n@@o@n@[@@b@@@<@@ûs@ִ@@ٟ@՘@ԇ@A@< @ӈ@@.@:@@%@2@!!A 3@rR@k@"@/@ǭ@,Z@i@@@q@q@ܼ@@T@3@A@@q@Ы<@ʵ6@JW@"@(@Ͳ@+@į@B@@+`@S@@@U@x@A@) @@ϯ@Ζ%@Gn@@ѱ@m/@%'@ݴh@@Ф@x@@~@Ǻ@@ڜc@^@@$@@ϕ@fL@@@@ 8@Ϫ@l@@@@P@`9@D@iE@CO@@@+{@iZ@!r@I@@a@4@@!B@Х@0d@@@Y@@!@i@ڒ@ӳ@I@‘@<@@D@@@r@k @L@@C@k6@@@ b@ wA@@@$@Ёb@X_@@5 @F@Ø@I@˶@ @?@j+@}@$@i@/@ʤ@@Y@)3@J@hA@@f@j@'@0@0@@wJ@ @Ȧ@LN@͸@.@@&@BE@.@ۭ@ @@3;@^@"@Y@z@@O @= @>,@υ@.@ @sn@և@B@2@i@ @İ'@Z|@9@@F @G#@ʵ@?@ޝ%@i@@9@@΂z@@O@~6@ݷ@@e@@t@@ѝg@5@o@Я@E(@@1@@@Td@Z#@:}@ˑ@̠@A @&@͚@ޣ@5@|@@@ȗ}@ݼk@@F@/@_@"@]@Ny@ @@@ܲ@@vc@ @%@q@ӑ@H@k@ͪp@͹Q@,@G@ɧ@;@@|@?@@)@@r@2@N9@\@X@a@@S@h]@@A@ԝ@#@¾@Ɩ@"@@e@;@R@{@\@h@X@ܗ@Q@>@'@ʁ@r@S@e@f@@o@Թ@]@ֻ@@@l@@@#@K@@ݵo@W#@oU@@b@?@D@@z@ 6@@{@e7@@3@-@ˣ@c@ @vl@f@@#@Ê@L@@ȗ@P@!@䦢@+I@o@ȉ@j@´@@+o@z@"@ε@g@@@@̕@q@ʷ@@@@@˗@@́@̇O@q@̾.@FA@Q@$@$@<@@9@B@Ѩ@@@Ȓ@f@!@ @(V@Q@9Q@@+@E@(@ľK@?@<@z@΃@~S@6+@֗@#@ AL@,@@@ <@Κj@r@I@@ǔ%@>_@ˇv@Х@8`@x@(@@@@~@U@Ω@*@;@T@@&D@@m@{@Ǩ@@@@]@ۓ\@d@\@׀@|@h@3@8@N@ա,@.l@(@,@4@@a@i|@y@Đ@@0sAA)y@@;@@L@B@@*@A @@@&:@@ޤ@fF@ @@@;@}@I@ ^@J@#@Y@@[@V@@?@:.@C@樟@3@@Dh@%@׆@ @@雓@i@=x@͙@f@ً@fv@Ď@@Զ@X@̲*@D@@}@@D@>@u@ȝ?@,@@@@@Mj@V@,@L"@"@ӽ%@ݭ@=@٘@@Ɖ@M:@֡(@ޗ@@@ڛ@e@ t@7@@@u@@@@Զ@W@ @@c@@砪Aq@ت@۽@%@:o@8Aq@d@;`@ A,x@W@@^@1@AQ@m@ԝ@@W@׺!@5@@f@@ڳ@^@Gl@7@$c@^@~ @7@۾U@͟@ˡ@@L%@Ӌ @@G@}@ U@@W @v@V@J@"@˸@i@@-!@ N@+@[@X@f@߀@N@'@f9@}@hBA0@{@fP@@@)@?A(@׎@@H;@ʱ^@:k@y@礢@=@@H@]@s{@y@@!@Ɂ@@Ս@@9@@k@$/#AWA V{@x7@@A@Ƿ@@8@׾@ˆ@-J@1@P@b@d@J@S@@@A@9@V@N@Ύ@CAGH@u@P@pQ@@ 3@b@@˂Y@7Am@/ @OH@Na@/@쥦@BA*@@@4#@_@h)@8@q@Y@;@nd@UAy@]@0@n@ۭs@@8@g@@μ@\@@O@@*@@@ r@M@^\@ @b@Ʌ@Qy@؎@@Af&@̍@k@六@@|@J@@` @w@F@ć@˜@D@@@@_@AC@@@rz@A@l@#0@Β)@\@r@l@=@y@(@v@@@@ǫ@@ڦ@D@|@ے@a@@@h@k@@H@T@(@5N@ڬ@1E@َ@~@̈́@Q@5@@@j@ @@2O@_@@%@c&@}E@@@A2@m@V}@N@rb@~@r@@@,@K@׭@*@)@6@%@{@LL@qH@:@s#@@ވ$@k@p@*@i@.@@@4@I@@@;U@@ڏ@N@@@`@߁!@X@v@}@׊Q@ؔ @@w@@׬@P@@@魭@}@[R@Bp@@1@@ݒb@@:@A@c@@|@m@⤹@A A@^@1@@ @@+A @E@I@6^@-@z@uAA A U@sA4A ǩ@rj@"@@&AkA6 !@1*@;!@@|@@@ޠEA@@A@@K@X@Ц@xAvASd@H@]00A 9A A@z@@@@A%A A GA @@ī@@d@@ӷ(A^A|@@A@M>A @5A~A/)A @Mp@h@ӧAVA @hA3/@.@@;!AjR@NO@pAb>A"A|A 2AW@LA@MAiZA{@p@@@/@R@v@b @E@&@v@@H@Ʉd@@`*t@{@OAAҐAA\AZASAA>e5A5O A/gmA/i%A/jA/lA/nOA/pA/qA/svA/u.A/vA/xA/zSA/| A/}A/wA/.A/A/A/PA/A/A/pA/&A/A/A/EA/A/A/bA/A/A/~A/1?V?@f@9|@)@G?O|1@0@j*f@ @K?@濧4>ɛ@r`Nh@Q @"⟽C%,>+Q>[@sp:fy@P=@u:s?dU>Z@Z@r?5/? ]3gF+@8@>*0?@]@j!@rUQ$@%|@w?&zf@0HD@zJ>|5p?ذ7@]X?}?:5ۧ?s?r![Q="?pH?:D ?i?~?qC͏>؄oƿX= \r{c?:m@@Bpp lT:=@9?>?i 㽌>>ނi=G]@;@F:k?C?%>է?? [@[U*BgwU@l {K?񾋦?H@?V!V#-nܗ0?qj澡`p?d>{D(Qe8DPu꾩U>ՋR>*οke?=W>5ѿUo>@@J%*@ ]>=@tr?궲DA?{{ @%W?%(@?]W@@/p ?5'@(?>=@EP uշ@oF@$"? =?@?8?J6Z?, ]VW?C&x@T_2 ? bo?g=7, y vLM?a.9?2ƽB.?d-)?s d?@X>d?FڿBP=;>(޾a-v?8?$Y?+0==tj-?Z?%ka????P@=%$?(ѿp=!k?e.?>33J=-==eQ,H҅>t>O\NG{H=4ـ?Y>cC >T?>B?Fa?MLվ.'?[e@7?x=?6,O90P^`ȼ>L\>v=.v,>1=Щ?Rt?=x=?M;>|?F;ErG>F?U>z?ܿ?{?rS;z%K*>3??X]e[-?@\?Ϭ1?=Z?S?g,>f;?{MB?|ʭ>tq?v0=G?hrC? u9?SU?<>޽J=n??7k>;t2$Q?@K 6@nuMP_>~Br? ǎV=kϽ<&׾[*=?Lv?FT>0?< >U_"|k?e>P}˽2.TL=gF >^d4?0?nh>+~ּ?Q?f >¾C ?"??>ӊ >ǔ>9>TMr?k>@ A2?-%r/W?Yl'y4*>ܤE>X5>펺={#_>?p0=^p ?zy?g>@R6Q>Pe>͓? F9?}?4?A>L?Vt?+=g;r>>vK?#"?L4?%A>i¾=?KU>5>Rj?;Q$B~=y:eܕ>Ġ=ki>?r?!I>/@(?!>.?Z%?t#5?>@>3Ⱦk*6=hk?y?3<>L5?M?tX%tb>{?B?9Nq4c>= ޾f]?>i??D:r4?P?D>?ꑿy¿ C>yg&!?X<12u= >t>z=ڶ>&5F a=9= ? x锽qȶ?(0b>>\ʿU3?UQ>A}?Mp¾?qBw:1>E?,^?=[6f#6E=>v>Dz?4E`>'>::=N??$>h?X?/I?\?H>pX^ǽ?#L>'?A28?l>7>WKm1>«>;ǑQQC>L>aK#>.?ub\*>&,=8>?F=nڽڒ>y>P?Q{>:? g >*?}`K?!??Iឿ?׏?l>Y&326Z'Q>nڿ~'>T ?;>Wa><,>?Mv>?T뾍)? %=)'6z?p~>>-Y>{a?/>yvyi?]߿M7?'>_?f֑?ֿ8d>^7>u=ɒ?Y8l?p>:K@a>In?>o?~?f>8>,<ƿIt,m>M4?8>,^=x ??Z&?a$?64>Խ?N֭?/%?o?(=6>9&T>??V?^?>R较C?m=?>ne?3 >|?p?{{D='>?rT>7E2?Ӌ>J~? #?Pn=W]>D&>q??>?C?!:!l5u<Ћ?y?C;i02?=ph?1n>E>?co>+ޠ? @>7>q?t]M>ѻ >`> o:Z?mi>Dq>N?N1I>l+22?)X?$?80?tT>׽!bm>{p>p]?l +??o5>> ? 4=Jw 褌?).2>u>/?I ?mu?>K?@(?,?8?=ľx\>D?F"q>Ө|>h$=>o//> &>Jo>?)~>'=c_I>}D?8!.L>)gv?A ?U󽭕n? ѽ=H??9>^B?>?\ +>ӌ>SG+B?M>§>>)=Y6>k> ?na?r}?vV@??\|S>>,?<@@>.w??>ܟF?I>ª%?+VڋC6?=ke;?JZ?X\v>^>C'?c?>W> ?E{>>f=C!?RR8?BpUV =>TvrW>cS?AB?|y>@ ?|>2?<>>->Ә?92?>i?4B?>zDC>|?W>>=~>\>W==6$>I 3k=qTfJ-~>>$?Y= 3>^>򄽔?YkB?h?S2}.?Jo?P=~ns?2jO=o>~?Bd>=x >>ǔ?<>Ͻ|>>1>`?@?6x>>Z &>r>=Jv>a?c >5Mn=? z>A ;?!v==s]?s}?t_>V><jl>=V>\> 6>?3E?G0?M>s>?T>D?&쾯By>v?A8\>@>(9'?B?Ny?)??0>󒡽L6>=?>P%=l?&p>a==h>"x?%M?->?f>>>BA??39j?<8?k??:S==>BJ>F?"?$?5Bu>IP?H{=Zhw?B>a??W>o?_ͽy$Af>_?RB??*l>!?z:>ׯ?V? ==5(]? O?Dk?:8 ='>?$9W>?>uIF4?NP=?N)7>KV>?:A=>J?T?7t?=2:>|O?*>*?`I>[> >:<?iYj>?x?Yc pC<=|35;hMG??l>e>L>M>WM>?H?;Z?X>RQ ?*l?>JQ=>y2>ӎ ?6)>e\>?QJCB?jH?>P/>8?L >>>=罠=?I0>+u=FQ>grx?QE=>q>M(?'@?w[b=Vr;>@+>?)>R>}?yt?8=O+>JH?F?~> >H#=>݌>??NT?>7OB*?Ϳ @7Vb>"i>=9>ζ?hR[>B=DE>9??~j?%b?7>}?;Q?!ny?+w?Agy.>$?^轲[4?&=-=U<>m?^`i7fK=h?2r%>4͂?E :A?,C?6T>V>&>HC5o>:.Z>>.\,?H>2|o+E=)>?4?4\)>vJ>w?I?Yg?cx?F?Pv?">?>~><???A>Ɖ>?7z?.cG>?=K?!) ?s LqW@>p>>QIE>=??P?bX?0>?r?uQ>L8? h>[>L?[)?o>9?A>e,>s?Vy">*>?>Ϲ?X(?z? %[>m䨾K >O?\>up>L>&X&z>Y{?0>>خ=Ֆ=?r?6??Aԫ>5>iu?$j>?)׻>A>v>>S?x}z=u?P> 7??J?ݥ?>ا>*G?l?=@yK>W>Mk> j? =v>H?35?jV>."?5R?3^=>w?3?]o?Z>%N?Y? V?NnH? #3>[]>j >ѫ?O??>m?>?6?k>Z m>>>I?_ӕ?Y=v>j?M?V?LV=??6"VU ? ?2k?Yi>>K?R'? ??B)>>\>WE?3>B^>(>W?>>,q?+ ?D?}s>Uq5}??U=>\?D>?i?>W>[> ?{`>b>?$?%O?y>>?Z>_>>{> ?7>N?5=?,>?[?H,'>Id?,¾>>F?{??4?E">>~=¼=>4? ">\==@>R>>F>œ>U>>O]>`>/?Q>? ??>p??"?wM?;Y*?b?g/?@?9+??&?R$> ?<?A{7?*}>>p?$-?''r?[u>"'>+A>V?*?&=Q>?2>>?X?)7t=xs?%끼#??J=X>&>#b>>g>j?S?>)1=JV>4&>Ƚ{j>A? >?$>൩>??B_> >Y[>4u=a??>Lz>%t? f>w>mI^?&>f?j2?=>s?d>?_>sz?-==S>?!;? ?>(??o?L?\(? >IN?Rs>??In;?(3>8߲>z? >?R?.?{ّ?25?]? ??g?u??2-r?u? ?>??t?;z?y?/t>? ?"ǝ?gk?M?W$?SC?D>!>Z?q?[R?? fY?L>?^ ??>??7>v>];?"?v?5r?r?U?Xv0>h=a?C?\#?faS?6?|^?]>ye=ц?c?i?|?>l?n?$?U>@?6=}?im?z?>q<=Ŗ>s;?a=g(>?>E>%u >e.>?-?ySL>Á=#>n?/P8?? ?=?g??a=g?_?">k??R?]h>f?;?d>ɕ? ?+z~?aq >$8؍?*?aBaM>? j>T?a?eY?o?OY?+P8?V?KJ?e>P>?W??R>U}n?G?|8`?a鰯>&>t>?3r?ZUf>>I?TF>>4?C9>\?[??ֶ?|1?N??z\?#>?)?K>-?nn?6?a?L?K?s ?A?(1??oG/8?r?p&3?!?G?Q|=\&? ?1z`>=s:=>">GG?,?qD? S># ?79?jt?z??Hr?S?ƪ? L?5?M?#f?2 ? ?Np?Pe?P?h ?d??j?,???l?ߥ?8?r~R?[n?eE?s?Pn1>ȋ?0?,j?:*?>1>X?~?E?,?~?^Xl?|Ly>V,?$?Z?fG?(?|d?OJ?CX?S'?qل>;|=u>ڹ?+(?q?R>e??T?-X?wF?lZ?o;> p? ??R?R>{??N?p/?k?5>w?~?V?-?T9?Q?G?#? ?^??4>n?.?(?m->։?:i?'X?F]=>>a$?T?>.?+?U?b>?BR@lX@ y5A5A$+@6K??_?F?k?c,?9T?ku>>\5'?_V?N>ɮ?W?\4?t??"?h?d?z?ZD?.p&??[?n7S?t?9u?@?4?R?׮>?&b?Ն?w">˝Z>?hW??Y?4?WD>L??4?W"/?8?փI?2 ??pJ>q]?@?~+>v>?i?V,?Wc?;we>ŭ?0???d?">?D ?OK>??!?k??>?@=>?ȸn?e?ab??S`?_?hV?.L*?B?O??2?3n?N?!*D??N>?e?M?6?y?[?Z?q?AN?C?J<?1>>&?]>nm?Y?}QQ?;S?.*?3?Uu?Ux?>?)F?/L?;?$U???"2>z>J?Ǡ?F|?!N?>v?>1?IyC??G?2?T.?Ej?7?}a??ۄ?'?cI???Z?? >J?E}?۳?V?2H?v??? T> ?r7??b??T?<$?+??e?h >> ?y?j06=)?af?l?IY@? 8??%KA?ܑ?=?/?G?? ?ʋ??e?P8+?)?`:-?7??r?x??,?!$??*?I'5?|?;??)>I??n~?T?7?c>g>޷l?m??' ?T@?e=@*IW@n?;9v ??E@?8?(?j>}g??zf?q?+?r-?$=?T|??5'=h?j?AF?w7>r?0@As?: =L+%?&?HyM?$==Ay@S?I?\??߽?>?i9?^H ?!j?~u?/??B2?[v?l >ں?3y?n?|w>b??7?p??:??FV?BA?v?ɕ> "? ? ?]?i? ?A?q?6?? ?I?= ?tF?q?>w?Ǟ? ??뾃Ul?B?*j ???0Q??>?6X?F??@-??N?K+p?X??f,?>@??Hrv??u?!\??O?T*;??2:?C?L/?]?<#??V?0#x?Z?b5?UE?xi>2|?E?UH???&x??Q??be?rd?Y(?`h?"?Q`?f4??@?pK??A?€1R?9@??Ԟx?̍?YN?76@b?6?u?¹4?D?zߘ?^ ??X?b??@??72@{?m?oq?΋?Q?Zo?(?9?es? y?S`?ˊW@ ?tn?t?H?Jhm?B"?dT?u=>h?8?$D?Z??]?Y?vӳ??_@8??lc?s6?gbu?7?zZ?\ 3?/?!??L?o)?ʜJ?y3?6?&x?/@ ??T p?,???Ҳ?A?r?߄? ?K?Q?*?f?sA?S3?Ȍ??m?n??/?׼X?"6?0/?y@F?[??Qg@ ?й;?@@ \?y?x?}u?6O@/?H??@7ܵ@:?P@??^>_@!?[??C @>?1p>`@@C?@ @@%fH??x???C?D??ݟ?x??}?÷@@(@ 9?1?K1?K@K?\z??@|@ R? @e?O?1??:@0Q@??֡?h?J?9??b?3?m$?Ƣ?2??H?#?f?!??M?S??n?? 0@ ??? ?,?һ|?Mi?R??'R??֭|?ڵc?%?W??@|??ο? ?oX?/?P?Z?5 ?l?x@??P??$@~???f?m?(4@-@un@@@ 9??Ѷ? ?Ϣ?@ H???@vD@ s?™??_(??ͷt?֢?۬? }??ī??PU?v9?&??A?E-?y?7?@ R??փ@G??m?JW??i??-?nX?t@?#?5?ά?@ٟ???e?u?ԁ?,?9$?x? @4@?#?18?R??Uu?^??v?c?P?????>??X?!?@8??0?,?P?@U?@Z?ۋ?mV?^?@ @?@ @j@?#@?2X@T@1R@>j@5Y@?l@ @ @@`?>?޽??E??yh?@#4?׋@6r@59?X_@j@$M?u@!@?@ \]@ @ @c@=O??@@ą@ @@|?R?Kb?X@?@ @y?r@$Vg@E??Ҙ"??u??ֱ*?@n? ?.@6@ ?ޑ?(m@)-@@@ģ?U@W@ ??=@ZN@@@s@K?`?&@m@?}@&@ q@6??Q@ Su@2rL@K@R@$"X@%H@b@b?@@~?K?j??6?X?{?@K?@{{?@pL@@@>|?됏?|i@3?Z? @h@ zd@O$?p?v@&@ ?:?? @@ X ?@?@,:@&v??Z?@@(@ <@@'?g@>@,@*>?x?}@\@!@{?@@%?gN@h@@E@o@+V?Ȁ@@'?%@ # @tm@ @;@?脣?@??Ђ@!?⯺@@g@r@-w@Z@+@  @؋@./:@1? ?'H?z?_?@b@6?p?ݿ@ @ Q@ `n@sA@ ??ޖ@@@ @5 ?#?L@ 0@[@ ?@@.ϙ@۶?@@@?+.@ @e?? )@ @I@@@7@=?ǰ:@i@@ք@?@*I? @7p@4?f@U?d@@[@ L?V?|@?\@ y@ t@ /@_d@B@'@H@}?l@+@@@#,o@3@@%~{@9I4@ n? @z#@,@H@ @F@ vB@+@`h@*S>J@Zx@8@'@?k@C @f@!@.z@J@@ c?KR?@ O@@@ ]@*y@ c@7@#@@HԠ@Wh@"@O!@/@!8Z@T$@D?r@#@3@`Y@0L@+@ ?x@h@@$M@ i@S??"@ ,@@#G@p?o@Q@- @'d@8@?\ @@?>Y@:@3g@ ک@X@ ?g-?6@@׫?۩@^R?o?[@V @4O@-?@I@$I?m@@'@LX@ ?UE?+@?@ ;@z@|@ա?@@(@ VG@ "+@? ?8@'[@-Q@@+(@@@t@@Be@)?N@n@*(@i@`@K?D@u@'@"̔@%@9?M2@$@ m@@;@&T@ Y@#?F?k@hi@"V@ ӡ@?X?>@G+?rO@#@D@*5U@ <@@ט@B@.@U}@h@0@9@;0@ ~@eh@;@(@v@?B@?@ @6?@?s@7q@@8V@@@u?仟? @$7@[?k@j@!@ ?@ @|@@m{@<}m@+w@ *?˦@ ko@* @˜?W@8 @"??Md?h@VS@3&@:+ @^@=~@*@M@@<@//~@7>@+@8C@&o@?@Mrw@.Xs@09@X?x?@ u@@H@@v @"Y@?@@ '@ @ F@'@?O@???̄?K@D@ y??(@x@,@@'9@?@ M@1-?Q@ Z@ r@8$@R@E?,@P?:@#@aa@@_@-m@!0@ R@K@a@x@q@0]@н@@/%@r?@?^@W =@z@@%SE@ @ @E@^?E@ @U?ݸ?z@)@7@)ZZ@L+@6@?@~@?@}@ @@F@ OA?@wX@k?@@pb?pܻeb@r@7@=/@_ҝ@q:@F@@@=1@@,@&.@"Y@@\@/k@G@"@7d ?O@R @'@r@ ?+@&l@H?nc@g @u@@6AZ@ *@'0???ǣ@YH@4E@ pP? @D@7^@>j@<*n@_O@TL@.`D@@;@@?u?$@ D@(S@a@*2}@8@@N)@@?h?@"@9D@5?@&@Or@@w@ @tG@=_??@@ N@N?@n=@f @BO@12@)@9@V@ ¯@@ +?q?K@U?Nj?ӈ?@,@Um@ 4Z@?m@U@(@Z]@@JY@q@"g@@!t@!N?@@@$@@%?H@ؓ???q?@#@G?5??'q?@a?@v??~?e?b?ĩ @5@ 7?I?(j@t@@@'@7K@@ Tw@ c?X@V @ @?X ?n@8@%'@4@̇?1@O@ l@@ ?y??;@w@ R@'@@ @E"?ok@@@ )@'%@2@@M@B@7@ G@ ؔ@@8/$@#e?%y@&@+@;@++@ Ʀ@*?@8@3d?z@S3U@c@V+@w@]e@-R@x@>G@)9@Y@9"u@IO@i@t @*?@?@$@/*f@P@j@.s@*@ *@v@ t@V@?@"@@ g@gAs^*Ԍ@@ul@U,A@ B@'W@"@-?=*@h@<@82q@@@;??[@c@_m@%@5d@_5@-@0?{?@0R@I?J@iP@5?[c@(l@>@M@ V~@ @@% ^?@9@$F@ @BF@@$OV@/0@3_@"b@~@z"@ka@@'@C4@@/t@o@_q'@(jO@@>@@@-m@P@5@^c@ .@oi@݋@@l@P@ :@o@Js@1@@10C@S@!DC@0@)@)k@/#1@_@ @j@@@B@$k!@>L@+P @+LZ@1IS@.ݝ@#΅@Fe@'-@@@1@(&@%@@6&@86@@&ޯ@8n??s)@7t@Rw@F.? @Um@Y@3:@NY@~@@8@}'@44@^@%@4@?@Dk@tت@M)@5/@y\@-Q@j@Fe@T@pk@61@BJ@e@0n@ <@?)i@D|@)tc@%5@?$@&@/Q@S@. @1@?܃@ @L@@L?U2AAHa@*@k8@h,@FV@@j^@@>d@-H?e!@ C@2a@$j@57A@*@7@]?n@}@s_@O@"@ @!Y@ @_@D@_@{X@e:m@=s@.kw@ X@%2@k5@:<@@x @n@@.<)@ @Fm@y@K@?4@,'?!@(@ @*e@;Q@Bn@MOn@uB{@]@;a%@D@>@bJ@HF@_@N9@`@f@]@(@B@0@&;?U@V@?@#@U?ϩ@W@lĒ@dɩ@Kn@6?\\?~C@.@c?S@h@Zx|@(@Kv@)@G@-@-@ {@(@2@@& @|l@H-@)`@:@R/@1?@Np@ @-*@&&D@w@,b@[D@1:w@9*@(o?͊@@3s?خ@Ŭ@9@@#;@ >@S7@4y@ @,Jf@K@(@z@R!@*P@.$@Zt@ U}@'N@6@/@(f@K@ @<@Pu@<@@:@*%@Ґ?@w@|@1@1l@v@܁@:8*@8)!@h@0+@F@N@A[@A<@WD@9@E<@-פ@0!@4@@A@Lq@z@ ,@2{@A@@W @:mD@m@)99@$P@?@.@O@0s@ @Af@$@I~Z@Cv@(I @&@!?&@/@-@@ X+@ @9@)+n@`@MC@X@@K@"@"@'@,@8@$S@4_@ T@$dk@!?@@W@!̝@_@@^@ A)/ASo@7[U@QAd=@@A'IoAoY@>@=r?2>? @@DV@[?CC@&@@h@<'@G@?@d@р@@ S?>0@C?xANAŬ@LU@N@V Z@'[@@:@T+z@"&@&c@@@QB@(@ǩ@O3@S@Y#@E6R@RaA@Ӟ@18@3[@+v@2@pB@$%~A +A8@q?$@ Ѷ@1}@I)@ ?@f?ό@u@?ֳ@!@m?@?A\;A#@sN@\@'@"@k @:q@>.@AA@W@5@Z@B@|@X}@'g@U@@O@ai@fv@8=@<(X@<:@@WO?˒@@,O@7@p@!_[@m\@|?ȷ@9@j@N@j@@$G@@9f@N@=@=@;-_@#?}@@C7@@p@+@z@Hw@D@Mo@>Y@8@S?~P@@<@@#'@?@7 @@ 3@LC3@V@0v@6Qi@2@ _@;O@YJ>@2=%@D@: @S@o?_@fU@@4@@"`?FG?6@@q@_. @ @67@3??N@l@@ p?@(V?@{ @]Mh@9[@GE@=@ @&<@!@?v?@-@d??@IE~@l+@Bw@*x@AY@S@DH@.@@nE@ ^?@Gk@@Q@(@@ G@)@*A?O$@?m3@Շ@W,?]@ @:%@j&@@]AX@< 6@F;@b\H@ @S1(@KA@2k@?>@%]@9֟@H?R@ @@I@>@.@;@iQ@5<@ $>@hD@ +@q@@x?J@FZA+Nw@t@]@4"@A=@m,?A/?@?s@'x@@q[??|q@07)@N?I @H@'@@P_?S@)3?R@Q:?Ɲ?̨@$v+? )!@ɡ?@"@["?p@XA$A%{~@@l@Cr@(X@$I@s@@ @@)aA@\?d?@)@-"?` ?w@!@IM@ dE?k1@0@؟AALv@@̿?-@rl@a@ ^?@&?5@X]@{JV@(P?Ԝ??fl@b?<ѻ@B @?@@]@?8V@ @E@uA!@@>I=V/@j_@7i@2?.H?fe@yh@@[?۷?@b@fZ9(J@"@ɴ@aG@Ֆ@/@v@NMi?o@#K@dM@_f٫@V@k?ɠ@w}@@=@!$A _A3@\a@F@@.w~C@iA6A0L6?zj?$@ F??ţ@r%@PA @@YLQ@?{p@KA7sA @@uZ@7@1o@ؿ@@v??@?@TK@@D;?lv?%@JtA 0@?% @c ?‘@͔@?A?!?>m@ALA տ_?"@2UA8 A8A8A8A8A8A8}A8eA8KA8 3A8"A8$A8%A8'A8)A8+A8-A8/fA81KA831A85A86A88A8:A8A8@qA8BTA8D9A8FA8GA8IA8KA&< n+?@\}@PM>=?!3?! GVVOR3>O?J?!?+9UF?@$'1?wfݿal·@;@5@ H??< #>?%1@Qf@4>mi@@]?C?K=5[P@aA@+3gC'CuS@\?޿H?'@*s?b9@CK;[#*1/g}?&ؿ_Կ>o??ܾ,p}?e??]:}W\ 8@!?*X>Qq?]?R~>Dy>e @ .Y>@? D=Un?D?[[9>wG?Mm+? @>@@"=p=>j߿o?(/?faR>>B߾b_?+ʼ'/8+>?>?!?+X>Ӯ?Xyr܎1?h?PU?"?zʿԾ5=2ҿ3>-@(P?]Q>4E6?+὿e}?Cv?[~p\@8nF>^V?/=+?oͿw>B? Q??!˽+=>>@"~?\l?*?B>N-o?*j?9W>U}>y?~X>:L?a?'?>9K;?@nnh?3i? Q7?g<>]=D [|?pl>Jũ@ %`@1EY?0>-6%>=?N`?V?+?G?(?x>;yU2qw?ԍ? ʵ?31>XvE?I?$|>@#6;?K?->ʿQi}>q?4Ȓc|cž7l??G\??6Ɣ? Y>Ks-?# D>P{?寏?>Α/? V? >"?9>Ag?cjV>K>o? ?ʾ?O$?,E=e>?@??nh?r?21?)=R???+(?>K>?з?f>uV?곾p?|v>A =o?(r>Q>Ns??l? C>]>e?f@>b>?E@uftM?R>d?j1A!>>ү(>~>ͿP*"r,j??[&? ?$2ؿ;H>=`6c? ?IU{A4,N[?2$>?@W?fs>6л?2>÷>7%>×. 0>#?4h>Pv>։??Q?Q>>P(=׹?@8w??$P{x>?\潸I?́ ? J5>~z?+)>3>x/оs??wJ o('=1?>J:.fKҾ{q>8루=$ >{>=? ϊ?s=t=?/^?>?wSI Ae[?g?e?$T;^k?#X>J?+=ӡxp?>+ێ>=]j4>??bV_??7yB>D}x>Q ??Q>Ó9>OPl=j=~?HĿ>F>Qֿо{%Z>4>y?|?=_V^>rZ6.>3Lɿ>J>&?X?~.bG6KAC>iO?!ڟ>h!Ͼ%>1=gi"<4k?,|?L?1>?>>;%@b?`?<[>!?_6b?Y?QH1???f?h]y?.?y6?'?Nl>(?tS=Z>C?_C>?#D?zy?Qٽ? ?l֧?!>Ix'Cɾom?Q>o`>>??@?? `?V>V?r?'?~?iO>y?=@l?;Nfp?q ?t?}?.P>M,??%?j>[|/X?u?d?bdX>z5?}i?[?u?>ǚ?;?j@~=??Zr> ???D?jS>=?pF ?i>e~>$? ?4\?\z?o0>,o??>"?&b?y1?m,?^?C?$ T?ju=`=N?N>M>?2?bug>g$=rf?(?; ?X?0b?'??"~?[y?c?ʴ?nJ?ɴ?Sf>jbL>z?S1?7>*?7#$?a2?V}?5 ?N??#+?!K?Q?s9??f(?E>W=>8>c?Ն?> m~="?;k?P[> >镾 >?>A>΍>ۢ??s-w?8==n?LZB>?Tx?>=D?1?>Y?P_,?s?r?za>,Ͼu>"?'?U?OK???>>?P?>ߩ#??z?3?d;?.?P ?P"??9Z?z??N??u|?`s?$? ???@I?4ǹ?}va? )>7n?e?07q?PG>Bp_>?$??">?dN?[?d>u?N?>̋>??3>HQ?>>;?Hϣ>3??Q?i>*m?NC>Z>?UvB?9j?7%?^?>MR?f?K?r^?³?>F>[>`?®?{?7g>>2?)?Y?&?>? $?d?T?$?9G?I=?B??6g?ƲM>E9?S? >?'X?"?>6?sx>1>??M^?>6i?73v?[y?]`_?A?>?w?3.>b%? ? ?j?O?-??m>?>(>?Zw?:; ?Q3?ݾB~? ?1?J%C?7??!a?*I?&X?ZC??>v+?EK-? |? >?x>*{?Ke?[?DE?v >H2? ?Y?H,?g? ?/>-?|6?\0?Q?ϰ?W>=???E'->>T??ɜ?>κ?hҽ۷=A?;?]<>wc*?Y72?=׽6\Q>K >*W>Qk?r@?=,???'o?s5>,?z?,~c?X5??1>9??V? e?I?u ?Ap?=?Mha?O>?L?NW?6?cr>m>KE?BJ??fe?2=??}?}??f?;>Ҧ>Z??I*?r0?r>qH??kD?*?g??E?ߒ>?[?`?p!?b1?>=j?Q?i}#?`?Dg?!@P?&V??~?2O?l?Z?X?Q???'?t?0O?>??3-? ~?8>U?:l>??(6'?8qT?fj?J??cL?AE?r1?C?hY? ?9.Y??a???zf??~?HQ?b?c?('m?/?t7?Г?e?~?-?H?|?|?$?,?@?o?>W? ?슛?e? 7? ?bP?k>4>>v?n2?+?o?u]??~S>pS?'?Z>ͧ?6?0?)5o>hz?JM?nC?>?~?5?e?Zq ?c?n?{? ?t#?;??us?Iy??X?/??q?%?RZ8?aO???"?ƣ?#?ޖ? ?W:?z?be?D p?8Q?j?h.?=??4<?>??3?[X?^c>?!?=?z>ƺ?-??h? ?W ?k?1>"\?J??7?c}?? ? ?~y???-?U?HF?qR?-?w?85?$l?I?rR?Jq?j??+?ݑ?Z:?p?%:?? ?SR?]?{?_?%??z\?B?y??ŕ?Xc}?? ?g ?;?P?]+?)Z??;A=*?2?H>??+?ߒ@4;?C??^7?B??s"??{?̱?d`?=E?"?B?;?{???ߺ>˨?" ?Af????\??Ǐ??F?'?;@?u?X9?zE?(z?Fr??;??M?8n0?^?ӏ?-?X(?.?????쩦?'?1?D?/??A7?'?~%??Jz?έ?Fj??’%?aQ??D?@#?8~?pU&?O@թ>XJ=r??|?_??K?:0?O`?ڲ?h6,??JK?PE?~?E?(?/?Gv?Y?w?bl??[???`g.??9@?L?R??¶ ?U?i;?E~?ף??҅~?|?"j?Ǭ?nZc?X?k?'?qN}?v?S?R ?v?,??PU?׬?g?\=?%?\?O??8X??`? N?K??L?T?fH?u?WL?2??O?g?rO?1?K?|??OO>??|,?w&_?o`?oM?_?{?Eo? ?]???{?H???{?=?U?Z??:??l?????*?? ??R"?F{?.?*?L?ʔ?9j??X??PHm?ED?}?vZ?ku?"?8?Uw?L?p"?H?g? ?f?5 ?{_%?m?s?]M??K#?(&?ʍ?5??Q4??f?;?7?w?Q{?P?#?#?% ???J?,?9ru@,K?ݴ'??h?w_? >ۣ?+?N[??܅?7?`;??? ?j?ҷ??|???ir?L? ?(?s?#?Q:?l_?N?OU??hH??볒?R ?A?0!??b@ 9?Ң?U?#R?9]?{?OJz?bZ? C?\ ??G$?4?ZQ?T?\W?Q??_ ?}?I?j?@ ?up?q?tct?S?\??ք?_?pl?+?>???ٙ?^?W ?h?;U? ??|L?U?A?A?q?ۻ?7??Z?+L?3?0?U-?Iv?L?Q?0?n?^?2?^kh?? ?P?̸?6??zŒ?E???M?ȉ?v??@5?Բ??q?s`?:???,?Lx?Ao??BM?rn?Ă1?0?h?U?K-?kW?7?h?? '?߇T?Ӝ?ھ???K?@ ?4??E:?: ??q??&[?0??H}???c?ز? ?3?я??O@@ac??DZM?sq??? ?ّ?.?d??Jt@-?ޥ'?s?&??:c@ +@?p?Ոx@S_@Y!?J?nA?@@ 2??w ??/C??qU?@i?8?b2?1?;@-(@@@ ?H?7g??[??厤?R??k??6?uM???{@y@}??s?ۣ?:|?`a?۪?i?D@ ?_?n?wY?"+?:?O?=?B@@(?@ ?q???@f?@&|@w@R@@??Z@?p?lj??k?(r?p-?Nz?z0?Y??p?)?B(?E?;T@5???=@ oq@ 6@Ih?p@?7'?@@ FJ@ @ @ Z?JN?@?@?x??@?m??A??ʰP?B?H?+z?Z?)?U?YU?p@?T?H?(?I9?]??Q??I?Ւ??@ #? ,?@@@?v ?9?T@?g?Z?b"??Ɯ#?u?(??#u??W?c???ߡ@?K2??!?r?b?̞?E"?ζ?Ӽb?@ f_?m'?7)?? ?{ ?????6?vl?u??8D?֔@V@@?6? \@ b@=@+o@ע??4?|v@ %@0[@.?/@'?Ӂ?⏣@@E? @ ?k4@ @J@@/@U @'@ˆ@ [s?@@@ͬ@@X?ܙ@??y!>ѥi??pp?<"??U?Z[?@6@?Q@/m!@@ @AN@ Z@@H9@@?c@ @n@)@Y@ L?x@6@C/W@1@D@1ú@f@G@J@7P@8}@%@)@p8@U@,E\@TP+@SH@)V@1g@G@VA@J&J@Ң@X1@L@>@S%@^S@AՊ@T@Ak@F>@\a@cd@n$@F4@C@N @LoY@S@z@e6@H@!q@@@@< h@HE@9@>@Y%;@h`@_(@=(@o@R@(@Cf@(E@A@9@J@T@8@>iv@Gf@>@;;@Gx@>@=@?l@>-@-@**@5@U%$@>l@0@V*@m@CZ@#J@8`@Q@O@J2@Jkj@4@8@F@]U@M@څ@,Ȭ@^$@@Y@I@MV3@&@X@@@M@J@>$@T|@ob@\]i@R@M@H@F9@C;:@\@5۔@'@'{@#I@W%@@+@1:{@;Q@[@5@ @7@jp@X2@8@Dn@?'@3@[`@lL@B@/@d@IQ@I@;@aO@:$@@NTR@J@as@d@+@C@Wv@,@ @7W@{@M#@Y.@Cm@F(@YF@_@>U@]'@/i@$=@"4"@S@7@d @/@~@h@5'}@3?D@W@@ @2H>?խ?B@1@Xt@0*@N@~fY@b(@\%P@K@n_@G@#n@Y@Wf@>@Df@'޲@9)@v@ @b@@@GR3@Q @oz@4@_@X0@95@Xh@P~@^R@Oy@BT@8g@0 <@Mv@Hv@/ @l@vU@w@+gK@#@-kO@M@X)p@?3@-2?A@/@D@/ @9M@9@h6@@D@~j@L>S@@w@W@] @w6@}{@X@v@cwN@CTF@Am&@P(x@Rd"@} @:ʍ@9_]@qf_@>@ ^@Z$@G@W@e@o@i\ @^@DZ@s@P@` @%,@Xxa@{a@b@H%@M@@R@`,5@9#@@@uv@w@KU@r@@b@K~@U@%e@EL?@w H@Gb@L@Lz@C6D@w @X@4@?9@Q|D@C+Z@_@s@a$8@D_@Bؒ@K@O@k@=+@(/@e@"p@@ͯ@WO@`V@Q1@9}@[@B@1k@@c@5@x>@>=@$y@3%0@(@$x@GZ@@2@p@6a@.6@\@S7Z@8@t_@Uw@."@niN@2@_@3@a@R@z6@e@l5@m@Nju@4@Qa@@@F$@pN@U@j]@l@@{@Z|@<@\@p@4b@M@9@u3@@g@PY@pʆ@k@@@S*@r/@"׺@JA@Lu@l@S@Y@@{ 2@X@@!@)@n~@h@gr@}=@Sc@z.4@q!r@i@@%<@`@^@nY@o@`h@Sl@O@M C@WS@Yr3@)uJ@"P@cB~@2U@D@Wi@f @f:S@]yv@Sa@_@{@f@@h3z@K@m~@P`@B{x@[@o@@|@^_0@Hp@r@N@@:W@@oW@i@l@y@pd@U@v@@'@kv@]@{U@}@a@@g]@@@@|@hR@oo@@K@1]@@~@@n@d؟@{h@@@`@V@?@)@-@@Plg@@j?@T@}i@ @i@@o@=7@G@W@@x@,@i@@D@@X@@@@Y@;@p@'@o@"@u@ m@m@o@@io@W@\@@q'@pS@coJ@^!@@Vz@@D@%@2@@@@0@@@O@D@@/@@xz@a@@zi@I@@`@@U=@b@@X4@m@\+@`c@t?@KA@l+w@M@R(@V@@V@@@|k@~@~C@@@mM@q@@]i@I@@@n @lY@[@ @i@x!@W@@!@'@j@9@x@[@-q@D@@ˏ@@@ j@@ӳ@@@if@@@@* @lA@@B@ A@_h@^Y@`s@^@q*@9@[@Yv@@@@r&@i@@%@@~@7@@Y@w@@j@|@l@]@@@@@r@ A@@|@ @R@ζ@];@ @*@3@@iA@k@@ZB@rk@k@=@'S@Dw@@@}@A@c@H@M@I@@s@w@ю@s@O@u7@@\@@zG@q@ U@@@F@@9@s@@V@7Y@ư@@@nd@L@c@@<@l@S@h @@3@@W@@@@f\@@a@t@@ѣ@s@n@$@2@@U@4@Զ@|@_*@ZD@=@;@x@@@_@h@@t@g@Q@1@@VW@@y@~@<@j@p@@( @\@@$@H@1@@I@*@@W%@@8#@s@ܥ@@e@@j@@d@>@`@@K@< @x@@,@M@6@@<@@+h@{@x$@@@O@@yA@(@\C@@3@@@@7@P-@@@3@@mi@J@.@@@8@@=B@?/@b@T@p;@ً@@8@A@@#@s@@n@@h@@I@>@@@Q@o/@@9@@@@e@-@R@)@R@X@6@F@y@N@K@y0@Xn@nE@W@@f@B@@|@z@6@V=@t_@xO@eB@r#@T.@@=@X@@w@AR@@@{@@(@@ 6@v@@@@H @a@@m@F@"@@xb@q@@ @@@ @N@d@P@i@@@@e@j@|@[@Tk@@|H@@@@PZ@@Z@@@d@@T@CX@@ ~@@Z@@@@s@E@>@Y;@@l@(@b@ @>@@O@G@@ @f@@@&@@ׯ@@w@W@@P@@{u@@@@@@p@6@Q@^@|@@@ћ@@ώ@@L@g@G@Vt@<@UJ@@n@z@a@ea@J@k@6d@>@@T@@@-(@Z@@@\v@ @v^@+@I@a@@@|@r@a@Y@"@-]@@@@H@ @@@5@@N@@CE@e@=@o@,@q@@q+@[@@)@@S@rK@@y,@@@!@~@+c@+@9@,@^@_@M@p@@p@ӫ@@l@x@K"@` @y@[@@W@@n@@U@@@h@@K@@@|@@W@{@@;@@YM@\@b@@L@z@M@r@@^@@@t@A@t@9@=@LU@u@@@Á@:@@@x @y@;@@d@'@@1t@@"@N@y@@@x@f@@wa@s@ @ш@Z@@@Z@@u!@[M@vj@z7@p!@?@]%@p @@}@K@`}@g@D@@@a@e@yh@k@d@cM@@@@@8 @}@.@@*0@x@@x@@:@@Y@8@@@@=@E*@J@%@@@F@g@_@j@t@@8>@G@C@@C@@@"@@l@l@@ʥ@*i@Mz@@K@M@@)@p@1A@@g@r|@@}@L@@@~uD@@T@a'@.@:@WL@u@2@E@@@@@]@:@9@S@ܕ@"k@na@d@$@5@@H@Ԇ@o@f@W@v@fe@Z!@?@t@@R@M@@V@p@s@R@.@@r@@d@}@#@@+@E@2@$@@P1@L@@@@7@>@t@x@@\@b@\+@@Pr@\@@y@@@MA@Jt@vc@{@Ұ@yP@ @j@i@@q@u>@:@@]@BK@@-p@ @=]@@M@]@J@,@y@^@!@f@o@p?@[[@@Q@s@B@3@@O@@@4/@2@(X@; @@O@@@9@0)@@t@#@=@@>*@@O@z@y@@q@@ūT@@T@ @%,@U@@ڒ@ .@@t@r@ Y@@[@A@@K@f@@h@LO@P@]@@@4F@u@29@@m@>@@@f@@B@"@^ @J@}@@@@9@ @@w@á8@/"@;@ S@U@@l@ȏ@s:@P׾@R@@=@p@?@8@'@v@@հ@@s@L@s@ˌ@@X@h1@@ @@@@4@@D@՚@h@@8@%c@*@oP@}M@@@N@,@F@@@ѫZ@o@@@@@+@@@@÷m@+@d`@@[@@c@Z@%@ @O@@Z@Ʌ@]@Ć @l@>@ @۬@@i@@f@@Z@@@ie@@te@@)@'@v4@@s @Gw@[{@@@"o@@Q@6"@!y@@W@@b@2@ӣe@I@D@K!@a@2@@@D @R@!@^@@@@@-@@Z@K@@8@,@h$@@-@@}@Ӭ@k@@ /@@N@ii@{W@e@@@A@@eu@@@Z@@w@I@G@@@@`3@0@@‘@@@r@@@@E@ @@_@)@è@(@D@+@S@@z@)@hT@+@:@0@4I@@N@@)@@c@@|@ @ё@@@ٷ@D@@P@,$@5@rO@\9@'@N@(@*@Ji@YE@ r@@@u@@Z@H@5@̩@,@l@݁@עP@1@̒@̱@<@@@f@@)@c@-@ @v@z@@7@>j@0@@@@(@# @q+@ @@@I@@@2ATA~@yK@p@E@m@@7@a@@D@@8@ag@4@p@@F@Z@c@)@@J @@&@@@n@,@@t@S@M@]z@1@&@E@ 1@Ě@6@@L@@@i@ 9AAL@e@@@/@]@8@/@IS@F@̺@ĸ@n@C@p@=@z&@U @@@t@p@@v@Ϡ@@@@z@V@E@@@c4@T@ԟ@w@C@@@@AU@W@8 @)@@@L@]@c9@@uO@ @A+@I@@5E@@[o@π@[`@m@H@j@@@6D@@6@L@@@1@@@ Y@@>U@{8@]@@@@2@@ @@. @7@@X@l@_@,@@@ @@ɕ@f@@t@T@_(@O@2@G@d@ @G@U@@w@D@@ߕ@!@@^M@)@@N@@@@ɉ@st@M@@@W@@@@l@@@@@@@f@f@ݛ@@i@M@M9@@ˆ@@J@Z@;@@nu@I@@@VX@@/@K@@ @@{<@>@w@$@MT@e@/@%@@@@ @@E@cv@щ@3Q@ @b@B@y@@{@_@@m3@@N@V@@@@1@l@7@ @)@=@NA vn@Z@@Ĕ@@/@@@v@tZ@@#@e@<@ƒ@@2@@{@B@@z@{2@1*AA2@D@@@@)@^@u@Ť`@@aS@@;@@tA'@)k@@l@^@se@CW@@`@ @@L@L @@@v7@v@ӊ@ŜM@_@-@2@_@@8@V!@ @@ @O@9@Ƕ@2@ @@[@}@,@r@@ @Q@U@p@=@@@]@}@8@5@@%@M@o.@@@:@S@@b@ @@΋@x@F@Ӝ@@#\@q@r@@Iu@@@@ t@@@@!i@ @@@t @T@@@r@m@@v@Ջ@ۓ@M@c@@ϟ@>l@@@Z@y(@A@D@@@]@ @@c@@@q@@D@!@@@U @ͥ@@@a5@;@¤@6@@À@ X@`@˧@\@3@h1@c@^@ @M:@>$@t@@$@7@@j@B@`@[@@Ʌu@FN@@@@f @@@@pV@W@@Z@ǜ@=@w@Q@@4@v@\@ ;@\@q@I\@|@_@#@@ƥ@@j@@@K@WA7A&@8AfA_2E@@ה@A<2rAtO@@'@z@@YA@;d@Q@=@Q @*_@@:@@@ T@oX@F@z@7@-@$wAA'm@C@ɇ@W@@J@@7/@S@@@@@V@@Ҹ@@K@R3@Ǚ@@@@@@-*@@@ҨA=@&AUG@@i@0E@@٠@ @A@@^-@v@4@$8@^@xz@WA<@@@'@h@W@ü@g@T@2YA9M@߿7@ZE@<@@6@@ FAq@ @&@@@v@4A@@Y@@@`@~@ @8@o@ @@'@@l@ӽ@$@@٥@[@W@fI@@Z@@Lh@k@@H@BD@(C@@@@(e@yf@z@?@!@@P@ @'<@0@@@@@O@h@7.@l;@@ĭ@@@@I@I@5@G@w@8@ͫ?@E#@Q@dJ@ss@қA-@؀@@@̓M@E@qAA!@@H@r@74@M@G@}AMA3@ @n@@@%@n@ٰ@t@Q@k@b@"@Ē"@-`@[@i@M@@@0@[X@m@*@5X@@@@޶@\@\N@KA)(@(f@+@@@n@@c@ @Ȯ~A{r@ư@É@{f@}@@o@@@A@:F@(@;@Ϧr@@}'@*@}d@'@@@d@@@@CAF@]@@@@M@@ҫ@o@wA)@}@@l@o@8@de@w@P@zA(@@f@c@b*@q@$@@p@?uA9AJ@̏@k@G@ѴA~AAeA u@@@`@DF@ @+@E@@' @|@a@WY@~@ @8A/@^@{i@@X]A(Ad@rP@v/?|ƟA@b@i@`@@@`@0@Ē@@/@R@@d@9H@]6Af@ie@r@“@W>@@{@^?AGA-@a?&@@@E.@@q@@ @A@N<@ @@T@pq@/:A,ĐAk,_A@w@R`@$@?@{@zb@7@8r!@@a >A^5ZA-N`@Y@H@t@{?d@ AdA)g@.@<@@qg@4ZA VAV@@Ϊ@1@f@ @@@|7A{$AV?+@/e?AuAuAR4AA5D)@ @g@@p?@Hm@l@c@@@T@Im@6@;7@PAA @Q@[=xKAZAMBB(A @AIAGA-+4A&/A&A&A&1A&A&A&1A&A&A&0A&A&A&-A&A&A&)A&A&|A&$A&A'tA'A'A'mA'A' A' cA' A'A'XA'A'A'Lpydl-0.7.0/pydl/pydlspec2d/tests/t/test_template_metadata.par0000644000076500000240000004407413064020340024747 0ustar weaverstaff00000000000000# # Input spectra for computing Galaxy templates. # object gal # Type of object. method hmf # Computation method, pca or hmf. wavemin 1850 # Minimum rest-frame wavelength. wavemax 10000 # Maximum rest-frame wavelength. snmax 100 # Maximum S/N. niter 20 # Mumber of iterations. For PCA, typically 10. # For HMF, depends on the specific method. nkeep 4 # Number of final templates. minuse 10 # Used to mask and fill in bad values. aesthetics mean # Used to fill in bad or extrapolated values # in combine1fiber. run2d v5_7_0 # Used to set the RUN2D environment variable. run1d v5_7_0 # Used to set the RUN1D environment variable. epsilon -1.0 # For HMF, set the epsilon parameter. nonnegative 0 # For HMF, set to 1 for non-negative. typedef struct { int plate; int mjd; int fiberid; double zfit; } EIGENOBJ; EIGENOBJ 3587 55182 186 0.3595674932 EIGENOBJ 3587 55182 220 0.455476969481 EIGENOBJ 3587 55182 304 0.330913722515 EIGENOBJ 3587 55182 328 0.189603984356 EIGENOBJ 3587 55182 610 0.0599232017994 EIGENOBJ 3587 55182 678 0.231336131692 EIGENOBJ 3587 55182 682 0.0605525933206 EIGENOBJ 3587 55182 730 0.307728677988 EIGENOBJ 3587 55182 736 0.526875853539 EIGENOBJ 3587 55182 756 0.171211704612 EIGENOBJ 3587 55182 764 0.258239895105 EIGENOBJ 3587 55182 766 0.0138538554311 EIGENOBJ 3587 55182 768 0.601705789566 EIGENOBJ 3587 55182 780 0.204909265041 EIGENOBJ 3588 55184 208 0.785968124866 EIGENOBJ 3588 55184 236 0.598300695419 EIGENOBJ 3588 55184 340 0.0299468748271 EIGENOBJ 3588 55184 432 0.548194587231 EIGENOBJ 3588 55184 536 0.30774384737 EIGENOBJ 3588 55184 572 0.09692350775 EIGENOBJ 3588 55184 592 0.0804488956928 EIGENOBJ 3588 55184 604 0.014511690475 EIGENOBJ 3588 55184 752 0.0582232065499 EIGENOBJ 3588 55184 966 0.476659476757 EIGENOBJ 3589 55186 82 0.126676037908 EIGENOBJ 3589 55186 368 0.0191847495735 EIGENOBJ 3589 55186 374 0.0139490524307 EIGENOBJ 3589 55186 406 0.0819376334548 EIGENOBJ 3589 55186 478 0.0423591732979 EIGENOBJ 3589 55186 540 0.106723308563 EIGENOBJ 3589 55186 555 0.104113548994 EIGENOBJ 3589 55186 956 0.287074357271 EIGENOBJ 3590 55201 6 0.599426329136 EIGENOBJ 3590 55201 62 0.263417065144 EIGENOBJ 3590 55201 102 0.462622344494 EIGENOBJ 3590 55201 108 0.41305822134 EIGENOBJ 3590 55201 394 0.356303632259 EIGENOBJ 3590 55201 407 0.121644556522 EIGENOBJ 3590 55201 449 0.671972036362 EIGENOBJ 3590 55201 451 0.239336982369 EIGENOBJ 3590 55201 474 0.234340757132 EIGENOBJ 3590 55201 498 0.110753938556 EIGENOBJ 3590 55201 539 0.422447562218 EIGENOBJ 3590 55201 626 0.105540692806 EIGENOBJ 3590 55201 647 0.361006826162 EIGENOBJ 3590 55201 658 0.124056331813 EIGENOBJ 3590 55201 676 0.670863866806 EIGENOBJ 3590 55201 898 0.138964474201 EIGENOBJ 3590 55201 930 0.0725083276629 EIGENOBJ 3590 55201 954 0.0703747346997 EIGENOBJ 3606 55182 54 0.743202328682 EIGENOBJ 3606 55182 116 0.291299402714 EIGENOBJ 3606 55182 172 0.0422144904733 EIGENOBJ 3606 55182 180 0.162159234285 EIGENOBJ 3606 55182 210 0.242266073823 EIGENOBJ 3606 55182 224 0.261272788048 EIGENOBJ 3606 55182 255 0.243586674333 EIGENOBJ 3606 55182 546 0.0429921299219 EIGENOBJ 3606 55182 662 0.36246278882 EIGENOBJ 3606 55182 690 0.143404364586 EIGENOBJ 3606 55182 786 0.130908712745 EIGENOBJ 3606 55182 874 0.133746698499 EIGENOBJ 3609 55201 138 0.676006138325 EIGENOBJ 3609 55201 214 0.292647778988 EIGENOBJ 3609 55201 274 0.498313724995 EIGENOBJ 3609 55201 334 0.191940352321 EIGENOBJ 3609 55201 376 0.0415928885341 EIGENOBJ 3609 55201 386 0.0782536864281 EIGENOBJ 3609 55201 440 0.242778137326 EIGENOBJ 3609 55201 444 0.253037691116 EIGENOBJ 3609 55201 452 0.366684436798 EIGENOBJ 3609 55201 460 0.599260628223 EIGENOBJ 3609 55201 500 0.439788401127 EIGENOBJ 3609 55201 526 0.686094582081 EIGENOBJ 3609 55201 540 0.541150391102 EIGENOBJ 3609 55201 554 0.361750036478 EIGENOBJ 3609 55201 556 0.590501368046 EIGENOBJ 3609 55201 580 0.563466131687 EIGENOBJ 3609 55201 600 0.433407515287 EIGENOBJ 3609 55201 704 0.0415181703866 EIGENOBJ 3609 55201 712 0.498827934265 EIGENOBJ 3609 55201 790 0.0604337230325 EIGENOBJ 3609 55201 812 0.456652373075 EIGENOBJ 3609 55201 872 0.019769448787 EIGENOBJ 3609 55201 902 0.449065119028 EIGENOBJ 3615 55179 44 0.0544760040939 EIGENOBJ 3615 55179 94 0.233552291989 EIGENOBJ 3615 55179 234 0.379028499126 EIGENOBJ 3615 55179 286 0.0741298794746 EIGENOBJ 3615 55179 328 0.0962169840932 EIGENOBJ 3615 55179 350 0.106441766024 EIGENOBJ 3615 55179 364 0.096172042191 EIGENOBJ 3615 55179 366 0.0955287441611 EIGENOBJ 3615 55179 378 0.573811411858 EIGENOBJ 3615 55179 384 0.072173371911 EIGENOBJ 3615 55179 398 0.405490189791 EIGENOBJ 3615 55179 464 0.501173973083 EIGENOBJ 3615 55179 526 0.113256692886 EIGENOBJ 3615 55179 529 0.346740663052 EIGENOBJ 3615 55179 540 0.188029557467 EIGENOBJ 3615 55179 607 0.305754303932 EIGENOBJ 3615 55179 613 0.164086073637 EIGENOBJ 3615 55179 808 0.164550572634 EIGENOBJ 3615 55179 854 0.698832035065 EIGENOBJ 3615 55179 912 0.485710799694 EIGENOBJ 3615 55208 42 0.0544746592641 EIGENOBJ 3615 55208 90 0.233550190926 EIGENOBJ 3615 55208 236 0.378973722458 EIGENOBJ 3615 55208 286 0.0741228759289 EIGENOBJ 3615 55208 336 0.0963061973453 EIGENOBJ 3615 55208 360 0.106406986713 EIGENOBJ 3615 55208 368 0.0961831361055 EIGENOBJ 3615 55208 374 0.573574900627 EIGENOBJ 3615 55208 378 0.0955336019397 EIGENOBJ 3615 55208 388 0.0721848532557 EIGENOBJ 3615 55208 396 0.405000418425 EIGENOBJ 3615 55208 468 0.501009106636 EIGENOBJ 3615 55208 525 0.346526145935 EIGENOBJ 3615 55208 532 0.113420948386 EIGENOBJ 3615 55208 540 0.187923535705 EIGENOBJ 3615 55208 607 0.305730313063 EIGENOBJ 3615 55208 609 0.164106249809 EIGENOBJ 3615 55208 808 0.164531633258 EIGENOBJ 3615 55208 852 0.698943674564 EIGENOBJ 3615 55208 904 0.486084878445 EIGENOBJ 3615 55445 48 0.0544515512884 EIGENOBJ 3615 55445 90 0.233550205827 EIGENOBJ 3615 55445 234 0.378881752491 EIGENOBJ 3615 55445 286 0.0741271078587 EIGENOBJ 3615 55445 324 0.0962622761726 EIGENOBJ 3615 55445 352 0.106376633048 EIGENOBJ 3615 55445 370 0.0961777791381 EIGENOBJ 3615 55445 372 0.0955613777041 EIGENOBJ 3615 55445 380 0.573788762093 EIGENOBJ 3615 55445 386 0.072188384831 EIGENOBJ 3615 55445 398 0.40524327755 EIGENOBJ 3615 55445 468 0.500844061375 EIGENOBJ 3615 55445 525 0.3466180861 EIGENOBJ 3615 55445 534 0.113392196596 EIGENOBJ 3615 55445 540 0.18774369359 EIGENOBJ 3615 55445 607 0.305750668049 EIGENOBJ 3615 55445 609 0.16414950788 EIGENOBJ 3615 55445 814 0.164589613676 EIGENOBJ 3615 55445 850 0.698752582073 EIGENOBJ 3615 55445 918 0.485884308815 EIGENOBJ 3615 55856 48 0.0544443912804 EIGENOBJ 3615 55856 92 0.233502492309 EIGENOBJ 3615 55856 234 0.378970891237 EIGENOBJ 3615 55856 284 0.0741146057844 EIGENOBJ 3615 55856 324 0.096275806427 EIGENOBJ 3615 55856 350 0.106433361769 EIGENOBJ 3615 55856 364 0.0961973890662 EIGENOBJ 3615 55856 368 0.0955100283027 EIGENOBJ 3615 55856 378 0.573617100716 EIGENOBJ 3615 55856 386 0.0721956193447 EIGENOBJ 3615 55856 396 0.405270129442 EIGENOBJ 3615 55856 468 0.500708460808 EIGENOBJ 3615 55856 527 0.346781492233 EIGENOBJ 3615 55856 532 0.187910825014 EIGENOBJ 3615 55856 534 0.113416872919 EIGENOBJ 3615 55856 609 0.305739820004 EIGENOBJ 3615 55856 611 0.164117679 EIGENOBJ 3615 55856 808 0.164597392082 EIGENOBJ 3615 55856 860 0.698767185211 EIGENOBJ 3615 55856 912 0.485851228237 EIGENOBJ 3615 56219 50 0.0544512979686 EIGENOBJ 3615 56219 94 0.233471944928 EIGENOBJ 3615 56219 234 0.378968924284 EIGENOBJ 3615 56219 294 0.0741345509887 EIGENOBJ 3615 56219 324 0.0962787494063 EIGENOBJ 3615 56219 354 0.106380023062 EIGENOBJ 3615 56219 368 0.0961789414287 EIGENOBJ 3615 56219 374 0.0955201238394 EIGENOBJ 3615 56219 376 0.57372289896 EIGENOBJ 3615 56219 392 0.0721727535129 EIGENOBJ 3615 56219 400 0.405341535807 EIGENOBJ 3615 56219 462 0.501057267189 EIGENOBJ 3615 56219 531 0.34678041935 EIGENOBJ 3615 56219 532 0.113428339362 EIGENOBJ 3615 56219 540 0.188013747334 EIGENOBJ 3615 56219 607 0.305740833282 EIGENOBJ 3615 56219 609 0.164116919041 EIGENOBJ 3615 56219 667 0.5529743433 EIGENOBJ 3615 56219 810 0.164573282003 EIGENOBJ 3615 56219 856 0.698858261108 EIGENOBJ 3615 56219 910 0.486025452614 EIGENOBJ 3615 56544 48 0.0544789992273 EIGENOBJ 3615 56544 98 0.233525261283 EIGENOBJ 3615 56544 236 0.378944069147 EIGENOBJ 3615 56544 282 0.0741275027394 EIGENOBJ 3615 56544 324 0.0962580889463 EIGENOBJ 3615 56544 358 0.106429852545 EIGENOBJ 3615 56544 374 0.095530629158 EIGENOBJ 3615 56544 376 0.096171207726 EIGENOBJ 3615 56544 380 0.57365334034 EIGENOBJ 3615 56544 382 0.405382812023 EIGENOBJ 3615 56544 398 0.0721548870206 EIGENOBJ 3615 56544 468 0.500856816769 EIGENOBJ 3615 56544 542 0.113405421376 EIGENOBJ 3615 56544 555 0.346680551767 EIGENOBJ 3615 56544 560 0.187925457954 EIGENOBJ 3615 56544 621 0.164129272103 EIGENOBJ 3615 56544 633 0.305758774281 EIGENOBJ 3615 56544 647 0.553033947945 EIGENOBJ 3615 56544 826 0.164600163698 EIGENOBJ 3615 56544 878 0.698783814907 EIGENOBJ 3615 56544 910 0.485903769732 EIGENOBJ 3639 55205 24 0.0420226864517 EIGENOBJ 3639 55205 142 0.240575179458 EIGENOBJ 3639 55205 192 0.522146821022 EIGENOBJ 3639 55205 336 0.642267942429 EIGENOBJ 3639 55205 435 0.221920639277 EIGENOBJ 3639 55205 462 0.381433427334 EIGENOBJ 3639 55205 478 0.294745951891 EIGENOBJ 3639 55205 492 0.415953218937 EIGENOBJ 3639 55205 512 0.405309557915 EIGENOBJ 3639 55205 635 0.057222712785 EIGENOBJ 3639 55205 678 0.0545327216387 EIGENOBJ 3639 55205 712 0.284469932318 EIGENOBJ 3639 55205 724 0.0793297663331 EIGENOBJ 3639 55205 782 0.302917361259 EIGENOBJ 3639 55205 954 0.280054599047 EIGENOBJ 3647 55181 4 0.0544512495399 EIGENOBJ 3647 55181 54 0.233564212918 EIGENOBJ 3647 55181 236 0.379076600075 EIGENOBJ 3647 55181 286 0.0741154775023 EIGENOBJ 3647 55181 324 0.096211515367 EIGENOBJ 3647 55181 328 0.0961828753352 EIGENOBJ 3647 55181 350 0.10645198077 EIGENOBJ 3647 55181 368 0.0955170169473 EIGENOBJ 3647 55181 378 0.573761582375 EIGENOBJ 3647 55181 384 0.0721629187465 EIGENOBJ 3647 55181 394 0.405251771212 EIGENOBJ 3647 55181 462 0.50101852417 EIGENOBJ 3647 55181 480 0.188002750278 EIGENOBJ 3647 55181 526 0.346876859665 EIGENOBJ 3647 55181 535 0.113394767046 EIGENOBJ 3647 55181 604 0.305763989687 EIGENOBJ 3647 55181 608 0.164066821337 EIGENOBJ 3647 55181 808 0.164546087384 EIGENOBJ 3647 55181 850 0.698788642883 EIGENOBJ 3647 55181 908 0.485820025206 EIGENOBJ 3647 55241 8 0.0544696263969 EIGENOBJ 3647 55241 44 0.233543619514 EIGENOBJ 3647 55241 234 0.379065155983 EIGENOBJ 3647 55241 292 0.0741230770946 EIGENOBJ 3647 55241 332 0.0961823388934 EIGENOBJ 3647 55241 334 0.0962535813451 EIGENOBJ 3647 55241 344 0.10646687448 EIGENOBJ 3647 55241 368 0.0955124124885 EIGENOBJ 3647 55241 380 0.574059784412 EIGENOBJ 3647 55241 388 0.072167955339 EIGENOBJ 3647 55241 398 0.40549710393 EIGENOBJ 3647 55241 468 0.501337528229 EIGENOBJ 3647 55241 478 0.187981829047 EIGENOBJ 3647 55241 527 0.113337554038 EIGENOBJ 3647 55241 530 0.346751838923 EIGENOBJ 3647 55241 608 0.305751949549 EIGENOBJ 3647 55241 618 0.164077952504 EIGENOBJ 3647 55241 802 0.164566159248 EIGENOBJ 3647 55241 852 0.698825418949 EIGENOBJ 3647 55241 916 0.485761404037 EIGENOBJ 3647 55476 50 0.0544798858464 EIGENOBJ 3647 55476 88 0.233482867479 EIGENOBJ 3647 55476 236 0.379029661417 EIGENOBJ 3647 55476 284 0.0741332322359 EIGENOBJ 3647 55476 328 0.096176572144 EIGENOBJ 3647 55476 330 0.0962399840355 EIGENOBJ 3647 55476 348 0.106455460191 EIGENOBJ 3647 55476 364 0.095522634685 EIGENOBJ 3647 55476 378 0.573968589306 EIGENOBJ 3647 55476 386 0.0721787139773 EIGENOBJ 3647 55476 400 0.405301004648 EIGENOBJ 3647 55476 478 0.18796429038 EIGENOBJ 3647 55476 480 0.500939011574 EIGENOBJ 3647 55476 552 0.346654355526 EIGENOBJ 3647 55476 557 0.113386534154 EIGENOBJ 3647 55476 624 0.305738508701 EIGENOBJ 3647 55476 630 0.1640997082 EIGENOBJ 3647 55476 826 0.164559110999 EIGENOBJ 3647 55476 864 0.698706269264 EIGENOBJ 3647 55476 910 0.485848486423 EIGENOBJ 3647 55827 48 0.0544545687735 EIGENOBJ 3647 55827 86 0.233516439795 EIGENOBJ 3647 55827 238 0.379057914019 EIGENOBJ 3647 55827 294 0.0741111934185 EIGENOBJ 3647 55827 322 0.0961670428514 EIGENOBJ 3647 55827 330 0.0962402820587 EIGENOBJ 3647 55827 352 0.106417857111 EIGENOBJ 3647 55827 366 0.0955108627677 EIGENOBJ 3647 55827 378 0.573973596096 EIGENOBJ 3647 55827 384 0.0721623376012 EIGENOBJ 3647 55827 396 0.405115574598 EIGENOBJ 3647 55827 462 0.500810086727 EIGENOBJ 3647 55827 480 0.187819138169 EIGENOBJ 3647 55827 530 0.346762537956 EIGENOBJ 3647 55827 533 0.11344178766 EIGENOBJ 3647 55827 604 0.305756896734 EIGENOBJ 3647 55827 608 0.164104238153 EIGENOBJ 3647 55827 808 0.164581939578 EIGENOBJ 3647 55827 852 0.698828399181 EIGENOBJ 3647 55827 910 0.485992103815 EIGENOBJ 3647 55945 8 0.0544647276402 EIGENOBJ 3647 55945 54 0.233553379774 EIGENOBJ 3647 55945 248 0.0741603076458 EIGENOBJ 3647 55945 274 0.37901481986 EIGENOBJ 3647 55945 284 0.0961729586124 EIGENOBJ 3647 55945 286 0.0962182283401 EIGENOBJ 3647 55945 328 0.0955245494843 EIGENOBJ 3647 55945 338 0.57392603159 EIGENOBJ 3647 55945 388 0.106475025415 EIGENOBJ 3647 55945 430 0.0721363797784 EIGENOBJ 3647 55945 436 0.405330955982 EIGENOBJ 3647 55945 490 0.501153171062 EIGENOBJ 3647 55945 500 0.187872990966 EIGENOBJ 3647 55945 506 0.346810698509 EIGENOBJ 3647 55945 515 0.113380581141 EIGENOBJ 3647 55945 572 0.1640881598 EIGENOBJ 3647 55945 576 0.305742025375 EIGENOBJ 3647 55945 768 0.164556711912 EIGENOBJ 3647 55945 812 0.698926091194 EIGENOBJ 3647 55945 946 0.485912948847 EIGENOBJ 3647 56219 44 0.0544597767293 EIGENOBJ 3647 56219 94 0.233532503247 EIGENOBJ 3647 56219 236 0.378963977098 EIGENOBJ 3647 56219 284 0.074140176177 EIGENOBJ 3647 56219 326 0.0961814373732 EIGENOBJ 3647 56219 328 0.0962324365973 EIGENOBJ 3647 56219 348 0.106373049319 EIGENOBJ 3647 56219 368 0.0955220013857 EIGENOBJ 3647 56219 376 0.573952913284 EIGENOBJ 3647 56219 384 0.0721831172705 EIGENOBJ 3647 56219 398 0.405394881964 EIGENOBJ 3647 56219 466 0.500729382038 EIGENOBJ 3647 56219 480 0.187687277794 EIGENOBJ 3647 56219 530 0.346789568663 EIGENOBJ 3647 56219 535 0.113536700606 EIGENOBJ 3647 56219 606 0.305758267641 EIGENOBJ 3647 56219 608 0.164115414023 EIGENOBJ 3647 56219 672 0.552938878536 EIGENOBJ 3647 56219 804 0.164585739374 EIGENOBJ 3647 56219 850 0.698761641979 EIGENOBJ 3647 56219 906 0.485815316439 EIGENOBJ 3647 56568 42 0.0544924847782 EIGENOBJ 3647 56568 88 0.233482077718 EIGENOBJ 3647 56568 234 0.379016786814 EIGENOBJ 3647 56568 288 0.0741531774402 EIGENOBJ 3647 56568 324 0.0962629765272 EIGENOBJ 3647 56568 332 0.096181191504 EIGENOBJ 3647 56568 356 0.106399953365 EIGENOBJ 3647 56568 368 0.0955087244511 EIGENOBJ 3647 56568 374 0.573679447174 EIGENOBJ 3647 56568 382 0.0721667483449 EIGENOBJ 3647 56568 400 0.405023157597 EIGENOBJ 3647 56568 466 0.187932342291 EIGENOBJ 3647 56568 470 0.501118600368 EIGENOBJ 3647 56568 525 0.113382048905 EIGENOBJ 3647 56568 534 0.346748709679 EIGENOBJ 3647 56568 604 0.305744081736 EIGENOBJ 3647 56568 608 0.164115265012 EIGENOBJ 3647 56568 668 0.553064227104 EIGENOBJ 3647 56568 692 0.477744013071 EIGENOBJ 3647 56568 806 0.164614424109 EIGENOBJ 3647 56568 856 0.698960840702 EIGENOBJ 3647 56568 908 0.485892236233 EIGENOBJ 3647 56596 42 0.0544780306518 EIGENOBJ 3647 56596 96 0.233589068055 EIGENOBJ 3647 56596 236 0.379069149494 EIGENOBJ 3647 56596 282 0.0741503462195 EIGENOBJ 3647 56596 322 0.0961421951652 EIGENOBJ 3647 56596 332 0.0962692275643 EIGENOBJ 3647 56596 350 0.106383480132 EIGENOBJ 3647 56596 366 0.0955390483141 EIGENOBJ 3647 56596 376 0.574318349361 EIGENOBJ 3647 56596 390 0.0721655935049 EIGENOBJ 3647 56596 392 0.405283272266 EIGENOBJ 3647 56596 468 0.500954151154 EIGENOBJ 3647 56596 480 0.187990143895 EIGENOBJ 3647 56596 527 0.113437004387 EIGENOBJ 3647 56596 528 0.34671330452 EIGENOBJ 3647 56596 604 0.30575928092 EIGENOBJ 3647 56596 608 0.164102524519 EIGENOBJ 3647 56596 686 0.477754086256 EIGENOBJ 3647 56596 814 0.164571017027 EIGENOBJ 3647 56596 850 0.698882818222 EIGENOBJ 3647 56596 912 0.48597240448 EIGENOBJ 3650 55244 308 0.0388687700033 EIGENOBJ 3650 55244 340 0.0386969409883 EIGENOBJ 3650 55244 536 0.782500088215 EIGENOBJ 3650 55244 600 0.24555696547 EIGENOBJ 3651 55247 483 0.384492307901 EIGENOBJ 3651 55247 515 0.276422828436 EIGENOBJ 3651 55247 527 0.216948509216 EIGENOBJ 3651 55247 537 0.296560764313 EIGENOBJ 3735 55209 10 0.0741894543171 EIGENOBJ 3735 55209 84 0.228548258543 EIGENOBJ 3735 55209 186 0.502200067043 EIGENOBJ 3735 55209 224 0.295845091343 EIGENOBJ 3735 55209 291 0.245919138193 EIGENOBJ 3735 55209 497 0.691471099854 EIGENOBJ 3735 55209 498 0.54198038578 EIGENOBJ 3735 55209 500 0.221394315362 EIGENOBJ 3735 55209 553 0.201369032264 EIGENOBJ 3735 55209 600 0.840697169304 EIGENOBJ 3735 55209 656 0.119228735566 EIGENOBJ 3735 55209 772 0.298009634018 EIGENOBJ 3735 55209 785 0.64361333847 EIGENOBJ 3735 55209 844 0.564572691917 EIGENOBJ 3735 55209 882 0.153093069792 EIGENOBJ 3735 55209 902 0.121742628515 EIGENOBJ 3735 55209 994 0.267185181379 EIGENOBJ 3736 55214 58 0.176775500178 EIGENOBJ 3736 55214 112 0.0443794690073 EIGENOBJ 3736 55214 140 0.109699942172 EIGENOBJ 3736 55214 144 0.0764242857695 EIGENOBJ 3736 55214 168 0.0389341637492 EIGENOBJ 3736 55214 352 0.0336177386343 EIGENOBJ 3736 55214 396 0.199927717447 EIGENOBJ 3736 55214 398 0.664975821972 EIGENOBJ 3736 55214 422 0.640958607197 EIGENOBJ 3736 55214 440 0.0452782809734 EIGENOBJ 3736 55214 482 0.339782655239 EIGENOBJ 3736 55214 588 0.10633675009 EIGENOBJ 3736 55214 712 0.465647995472 EIGENOBJ 3736 55214 719 0.413004368544 EIGENOBJ 3736 55214 946 0.0328998342156 EIGENOBJ 3736 55214 958 0.064796641469 EIGENOBJ 3744 55209 234 0.493817925453 EIGENOBJ 3744 55209 284 0.0546493791044 EIGENOBJ 3744 55209 316 0.361012130976 EIGENOBJ 3744 55209 368 0.0549891069531 EIGENOBJ 3744 55209 370 0.366891771555 EIGENOBJ 3744 55209 494 0.335478156805 EIGENOBJ 3744 55209 498 0.247286885977 EIGENOBJ 3744 55209 506 0.198596671224 EIGENOBJ 3744 55209 522 0.477695584297 EIGENOBJ 3744 55209 523 0.0715539082885 EIGENOBJ 3744 55209 544 0.305706828833 EIGENOBJ 3744 55209 561 0.0252247527242 EIGENOBJ 3744 55209 616 0.396595001221 EIGENOBJ 3744 55209 802 0.814275026321 EIGENOBJ 3744 55209 852 0.493357479572 EIGENOBJ 3745 55234 266 0.0443090349436 EIGENOBJ 3745 55234 302 0.261240541935 EIGENOBJ 3745 55234 366 0.0996090769768 EIGENOBJ 3745 55234 440 0.548297226429 EIGENOBJ 3745 55234 609 0.0677964612842 EIGENOBJ 3745 55234 728 0.612909734249 EIGENOBJ 3745 55234 764 0.388154536486 EIGENOBJ 3745 55234 800 0.0280731134117 pydl-0.7.0/pydl/pydlspec2d/tests/t/filter_thru_idl_data.txt0000644000076500000240000000260512632466352024455 0ustar weaverstaff00000000000000# Test data for use with test_filter_thru. 14.1224, 0.127368, 0.508305, 0.293359, 0.381092, 2.51200, -0.0306516, 0.298414, 0.185329, 1.11159, 0.166036, 1.08602, 0.162873, 0.529303, 0.290425, -0.0399243, 0.209576, 0.778746, 0.578854, 0.429100 18.3209, 0.0740831, 0.531655, 0.943787, 0.558309, 7.58337, 0.209387, 0.861768, 0.275096, 1.88644, 0.307924, 4.10726, 0.276175, 0.732941, 0.336644, 0.0456389, 0.725353, 2.25312, 0.413612, 1.08603 14.4588, -0.0113344, 1.15476, 3.76573, 1.35594, 17.3931, 0.647237, 2.96833, 1.00745, 1.46273, 0.815733, 11.2539, 0.957882, 2.57075, 0.821274, 0.162416, 2.08590, 5.34149, 1.32573, 3.67932 10.4319, -0.0250982, 1.83043, 5.57500, 1.64166, 19.9340, 1.18961, 3.89914, 1.83561, 0.884010, 1.22140, 13.7641, 1.66350, 3.35640, 1.41506, 0.105699, 3.04802, 6.47299, 2.31518, 5.10352 7.59458, 0.0395837, 2.31145, 6.42070, 2.04900, 20.3303, 1.56850, 4.25935, 2.38152, 0.984143, 1.51820, 14.4626, 2.01851, 3.74417, 1.86966, 0.133837, 3.53355, 6.86765, 2.98000, 5.82073 pydl-0.7.0/pydl/pydlspec2d/tests/test_spec2d.py0000644000076500000240000000670513064020340022056 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np import os from astropy.io import fits from astropy.tests.helper import raises from astropy.utils.data import get_pkg_data_filename from ..spec2d import aesthetics, combine1fiber, filter_thru from .. import Pydlspec2dException class TestSpec2d(object): """Test the functions in pydl.pydlspec2d.spec2d. """ def setup(self): self.env = {'BOSS_SPECTRO_REDUX': '/boss/spectro/redux', 'SPECTRO_REDUX': '/sdss/spectro/redux', 'RUN2D': 'v1_2_3', 'RUN1D': 'v1_2_3'} self.original_env = dict() for key in self.env: if key in os.environ: self.original_env[key] = os.environ[key] else: self.original_env[key] = None os.environ[key] = self.env[key] def teardown(self): for key in self.original_env: if self.original_env[key] is None: del os.environ[key] else: os.environ[key] = self.original_env[key] def test_aesthetics(self): np.random.seed(137) flux = np.random.rand(100) ivar = np.random.rand(100) # # No bad # f = aesthetics(flux, ivar) assert (f == flux).all() # # Bad points # ivar[ivar < 0.1] = 0.0 # # Bad method # with raises(Pydlspec2dException): f = aesthetics(flux, ivar, 'badmethod') # # Nothing # f = aesthetics(flux, ivar, 'nothing') assert (f == flux).all() def test_combine1fiber(self): pass def test_filter_thru(self): fname = get_pkg_data_filename('t/spPlate-4055-55359-0020.fits') with fits.open(fname) as hdulist: flux = hdulist[0].data npix = hdulist[0].header['NAXIS1'] ntrace = hdulist[0].header['NAXIS2'] crval1 = hdulist[0].header['COEFF0'] cd1_1 = hdulist[0].header['COEFF1'] assert flux.shape == (ntrace, npix) loglam0 = crval1 + cd1_1*np.arange(npix, dtype=flux.dtype) waveimg = 10**(np.tile(loglam0, 20).reshape(flux.shape)) assert waveimg.shape == flux.shape f = filter_thru(flux, waveimg=waveimg) idl_data_file = get_pkg_data_filename('t/filter_thru_idl_data.txt') idl_data = np.loadtxt(idl_data_file, dtype='f', delimiter=',').T assert f.shape == (20, 5) assert np.allclose(f, idl_data, atol=1.0e-6) # # Test bad input. # with raises(ValueError): f = filter_thru(flux) with raises(ValueError): f = filter_thru(flux, waveimg=waveimg, filter_prefix='sdss') return def prepare_data(): """Convert full spPlate file into a test-sized version. """ nTrace = 20 spPlate = os.path.join(os.getenv('HOME'), 'Downloads', 'spPlate-4055-55359.fits') spPlateOut = os.path.join(os.path.dirname(__file__), 't', 'spPlate-4055-55359-0020.fits') with fits.open(spPlate) as hdulist: newhdu = fits.PrimaryHDU(hdulist[0].data[0:20, :], header=hdulist[0].header) newhdulist = fits.HDUList([newhdu]) newhdulist.writeto(spPlateOut) return 0 if __name__ == '__main__': from sys import exit exit(prepare_data()) pydl-0.7.0/pydl/pydlspec2d/tests/test_spec1d.py0000644000076500000240000001056413434104050022055 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np import os from astropy.tests.helper import raises from astropy.utils.data import get_pkg_data_filename from .. import Pydlspec2dException from ..spec1d import (HMF, findspec, log, spec_append, spec_path, template_metadata, wavevector) class TestSpec1d(object): """Test the functions in pydl.pydlspec2d.spec1d. """ def setup(self): self.env = {'BOSS_SPECTRO_REDUX': '/boss/spectro/redux', 'SPECTRO_REDUX': '/sdss/spectro/redux', 'RUN2D': 'v1_2_3', 'RUN1D': 'v1_2_3'} self.original_env = dict() for key in self.env: if key in os.environ: self.original_env[key] = os.environ[key] else: self.original_env[key] = None os.environ[key] = self.env[key] def teardown(self): for key in self.original_env: if self.original_env[key] is None: del os.environ[key] else: os.environ[key] = self.original_env[key] def test_findspec(self): """This is just a placeholder for now. """ # slist = findspec(infile='file.in', sdss=True) assert True def test_hmf_init(self): """Test initialization of HMF object """ spec = np.random.random((20, 100)) invvar = np.random.random((20, 100)) hmf = HMF(spec, invvar) assert hmf.K == 4 assert log.level == 20 # INFO hmf = HMF(spec, invvar, K=6, verbose=True) assert hmf.K == 6 assert log.level == 10 # DEBUG def test_spec_append(self): spec1 = np.array([[1, 1, 1, 1, 1], [1, 1, 1, 1, 1]]) spec2 = np.array([[2, 2, 2, 2, 2], [2, 2, 2, 2, 2]]) s = spec_append(spec1, spec2) assert (s == np.array([[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [2, 2, 2, 2, 2], [2, 2, 2, 2, 2]])).all() spec2 = np.array([[2, 2, 2, 2], [2, 2, 2, 2]]) s = spec_append(spec1, spec2) assert (s == np.array([[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [2, 2, 2, 2, 0], [2, 2, 2, 2, 0]])).all() s = spec_append(spec1, spec2, 1) assert (s == np.array([[1, 1, 1, 1, 1], [1, 1, 1, 1, 1], [0, 2, 2, 2, 2], [0, 2, 2, 2, 2]])).all() spec1 = np.array([[1, 1, 1], [1, 1, 1]]) spec2 = np.array([[2, 2, 2, 2, 2], [2, 2, 2, 2, 2]]) s = spec_append(spec1, spec2, -2) assert (s == np.array([[0, 0, 1, 1, 1], [0, 0, 1, 1, 1], [2, 2, 2, 2, 2], [2, 2, 2, 2, 2]])).all() def test_spec_path(self): bsr = self.env['BOSS_SPECTRO_REDUX'] run2d = self.env['RUN2D'] p = spec_path(123) assert p[0] == os.path.join(bsr, run2d, '0123') p = spec_path(1234) assert p[0] == os.path.join(bsr, run2d, '1234') p = spec_path(1234, topdir=bsr, run2d=run2d) assert p[0] == os.path.join(bsr, run2d, '1234') p = spec_path(np.array([1234, 5678]), topdir=bsr, run2d=run2d) assert p[0] == os.path.join(bsr, run2d, '1234') assert p[1] == os.path.join(bsr, run2d, '5678') p = spec_path(1234, path=bsr) assert p[0] == bsr def test_template_metadata(self): with raises(Pydlspec2dException): slist, metadata = template_metadata('/no/such/file.par') inputfile = get_pkg_data_filename('t/test_template_metadata.par') slist, metadata = template_metadata(inputfile) assert metadata['object'] == 'gal' assert not metadata['nonnegative'] def test_wavevector(self): l = wavevector(3, 4, binsz=0.1) ll = np.array([3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4.0]) assert np.allclose(l, ll) l = wavevector(3, 4, wavemin=3, binsz=0.1) ll = np.array([3.0, 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4.0]) assert np.allclose(l, ll) pydl-0.7.0/pydl/pydlspec2d/__init__.py0000644000076500000240000000123212671060606020236 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This subpackage implements functions from the idlspec2d package. """ # # Define this early on so that submodules can use it # from .. import PydlException import astropy.utils.exceptions as aue class Pydlspec2dException(PydlException): """Exceptions raised by :mod:`pydl.pydlspec2d` that don't fit into a standard exception class like :exc:`ValueError`. """ pass class Pydlspec2dUserWarning(aue.AstropyUserWarning): """Class for warnings issued by :mod:`pydl.pydlspec2d`. """ pass __all__ = ['Pydlspec2dException', 'Pydlspec2dUserWarning'] pydl-0.7.0/pydl/_astropy_init.py0000644000076500000240000000374713434104050017314 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst __all__ = ['__version__', '__githash__'] # this indicates whether or not we are in the package's setup.py try: _ASTROPY_SETUP_ except NameError: from sys import version_info if version_info[0] >= 3: import builtins else: import __builtin__ as builtins builtins._ASTROPY_SETUP_ = False try: from .version import version as __version__ except ImportError: __version__ = '' try: from .version import githash as __githash__ except ImportError: __githash__ = '' if not _ASTROPY_SETUP_: # noqa import os from warnings import warn from astropy.config.configuration import ( update_default_config, ConfigurationDefaultMissingError, ConfigurationDefaultMissingWarning) # Create the test function for self test from astropy.tests.runner import TestRunner test = TestRunner.make_test_runner_in(os.path.dirname(__file__)) test.__test__ = False __all__ += ['test'] # add these here so we only need to cleanup the namespace at the end config_dir = None if not os.environ.get('ASTROPY_SKIP_CONFIG_UPDATE', False): config_dir = os.path.dirname(__file__) config_template = os.path.join(config_dir, __package__ + ".cfg") if os.path.isfile(config_template): try: update_default_config( __package__, config_dir, version=__version__) except TypeError as orig_error: try: update_default_config(__package__, config_dir) except ConfigurationDefaultMissingError as e: wmsg = (e.args[0] + " Cannot install default profile. If you are " "importing from source, this is expected.") warn(ConfigurationDefaultMissingWarning(wmsg)) del e except Exception: raise orig_error pydl-0.7.0/pydl/tests/0000755000076500000240000000000013434104632015214 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/tests/__init__.py0000644000076500000240000000020212135272325017321 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This is the pydl/tests directory. """ pydl-0.7.0/pydl/tests/t/0000755000076500000240000000000013434104632015457 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/tests/t/pcomp_data.txt0000644000076500000240000000367712664715445020362 0ustar weaverstaff00000000000000# Data for testing the pcomp class. # input data 19.5, 43.1, 29.1, 11.9 24.7, 49.8, 28.2, 22.8 30.7, 51.9, 37.0, 18.7 29.8, 54.3, 31.1, 20.1 19.1, 42.2, 30.9, 12.9 25.6, 53.9, 23.7, 21.7 31.4, 58.5, 27.6, 27.1 27.9, 52.1, 30.6, 25.4 22.1, 49.9, 23.2, 21.3 25.5, 53.5, 24.8, 19.3 31.1, 56.6, 30.0, 25.4 30.4, 56.7, 28.3, 27.2 18.7, 46.5, 23.0, 11.7 19.7, 44.2, 28.6, 17.8 14.6, 42.7, 21.3, 12.8 29.5, 54.4, 30.1, 23.9 27.7, 55.3, 25.7, 22.6 30.2, 58.6, 24.6, 25.4 22.7, 48.2, 27.1, 14.8 25.2, 51.0, 27.5, 21.1 # derived values -107.377 , 13.3986 , -1.40521 , -0.0324538 3.20342, 0.699964, 5.95396 , -0.0154266 32.4969 , 38.6573 , -3.87102 , 0.00906474 40.8858 , 13.7870 , -4.98349 , -0.00512299 -107.236 , 19.3568 , 1.76589 , 0.0231152 18.4284 , -17.1468 , -1.46805 , -0.00318355 99.8885 , -6.22849 , 0.130080 , 0.0166338 45.3808 , 8.11078 , 6.53065 , -0.0114850 -21.3101 , -18.3147 , 3.75132 , -0.0133497 5.54411, -11.1707 , -4.51918 , 0.0224445 83.1352 , 4.96944 , 0.0912485, 0.0104372 87.1055 , -3.16057 , 2.81444 , 0.00460935 -101.319 , -11.7835 , -6.11614 , 0.00658332 -73.0708 , 6.23879 , 6.60536 , 0.0215680 -137.017 , -19.0988 , 1.32969 , 0.0120680 57.1149 , 6.95618 , 0.839940 , -0.00950313 42.1264 , -10.0721 , -2.14432 , 0.00948831 83.3046 , -16.6907 , -2.71600 , -0.0132752 -54.1269 , 2.55509 , -4.21364 , -0.0251960 2.84273, -1.06368 , 1.62462 , -0.00701279 # coefficients 4.87988 , 5.05684 , 1.02824 , 4.79357 1.01466 , -0.954475 , 3.48852 , -0.774333 -0.618291 , -0.955430 , 0.269045 , 1.57962 -0.0900205, 0.0751850, 0.0472409, 0.00219369 # eigenvalues 73.4205, 14.7100, 3.86270, 0.0159930 # variance 0.797969, 0.159875, 0.0419817, 0.000173819 # eigenvalues of standardized data 2.99966981, 1.05397516, 1.56189601e-01, 6.91743392e-04 # variance of standardized data 7.12421581e-01, 2.50319100e-01, 3.70950302e-02, 1.64289056e-04 pydl-0.7.0/pydl/tests/t/this-file-contains-42-lines.txt.gz0000644000076500000240000000016412135272325023675 0ustar weaverstaff000000000000007Qthis-file-contains-42-lines.txt 0MmB@„0\ 5:hE-.,l?_49d%,]hyupydl-0.7.0/pydl/tests/t/this-file-contains-42-lines.txt0000644000076500000240000000016512135272325023257 0ustar weaverstaff000000000000001 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 pydl-0.7.0/pydl/tests/t/this-file-contains-137-lines.txt0000644000076500000240000000067012135272325023345 0ustar weaverstaff000000000000001 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 pydl-0.7.0/pydl/tests/t/this-file-contains-137-lines.txt.gz0000644000076500000240000000041312135272325023757 0ustar weaverstaff00000000000000Q7Qthis-file-contains-137-lines.txtAY;+@$NfX7/F8DE>>=>_dBt)].He=B9q<].6+,Ppydl-0.7.0/pydl/tests/t/smooth_data.txt0000644000076500000240000000163012632466352020533 0ustar weaverstaff000000000000002.30610 -0.606922 -0.495611 0.319969 -0.852573 -0.718530 -0.822457 -0.914747 -2.06190 -0.367282 -1.07346 0.333589 -0.171405 1.29557 1.04241 0.745161 1.12171 -1.68132 1.33361 1.09108 1.16230 -0.663734 0.819509 -0.0675587 -1.30818 1.65162 1.54414 -0.801843 1.55333 -0.0529722 -0.535589 0.424731 -1.01041 1.78723 0.625269 -0.770034 0.837806 -1.29704 1.84804 0.331175 -0.0432724 0.430578 0.785313 -0.243586 -0.874319 -1.18494 0.204993 -0.620158 1.51457 1.12941 0.776386 2.25176 1.75678 0.327156 0.530501 0.648502 0.0279251 -0.395409 -0.341345 0.339561 0.414343 -0.513183 1.23809 -1.44090 -1.94348 -0.443420 0.483545 -0.408526 -0.0689184 -0.568676 1.89727 -1.54532 1.16730 0.898135 -1.21551 -0.525567 -0.0766497 1.47355 -0.735265 0.772411 0.143568 0.0425026 1.27208 0.499760 0.718399 0.577398 1.79356 -0.237896 -1.87300 1.95981 -0.0884865 -0.872654 0.133105 -1.24190 0.465836 -0.210268 0.675163 -1.54661 -0.993406 -0.0257759 pydl-0.7.0/pydl/tests/t/this-file-contains-1-lines.txt0000644000076500000240000000000212135272325023160 0ustar weaverstaff000000000000001 pydl-0.7.0/pydl/tests/t/README.rst0000644000076500000240000000011012135272325017140 0ustar weaverstaff00000000000000This directory contains files used by the tests in the directory above. pydl-0.7.0/pydl/tests/t/this-file-is-empty.txt0000644000076500000240000000000012135272325021641 0ustar weaverstaff00000000000000pydl-0.7.0/pydl/tests/t/this-file-contains-1-lines.txt.gz0000644000076500000240000000006512135272325023610 0ustar weaverstaff000000000000007Qthis-file-contains-1-lines.txt3SQgpydl-0.7.0/pydl/tests/coveragerc0000644000076500000240000000127712403670051017264 0ustar weaverstaff00000000000000[run] source = pydl omit = pydl/_astropy_init* pydl/conftest* pydl/cython_version* pydl/setup_package* pydl/*/setup_package* pydl/*/*/setup_package* pydl/tests/* pydl/*/tests/* pydl/*/*/tests/* pydl/version* [report] exclude_lines = # Have to re-enable the standard pragma pragma: no cover # Don't complain about packages we have installed except ImportError # Don't complain if tests don't hit assertions raise AssertionError raise NotImplementedError # Don't complain about script hooks def main\(.*\): # Ignore branches that don't pertain to this version of Python pragma: py{ignore_python_version} six.PY{ignore_python_version} pydl-0.7.0/pydl/tests/test_pydl.py0000644000076500000240000002001213272413172017572 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np import glob try: from astropy.tests.compat import assert_allclose except ImportError: from numpy.testing.utils import assert_allclose from astropy.tests.helper import raises from astropy.utils.data import get_pkg_data_filename from os.path import basename, dirname, join from ..file_lines import file_lines from ..median import median from ..pcomp import pcomp from ..rebin import rebin from ..smooth import smooth from ..uniq import uniq class TestPydl(object): """Test the top-level pydl functions. """ def setup(self): pass def teardown(self): pass def test_file_lines(self): # # Find the test files # line_numbers = (1, 42, 137) plainfiles = [get_pkg_data_filename('t/this-file-contains-{0:d}-lines.txt'.format(l)) for l in line_numbers] gzfiles = [get_pkg_data_filename('t/this-file-contains-{0:d}-lines.txt.gz'.format(l)) for l in line_numbers] for i, p in enumerate(plainfiles): n = file_lines(p) assert n == line_numbers[i] for i, p in enumerate(gzfiles): n = file_lines(p, compress=True) assert n == line_numbers[i] # # Test list passing # n = file_lines(plainfiles) assert tuple(n) == line_numbers n = file_lines(gzfiles, compress=True) assert tuple(n) == line_numbers # # Make sure empty files work # n = file_lines(get_pkg_data_filename('t/this-file-is-empty.txt')) assert n == 0 def test_median(self): odd_data = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13], dtype=np.float32) even_data = np.array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], dtype=np.float32) assert median(odd_data) == 7 assert median(odd_data, even=True) == 7 assert median(even_data) == 7 assert median(even_data, even=True) == 6.5 assert (median(odd_data, 3) == odd_data).all() with raises(ValueError): foo = median(np.ones((9, 9, 9)), 3) odd_data2 = np.vstack((odd_data, odd_data, odd_data, odd_data, odd_data)) assert (median(odd_data2, 3) == odd_data2).all() assert (median(odd_data2, axis=0) == odd_data).all() assert (median(odd_data2, axis=1) == 7*np.ones((odd_data2.shape[0],), dtype=odd_data2.dtype)).all() def test_pcomp(self): test_data_file = get_pkg_data_filename('t/pcomp_data.txt') test_data = np.loadtxt(test_data_file, dtype='d', delimiter=',') with raises(ValueError): foo = pcomp(np.arange(10)) pcomp_data = test_data[0:20, :] m = 4 n = 20 means = np.tile(pcomp_data.mean(0), n).reshape(pcomp_data.shape) newarray = pcomp_data - means foo = pcomp(newarray, covariance=True) # # This array is obtained from the IDL version of PCOMP. # It is only accurate up to an overall sign on each column. # derived = test_data[20:40, :] for k in range(m): assert_allclose(abs(foo.derived[:, k]), abs(derived[:, k]), 1e-4) coefficients = test_data[40:44, :] coefficientsT = coefficients.T for k in range(m): assert_allclose(abs(foo.coefficients[:, k]), abs(coefficientsT[:, k]), 1e-4) eigenvalues = test_data[44, :] assert_allclose(foo.eigenvalues, eigenvalues, 1e-4) variance = test_data[45, :] assert_allclose(foo.variance, variance, 1e-4) # # Test the standardization. # foo = pcomp(pcomp_data, standardize=True, covariance=True) # for k in range(m): # assert_allclose(abs(foo.derived[:, k]), abs(derived[:, k]), 1e-4) # for k in range(m): # assert_allclose(abs(foo.coefficients[:, k]), # abs(coefficientsT[:, k]), # 1e-4) eigenvalues = test_data[46, :] assert_allclose(foo.eigenvalues, eigenvalues, 1e-4) variance = test_data[47, :] assert_allclose(foo.variance, variance, 1e-4) # assert_allclose(foo.derived[0, :], np.array([-1.64153312, # -9.12322038, # 1.41790708, # -8.29359322])) # # Make sure correlation is working at least. # foo = pcomp(pcomp_data, standardize=True) assert_allclose(foo.eigenvalues, np.array([2.84968632e+00, 1.00127640e+00, 1.48380121e-01, 6.57156222e-04])) assert_allclose(foo.variance, np.array([7.12421581e-01, 2.50319100e-01, 3.70950302e-02, 1.64289056e-04])) def test_rebin(self): x = np.arange(40) with raises(ValueError): r = rebin(x, d=(10, 10)) with raises(ValueError): r = rebin(x, d=(70,)) with raises(ValueError): r = rebin(x, d=(30,)) x = np.array([[1.0, 2.0], [2.0, 3.0]]) rexpect = np.array([[1.0, 2.0], [1.5, 2.5], [2.0, 3.0], [2.0, 3.0]]) r = rebin(x, d=(4, 2)) assert np.allclose(r, rexpect) rexpect = np.array([[1.0, 1.5, 2.0, 2.0], [2.0, 2.5, 3.0, 3.0]]) r = rebin(x, d=(2, 4)) assert np.allclose(r, rexpect) rexpect = np.array([[1.0, 2.0], [1.0, 2.0], [2.0, 3.0], [2.0, 3.0]]) r = rebin(x, d=(4, 2), sample=True) assert np.allclose(r, rexpect) rexpect = np.array([[1.0, 1.0, 2.0, 2.0], [2.0, 2.0, 3.0, 3.0]]) r = rebin(x, d=(2, 4), sample=True) assert np.allclose(r, rexpect) x = np.arange(10) rexpect = np.array([0.0, 2.0, 4.0, 6.0, 8.0]) r = rebin(x, d=(5,), sample=True) assert np.allclose(r, rexpect) x = np.array([[1.0, 2.0, 3.0, 4.0], [2.0, 3.0, 4.0, 5.0], [3.0, 4.0, 5.0, 6.0], [4.0, 5.0, 6.0, 7.0]]) rexpect = np.array([[2.0, 4.0], [4.0, 6.0]]) r = rebin(x, d=(2, 2)) assert np.allclose(r, rexpect) rexpect = np.array([[1.5, 2.0, 2.5, 3.0, 3.5, 4.0, 4.5, 4.5], [3.5, 4.0, 4.5, 5.0, 5.5, 6.0, 6.5, 6.5]]) r = rebin(x, d=(2, 8)) assert np.allclose(r, rexpect) def test_smooth(self): test_data_file = get_pkg_data_filename('t/smooth_data.txt') noise = np.loadtxt(test_data_file, dtype='d') # # Test smooth function # x = 8.0*np.arange(100)/100.0 - 4.0 y = np.sin(x) + 0.1*noise s = smooth(y, 5) assert s.shape == (100,) s_edge = smooth(y, 5, True) assert s_edge.shape == (100,) s_w = smooth(y, 1) assert (s_w == y).all() def test_uniq(self): items = np.array([1, 2, 3, 1, 5, 6, 1, 7, 3, 2, 5, 9, 11, 1]) items_sorted = np.sort(items) items_argsorted = np.argsort(items) # # Test pre-sorted array. # u1 = uniq(items_sorted) assert (u1 == np.array([3, 5, 7, 9, 10, 11, 12, 13])).all() # # Test arg-sorted array. # u2 = uniq(items, items_argsorted) assert (u2 == np.array([13, 9, 8, 10, 5, 7, 11, 12])).all() assert (items_sorted[u1] == items[u2]).all() # # Test degenerate case of all identical items. # identical_items = np.ones((10,), dtype=items.dtype) u = uniq(identical_items) assert (u == np.array([9])).all() u = uniq(identical_items, np.arange(10, dtype=items.dtype)) assert (u == np.array([9])).all() pydl-0.7.0/pydl/__init__.py0000644000076500000240000000330313301650031016152 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ ==== pydl ==== Python replacements for functions that are part of the IDL_ built-in library, or part of astronomical IDL libraries. The emphasis is on reproducing results of the astronomical library functions. Only the bare minimum of IDL_ built-in functions are implemented to support this. .. _IDL: http://www.harrisgeospatial.com/SoftwareTechnology/IDL.aspx """ # Packages may add whatever they like to this file, but # should keep this content at the top. # ---------------------------------------------------------------------------- from ._astropy_init import * # ---------------------------------------------------------------------------- # Enforce Python version check during package import. # This is the same check as the one at the top of setup.py import sys class UnsupportedPythonError(Exception): pass if sys.version_info < tuple((int(val) for val in "2.7".split('.'))): raise UnsupportedPythonError("PyDL does not support Python < {}".format(2.7)) if not _ASTROPY_SETUP_: # For egg_info test builds to pass, put package imports here. from .file_lines import file_lines from .median import median from .pcomp import pcomp from .rebin import rebin from .smooth import smooth from .uniq import uniq # Workaround: Numpy 1.14.x changes the way arrays are printed. try: from numpy import set_printoptions set_printoptions(legacy='1.13') except Exception: pass class PydlException(Exception): """Base class for exceptions raised in PyDL functions. """ pass __all__ = ['file_lines', 'median', 'pcomp', 'rebin', 'smooth', 'uniq', 'PydlException'] pydl-0.7.0/pydl/uniq.py0000644000076500000240000000327413301650031015376 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- def uniq(x, index=None): """Replicates the IDL ``UNIQ()`` function. Returns the *subscripts* of the unique elements of an array. The elements must actually be *sorted* before being passed to this function. This can be done by sorting `x` explicitly or by passing the array subscripts that sort `x` as a second parameter. Parameters ---------- x : array-like Search this array for unique items. index : array-like, optional This array provides the array subscripts that sort `x`. Returns ------- array-like The subscripts of `x` that are the unique elements of `x`. Notes ----- Given a sorted array, and assuming that there is a set of adjacent identical items, ``uniq()`` will return the subscript of the *last* unique item. This charming feature is retained for reproducibility. References ---------- http://www.harrisgeospatial.com/docs/uniq.html Examples -------- >>> import numpy as np >>> from pydl import uniq >>> data = np.array([ 1, 2, 3, 1, 5, 6, 1, 7, 3, 2, 5, 9, 11, 1 ]) >>> print(uniq(np.sort(data))) [ 3 5 7 9 10 11 12 13] """ from numpy import array, roll if index is None: indicies = (x != roll(x, -1)).nonzero()[0] if indicies.size > 0: return indicies else: return array([x.size - 1, ]) else: q = x[index] indicies = (q != roll(q, -1)).nonzero()[0] if indicies.size > 0: return index[indicies] else: return array([q.size - 1, ], dtype=index.dtype) pydl-0.7.0/pydl/pydlutils/0000755000076500000240000000000013434104632016103 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlutils/misc.py0000644000076500000240000003071513434104050017410 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the misc directory in idlutils. """ from __future__ import print_function import numpy as np from . import PydlutilsException def decode_mixed(x): """Convert bytes in Numpy arrays into strings. Leave other stuff alone. Parameters ---------- x : object Input object. Returns ------- object If `x` has a ``decode()`` method, ``x.decode()`` will be returned. Otherwise `x` will be returned unchanged. """ try: return x.decode() except: return x def djs_laxisgen(dims, iaxis=0): """Returns an integer array where each element of the array is set equal to its index number along the specified axis. Parameters ---------- dims : :class:`list` Dimensions of the array to return. iaxis : :class:`int`, optional Index along this dimension. Returns ------- :class:`numpy.ndarray` An array of indexes with ``dtype=int32``. Raises ------ :exc:`ValueError` If `iaxis` is greater than or equal to the number of dimensions. Notes ----- For two or more dimensions, there is no difference between this routine and :func:`~pydl.pydlutils.misc.djs_laxisnum`. Examples -------- >>> from pydl.pydlutils.misc import djs_laxisgen >>> print(djs_laxisgen([4,4])) [[0 0 0 0] [1 1 1 1] [2 2 2 2] [3 3 3 3]] """ ndimen = len(dims) if ndimen == 1: return np.arange(dims[0], dtype='i4') return djs_laxisnum(dims, iaxis) def djs_laxisnum(dims, iaxis=0): """Returns an integer array where each element of the array is set equal to its index number in the specified axis. Parameters ---------- dims : :class:`list` Dimensions of the array to return. iaxis : :class:`int`, optional Index along this dimension. Returns ------- :class:`numpy.ndarray` An array of indexes with ``dtype=int32``. Raises ------ :exc:`ValueError` If `iaxis` is greater than or equal to the number of dimensions, or if number of dimensions is greater than three. Notes ----- For two or more dimensions, there is no difference between this routine and :func:`~pydl.pydlutils.misc.djs_laxisgen`. Examples -------- >>> from pydl.pydlutils.misc import djs_laxisnum >>> print(djs_laxisnum([4,4])) [[0 0 0 0] [1 1 1 1] [2 2 2 2] [3 3 3 3]] """ ndimen = len(dims) result = np.zeros(dims, dtype='i4') if ndimen == 1: pass elif ndimen == 2: if iaxis == 0: for k in range(dims[0]): result[k, :] = k elif iaxis == 1: for k in range(dims[1]): result[:, k] = k else: raise ValueError("Bad value for iaxis: {0:d}".format(iaxis)) elif ndimen == 3: if iaxis == 0: for k in range(dims[0]): result[k, :, :] = k elif iaxis == 1: for k in range(dims[1]): result[:, k, :] = k elif iaxis == 2: for k in range(dims[2]): result[:, :, k] = k else: raise ValueError("Bad value for iaxis: {0:d}".format(iaxis)) else: raise ValueError("{0:d} dimensions not supported.".format(ndimen)) return result def hogg_iau_name(ra, dec, prefix='SDSS', precision=1): """Properly format astronomical source names to the IAU convention. Parameters ---------- ra : :class:`float` or :class:`numpy.ndarray` Right ascencion in decimal degrees dec : :class:`float` or :class:`numpy.ndarray` Declination in decimal degrees. prefix : :class:`str`, optional Add this prefix to the string, defaults to 'SDSS'. precision : :class:`int`, optional Display this many digits of precision on seconds, default 1. Returns ------- :class:`str` or :class:`list` The IAU name for the coordinates. Examples -------- >>> from pydl.pydlutils.misc import hogg_iau_name >>> hogg_iau_name(354.120375,-0.544777778) 'SDSS J233628.89-003241.2' """ # # Promote scalar values to arrays. # if isinstance(ra, float): ra = np.array([ra]) if isinstance(dec, float): dec = np.array([dec]) h = ra/15.0 rah = np.floor(h) ram = np.floor(60.0*(h-rah)) ras = 60.0*(60.0*(h-rah) - ram) ras = np.floor(ras*10.0**(precision+1))/10.0**(precision+1) rasformat = "{{2:0{0:d}.{1:d}f}}".format(precision+4, precision+1) rah = rah.astype(np.int32) ram = ram.astype(np.int32) desgn = np.array(list('+'*len(dec))) desgn[dec < 0] = '-' adec = np.absolute(dec) ded = np.floor(adec) dem = np.floor(60.0*(adec-ded)) des = 60.0*(60.0*(adec-ded) - dem) des = np.floor(des*10.0**precision)/10.0**precision desformat = "{{6:0{0:d}.{1:d}f}}".format(precision+3, precision) if precision == 0: desformat = "{6:02d}" des = des.astype(np.int32) ded = ded.astype(np.int32) dem = dem.astype(np.int32) adformat = "{{0:02d}}{{1:02d}}{ras}{{3:s}}{{4:02d}}{{5:02d}}{des}".format( ras=rasformat, des=desformat) adstr = [adformat.format(*x) for x in zip( rah, ram, ras, desgn, ded, dem, des)] if prefix == '': jstr = 'J' else: jstr = ' J' name = ["{0}{1}{2}".format(prefix, jstr, x) for x in adstr] if len(ra) == 1: return name[0] else: return name def hogg_iau_name_main(): # pragma: no cover from argparse import ArgumentParser parser = ArgumentParser(description='Properly format astronomical ' + 'source names to the IAU convention.') parser.add_argument('-P', '--precision', dest='precision', action='store', metavar='N', default=1, type=int, help='Digits of precision to add to the declination.') parser.add_argument('-p', '--prefix', dest='prefix', action='store', metavar='STR', default='SDSS', help='Add this prefix to the name.') parser.add_argument('ra', metavar='RA', type=float, help='Right Ascension.') parser.add_argument('dec', metavar='Dec', type=float, help='Declination.') options = parser.parse_args() print(hogg_iau_name(options.ra, options.dec, prefix=options.prefix, precision=options.precision)) return 0 def struct_print(array, filename=None, formatcodes=None, alias=None, fdigit=5, ddigit=7, html=False, no_head=False, silent=False): """Print a NumPy record array (analogous to an IDL structure) in a nice way. Parameters ---------- array : :class:`numpy.ndarray` A record array to print. filename : :class:`str` or file-like, optional If supplied, write to this file. formatcodes : :class:`dict`, optional If supplied, use explicit format for certain columns. alias : :class:`dict`, optional If supplied, use this mapping of record array column names to printed column names. fdigit : :class:`int`, optional Width of 32-bit floating point columns, default 5. ddigit : :class:`int`, optional Width of 64-bit floating point columns, default 7. html : :class:`bool`, optional If ``True``, print an html table. no_head : :class:`bool`, optional If ``True``, *don't* print a header line. silent : :class:`bool`, optional If ``True``, do not print the table, just return it. Returns ------- :func:`tuple` A tuple containing a list of the lines in the table. If `html` is ``True``, also returns a list of lines of CSS for formatting the table. Examples -------- >>> import numpy as np >>> from pydl.pydlutils.misc import struct_print >>> struct_print(np.array([(1,2.34,'five'),(2,3.456,'seven'),(3,4.5678,'nine')],dtype=[('a','i4'),('bb','f4'),('ccc','S5')]),silent=True) (['a bb ccc ', '- ----------- -----', '1 2.34 five ', '2 3.456 seven', '3 4.5678 nine '], []) """ if html: headstart = '' headsep = '' headend = '' colstart = '' colsep = '' colend = '' css = [''] else: headstart = '' headsep = ' ' headend = '' colstart = '' colsep = ' ' colend = '' css = list() # # Alias should be a dictionary that maps structure names to column names # if alias is None: # # Create a dummy alias dictionary # alias = dict(list(zip(array.dtype.names, array.dtype.names))) else: # # Fill in any missing values of the alias dictionary # for tag in array.dtype.names: if tag not in alias: alias[tag] = tag # # Formatcodes allows an override for certain columns. # if formatcodes is None: formatcodes = dict() # # This dictionary will hold the number of characters in each column # nchar = dict() # # Construct format codes for each column # for k, tag in enumerate(array.dtype.names): if tag in formatcodes: thiscode = formatcodes[tag] thisn = len(thiscode.format(array[tag][0])) else: d = array.dtype.fields[tag][0] if d.kind == 'i' or d.kind == 'u': thisn = max(max(len(str(array[tag].min())), len(str(array[tag].max()))), len(tag)) thiscode = "{{{0:d}:{1:d}d}}".format(k, thisn) elif d.kind == 'f': if d.itemsize == 8: prec = ddigit else: prec = fdigit thisn = prec + 6 if array[tag].min() < 0: thisn += 1 thiscode = "{{{0:d}:{1:d}.{2:d}g}}".format(k, thisn, prec) elif d.kind == 'S' or d.kind == 'U': thisn = max(d.itemsize, len(tag)) thiscode = "{{{0:d}:{1:d}s}}".format(k, thisn) else: raise PydlutilsException( "Unsupported kind: {0}".format(d.kind)) formatcodes[tag] = thiscode nchar[tag] = thisn # # Start building an array of lines # lines = list() # # Construct header lines # if html: lines.append('') hdr1 = (headstart + headsep.join([alias[tag] for tag in array.dtype.names]) + headend) lines.append(hdr1) else: if not no_head: hdr1 = (headstart + headsep.join([("{{0:{0:d}s}}".format( nchar[tag])).format(alias[tag]) for tag in array.dtype.names]) + headend) hdr2 = (headstart + headsep.join(['-' * nchar[tag] for tag in array.dtype.names]) + headend) lines.append(hdr1) lines.append(hdr2) # # Create a format string for the data from the individual format codes # rowformat = (colstart + colsep.join([formatcodes[tag] for tag in array.dtype.names]) + colend) for k in range(array.size): lines.append(rowformat.format( *([decode_mixed(l) for l in array[k].tolist()]))) if html: lines.append('
') f = None # This variable will store a file handle close_file = False if filename is not None: if hasattr(filename, 'write'): f = filename else: f = open(filename, 'w+b') close_file = True if f is None: if not silent: # pragma: no cover print("\n".join(lines)+"\n") else: f.write(("\n".join(lines)+"\n").encode('utf-8')) if close_file: f.close() return (lines, css) pydl-0.7.0/pydl/pydlutils/trace.py0000644000076500000240000004072013434104050017550 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the trace directory in idlutils. """ import numpy as np from scipy.special import chebyt from astropy.io.fits.fitsrec import FITS_rec from . import PydlutilsException from .math import djs_reject from .misc import djs_laxisgen from ..goddard.math import flegendre def fchebyshev(x, m): """Compute the first `m` Chebyshev polynomials. Parameters ---------- x : array-like Compute the Chebyshev polynomials at these abscissa values. m : :class:`int` The number of Chebyshev polynomials to compute. For example, if :math:`m = 3`, :math:`T_0 (x)`, :math:`T_1 (x)` and :math:`T_2 (x)` will be computed. Returns ------- :class:`numpy.ndarray` """ if isinstance(x, np.ndarray): n = x.size else: n = 1 if m < 1: raise ValueError('Order of Chebyshev polynomial must be at least 1.') try: dt = x.dtype except AttributeError: dt = np.float64 leg = np.ones((m, n), dtype=dt) if m >= 2: leg[1, :] = x if m >= 3: for k in range(2, m): leg[k, :] = np.polyval(chebyt(k), x) return leg def fchebyshev_split(x, m): """Compute the first `m` Chebyshev polynomials, but modified to allow a split in the baseline at :math:`x=0`. The intent is to allow a model fit where a constant term is different for positive and negative `x`. Parameters ---------- x : array-like Compute the Chebyshev polynomials at these abscissa values. m : :class:`int` The number of Chebyshev polynomials to compute. For example, if :math:`m = 3`, :math:`T_0 (x)`, :math:`T_1 (x)` and :math:`T_2 (x)` will be computed. Returns ------- :class:`numpy.ndarray` """ if isinstance(x, np.ndarray): n = x.size else: n = 1 if m < 2: raise ValueError('Order of polynomial must be at least 2.') try: dt = x.dtype except AttributeError: dt = np.float64 leg = np.ones((m, n), dtype=dt) try: leg[0, :] = (x >= 0).astype(x.dtype) except AttributeError: leg[0, :] = np.double(x >= 0) if m > 2: leg[2, :] = x if m > 3: for k in range(3, m): leg[k, :] = 2.0 * x * leg[k-1, :] - leg[k-2, :] return leg def fpoly(x, m): """Compute the first `m` simple polynomials. Parameters ---------- x : array-like Compute the simple polynomials at these abscissa values. m : :class:`int` The number of simple polynomials to compute. For example, if :math:`m = 3`, :math:`x^0`, :math:`x^1` and :math:`x^2` will be computed. Returns ------- :class:`numpy.ndarray` """ if isinstance(x, np.ndarray): n = x.size else: n = 1 if m < 1: raise ValueError('Order of polynomial must be at least 1.') try: dt = x.dtype except AttributeError: dt = np.float64 leg = np.ones((m, n), dtype=dt) if m >= 2: leg[1, :] = x if m >= 3: for k in range(2, m): leg[k, :] = leg[k-1, :] * x return leg def func_fit(x, y, ncoeff, invvar=None, function_name='legendre', ia=None, inputans=None, inputfunc=None): """Fit `x`, `y` positions to a functional form. Parameters ---------- x : array-like X values (independent variable). y : array-like Y values (dependent variable). ncoeff : :class:`int` Number of coefficients to fit. invvar : array-like, optional Weight values; inverse variance. function_name : :class:`str`, optional Function name, default 'legendre'. ia : array-like, optional An array of bool of length `ncoeff` specifying free (``True``) and fixed (``False``) parameters. inputans : array-like, optional An array of values of length `ncoeff` specifying the values of the fixed parameters. inputfunc : array-like, optional Multiply the function fit by these values. Returns ------- :func:`tuple` of array-like Fit coefficients, length `ncoeff`; fitted values. Raises ------ :exc:`KeyError` If an invalid function type is selected. :exc:`ValueError` If input dimensions do not agree. """ if x.shape != y.shape: raise ValueError('Dimensions of X and Y do not agree!') if invvar is None: invvar = np.ones(x.shape, dtype=x.dtype) else: if invvar.shape != x.shape: raise ValueError('Dimensions of X and invvar do not agree!') if ia is None: ia = np.ones((ncoeff,), dtype=np.bool) if not ia.all(): if inputans is None: inputans = np.zeros((ncoeff,), dtype=x.dtype) # # Select unmasked points # igood = (invvar > 0).nonzero()[0] ngood = len(igood) res = np.zeros((ncoeff,), dtype=x.dtype) yfit = np.zeros(x.shape, dtype=x.dtype) if ngood == 0: pass elif ngood == 1: res[0] = y[igood[0]] yfit += y[igood[0]] else: ncfit = min(ngood, ncoeff) function_map = { 'legendre': flegendre, 'flegendre': flegendre, 'chebyshev': fchebyshev, 'fchebyshev': fchebyshev, 'chebyshev_split': fchebyshev_split, 'fchebyshev_split': fchebyshev_split, 'poly': fpoly, 'fpoly': fpoly } try: legarr = function_map[function_name](x, ncfit) except KeyError: raise KeyError('Unknown function type: {0}'.format(function_name)) if inputfunc is not None: if inputfunc.shape != x.shape: raise ValueError('Dimensions of X and inputfunc do not agree!') legarr *= np.tile(inputfunc, ncfit).reshape(ncfit, x.shape[0]) yfix = np.zeros(x.shape, dtype=x.dtype) nonfix = ia[0:ncfit].nonzero()[0] nparams = len(nonfix) fixed = (~ia[0:ncfit]).nonzero()[0] if len(fixed) > 0: yfix = np.dot(legarr.T, inputans * (1 - ia)) ysub = y - yfix finalarr = legarr[nonfix, :] else: finalarr = legarr ysub = y # extra2 = finalarr * np.outer(np.ones((nparams,), dtype=x.dtype), # (invvar > 0)) extra2 = finalarr * np.outer(np.ones((nparams,), dtype=x.dtype), invvar) alpha = np.dot(finalarr, extra2.T) # assert alpha.dtype == x.dtype if nparams > 1: # beta = np.dot(ysub * (invvar > 0), finalarr.T) beta = np.dot(ysub * invvar, finalarr.T) assert beta.dtype == x.dtype # uu,ww,vv = np.linalg.svd(alpha, full_matrices=False) res[nonfix] = np.linalg.solve(alpha, beta) else: # res[nonfix] = (ysub * (invvar > 0) * finalarr).sum()/alpha res[nonfix] = (ysub * invvar * finalarr).sum()/alpha if len(fixed) > 0: res[fixed] = inputans[fixed] yfit = np.dot(legarr.T, res[0:ncfit]) return (res, yfit) class TraceSet(object): """Implements the idea of a trace set. Attributes ---------- func : :class:`str` Name of function type used to fit the trace set. xmin : float-like Minimum x value. xmax : float-like Maximum x value. coeff : array-like Coefficients of the trace set fit. nTrace : :class:`int` Number of traces in the object. ncoeff : :class:`int` Number of coefficients of the trace set fit. xjumplo : float-like Jump value, for BOSS readouts. xjumphi : float-like Jump value, for BOSS readouts. xjumpval : float-like Jump value, for BOSS readouts. outmask : array-like When initialized with x,y positions, this contains the rejected points. yfit : array-like When initialized with x,y positions, this contains the fitted y values. """ _func_map = {'poly': fpoly, 'legendre': flegendre, 'chebyshev': fchebyshev} def __init__(self, *args, **kwargs): """This class can be initialized either with a set of xy positions, or with a trace set HDU from a FITS file. """ if len(args) == 1 and isinstance(args[0], FITS_rec): # # Initialize with FITS data # self.func = args[0]['FUNC'][0] self.xmin = args[0]['XMIN'][0] self.xmax = args[0]['XMAX'][0] self.coeff = args[0]['COEFF'][0] self.nTrace = self.coeff.shape[0] self.ncoeff = self.coeff.shape[1] if 'XJUMPLO' in args[0].dtype.names: self.xjumplo = args[0]['XJUMPLO'][0] self.xjumphi = args[0]['XJUMPHI'][0] self.xjumpval = args[0]['XJUMPVAL'][0] else: self.xjumplo = None self.xjumphi = None self.xjumpval = None self.outmask = None self.yfit = None elif len(args) == 2: # # Initialize with x, y positions. # xpos = args[0] ypos = args[1] self.nTrace = xpos.shape[0] if 'invvar' in kwargs: invvar = kwargs['invvar'] else: invvar = np.ones(xpos.shape, dtype=xpos.dtype) if 'func' in kwargs: self.func = kwargs['func'] else: self.func = 'legendre' if 'ncoeff' in kwargs: self.ncoeff = int(kwargs['ncoeff']) else: self.ncoeff = 3 if 'xmin' in kwargs: self.xmin = np.float64(kwargs['xmin']) else: self.xmin = xpos.min() if 'xmax' in kwargs: self.xmax = np.float64(kwargs['xmax']) else: self.xmax = xpos.max() if 'maxiter' in kwargs: maxiter = int(kwargs['maxiter']) else: maxiter = 10 if 'inmask' in kwargs: inmask = kwargs['inmask'] else: inmask = np.ones(xpos.shape, dtype=np.bool) do_jump = False if 'xjumplo' in kwargs: do_jump = True self.xjumplo = np.float64(kwargs['xjumplo']) else: self.xjumplo = None if 'xjumphi' in kwargs: self.xjumphi = np.float64(kwargs['xjumphi']) else: self.xjumphi = None if 'xjumpval' in kwargs: self.xjumpval = np.float64(kwargs['xjumpval']) else: self.xjumpval = None self.coeff = np.zeros((self.nTrace, self.ncoeff), dtype=xpos.dtype) self.outmask = np.zeros(xpos.shape, dtype=np.bool) self.yfit = np.zeros(xpos.shape, dtype=xpos.dtype) for iTrace in range(self.nTrace): xvec = self.xnorm(xpos[iTrace, :], do_jump) iIter = 0 qdone = False tempivar = (invvar[iTrace, :] * inmask[iTrace, :].astype(invvar.dtype)) thismask = tempivar > 0 while (not qdone) and (iIter <= maxiter): res, ycurfit = func_fit(xvec, ypos[iTrace, :], self.ncoeff, invvar=tempivar, function_name=self.func) thismask, qdone = djs_reject(ypos[iTrace, :], ycurfit, invvar=tempivar) iIter += 1 self.yfit[iTrace, :] = ycurfit self.coeff[iTrace, :] = res self.outmask[iTrace, :] = thismask else: raise PydlutilsException("Wrong number of arguments to TraceSet!") def xy(self, xpos=None, ignore_jump=False): """Convert from a trace set to an array of x,y positions. Parameters ---------- xpos : array-like, optional If provided, evaluate the trace set at these positions. Otherwise the positions will be constructed from the trace set object iself. ignore_jump : :class:`bool`, optional If ``True``, ignore any jump information in the `tset` object Returns ------- :func:`tuple` of array-like The x, y positions. """ do_jump = self.has_jump and (not ignore_jump) if xpos is None: xpos = djs_laxisgen([self.nTrace, self.nx], iaxis=1) + self.xmin ypos = np.zeros(xpos.shape, dtype=xpos.dtype) for iTrace in range(self.nTrace): xvec = self.xnorm(xpos[iTrace, :], do_jump) legarr = self._func_map[self.func](xvec, self.ncoeff) ypos[iTrace, :] = np.dot(legarr.T, self.coeff[iTrace, :]) return (xpos, ypos) @property def has_jump(self): """``True`` if jump conditions are set. """ return self.xjumplo is not None @property def xRange(self): """Range of x values. """ return self.xmax - self.xmin @property def nx(self): """Number of x values. """ return int(self.xRange + 1) @property def xmid(self): """Midpoint of x values. """ return 0.5 * (self.xmin + self.xmax) def xnorm(self, xinput, jump): """Convert input x coordinates to normalized coordinates suitable for input to special polynomials. Parameters ---------- xinput : array-like Input coordinates. jump : :class:`bool` Set to ``True`` if there is a jump. Returns ------- array-like Normalized coordinates. """ if jump: # Vector specifying what fraction of the jump has passed: jfrac = np.minimum(np.maximum(((xinput - self.xjumplo) / (self.xjumphi - self.xjumplo)), 0.), 1.) # Conversion to "natural" x baseline: xnatural = xinput + jfrac * self.xjumpval else: xnatural = xinput return 2.0 * (xnatural - self.xmid)/self.xRange def traceset2xy(tset, xpos=None, ignore_jump=False): """Convert from a trace set to an array of x,y positions. Parameters ---------- tset : :class:`TraceSet` A :class:`TraceSet` object. xpos : array-like, optional If provided, evaluate the trace set at these positions. Otherwise the positions will be constructed from the trace set object iself. ignore_jump : bool, optional If ``True``, ignore any jump information in the `tset` object Returns ------- :func:`tuple` of array-like The x, y positions. """ return tset.xy(xpos, ignore_jump) def xy2traceset(xpos, ypos, **kwargs): """Convert from x,y positions to a trace set. Parameters ---------- xpos, ypos : array-like X,Y positions corresponding as [nx,Ntrace] arrays. invvar : array-like, optional Inverse variances for fitting. func : :class:`str`, optional Function type for fitting; defaults to 'legendre'. ncoeff : :class:`int`, optional Number of coefficients to fit. Defaults to 3. xmin, xmax : :class:`float`, optional Explicitly set minimum and maximum values, instead of computing them from `xpos`. maxiter : :class:`int`, optional Maximum number of rejection iterations; set to 0 for no rejection; default to 10. inmask : array-like, optional Mask set to 1 for good points and 0 for rejected points; same dimensions as `xpos`, `ypos`. Points rejected by `inmask` are always rejected from the fits (the rejection is "sticky"), and will also be marked as rejected in the outmask attribute. ia, inputans, inputfunc : array-like, optional These arguments will be passed to :func:`func_fit`. xjumplo : :class:`float`, optional x position locating start of an x discontinuity xjumphi : :class:`float`, optional x position locating end of that x discontinuity xjumpval : :class:`float`, optional magnitude of the discontinuity "jump" between those bounds (previous 3 keywords motivated by BOSS 2-phase readout) Returns ------- :class:`TraceSet` A :class:`TraceSet` object. """ return TraceSet(xpos, ypos, **kwargs) pydl-0.7.0/pydl/pydlutils/bspline.py0000644000076500000240000005672313434104050020120 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the bspline directory in idlutils. """ from warnings import warn import numpy as np from numpy.linalg.linalg import LinAlgError from scipy.linalg import cholesky_banded, cho_solve_banded from . import PydlutilsUserWarning from .math import djs_reject from .trace import fchebyshev from .. import uniq from ..goddard.math import flegendre class bspline(object): """B-spline class. Functions in the idlutils bspline library are implemented as methods on this class. Parameters ---------- x : :class:`numpy.ndarray` The data. nord : :class:`int`, optional The order of the B-spline. Default is 4, which is cubic. npoly : :class:`int`, optional Polynomial order to fit over 2nd variable, if supplied. If not supplied the order is 1. bkpt : :class:`numpy.ndarray`, optional To be documented. bkspread : :class:`float`, optional To be documented. Attributes ---------- breakpoints To be documented. nord To be documented. npoly To be documented. mask To be documented. coeff To be documented. icoeff To be documented. xmin To be documented. xmax To be documented. funcname To be documented. """ def __init__(self, x, nord=4, npoly=1, bkpt=None, bkspread=1.0, **kwargs): """Init creates an object whose attributes are similar to the structure returned by the ``create_bsplineset()`` function. """ # # Set the breakpoints. # if bkpt is None: startx = x.min() rangex = x.max() - startx if 'placed' in kwargs: w = ((kwargs['placed'] >= startx) & (kwargs['placed'] <= startx+rangex)) if w.sum() < 2: bkpt = np.arange(2, dtype='f') * rangex + startx else: bkpt = kwargs['placed'][w] elif 'bkspace' in kwargs: nbkpts = int(rangex/kwargs['bkspace']) + 1 if nbkpts < 2: nbkpts = 2 tempbkspace = rangex/float(nbkpts-1) bkpt = np.arange(nbkpts, dtype='f')*tempbkspace + startx elif 'nbkpts' in kwargs: nbkpts = kwargs['nbkpts'] if nbkpts < 2: nbkpts = 2 tempbkspace = rangex/float(nbkpts-1) bkpt = np.arange(nbkpts, dtype='f') * tempbkspace + startx elif 'everyn' in kwargs: npkpts = max(nx/kwargs['everyn'], 1) if nbkpts == 1: xspot = [0] else: xspot = int(nx/(nbkpts-1)) * np.arange(nbkpts, dtype='i4') bkpt = x[xspot].astype('f') else: raise ValueError('No information for bkpts.') imin = bkpt.argmin() imax = bkpt.argmax() if x.min() < bkpt[imin]: warn('Lowest breakpoint does not cover lowest x value: changing.', PydlutilsUserWarning) bkpt[imin] = x.min() if x.max() > bkpt[imax]: warn('Highest breakpoint does not cover highest x value: changing.', PydlutilsUserWarning) bkpt[imax] = x.max() nshortbkpt = bkpt.size fullbkpt = bkpt.copy() if nshortbkpt == 1: bkspace = np.float32(bkspread) else: bkspace = (bkpt[1] - bkpt[0]) * np.float32(bkspread) for i in np.arange(1, nord, dtype=np.float32): fullbkpt = np.insert(fullbkpt, 0, bkpt[0]-bkspace*i) fullbkpt = np.insert(fullbkpt, fullbkpt.shape[0], bkpt[nshortbkpt-1] + bkspace*i) # # Set the attributes # nc = fullbkpt.size - nord self.breakpoints = fullbkpt self.nord = nord self.npoly = npoly self.mask = np.ones((fullbkpt.size,), dtype='bool') if npoly > 1: self.coeff = np.zeros((npoly, nc), dtype='d') self.icoeff = np.zeros((npoly, nc), dtype='d') else: self.coeff = np.zeros((nc,), dtype='d') self.icoeff = np.zeros((nc,), dtype='d') self.xmin = 0.0 self.xmax = 1.0 self.funcname = 'legendre' return def fit(self, xdata, ydata, invvar, x2=None): """Calculate a B-spline in the least-squares sense. Fit is based on two variables: `xdata` which is sorted and spans a large range where breakpoints are required `ydata` which can be described with a low order polynomial. Parameters ---------- xdata : :class:`numpy.ndarray` Independent variable. ydata : :class:`numpy.ndarray` Dependent variable. invvar : :class:`numpy.ndarray` Inverse variance of `ydata`. x2 : :class:`numpy.ndarray`, optional Orthogonal dependent variable for 2d fits. Returns ------- :func:`tuple` A tuple containing an integer error code, and the evaluation of the b-spline at the input values. An error code of -2 is a failure, -1 indicates dropped breakpoints, 0 is success, and positive integers indicate ill-conditioned breakpoints. """ goodbk = self.mask[self.nord:] nn = goodbk.sum() if nn < self.nord: yfit = np.zeros(ydata.shape, dtype='f') return (-2, yfit) nfull = nn * self.npoly bw = self.npoly * self.nord a1, lower, upper = self.action(xdata, x2=x2) foo = np.tile(invvar, bw).reshape(bw, invvar.size).transpose() a2 = a1 * foo alpha = np.zeros((bw, nfull+bw), dtype='d') beta = np.zeros((nfull+bw,), dtype='d') bi = np.arange(bw, dtype='i4') bo = np.arange(bw, dtype='i4') for k in range(1, bw): bi = np.append(bi, np.arange(bw-k, dtype='i4')+(bw+1)*k) bo = np.append(bo, np.arange(bw-k, dtype='i4')+bw*k) for k in range(nn-self.nord+1): itop = k*self.npoly ibottom = min(itop, nfull) + bw - 1 ict = upper[k] - lower[k] + 1 if ict > 0: work = np.dot(a1[lower[k]:upper[k]+1, :].T, a2[lower[k]:upper[k]+1, :]) wb = np.dot(ydata[lower[k]:upper[k]+1], a2[lower[k]:upper[k]+1, :]) alpha.T.flat[bo+itop*bw] += work.flat[bi] beta[itop:ibottom+1] += wb min_influence = 1.0e-10 * invvar.sum() / nfull errb = cholesky_band(alpha, mininf=min_influence) if isinstance(errb[0], int) and errb[0] == -1: a = errb[1] else: yfit, foo = self.value(xdata, x2=x2, action=a1, upper=upper, lower=lower) return (self.maskpoints(errb[0]), yfit) sol = cholesky_solve(a, beta) if self.npoly > 1: self.icoeff[:, goodbk] = np.array(a[0, 0:nfull].reshape(self.npoly, nn), dtype=a.dtype) self.coeff[:, goodbk] = np.array(sol[0:nfull].reshape(self.npoly, nn), dtype=sol.dtype) else: self.icoeff[goodbk] = np.array(a[0, 0:nfull], dtype=a.dtype) self.coeff[goodbk] = np.array(sol[0:nfull], dtype=sol.dtype) yfit, foo = self.value(xdata, x2=x2, action=a1, upper=upper, lower=lower) return (0, yfit) def action(self, x, x2=None): """Construct banded B-spline matrix, with dimensions [ndata, bandwidth]. Parameters ---------- x : :class:`numpy.ndarray` Independent variable. x2 : :class:`numpy.ndarray`, optional Orthogonal dependent variable for 2d fits. Returns ------- :func:`tuple` A tuple containing the B-spline action matrix; the 'lower' parameter, a list of pixel positions, each corresponding to the first occurence of position greater than breakpoint indx; and 'upper', Same as lower, but denotes the upper pixel positions. """ nx = x.size nbkpt = self.mask.sum() if nbkpt < 2*self.nord: return (-2, 0, 0) n = nbkpt - self.nord gb = self.breakpoints[self.mask] bw = self.npoly*self.nord lower = np.zeros((n - self.nord + 1,), dtype='i4') upper = np.zeros((n - self.nord + 1,), dtype='i4') - 1 indx = self.intrv(x) bf1 = self.bsplvn(x, indx) action = bf1 aa = uniq(indx, np.arange(indx.size, dtype='i4')) upper[indx[aa]-self.nord+1] = aa rindx = indx[::-1] bb = uniq(rindx, np.arange(rindx.size, dtype='i4')) lower[rindx[bb]-self.nord+1] = nx - bb - 1 if x2 is not None: if x2.size != nx: raise ValueError('Dimensions of x and x2 do not match.') x2norm = 2.0 * (x2 - self.xmin) / (self.xmax - self.xmin) - 1.0 if self.funcname == 'poly': temppoly = np.ones((nx, self.npoly), dtype='f') for i in range(1, self.npoly): temppoly[:, i] = temppoly[:, i-1] * x2norm elif self.funcname == 'poly1': temppoly = np.tile(x2norm, self.npoly).reshape(nx, self.npoly) for i in range(1, self.npoly): temppoly[:, i] = temppoly[:, i-1] * x2norm elif self.funcname == 'chebyshev': temppoly = fchebyshev(x2norm, self.npoly) elif self.funcname == 'legendre': temppoly = flegendre(x2norm, self.npoly) else: raise ValueError('Unknown value of funcname.') action = np.zeros((nx, bw), dtype='d') counter = -1 for ii in range(self.nord): for jj in range(self.npoly): counter += 1 action[:, counter] = bf1[:, ii]*temppoly[:, jj] return (action, lower, upper) def intrv(self, x): """Find the segment between breakpoints which contain each value in the array `x`. The minimum breakpoint is ``nbkptord - 1``, and the maximum is ``nbkpt - nbkptord - 1``. Parameters ---------- x : :class:`numpy.ndarray` Data values, assumed to be monotonically increasing. Returns ------- :class:`numpy.ndarray` Position of array elements with respect to breakpoints. """ gb = self.breakpoints[self.mask] n = gb.size - self.nord indx = np.zeros((x.size,), dtype='i4') ileft = self.nord - 1 for i in range(x.size): while x[i] > gb[ileft+1] and ileft < n - 1: ileft += 1 indx[i] = ileft return indx def bsplvn(self, x, ileft): """Calculates the value of all possibly nonzero B-splines at `x` of a certain order. Parameters ---------- x : :class:`numpy.ndarray` Independent variable. ileft : :class:`int` Breakpoint segements that contain `x`. Returns ------- :class:`numpy.ndarray` B-spline values. """ bkpt = self.breakpoints[self.mask] vnikx = np.zeros((x.size, self.nord), dtype=x.dtype) deltap = vnikx.copy() deltam = vnikx.copy() j = 0 vnikx[:, 0] = 1.0 while j < self.nord - 1: ipj = ileft+j+1 deltap[:, j] = bkpt[ipj] - x imj = ileft-j deltam[:, j] = x - bkpt[imj] vmprev = 0.0 for l in range(j+1): vm = vnikx[:, l]/(deltap[:, l] + deltam[:, j-l]) vnikx[:, l] = vm*deltap[:, l] + vmprev vmprev = vm*deltam[:, j-l] j += 1 vnikx[:, j] = vmprev return vnikx def value(self, x, x2=None, action=None, lower=None, upper=None): """Evaluate a B-spline at specified values. Parameters ---------- x : :class:`numpy.ndarray` Independent variable. x2 : :class:`numpy.ndarray`, optional Orthogonal dependent variable for 2d fits. action : :class:`numpy.ndarray`, optional Action matrix to use. If not supplied it is calculated. lower : :class:`numpy.ndarray`, optional If the action parameter is supplied, this parameter must also be supplied. upper : :class:`numpy.ndarray`, optional If the action parameter is supplied, this parameter must also be supplied. Returns ------- :func:`tuple` A tuple containing the results of the bspline evaluation and a mask indicating where the evaluation was good. """ xsort = x.argsort() xwork = x[xsort] if x2 is not None: x2work = x2[xsort] else: x2work = None if action is not None: if lower is None or upper is None: raise ValueError('Must specify lower and upper if action is set.') else: action, lower, upper = self.action(xwork, x2=x2work) yfit = np.zeros(x.shape, dtype=x.dtype) bw = self.npoly * self.nord spot = np.arange(bw, dtype='i4') goodbk = self.mask.nonzero()[0] coeffbk = self.mask[self.nord:].nonzero()[0] n = self.mask.sum() - self.nord if self.npoly > 1: goodcoeff = self.coeff[:, coeffbk] else: goodcoeff = self.coeff[coeffbk] # maskthis = np.zeros(xwork.shape,dtype=xwork.dtype) for i in range(n-self.nord+1): ict = upper[i] - lower[i] + 1 if ict > 0: yfit[lower[i]:upper[i]+1] = np.dot( action[lower[i]:upper[i]+1, :], goodcoeff[i*self.npoly+spot]) yy = yfit.copy() yy[xsort] = yfit mask = np.ones(x.shape, dtype='bool') gb = self.breakpoints[goodbk] outside = ((x < gb[self.nord-1]) | (x > gb[n])) if outside.any(): mask[outside] = False hmm = ((np.diff(goodbk) > 2).nonzero())[0] for jj in range(hmm.size): inside = ((x >= self.breakpoints[goodbk[hmm[jj]]]) & (x <= self.breakpoints[goodbk[hmm[jj]+1]-1])) if inside.any(): mask[inside] = False return (yy, mask) def maskpoints(self, err): """Perform simple logic of which breakpoints to mask. Parameters ---------- err : :class:`numpy.ndarray` The list of indexes returned by the cholesky routines. Returns ------- :class:`int` An integer indicating the results of the masking. -1 indicates that the error points were successfully masked. -2 indicates failure; the calculation should be aborted. Notes ----- The mask attribute is modified, assuming it is possible to create the mask. """ nbkpt = self.mask.sum() if nbkpt <= 2*self.nord: return -2 hmm = err[np.unique(err/self.npoly)]/self.npoly n = nbkpt - self.nord if np.any(hmm >= n): return -2 test = np.zeros(nbkpt, dtype='bool') for jj in range(-np.ceil(nord/2.0), nord/2.0): foo = np.where((hmm+jj) > 0, hmm+jj, np.zeros(hmm.shape, dtype=hmm.dtype)) inside = np.where((foo+nord) < n-1, foo+nord, np.zeros(hmm.shape, dtype=hmm.dtype)+n-1) test[inside] = True if test.any(): reality = self.mask[test] if self.mask[reality].any(): self.mask[reality] = False return -1 else: return -2 else: return -2 def cholesky_band(l, mininf=0.0): """Compute *lower* Cholesky decomposition of a banded matrix. This function provides informative error messages to pass back to the :class:`~pydl.pydlutils.bspline.bspline` machinery; the actual computation is delegated to :func:`scipy.linalg.cholesky_banded`. Parameters ---------- l : :class:`numpy.ndarray` A matrix on which to perform the Cholesky decomposition. The matrix must be in a special, *lower* form described in :func:`scipy.linalg.cholesky_banded`. In addition, the input must be padded. If the original, square matrix has size :math:`N \\times N`, and the width of the band is :math:`b`, `l` must be :math:`b \\times (N + b)`. mininf : :class:`float`, optional Entries in the `l` matrix are considered negative if they are less than this value (default 0.0). Returns ------- :func:`tuple` If problems were detected, the first item will be the index or indexes where the problem was detected, and the second item will simply be the input matrix. If no problems were detected, the first item will be -1, and the second item will be the Cholesky decomposition. """ bw, nn = l.shape n = nn - bw negative = l[0, 0:n] <= mininf if negative.any() or not np.all(np.isfinite(l)): warn('Bad entries: ' + str(negative.nonzero()[0]), PydlutilsUserWarning) return (negative.nonzero()[0], l) try: lower = cholesky_banded(l[:, 0:n], lower=True) except LinAlgError: # # Figure out where the error is. # lower = l.copy() kn = bw - 1 spot = np.arange(kn, dtype='i4') + 1 for j in range(n): lower[0, j] = np.sqrt(lower[0, j]) lower[spot, j] /= lower[0, j] x = lower[spot, j] if not np.all(np.isfinite(x)): warn('NaN found in cholesky_band.', PydlutilsUserWarning) return (j, l) # # Restore padding. # L = np.zeros(l.shape, dtype=l.dtype) L[:, 0:n] = lower return (-1, L) def cholesky_solve(a, bb): """Solve the equation :math:`A x = b` where `a` is a *lower* Cholesky-banded matrix. In the :class:`~pydl.pydlutils.bspline.bspline` machinery, `a` needs to be padded. This function should only used with the output of :func:`~pydl.pydlutils.bspline.cholesky_band`, to ensure the proper padding on `a`. Otherwise the computation is delegated to :func:`scipy.linalg.cho_solve_banded`. Parameters ---------- a : :class:`numpy.ndarray` *Lower* Cholesky decomposition of :math:`A` in :math:`A x = b`. bb : :class:`numpy.ndarray` :math:`b` in :math:`A x = b`. Returns ------- :class:`numpy.ndarray` The solution, padded to be the same shape as `bb`. """ bw = a.shape[0] n = bb.shape[0] - bw x = np.zeros(bb.shape, dtype=bb.dtype) x[0:n] = cho_solve_banded((a[:, 0:n], True), bb[0:n]) return x def iterfit(xdata, ydata, invvar=None, upper=5, lower=5, x2=None, maxiter=10, **kwargs): """Iteratively fit a B-spline set to data, with rejection. Parameters ---------- xdata : :class:`numpy.ndarray` Independent variable. ydata : :class:`numpy.ndarray` Dependent variable. invvar : :class:`numpy.ndarray`, optional Inverse variance of `ydata`. If not set, it will be calculated based on the standard deviation. upper : :class:`int` or :class:`float`, optional Upper rejection threshold in units of sigma, defaults to 5 sigma. lower : :class:`int` or :class:`float`, optional Lower rejection threshold in units of sigma, defaults to 5 sigma. x2 : :class:`numpy.ndarray`, optional Orthogonal dependent variable for 2d fits. maxiter : :class:`int`, optional Maximum number of rejection iterations, default 10. Set this to zero to disable rejection. Returns ------- :func:`tuple` A tuple containing the fitted bspline object and an output mask. """ nx = xdata.size if ydata.size != nx: raise ValueError('Dimensions of xdata and ydata do not agree.') if invvar is not None: if invvar.size != nx: raise ValueError('Dimensions of xdata and invvar do not agree.') else: # # This correction to the variance makes it the same # as IDL's variance() # var = ydata.var()*(float(nx)/float(nx-1)) if var == 0: var = 1.0 invvar = np.ones(ydata.shape, dtype=ydata.dtype)/var if x2 is not None: if x2.size != nx: raise ValueError('Dimensions of xdata and x2 do not agree.') yfit = np.zeros(ydata.shape, dtype=ydata.dtype) if invvar.size == 1: outmask = True else: outmask = np.ones(invvar.shape, dtype='bool') xsort = xdata.argsort() maskwork = (outmask & (invvar > 0))[xsort] if 'oldset' in kwargs: sset = kwargs['oldset'] sset.mask = True sset.coeff = 0 else: if not maskwork.any(): raise ValueError('No valid data points.') if 'fullbkpt' in kwargs: fullbkpt = kwargs['fullbkpt'] else: sset = bspline(xdata[xsort[maskwork]], **kwargs) if maskwork.sum() < sset.nord: warn('Number of good data points fewer than nord.', PydlutilsUserWarning) return (sset, outmask) if x2 is not None: if 'xmin' in kwargs: xmin = kwargs['xmin'] else: xmin = x2.min() if 'xmax' in kwargs: xmax = kwargs['xmax'] else: xmax = x2.max() if xmin == xmax: xmax = xmin + 1 sset.xmin = xmin sset.xmax = xmax if 'funcname' in kwargs: sset.funcname = kwargs['funcname'] xwork = xdata[xsort] ywork = ydata[xsort] invwork = invvar[xsort] if x2 is not None: x2work = x2[xsort] else: x2work = None iiter = 0 error = 0 qdone = -1 while (error != 0 or qdone == -1) and iiter <= maxiter: goodbk = sset.mask.nonzero()[0] if maskwork.sum() <= 1 or not sset.mask.any(): sset.coeff = 0 iiter = maxiter + 1 else: if 'requiren' in kwargs: i = 0 while xwork[i] < sset.breakpoints[goodbk[sset.nord]] and i < nx-1: i += 1 ct = 0 for ileft in range(sset.nord, sset.mask.sum()-sset.nord+1): while (xwork[i] >= sset.breakpoints[goodbk[ileft]] and xwork[i] < sset.breakpoints[goodbk[ileft+1]] and i < nx-1): ct += invwork[i]*maskwork[i] > 0 i += 1 if ct >= kwargs['requiren']: ct = 0 else: sset.mask[goodbk[ileft]] = False error, yfit = sset.fit(xwork, ywork, invwork*maskwork, x2=x2work) iiter += 1 inmask = maskwork if error == -2: return (sset, outmask) elif error == 0: maskwork, qdone = djs_reject(ywork, yfit, invvar=invwork, inmask=inmask, outmask=maskwork, upper=upper, lower=lower) else: pass outmask[xsort] = maskwork temp = yfit yfit[xsort] = temp return (sset, outmask) pydl-0.7.0/pydl/pydlutils/cooling.py0000644000076500000240000000403113434104050020077 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the cooling directory in idlutils. """ from numpy import interp from astropy.io import ascii from astropy.utils.data import get_pkg_data_contents def read_ds_cooling(fname, logT=None): """Read in the `Sutherland & Dopita (1993) `_ cooling function. Parameters ---------- fname : { 'm-00.cie', 'm-05.cie', 'm+05.cie', 'm-10.cie', 'm-15.cie', 'm-20.cie', 'm-30.cie', 'mzero.cie' } Name of the data file to read. logT : :class:`numpy.ndarray`, optional If provided, values will be interpolated to the provided values. If not provided, the values in the data files will be returned. Returns ------- :func:`tuple` A tuple containing `logT` and `loglambda`, respectively. Raises ------ :exc:`ValueError` If the input file name is invalid. Notes ----- The data have been retrieved from http://www.mso.anu.edu.au/~ralph/data/cool/ and stored in the package. Examples -------- >>> from pydl.pydlutils.cooling import read_ds_cooling >>> logT, loglambda = read_ds_cooling('m-15.cie') >>> logT[0:5] # doctest: +NORMALIZE_WHITESPACE array([ 4. , 4.05, 4.1 , 4.15, 4.2 ]) >>> loglambda[0:5] # doctest: +NORMALIZE_WHITESPACE array([-26. , -24.66, -23.52, -22.62, -22.11]) """ if fname not in ('m-00.cie', 'm-05.cie', 'm+05.cie', 'm-10.cie', 'm-15.cie', 'm-20.cie', 'm-30.cie', 'mzero.cie'): raise ValueError('Invalid value for data file: {0}'.format(fname)) coolingfile = get_pkg_data_contents('data/cooling/' + fname) data = ascii.read(coolingfile.split('\n')[2:], delimiter='\t') if logT is None: return (data['log(T)'].data, data['log(lambda net)'].data) else: loglambda = interp(logT, data['log(T)'].data, data['log(lambda net)'].data) return (logT, loglambda) pydl-0.7.0/pydl/pydlutils/yanny.py0000644000076500000240000013645113434104050017617 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the yanny directory in idlutils. This is a Python library for reading & writing yanny files. :class:`yanny` is an object-oriented interface to SDSS Parameter files (a.k.a. "FTCL" or "yanny" files) following these specifications_. Parameter files typically have and can be recognized by the extension ``.par``. These files may also be recognized if the first line of the file is:: #%yanny This is not part of the standard specification, but it suffices to identify, *e.g.*, files that have not yet been written to disk, but only exist as file objects. Because Parameter files can contain multiple tables, as well as metadata, there is no simple, one-to-one correspondence between these files and, say, an astropy :class:`~astropy.table.Table` object. Thus, :class:`yanny` objects are implemented as a subclass of :class:`~collections.OrderedDict` (to remember the order of keyword-value pairs), and the actual data are values accessed by keyword. Still, it is certainly possible to *write* a table-like object to a yanny file. Given the caveats above, we have introduced *experimental* support for reading and writing yanny files directly to/from :class:`~astropy.table.Table` objects. Because of the experimental nature of this support, it must be activated "by hand" by including this snippet in your code:: from astropy.table import Table from astropy.io.registry import (register_identifier, register_reader, register_writer) from pydl.pydlutils.yanny import (is_yanny, read_table_yanny, write_table_yanny) register_identifier('yanny', Table, is_yanny) register_reader('yanny', Table, read_table_yanny) register_writer('yanny', Table, write_table_yanny) Currently multidimensional arrays are only supported for type ``char``, and a close reading of the specifications indicates that multidimensional arrays were only ever intended to be supported for type ``char``. So no multidimensional arrays, sorry. .. _specifications: https://www.sdss.org/dr14/software/par/ """ import re import os import datetime import warnings from collections import OrderedDict import six import numpy as np from astropy.table import Table # from astropy.io.registry import register_identifier, register_writer from . import PydlutilsException, PydlutilsUserWarning class yanny(OrderedDict): """An object interface to a yanny file. Create a yanny object using a yanny file, `filename`. If the file exists, it is read, & the dict structure of the object will be basically the same as that returned by ``read_yanny()`` in the efftickle package. If the file does not exist, or if no filename is given, a blank structure is returned. Other methods allow for subsequent writing to the file. Parameters ---------- filename : :class:`str` or file-like, optional The name of a yanny file or a file-like object representing a yanny file. raw : :class:`bool`, optional If ``True``, data in a yanny file will *not* be converted into astropy :class:`~astropy.table.Table` objects, but will instead be retained as raw Python lists. Attributes ---------- raw : :class:`bool` If ``True``, data in a yanny file will *not* be converted into astropy :class:`~astropy.table.Table` objects, but will instead be retained as raw Python lists. filename : :class:`str` The name of a yanny parameter file. If a file-like object was used to initialize the object, this will have the value 'in_memory.par'. _symbols : :class:`dict` A dictionary containing the metadata describing the tables. _contents : :class:`str` The complete contents of a yanny parameter file. _struct_type_caches : :class:`dict` A dictionary of dictionaries, one dictionary for every structure definition in a yanny parameter file. Contains the types of each column _struct_isarray_caches : :class:`dict` A dictionary of dictionaries, one dictionary for every structure definition in a yanny parameter file. Contains a boolean value for every column. _enum_cache : :class:`dict` Initially ``None``, this attribute is initialized the first time the :meth:`isenum` method is called. The keyword is the name of the enum type, the value is a list of the possible values of that type. """ @staticmethod def get_token(string): """Removes the first 'word' from string. If the 'word' is enclosed in double quotes, it returns the contents of the double quotes. If the 'word' is enclosed in braces, it returns the contents of the braces, but does not attempt to split the array. If the 'word' is the last word of the string, remainder is set equal to the empty string. This is basically a wrapper on some convenient regular expressions. Parameters ---------- string : :class:`str` A string containing words. Returns ------- :func:`tuple` A tuple containing the first word and the remainder of the string. Examples -------- >>> from pydl.pydlutils.yanny import yanny >>> yanny.get_token("The quick brown fox") ('The', 'quick brown fox') """ if string[0] == '"': (word, remainder) = re.search(r'^"([^"]*)"\s*(.*)', string).groups() elif string[0] == '{': (word, remainder) = re.search(r'^\{\s*([^}]*)\s*\}\s*(.*)', string).groups() else: try: (word, remainder) = re.split(r'\s+', string, 1) except ValueError: (word, remainder) = (string, '') # if remainder is None: # remainder = '' return (word, remainder) @staticmethod def protect(x): """Used to appropriately quote string that might contain whitespace. This method is mostly for internal use by the yanny object. Parameters ---------- x : :class:`str` The data to protect. Returns ------- :class:`str` The data with white space protected by quotes. Examples -------- >>> from pydl.pydlutils.yanny import yanny >>> yanny.protect('This string contains whitespace.') '"This string contains whitespace."' >>> yanny.protect('This string contains a #hashtag.') '"This string contains a #hashtag."' """ if isinstance(x, np.bytes_): s = x.decode() else: s = str(x) if len(s) == 0 or s.find('#') >= 0 or re.search(r'\s+', s) is not None: return '"' + s + '"' else: return s @staticmethod def trailing_comment(line): """Identify a trailing comment and strip it. This routine works on the theory that a properly quoted comment mark will be surrounted by an odd number of double quotes, & we can skip to searching for the last one in the line. Parameters ---------- line : :class:`str` A line from a yanny file potentially containing trailing comments. Returns ------- :class:`str` The line with any trailing comment and any residual white space trimmed off. Notes ----- This may fail in certain pathological cases, for example if a real trailing comment contains a single double-quote:: # a 'pathological" trailing comment or if someone is over-enthusiastically commenting:: # # # # # I like # characters. Examples -------- >>> from pydl.pydlutils.yanny import yanny >>> yanny.trailing_comment('mystruct 1234 "#hashtag" # a comment.') 'mystruct 1234 "#hashtag"' >>> yanny.trailing_comment('mystruct 1234 "#hashtag" # a "comment".') 'mystruct 1234 "#hashtag"' """ lastmark = line.rfind('#') if lastmark >= 0: # # Count the number of double quotes in the remainder of the line # if (len([c for c in line[lastmark:] if c == '"']) % 2) == 0: # # Even number of quotes # return line[0:lastmark].rstrip() return line @staticmethod def dtype_to_struct(dt, structname='mystruct', enums=None): """Convert a NumPy dtype object describing a record array to a typedef struct statement. The second argument is the name of the structure. If any of the columns are enum types, enums must be a dictionary with the keys the column names, and the values are a tuple containing the name of the enum type as the first item and a tuple or list of possible values as the second item. Parameters ---------- dt : :class:`numpy.dtype` The dtype of a NumPy record array. structname : :class:`str`, optional The name to give the structure in the yanny file. Defaults to 'MYSTRUCT'. enums : :class:`dict`, optional A dictionary containing enum information. See details above. Returns ------- :class:`dict` A dictionary suitable for setting the 'symbols' dictionary of a new yanny object. Examples -------- """ dtmap = {'i2': 'short', 'i4': 'int', 'i8': 'long', 'f4': 'float', 'f8': 'double'} returnenums = list() if enums is None: enums = dict() else: for e in enums: lines = list() lines.append('typedef enum {') for n in enums[e][1]: lines.append(" {0},".format(n)) lines[-1] = lines[-1].strip(',') lines.append('}} {0};'.format(enums[e][0].upper())) returnenums.append("\n".join(lines)) # lines.append('') lines = list() lines.append('typedef struct {') for c in dt.names: if dt[c].kind == 'V': t = dt[c].subdtype[0].str[1:] l = dt[c].subdtype[1][0] s = dt[c].subdtype[0].itemsize else: t = dt[c].str[1:] l = 0 s = dt[c].itemsize line = ' ' if t[0] in 'SU': if c in enums: line += enums[c][0].upper() else: line += 'char' else: line += dtmap[t] line += ' {0}'.format(c) if l > 0: line += "[{0:d}]".format(l) if t[0] in 'SU' and c not in enums: line += "[{0:d}]".format(s) line += ';' lines.append(line) lines.append('}} {0};'.format(structname.upper())) return {structname.upper(): list(dt.names), 'enum': returnenums, 'struct': ["\n".join(lines)]} def __init__(self, filename=None, raw=False): """Create a yanny object using a yanny file. """ super(yanny, self).__init__() # # The symbol hash is inherited from the old read_yanny # self._symbols = dict() # # Create special attributes that contain the internal status of the # object. This should prevent overlap with keywords in the data files. # self.filename = '' self._contents = '' # # Since the re is expensive, cache the structure types keyed by # the field. Create a dictionary for each structure found. # self._struct_type_caches = dict() self._struct_isarray_caches = dict() self._enum_cache = None # # Optionally convert numeric data into NumPy arrays # self.raw = raw # # If the file exists, read it # if filename is not None: # # Handle file-like objects # if isinstance(filename, six.string_types): if os.access(filename, os.R_OK): self.filename = filename with open(filename, 'r') as f: self._contents = f.read() else: # # Assume file-like # self.filename = 'in_memory.par' contents = filename.read() if 'b' in filename.mode: contents = contents.decode('ascii') self._contents = contents self._parse() return def __str__(self): """Implement the ``str()`` function for yanny objects. Simply prints the current contents of the yanny file. """ return self._contents __repr__ = __str__ def __eq__(self, other): """Test two yanny objects for equality. Two yanny objects are assumed to be equal if their contents are equal. """ if isinstance(other, yanny): return self._contents == other._contents return NotImplemented def __ne__(self, other): """Test two yanny objects for inequality. Two yanny objects are assumed to be unequal if their contents are unequal. """ if isinstance(other, yanny): return self._contents != other._contents return NotImplemented def __bool__(self): """Give a yanny object a definite truth value. A yanny object is considered ``True`` if its contents are non-zero. """ return len(self._contents) > 0 # `__nonzero__` is needed for Python 2. # Python 3 uses `__bool__`. # http://stackoverflow.com/a/2233850/498873 __nonzero__ = __bool__ def type(self, structure, variable): """Returns the type of a variable defined in a structure. Returns ``None`` if the structure or the variable is undefined. Parameters ---------- structure : :class:`str` The name of the structure that contains `variable`. variable : :class:`str` The name of the column whose type you want. Returns ------- :class:`str` The type of the variable. """ if structure not in self: return None if variable not in self.columns(structure): return None # # Added code to cache values to speed up parsing large files. # 2009.05.11 / Demitri Muna, NYU # Find (or create) the cache for this structure. # try: cache = self._struct_type_caches[structure] except KeyError: self._struct_type_caches[structure] = dict() # cache for one struct type cache = self._struct_type_caches[structure] # # Lookup (or create) the value for this variable # try: var_type = cache[variable] except KeyError: defl = [x for x in self._symbols['struct'] if x.find(structure.lower()) > 0] defu = [x for x in self._symbols['struct'] if x.find(structure.upper()) > 0] if len(defl) != 1 and len(defu) != 1: return None elif len(defl) == 1: definition = defl else: definition = defu typere = re.compile( r'(\S+)\s+{0}([\[<].*[\]>]|);'.format(variable)) (typ, array) = typere.search(definition[0]).groups() var_type = typ + array.replace('<', '[').replace('>', ']') cache[variable] = var_type return var_type def basetype(self, structure, variable): """Returns the bare type of a variable, stripping off any array information. Parameters ---------- structure : :class:`str` The name of the structure that contains `variable`. variable : :class:`str` The name of the column whose type you want. Returns ------- :class:`str` The type of the variable, stripped of array information. """ typ = self.type(structure, variable) try: return typ[0:typ.index('[')] except ValueError: return typ def isarray(self, structure, variable): """Returns ``True`` if the variable is an array type. For character types, this means a two-dimensional array, *e.g.*: ``char[5][20]``. Parameters ---------- structure : :class:`str` The name of the structure that contains `variable`. variable : :class:`str` The name of the column to check for array type. Returns ------- :class:`bool` ``True`` if the variable is an array. """ try: cache = self._struct_isarray_caches[structure] except KeyError: self._struct_isarray_caches[structure] = dict() cache = self._struct_isarray_caches[structure] try: result = cache[variable] except KeyError: typ = self.type(structure, variable) character_array = re.compile(r'char[\[<]\d*[\]>][\[<]\d*[\]>]') if ((character_array.search(typ) is not None) or (typ.find('char') < 0 and (typ.find('[') >= 0 or typ.find('<') >= 0))): cache[variable] = True else: cache[variable] = False result = cache[variable] return result def isenum(self, structure, variable): """Returns true if a variable is an enum type. Parameters ---------- structure : :class:`str` The name of the structure that contains `variable`. variable : :class:`str` The name of the column to check for enum type. Returns ------- :class:`bool` ``True`` if the variable is enum type. """ if self._enum_cache is None: self._enum_cache = dict() if 'enum' in self._symbols: for e in self._symbols['enum']: m = re.search(r'typedef\s+enum\s*\{([^}]+)\}\s*(\w+)\s*;', e).groups() self._enum_cache[m[1]] = re.split(r',\s*', m[0].strip()) else: return False return self.basetype(structure, variable) in self._enum_cache def array_length(self, structure, variable): """Returns the length of an array type or 1 if the variable is not an array. For character types, this is the length of a two-dimensional array, *e.g.*, ``char[5][20]`` has length 5. Parameters ---------- structure : :class:`str` The name of the structure that contains `variable`. variable : :class:`str` The name of the column to check for array length. Returns ------- :class:`int` The length of the array variable """ if self.isarray(structure, variable): typ = self.type(structure, variable) return int(typ[typ.index('[')+1:typ.index(']')]) else: return 1 def char_length(self, structure, variable): """Returns the length of a character field. *e.g.* ``char[5][20]`` is an array of 5 strings of length 20. Returns ``None`` if the variable is not a character type. If the length is not specified, *i.e.* ``char[]``, it returns the length of the largest string. Parameters ---------- structure : :class:`str` The name of the structure that contains `variable`. variable : :class:`str` The name of the column to check for char length. Returns ------- :class:`int` or None The length of the char variable. """ typ = self.type(structure, variable) if typ.find('char') < 0: return None try: return int(typ[typ.rfind('[')+1:typ.rfind(']')]) except ValueError: if self.isarray(structure, variable): return max([max([len(x) for x in r]) for r in self[structure][variable]]) else: return max([len(x) for x in self[structure][variable]]) def dtype(self, structure): """Returns a NumPy dtype object suitable for describing a table as a record array. Treats enums as string, which is what the IDL reader does. Parameters ---------- structure : :class:`str` The name of the structure. Returns ------- :class:`numpy.dtype` A dtype object suitable for describing the yanny structure as a record array. """ dt = list() dtmap = {'short': 'i2', 'int': 'i4', 'long': 'i8', 'float': 'f', 'double': 'd'} for c in self.columns(structure): typ = self.basetype(structure, c) if typ == 'char': d = "S{0:d}".format(self.char_length(structure, c)) elif self.isenum(structure, c): d = "S{0:d}".format(max([len(x) for x in self._enum_cache[typ]])) else: d = dtmap[typ] if self.isarray(structure, c): dt.append((str(c), str(d), (self.array_length(structure, c),))) else: dt.append((str(c), str(d))) dt = np.dtype(dt) return dt def convert(self, structure, variable, value): """Converts value into the appropriate (Python) type. * ``short`` & ``int`` are converted to Python :class:`int`. * ``long`` is converted to Python :class:`long`. * ``float`` & ``double`` are converted to Python :class:`float`. * Other types are not altered. There may be further conversions into NumPy types, but this is the first stage. Parameters ---------- structure : :class:`str` The name of the structure that contains `variable`. variable : :class:`str` The name of the column undergoing conversion. value : :class:`str` The value contained in a particular row of `variable`. Returns ------- :class:`int`, :class:`long`, :class:`float` or :class:`str` `value` converted to a Python numerical type. """ if six.PY3: intTypes = set(['short', 'int', 'long']) longTypes = set() mylong = int else: intTypes = set(['short', 'int']) longTypes = set(['long']) mylong = long floatTypes = set(['float', 'double']) typ = self.basetype(structure, variable) if typ in intTypes: if self.isarray(structure, variable): return [int(v) for v in value] else: return int(value) if typ in longTypes: if self.isarray(structure, variable): return [mylong(v) for v in value] else: return mylong(value) if typ in floatTypes: if self.isarray(structure, variable): return [float(v) for v in value] else: return float(value) return value def tables(self): """Returns a list of all the defined structures. This is just the list of keys of the object with the 'internal' keys removed. """ foo = list() for k in self._symbols.keys(): if k not in ('struct', 'enum'): foo.append(k) return foo def columns(self, table): """Returns an ordered list of column names associated with a particular table. The order is the same order as they are defined in the yanny file. Parameters ---------- table : :class:`str` The table whose columns are desired. Returns ------- :class:`list` The list of column names. """ foo = list() if table in self._symbols: return self._symbols[table] return foo def size(self, table): """Returns the number of rows in a table. Parameters ---------- table : :class:`str` The table whose size desired. Returns ------- :class:`int` The number of rows in `table`. """ foo = self.columns(table) return len(self[table][foo[0]]) def pairs(self): """Returns a list of keys to keyword/value pairs. Equivalent to doing ``self.keys()``, but with all the data tables & other control structures stripped out. """ p = list() foo = self.tables() for k in self.keys(): if k not in foo: p.append(k) return p def row(self, table, index): """Returns a list containing a single row from a specified table in column order. If index is out of range, it returns an empty list. If the yanny object instance is set up for NumPy record arrays, then a single row can be obtained with:: row0 = par['TABLE'][0] Parameters ---------- table : :class:`str` The table whose row is desired. index : :class:`int` The number of the row to return. Returns ------- :class:`list` A row from `table`. """ datarow = list() if table in self and index >= 0 and index < self.size(table): for c in self.columns(table): datarow.append(self[table][c][index]) return datarow def list_of_dicts(self, table): """Construct a list of dictionaries. Takes a table from the yanny object and constructs a list object containing one row per entry. Each item in the list is a dictionary keyed by the struct value names. If the yanny object instance is set up for NumPy record arrays, then the same functionality can be obtained with:: foo = par['TABLE'][0]['column'] Parameters ---------- table : :class:`str` The table to convert Returns ------- :class:`list` A list containing the rows of `table` converted to :class:`dict`. """ return_list = list() d = dict() # I'm assuming these are in order... struct_fields = self.columns(table) for i in range(self.size(table)): one_row = self.row(table, i) # one row as a list j = 0 for key in struct_fields: d[key] = one_row[j] j = j + 1 return_list.append(dict(d)) # append a new dict (copy of d) return return_list def new_dict_from_pairs(self): """Returns a new dictionary of keyword/value pairs. The new dictionary (*i.e.*, not a yanny object) contains the keys that :meth:`pairs` returns. There are two reasons this is convenient: * the key 'symbols' that is part of the yanny object will not be present * a simple yanny file can be read with no further processing Returns ------- :class:`~collections.OrderedDict` A dictionary of the keyword-value pairs that remembers the order in which they were defined in the file. Examples -------- Read a yanny file and return only the pairs:: >>> from os.path import dirname >>> from pydl.pydlutils.yanny import yanny >>> new_dict = yanny(dirname(__file__)+'/tests/t/test.par').new_dict_from_pairs() >>> new_dict['mjd'] '54579' >>> new_dict['alpha'] 'beta gamma delta' added: Demitri Muna, NYU 2009-04-28 """ new_dictionary = OrderedDict() for key in self.pairs(): new_dictionary[key] = self[key] return new_dictionary def write(self, newfile=None, comments=None): """Write a yanny object to a file. This assumes that the filename used to create the object was not that of a pre-existing file. If a file of the same name is detected, this method will *not* attempt to overwrite it, but will print a warning. This also assumes that the special 'symbols' key has been properly created. This will not necessarily make the file very human-readable, especially if the data lines are long. If the name of a new file is given, it will write to the new file (assuming it doesn't exist). If the writing is successful, the data in the object will be updated. Parameters ---------- newfile : :class:`str`, optional The name of the file to write. comments : :class:`str` or :class:`list` of :class:`str`, optional Comments that will be placed at the head of the file. If a single string is passed, it will be written out verbatim, although a '#' character will be added if it does not already have one. If a list of strings is passed, comment characters will be added and the strings will be joined together. """ if newfile is None: if len(self.filename) > 0: newfile = self.filename else: raise ValueError("No filename specified!") if os.access(newfile, os.F_OK): raise PydlutilsException( "{0} exists, aborting write!".format(newfile)) if comments is None: basefile = os.path.basename(newfile) timestamp = datetime.datetime.utcnow().strftime( '%Y-%m-%d %H:%M:%S UTC') comments = "#\n# {0}\n#\n# Created by pydl.pydlutils.yanny.yanny\n#\n# {1}\n#\n".format(basefile, timestamp) else: if isinstance(comments, six.string_types): if not comments.startswith('#'): comments = '# ' + comments if not comments.endswith('\n'): comments += '\n' else: comments = ("\n".join(["# {0}".format(c) for c in comments]) + "\n") contents = "#%yanny\n" + comments # # Print any key/value pairs # for key in self.pairs(): contents += "{0} {1}\n".format(key, self[key]) # # Print out enum definitions # if len(self._symbols['enum']) > 0: contents += "\n" + "\n\n".join(self._symbols['enum']) + "\n" # # Print out structure definitions # if len(self._symbols['struct']) > 0: contents += "\n" + "\n\n".join(self._symbols['struct']) + "\n" contents += "\n" # # Print out the data tables # for sym in self.tables(): columns = self.columns(sym) for k in range(self.size(sym)): line = list() line.append(sym) for col in columns: if self.isarray(sym, col): datum = ('{' + ' '.join([self.protect(x) for x in self[sym][col][k]]) + '}') else: datum = self.protect(self[sym][col][k]) line.append(datum) contents += "{0}\n".format(' '.join(line)) # # Actually write the data to file # with open(newfile, 'w') as f: f.write(contents) self._contents = contents self.filename = newfile self._parse() return def append(self, datatable): """Appends data to an existing FTCL/yanny file. Tries as much as possible to preserve the ordering & format of the original file. The datatable should adhere to the format of the yanny object, but it is not necessary to reproduce the 'symbols' dictionary. It will not try to append data to a file that does not exist. If the append is successful, the data in the object will be updated. Parameters ---------- datatable : :class:`dict` The data to append. """ if len(self.filename) == 0: raise ValueError("No filename is set for this object. " + "Use the filename attribute to set the filename!") if not isinstance(datatable, dict): raise ValueError("Data to append is not of the correct type. " + "Use a dict!") timestamp = datetime.datetime.utcnow().strftime( '%Y-%m-%d %H:%M:%S UTC') contents = '' # # Print any key/value pairs # for key in datatable.keys(): if key.upper() in self.tables() or key == 'symbols': continue contents += "{0} {1}\n".format(key, datatable[key]) # # Print out the data tables # for sym in self.tables(): if sym.lower() in datatable: datasym = sym.lower() else: datasym = sym if datasym in datatable: columns = self.columns(sym) for k in range(len(datatable[datasym][columns[0]])): line = list() line.append(sym) for col in columns: if self.isarray(sym, col): datum = ('{' + ' '.join([self.protect(x) for x in datatable[datasym][col][k]]) + '}') else: datum = self.protect(datatable[datasym][col][k]) line.append(datum) contents += "{0}\n".format(' '.join(line)) # # Actually write the data to file # if len(contents) > 0: contents = ("# Appended by yanny.py at {0}.\n".format(timestamp) + contents) if os.access(self.filename, os.W_OK): with open(self.filename, 'a') as f: f.write(contents) self._contents += contents self._parse() else: raise PydlutilsException(self.filename + " does not exist, aborting append!") else: warnings.warn("Nothing to be appended!", PydlutilsUserWarning) return def _parse(self): """Converts text into tables that users can use. This method is for use internally by the yanny object. It is not meant to be called by users. Parsing proceeds in this order: #. Lines that end with a backslash character ``\`` are reattached to following lines. #. Structure & enum definitions are identified, saved into the 'symbols' dictionary & stripped from the contents. #. Structure definitions are interpreted. #. At this point, the remaining lines of the original file can only contain these things: * 'blank' lines, including lines that only contain comments * keyword/value pairs * structure rows #. The remaining lines are scanned sequentially. #. 'Blank' lines are identified & ignored. #. Whitespace & comments are stripped from non-blank lines. #. Empty double braces ``{{}}`` are converted into empty double quotes ``""``. #. If the first word on a line matches the name of a structure, the line is broken up into tokens & each token or set of tokens (for arrays) is converted to the appropriate Python type. #. If the first word on a line does not match the name of a structure, it must be a keyword, so this line is interpreted as a keyword/value pair. No further processing is done to the value. #. At the conclusion of parsing, if ``self.raw`` is ``False``, the structures are converted into NumPy record arrays. """ # # there are five things we might find # 1. 'blank' lines including comments # 2. keyword/value pairs (which may have trailing comments) # 3. enumeration definitions # 4. structure definitions # 5. data # lines = self._contents # # Reattach lines ending with \ # lines = re.sub(r'\\\s*\n', ' ', lines) # # Find structure & enumeration definitions & strip them out # self._symbols['struct'] = re.findall( r'typedef\s+struct\s*\{[^}]+\}\s*\w+\s*;', lines) self._symbols['enum'] = re.findall( r'typedef\s+enum\s*\{[^}]+\}\s*\w+\s*;', lines) lines = re.sub(r'typedef\s+struct\s*\{[^}]+\}\s*\w+\s*;', '', lines) lines = re.sub(r'typedef\s+enum\s*\{[^}]+\}\s*\w+\s*;', '', lines) # # Interpret the structure definitions # typedefre = re.compile(r'typedef\s+struct\s*\{([^}]+)\}\s*(\w*)\s*;') for typedef in self._symbols['struct']: typedefm = typedefre.search(typedef) (definition, name) = typedefm.groups() self[name.upper()] = dict() self._symbols[name.upper()] = list() definitions = re.findall(r'\S+\s+\S+;', definition) for d in definitions: d = d.replace(';', '') (datatype, column) = re.split(r'\s+', d) column = re.sub(r'[\[<].*[\]>]$', '', column) self._symbols[name.upper()].append(column) self[name.upper()][column] = list() # Remove lines containing only comments comments = re.compile(r'^\s*#') # Remove lines containing only whitespace blanks = re.compile(r'^\s*$') # # Remove trailing comments, but not if they are enclosed in quotes. # # trailing_comments = re.compile(r'\s*\#.*$') # trailing_comments = re.compile(r'\s*\#[^"]+$') # # Double empty braces get replaced with empty quotes # double_braces = re.compile(r'\{\s*\{\s*\}\s*\}') if len(lines) > 0: for line in lines.split('\n'): if len(line) == 0: continue if comments.search(line) is not None: continue if blanks.search(line) is not None: continue # # Remove leading & trailing blanks & comments # line = line.strip() line = self.trailing_comment(line) # line = trailing_comments.sub('',line) line = double_braces.sub('""', line) # # Now if the first word on the line does not match a # structure definition it is a keyword/value pair # (key, value) = self.get_token(line) uckey = key.upper() if uckey in self._symbols: # # Structure data # for column in self._symbols[uckey]: if len(value) > 0 and blanks.search(value) is None: (data, value) = self.get_token(value) if self.isarray(uckey, column): # # An array value # if it's character data, it won't be # delimited by {} unless it is a multidimensional # string array. It may or may not be delimited # by double quotes # # Note, we're assuming here that the only # multidimensional arrays are string arrays # arraydata = list() while len(data) > 0: (token, data) = self.get_token(data) arraydata.append(token) self[uckey][column].append( self.convert(uckey, column, arraydata)) else: # # A single value # self[uckey][column].append( self.convert(uckey, column, data)) else: break else: # # Keyword/value pair # self[key] = value # # If self.raw is False, convert tables into NumPy record arrays # if not self.raw: for t in self.tables(): record = np.zeros((self.size(t),), dtype=self.dtype(t)) for c in self.columns(t): record[c] = self[t][c] self[t] = record.view(np.recarray) return def write_ndarray_to_yanny(filename, datatables, structnames=None, enums=None, hdr=None, comments=None): """Converts a NumPy record array into a new FTCL/yanny file. Returns a new yanny object corresponding to the file. Parameters ---------- filename : :class:`str` The name of a parameter file. datatables : :class:`numpy.ndarray`, :class:`numpy.recarray` or :class:`list` of these. A NumPy record array containing data that can be copied into a `yanny` object. structnames : :class:`str` or :class:`list` of :class:`str`, optional The name(s) to give the structure(s) in the yanny file. Defaults to 'MYSTRUCT0'. enums : :class:`dict`, optional A dictionary containing enum information. See the documentation for the :meth:`~pydl.pydlutils.yanny.yanny.dtype_to_struct` method of the yanny object. hdr : :class:`dict`, optional A dictionary containing keyword/value pairs for the 'header' of the yanny file. comments : :class:`str` or :class:`list` of :class:`str`, optional A string containing comments that will be added to the start of the new file. Returns ------- :class:`yanny` The `yanny` object resulting from writing the file. Raises ------ PydlutilsException If `filename` already exists, or if the metadata are incorrect. """ par = yanny(filename) if par: # # If the file already exists # raise PydlutilsException( "Apparently {0} already exists.".format(filename)) if isinstance(datatables, (np.ndarray, np.recarray, Table)): datatables = (datatables,) if structnames is None: structnames = ["MYSTRUCT{0:d}".format(k) for k in range(len(datatables))] if isinstance(structnames, six.string_types): structnames = (structnames,) if len(datatables) != len(structnames): raise PydlutilsException( "The data tables and their names do not match!") for k in range(len(datatables)): struct = par.dtype_to_struct(datatables[k].dtype, structname=structnames[k], enums=enums) par._symbols['struct'] += struct['struct'] par._symbols[structnames[k].upper()] = struct[structnames[k].upper()] if enums is not None and len(par._symbols['enum']) == 0: par._symbols['enum'] = struct['enum'] par[structnames[k].upper()] = datatables[k] if hdr is not None: for key in hdr: par[key] = hdr[key] par.write(filename, comments=comments) return par def is_yanny(origin, path, fileobj, *args, **kwargs): """Identifies Yanny files or objects. This function is for use with :func:`~astropy.io.registry.register_identifier`. Parameters ---------- origin : :class:`str` 'read' or 'write' path : :class:`str` Path to the file. fileobj : file object Open file object, if available. Returns ------- :class:`bool` ``True`` if the file or object is a Yanny file. """ if fileobj is not None: loc = fileobj.tell() fileobj.seek(0) try: signature = fileobj.read(7) finally: fileobj.seek(loc) return signature == b'#%yanny' elif path is not None: return path.endswith('.par') return isinstance(args[0], yanny) def read_table_yanny(filename, tablename=None): """Read a yanny file into a :class:`~astropy.table.Table`. Because yanny files can contain multiple tables, it is necessary to specify the table name to return. However, all "headers" (keyword-value pairs) will be included in the Table metadata. This function is for use with :func:`~astropy.io.registry.register_reader`. Parameters ---------- filename : :class:`str` Name of the file to read. tablename : :class:`str` The name of the table to read from the file. Returns ------- :class:`~astropy.table.Table` The table read from the file. Raises ------ :exc:`PydlutilsException` If `tablename` is not set. :exc:`KeyError` If `tablename` does not exist in the file. """ if tablename is None: raise PydlutilsException("The name of the table is required!") # # When opened by Table.read(), the filename will actually be a file-like # object opened in *binary* mode. # par = yanny(filename) try: t0 = par[tablename.upper()] except KeyError: raise KeyError("No table named {0} in {1}!".format(tablename, filename)) t = Table(t0) t.meta = par.new_dict_from_pairs() return t def write_table_yanny(table, filename, tablename=None, overwrite=False): """Write a :class:`~astropy.table.Table` to a yanny file. If `table` has any metadata, it will be written to the file as well. This function is for use with :func:`~astropy.io.registry.register_writer`. Parameters ---------- table : :class:`astropy.table.Table` The object to be written. filename : :class:`str` Name of the file to write to. tablename : :class:`str`, optional Name to give `table` within the file. overwrite : :class:`bool`, optional If ``True``, any existing file will be silently overwritten. """ if overwrite and os.path.exists(filename): os.remove(filename) if table.meta: hdr = table.meta else: hdr = None write_ndarray_to_yanny(filename, table, structnames=tablename, hdr=hdr, comments='Table') pydl-0.7.0/pydl/pydlutils/mangle.py0000644000076500000240000006576013434104050017730 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the mangle directory in idlutils. Mangle_ [1]_ is a software suite that supports the concept of polygons on a sphere, and is used to, for example, describe the `window function`_ of the Sloan Digital Sky Survey. This implementation is intended to support the portions of Mangle that are included in idlutils. To simplify the coding somewhat, unlike idlutils, the caps information is accessed through ``polygon.x`` and ``polygon.cm``, not ``polygon.caps.x`` or ``polygon.caps.cm``. Window function operations are already supported by :func:`~is_in_polygon` and :func:`~is_in_window`. However, calculation of polygon area (solid angle) from first principles is not yet implemented. Note that in traditional geometry "spherical polygon" means a figure bounded by *great circles*. Mangle allows polygons to be bounded by *any* circle, great or not. Use care when comparing formulas in this module to formulas in the mathematical literature. .. _Mangle: http://space.mit.edu/~molly/mangle/ .. _`window function`: https://www.sdss.org/dr14/algorithms/resolve/ References ---------- .. [1] `Swanson, M. E. C.; Tegmark, Max; Hamilton, Andrew J. S.; Hill, J. Colin, 2008 MNRAS 387, 1391 `_. """ import re import warnings import numpy as np import six from astropy.io import fits # import astropy.utils as au from . import PydlutilsException, PydlutilsUserWarning class PolygonList(list): """A :class:`list` that contains :class:`ManglePolygon` objects and possibly some metadata. Parameters ---------- header : :class:`list`, optional Set the `header` attribute. Attributes ---------- header : :class:`list` A list of strings containing metadata. """ def __init__(self, *args, **kwargs): super(PolygonList, self).__init__(*args) if 'header' in kwargs: self.header = kwargs['header'] else: self.header = list() return class ManglePolygon(object): """Simple object to represent a polygon. A polygon may be instantiated with a row (:class:`~astropy.io.fits.fitsrec.FITS_record`) from a :class:`FITS_polygon` object, another :class:`ManglePolygon` object (copy constructor), keyword arguments, or with no arguments at all, in which case it represents the whole sky. Attributes ---------- cm : :class:`~numpy.ndarray` The size of each cap in the polygon. id : :class:`int` An arbitrary ID number. pixel : :class:`int` Pixel this polygon is in. use_caps : :class:`int` Bitmask indicating which caps to use. weight : :class:`float` Weight factor assigned to the polygon. """ def __init__(self, *args, **kwargs): try: a0 = args[0] except IndexError: a0 = None if isinstance(a0, fits.fitsrec.FITS_record): self._ncaps = int(a0['NCAPS']) self.weight = float(a0['WEIGHT']) self.pixel = int(a0['PIXEL']) try: self.id = int(a0['IFIELD']) except KeyError: self.id = -1 self._str = float(a0['STR']) self.use_caps = int(a0['USE_CAPS']) xcaps = a0['XCAPS'] if xcaps.shape == (3, ): self._x = np.zeros((1, 3), dtype=xcaps.dtype) + xcaps else: self._x = xcaps[0:self._ncaps, :].copy() # assert self._x.shape == (self._ncaps, 3) cmcaps = a0['CMCAPS'] if cmcaps.shape == (): self.cm = np.zeros((1,), dtype=cmcaps.dtype) + cmcaps else: self.cm = cmcaps[0:self._ncaps].copy() # assert self.cm.shape == (self._ncaps, ) elif isinstance(a0, ManglePolygon): self._ncaps = a0._ncaps self.weight = a0.weight self.pixel = a0.pixel self.id = a0.id self._str = a0._str self.use_caps = a0.use_caps self._x = a0._x.copy() self.cm = a0.cm.copy() elif kwargs: if 'x' in kwargs and 'cm' in kwargs: xs = kwargs['x'].shape cm = kwargs['cm'].shape assert xs[0] == cm[0] else: raise ValueError('Input values are missing!') self._ncaps = xs[0] if 'weight' in kwargs: self.weight = float(kwargs['weight']) else: self.weight = 1.0 if 'pixel' in kwargs: self.pixel = int(kwargs['pixel']) else: self.pixel = -1 if 'id' in kwargs: self.id = int(kwargs['id']) else: self.id = -1 if 'use_caps' in kwargs: self.use_caps = int(kwargs['use_caps']) else: # Use all caps by default self.use_caps = (1 << self.ncaps) - 1 self._x = kwargs['x'].copy() self.cm = kwargs['cm'].copy() if 'str' in kwargs: self._str = float(kwargs['str']) else: self._str = None else: # # An "empty" polygon represents the whole sky. # self._ncaps = 0 self.weight = 1.0 self.pixel = -1 self.id = -1 self.use_caps = 0 self._x = None self.cm = None self._str = 4.0*np.pi return @property def ncaps(self): """Number of caps in the polygon. """ return self._ncaps @property def x(self): """The orientation of each cap in the polygon. The direction is specified by a unit 3-vector. """ return self._x @property def str(self): """Solid angle of this polygon (steradians). """ if self._str is None: self._str = self.garea() return self._str def cmminf(self): """The index of the smallest cap in the polygon, accounting for negative caps and ``use_caps``. Returns ------- :class:`int` Integer index of the smallest cap. """ if self.ncaps == 0: return None cmmin = 2.0 kmin = -1 for k in range(self.ncaps): if is_cap_used(self.use_caps, k): if self.cm[k] >= 0: cmk = self.cm[k] else: cmk = 2.0 + self.cm[k] if cmk < cmmin: cmmin = cmk kmin = k return kmin def garea(self): """Compute the area of a polygon. See [1]_ for the detailed area formula, which is summarized here: * An empty polygon with no caps is defined to be the whole sky. * A polygon with one cap has area ``2*pi*self.cm``. * A polygon with at least one cap with an area less than :math:`2\pi` has an area less than :math:`2\pi`. * If every cap has an area greater than :math:`2\pi`, split the polygon into two smaller polygons and sum the two areas. Returns ------- :class:`float` The area of the polygon. References ---------- .. [1] `Hamilton, A. J. S.; Tegmark, Max, 2004 MNRAS 349, 115 `_. """ smallest_cap = self.cmminf() if smallest_cap is None: return 4.0 * np.pi cmmin = self.cm[smallest_cap] used_caps = list() for k in range(self.ncaps): if is_cap_used(self.use_caps, k): used_caps.append(k) nused_caps = len(used_caps) if nused_caps == 0: return 0.0 if nused_caps >= 2 and cmmin > 1.0: npl = nused_caps + 1 else: npl = nused_caps if npl == nused_caps: if nused_caps == 1: # # Short-circuit this case. # if cmmin < 0: return 2.0*np.pi*(2.0+cmmin) else: return 2.0*np.pi*cmmin else: # # Two or more caps, at least one has area < 2.0*pi. # return self._garea_helper() else: # # More than two caps, and all have area > 2.0*pi. # This section still needs to account for use_caps. # dpoly = self.polyn(self, smallest_cap) dpoly.cm[nused_caps] = cmmin / 2.0 area1 = dpoly._garea_helper() dpoly.cm[nused_caps] = -1.0 * dpoly.cm[nused_caps] area2 = dpoly._garea_helper() return area1 + area2 def _garea_helper(self): """Reproduces the Fortran routine garea in Mangle. *Placeholder for now.* Returns ------- :class:`float` Area of polygon in steradians. """ if self.gzeroar(): return 0.0 warnings.warn(("The ManglePolygon.garea() method is incomplete" + "and is returning a dummy value."), PydlutilsUserWarning) return 1.0 def gzeroar(self): """If at least one cap has zero area, then the whole polygon has zero area. Returns ------- :class:`bool` ``True`` if the area is zero. """ if self.ncaps == 0: # # Catch the case of an "allsky" polygon. # return False return (self.cm == 0.0).any() or (self.cm <= -2.0).any() def copy(self): """Return an exact copy of the polygon. Returns ------- :class:`ManglePolygon` A new polygon object. """ return ManglePolygon(self) def add_caps(self, x, cm): """Create a new polygon that contains additional caps. The caps are appended to the existing polygon's caps. Parameters ---------- x : :class:`~numpy.ndarray` or :class:`~numpy.recarray` ``X`` values of the new caps. cm : :class:`~numpy.ndarray` or :class:`~numpy.recarray` ``CM`` values of the new caps. Returns ------- :class:`ManglePolygon` A new polygon object. """ ncaps = self.ncaps + cm.size newdata = dict() newdata['weight'] = self.weight newdata['pixel'] = self.pixel newdata['id'] = self.id newdata['use_caps'] = self.use_caps newdata['x'] = np.zeros((ncaps, 3), dtype=self.x.dtype) newdata['cm'] = np.zeros((ncaps,), dtype=self.cm.dtype) newdata['x'][0:self.ncaps, :] = self.x.copy() newdata['cm'][0:self.ncaps] = self.cm.copy() newdata['x'][self.ncaps:, :] = x.copy() newdata['cm'][self.ncaps:] = cm.copy() return ManglePolygon(**newdata) def polyn(self, other, n, complement=False): """Intersection of a polygon with the `n` th cap of another polygon. Parameters ---------- other : :class:`~pydl.pydlutils.mangle.ManglePolygon` Polygon containing a cap to intersect the first polygon with. n : :class:`int` Index of the cap in `other`. complement : :class:`bool`, optional If ``True``, set the sign of the cm value of `other` to be the complement of the original value. Returns ------- :class:`~pydl.pydlutils.mangle.ManglePolygon` A polygon containing the intersected caps. """ x = other.x[n, :] sign = 1.0 if complement: sign = -1.0 cm = sign * other.cm[n] return self.add_caps(x, cm) class FITS_polygon(fits.FITS_rec): """Handle polygons read in from a FITS file. This class provides aliases for the columns in a typical FITS polygons file. """ _pkey = {'X': 'XCAPS', 'x': 'XCAPS', 'CM': 'CMCAPS', 'cm': 'CMCAPS', 'ID': 'IFIELD', 'id': 'IFIELD', 'ncaps': 'NCAPS', 'weight': 'WEIGHT', 'pixel': 'PIXEL', 'str': 'STR', 'use_caps': 'USE_CAPS'} # # Right now, this class is only instantiated by calling .view() on # a FITS_rec object, so only __array_finalize__ is needed. # # def __new__(*args): # self = super(FITS_polygon, self).__new__(*args) # self._caps = None # return self def __array_finalize__(self, obj): super(FITS_polygon, self).__array_finalize__(obj) def __getitem__(self, key): if isinstance(key, six.string_types): if key in self._pkey: return super(FITS_polygon, self).__getitem__(self._pkey[key]) return super(FITS_polygon, self).__getitem__(key) def __getattr__(self, key): if key in self._pkey: return super(FITS_polygon, self).__getattribute__(self._pkey[key]) raise AttributeError("FITS_polygon has no attribute {0}.".format(key)) def angles_to_x(points, latitude=False): """Convert spherical angles to unit Cartesian vectors. Parameters ---------- points : :class:`~numpy.ndarray` A set of angles in the form phi, theta (in degrees). latitude : :class:`bool`, optional If ``True``, assume that the angles actually represent longitude, latitude, or equivalently, RA, Dec. Returns ------- :class:`~numpy.ndarray` The corresponding Cartesian vectors. """ npoints, ncol = points.shape x = np.zeros((npoints, 3), dtype=points.dtype) phi = np.radians(points[:, 0]) if latitude: theta = np.radians(90.0 - points[:, 1]) else: theta = np.radians(points[:, 1]) st = np.sin(theta) x[:, 0] = np.cos(phi) * st x[:, 1] = np.sin(phi) * st x[:, 2] = np.cos(theta) return x def cap_distance(x, cm, points): """Compute the distance from a point to a cap, and also determine whether the point is inside or outside the cap. Parameters ---------- x : :class:`~numpy.ndarray` or :class:`~numpy.recarray` ``X`` value of the cap (3-vector). cm : :class:`~numpy.ndarray` or :class:`~numpy.recarray` ``CM`` value of the cap. points : :class:`~numpy.ndarray` or :class:`~numpy.recarray` If `points` is a 3-vector, or set of 3-vectors, then assume the point is a Cartesian unit vector. If `point` is a 2-vector or set of 2-vectors, assume the point is RA, Dec. Returns ------- :class:`~numpy.ndarray` The distance(s) to the point(s) in degrees. If the distance is negative, the point is outside the cap. """ npoints, ncol = points.shape if ncol == 2: xyz = angles_to_x(points, latitude=True) elif ncol == 3: xyz = points else: raise ValueError("Inappropriate shape for point!") dotprod = np.dot(xyz, x) cdist = np.degrees(np.arccos(1.0 - np.abs(cm)) - np.arccos(dotprod)) if cm < 0: cdist *= -1.0 return cdist def circle_cap(radius, points): """Construct caps based on input points (directions on the unit sphere). Parameters ---------- radius : :class:`float` or :class:`~numpy.ndarray` Radii of the caps in degrees. This will become the ``CM`` value. points : :class:`~numpy.ndarray` or :class:`~numpy.recarray` If `points` is a 3-vector, or set of 3-vectors, then assume the point is a Cartesian unit vector. If `point` is a 2-vector or set of 2-vectors, assume the point is RA, Dec. Returns ------- :func:`tuple` A tuple containing ``X`` and ``CM`` values for the cap. """ npoints, ncol = points.shape if ncol == 2: x = angles_to_x(points, latitude=True) elif ncol == 3: x = points.copy() else: raise ValueError("Inappropriate shape for point!") if isinstance(radius, float): radius = np.zeros((npoints,), dtype=points.dtype) + radius elif radius.shape == (): radius = np.zeros((npoints,), dtype=points.dtype) + radius elif radius.size == npoints: pass else: raise ValueError("radius does not appear to be the correct shape.") cm = 1.0 - np.cos(np.radians(radius)) return (x, cm) def is_cap_used(use_caps, i): """Returns ``True`` if a cap is used. Parameters ---------- use_caps : :class:`int` Bit mask indicating which cap is used. i : :class:`int` Number indicating which cap we are interested in. Returns ------- :class:`bool` ``True`` if a cap is used. """ return (use_caps & 1 << i) != 0 def is_in_cap(x, cm, points): """Are the points in a (single) given cap? Parameters ---------- x : :class:`~numpy.ndarray` or :class:`~numpy.recarray` ``X`` value of the cap (3-vector). cm : :class:`~numpy.ndarray` or :class:`~numpy.recarray` ``CM`` value of the cap. points : :class:`~numpy.ndarray` or :class:`~numpy.recarray` If `points` is a 3-vector, or set of 3-vectors, then assume the point is a Cartesian unit vector. If `point` is a 2-vector or set of 2-vectors, assume the point is RA, Dec. Returns ------- :class:`~numpy.ndarray` A boolean vector giving the result for each point. """ return cap_distance(x, cm, points) >= 0.0 def is_in_polygon(polygon, points, ncaps=0): """Are the points in a given (single) polygon? Parameters ---------- polygon : :class:`~pydl.pydlutils.mangle.ManglePolygon` A polygon object. points : :class:`~numpy.ndarray` or :class:`~numpy.recarray` If `points` is a 3-vector, or set of 3-vectors, then assume the point is a Cartesian unit vector. If `point` is a 2-vector or set of 2-vectors, assume the point is RA, Dec. ncaps : :class:`int`, optional If set, use only the first `ncaps` caps in `polygon`. Returns ------- :class:`~numpy.ndarray` A boolean vector giving the result for each point. """ npoints, ncol = points.shape p = dict() pmap = {'ncaps': 'NCAPS', 'use_caps': 'USE_CAPS', 'x': 'XCAPS', 'cm': 'CMCAPS'} for key in pmap: try: p[key] = getattr(polygon, key) except AttributeError: p[key] = polygon[pmap[key]] usencaps = p['ncaps'] if ncaps > 0: usencaps = min(ncaps, p['ncaps']) in_polygon = np.ones((npoints,), dtype=np.bool) for icap in range(usencaps): if is_cap_used(p['use_caps'], icap): in_polygon &= is_in_cap(p['x'][icap, :], p['cm'][icap], points) return in_polygon def is_in_window(polygons, points, ncaps=0): """Check to see if `points` lie within a set of `polygons`. Parameters ---------- polygons : :class:`PolygonList` or :class:`FITS_polygon` A set of polygons. points : :class:`~numpy.ndarray` or :class:`~numpy.recarray` If `points` is a 3-vector, or set of 3-vectors, then assume the point is a Cartesian unit vector. If `point` is a 2-vector or set of 2-vectors, assume the point is RA, Dec. ncaps : :class:`int`, optional If set, use only the first `ncaps` caps in `polygon`. This only exists to be passed to :func:`is_in_polygon`. Returns ------- :func:`tuple` A tuple containing two :class:`~numpy.ndarray`. First, a boolean vector giving the result for each point. Second, an integer vector giving the index of the polygon that contains the point. """ npoints, ncol = points.shape npoly = len(polygons) in_polygon = np.zeros((npoints, ), dtype=np.int32) - 1 curr_polygon = 0 while curr_polygon < npoly: indx_not_in = (in_polygon == -1).nonzero()[0] if len(indx_not_in) > 0: indx_in_curr_polygon = is_in_polygon(polygons[curr_polygon], points[indx_not_in], ncaps=ncaps) if indx_in_curr_polygon.any(): in_polygon[indx_not_in[indx_in_curr_polygon]] = curr_polygon curr_polygon += 1 return (in_polygon >= 0, in_polygon) def read_fits_polygons(filename, convert=False): """Read a "polygon" format FITS file. This function returns a subclass of :class:`~astropy.io.fits.FITS_rec` that provides some convenient aliases for the columns of a typical FITS polygon file (which might be all upper-case). Parameters ---------- filename : :class:`str` Name of FITS file to read. convert : :class:`bool`, optional If ``True``, convert the data to a list of :class:`ManglePolygon` objects. *Caution: This could result in some data being discarded!* Returns ------- :class:`~pydl.pydlutils.mangle.FITS_polygon` or :class:`list` The data contained in HDU 1 of the FITS file. """ with fits.open(filename, uint=True) as hdulist: data = hdulist[1].data if convert: poly = PolygonList() for k in range(data.size): poly.append(ManglePolygon(data[k])) else: poly = data.view(FITS_polygon) return poly def read_mangle_polygons(filename): """Read a "polygon" format ASCII file in Mangle's own format. These files typically have extension ``.ply`` or ``.pol``. Parameters ---------- filename : :class:`str` Name of FITS file to read. Returns ------- :class:`~pydl.pydlutils.mangle.PolygonList` A list-like object containing :class:`~pydl.pydlutils.mangle.ManglePolygon` objects and any metadata. """ with open(filename, 'rU') as ply: lines = ply.read().split(ply.newlines) try: npoly = int(lines[0].split()[0]) except ValueError: raise PydlutilsException(("Invalid first line of {0}! " + "Are you sure this is a Mangle " + "polygon file?").format(filename)) p_lines = [i for i, l in enumerate(lines) if l.startswith('polygon')] header = lines[1:p_lines[0]] poly = PolygonList(header=header) r1 = re.compile(r'polygon\s+(\d+)\s+\(([^)]+)\):') mtypes = {'str': float, 'weight': float, 'pixel': int, 'caps': int} for p in p_lines: m = r1.match(lines[p]) g = m.groups() pid = int(g[0]) meta = g[1].strip().split(',') m1 = [m.strip().split()[1] for m in meta] m0 = [mtypes[m1[i]](m.strip().split()[0]) for i, m in enumerate(meta)] metad = dict(zip(m1, m0)) metad['id'] = pid metad['x'] = list() metad['cm'] = list() for cap in lines[p+1:p+1+metad['caps']]: data = [float(d) for d in re.split(r'\s+', cap.strip())] metad['x'].append(data[0:3]) metad['cm'].append(data[-1]) metad['x'] = np.array(metad['x']) assert metad['x'].shape == (metad['caps'], 3) metad['cm'] = np.array(metad['cm']) assert metad['cm'].shape == (metad['caps'],) poly.append(ManglePolygon(**metad)) return poly def set_use_caps(polygon, index_list, add=False, tol=1.0e-10, allow_doubles=False, allow_neg_doubles=False): """Set the bits in use_caps for a polygon. Parameters ---------- polygon : :class:`~pydl.pydlutils.mangle.ManglePolygon` A polygon object. index_list : array-like A list of indices of caps to set in the polygon. Should be no longer, nor contain indices greater than the number of caps (``polygon.ncaps``). add : :class:`bool`, optional If ``True``, don't initialize the use_caps value to zero, use the existing value associated with `polygon` instead. tol : :class:`float`, optional Tolerance used to determine whether two caps are identical. allow_doubles : :class:`bool`, optional Normally, this routine automatically sets use_caps such that no two caps with use_caps set are identical. allow_neg_doubles : :class:`bool`, optional Normally, two caps that are identical except for the sign of `cm` would be set unused. This inhibits that behaviour. Returns ------- :class:`int` Value of use_caps. """ if not add: polygon.use_caps = 0 t2 = tol**2 for i in index_list: polygon.use_caps |= (1 << index_list[i]) if not allow_doubles: # # Check for doubles # for i in range(polygon.ncaps): if is_cap_used(polygon.use_caps, i): for j in range(i+1, polygon.ncaps): if is_cap_used(polygon.use_caps, j): if np.sum((polygon.x[i, :] - polygon.x[j, :])**2) < t2: if ((np.absolute(polygon.cm[i] - polygon.cm[j]) < tol) or ((polygon.cm[i] + polygon.cm[j]) < tol and not allow_neg_doubles)): # # Don't use # polygon.use_caps -= 1 << j return polygon.use_caps def x_to_angles(points, latitude=False): """Convert unit Cartesian vectors to spherical angles. Parameters ---------- points : :class:`~numpy.ndarray` A set of x, y, z coordinates. latitude : :class:`bool`, optional If ``True``, convert the angles to longitude, latitude, or equivalently, RA, Dec. Returns ------- :class:`~numpy.ndarray` The corresponding spherical angles. """ npoints, ncol = points.shape phi = np.degrees(np.arctan2(points[:, 1], points[:, 0])) r = (points**2).sum(1) theta = np.degrees(np.arccos(points[:, 2]/r)) if latitude: theta = 90.0 - theta x = np.zeros((npoints, 2), dtype=points.dtype) x[:, 0] = phi x[:, 1] = theta return x def _single_polygon(poly): """Given certain classes of `poly`, return a :class:`ManglePolygon` object. This function is used to allow various types of input in functions that take a *single* polygon as an argument. Parameters ---------- poly : object An object containing data that could potentially be converted into a :class:`ManglePolygon`. Returns ------- ManglePolygon The resulting *single* polygon object. Raises ------ :exc:`ValueError` If the conversion is not possible. """ if isinstance(poly, ManglePolygon): return poly if isinstance(poly, PolygonList) and len(poly) == 1: return poly[0] if isinstance(poly, FITS_polygon) and len(poly) == 1: return ManglePolygon(poly[0]) if isinstance(poly, fits.fitsrec.FITS_record): return ManglePolygon(poly) raise ValueError("Can't convert input into a single polygon!") pydl-0.7.0/pydl/pydlutils/coord.py0000644000076500000240000001520713434104050017562 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the coord directory in idlutils. """ import numpy as np import astropy.units as u import astropy.coordinates as ac from ..goddard.astro import get_juldate class SDSSMuNu(ac.BaseCoordinateFrame): """SDSS Great Circle Coordinates Attributes ---------- stripe SDSS `Stripe Number`_ . node Node of the great circle with respect to the celestial equator. In SDSS, this is almost always RA = 95.0 degrees. incl Inclination of the great circle with respect to the celestial equator. phi Counter-clockwise position angle w.r.t. north for an arc in the +nu direction. Parameters ---------- mu : :class:`~astropy.coordinates.Angle` Angle corresponding to longitude measured along a stripe. nu : :class:`~astropy.coordinates.Angle` Angle corresponding to latitude measured perpendicular to a stripe. Notes ----- https://www.sdss.org/dr14/algorithms/surveycoords/ .. _`Stripe Number`: https://www.sdss.org/dr14/help/glossary/#stripe """ default_representation = ac.SphericalRepresentation frame_specific_representation_info = { 'spherical': [ ac.RepresentationMapping(reprname='lon', framename='mu', defaultunit=u.deg), ac.RepresentationMapping(reprname='lat', framename='nu', defaultunit=u.deg) ] } frame_specific_representation_info['unitspherical'] = ( frame_specific_representation_info['spherical']) stripe = ac.Attribute(default=0) node = ac.QuantityAttribute(default=ac.Angle(95.0, unit=u.deg), unit=u.deg) # phi = ac.QuantityFrameAttribute(default=None, unit=u.deg) @property def incl(self): return ac.Angle(stripe_to_incl(self.stripe), unit=u.deg) def current_mjd(): """Return the current MJD. """ return get_juldate() - 2400000.5 @ac.frame_transform_graph.transform(ac.FunctionTransform, SDSSMuNu, ac.ICRS) def munu_to_radec(munu, icrs_frame): """Convert from SDSS great circle coordinates to equatorial coordinates. Parameters ---------- munu : :class:`~pydl.pydlutils.coord.SDSSMuNu` SDSS great circle coordinates (mu, nu). Returns ------- :class:`~astropy.coordinates.ICRS` Equatorial coordinates (RA, Dec). """ # from pydlutils.coord import stripe_to_eta # from pydlutils.goddard.misc import cirrange # if 'stripe' in kwargs: # node = 95.0 # incl = stripe_to_incl(kwargs['stripe']) # elif 'node' in kwargs and 'incl' in kwargs: # node = kwargs['node'] # incl = kwargs['incl'] # else: # raise ValueError('Must specify either STRIPE or NODE,INCL!') # if mu.size != nu.size: # raise ValueError('Number of elements in MU and NU must agree!') sinnu = np.sin(munu.nu.to(u.radian).value) cosnu = np.cos(munu.nu.to(u.radian).value) sini = np.sin(munu.incl.to(u.radian).value) cosi = np.cos(munu.incl.to(u.radian).value) sinmu = np.sin((munu.mu - munu.node).to(u.radian).value) cosmu = np.cos((munu.mu - munu.node).to(u.radian).value) xx = cosmu * cosnu yy = sinmu * cosnu * cosi - sinnu * sini zz = sinmu * cosnu * sini + sinnu * cosi ra = ac.Angle(np.arctan2(yy, xx), unit=u.radian) + munu.node dec = ac.Angle(np.arcsin(zz), unit=u.radian) # if 'phi' in kwargs: # phi = np.rad2deg(np.arctan2(cosmu * sini, # (-sinmu * sinnu * sini + cosnu * cosi)*cosnu)) # return (ra, dec, phi) # else: # return (ra, dec) return ac.ICRS(ra=ra, dec=dec).transform_to(icrs_frame) @ac.frame_transform_graph.transform(ac.FunctionTransform, ac.ICRS, SDSSMuNu) def radec_to_munu(icrs_frame, munu): """Convert from equatorial coordinates to SDSS great circle coordinates. Parameters ---------- icrs_frame : :class:`~astropy.coordinates.ICRS` Equatorial coordinates (RA, Dec). Returns ------- :class:`~pydl.pydlutils.coord.SDSSMuNu` SDSS great circle coordinates (mu, nu). """ # from pydlutils.coord import stripe_to_eta # from pydlutils.goddard.misc import cirrange # if 'stripe' in kwargs: # node = 95.0 # incl = stripe_to_incl(kwargs['stripe']) # elif 'node' in kwargs and 'incl' in kwargs: # node = kwargs['node'] # incl = kwargs['incl'] # else: # raise ValueError('Must specify either STRIPE or NODE,INCL!') # if ra.size != dec.size: # raise ValueError('Number of elements in RA and DEC must agree!') sinra = np.sin((icrs_frame.ra - munu.node).to(u.radian).value) cosra = np.cos((icrs_frame.ra - munu.node).to(u.radian).value) sindec = np.sin(icrs_frame.dec.to(u.radian).value) cosdec = np.cos(icrs_frame.dec.to(u.radian).value) sini = np.sin(munu.incl.to(u.radian).value) cosi = np.cos(munu.incl.to(u.radian).value) x1 = cosdec * cosra y1 = cosdec * sinra z1 = sindec x2 = x1 y2 = y1 * cosi + z1 * sini z2 = -y1 * sini + z1 * cosi mu = ac.Angle(np.arctan2(y2, x2), unit=u.radian) + munu.node nu = ac.Angle(np.arcsin(z2), unit=u.radian) # if 'phi' in kwargs: # sinnu = np.sin(np.deg2rad(nu)) # cosnu = np.cos(np.deg2rad(nu)) # sinmu = np.sin(np.deg2rad(mu-node)) # cosmu = np.cos(np.deg2rad(mu-node)) # phi = np.rad2deg(np.arctan2(cosmu * sini, # (-sinmu * sinnu * sini + cosnu * cosi)*cosnu)) # return (ra, dec, phi) # else: # return (ra, dec) return SDSSMuNu(mu=mu, nu=nu, stripe=munu.stripe) def stripe_to_eta(stripe): """Convert from SDSS great circle coordinates to equatorial coordinates. Parameters ---------- stripe : :class:`int` or :class:`numpy.ndarray` SDSS Stripe number. Returns ------- :class:`float` or :class:`numpy.ndarray` The eta value in the SDSS (lambda,eta) coordinate system. """ stripe_sep = 2.5 eta = stripe * stripe_sep - 57.5 if stripe > 46: eta -= 180.0 return eta def stripe_to_incl(stripe): """Convert from SDSS stripe number to an inclination relative to the equator. Parameters ---------- stripe : :class:`int` or :class:`numpy.ndarray` SDSS Stripe number. Returns ------- :class:`float` or :class:`numpy.ndarray` Inclination of the stripe relative to the equator (Dec = 0). """ dec_center = 32.5 eta_center = stripe_to_eta(stripe) incl = eta_center + dec_center return incl pydl-0.7.0/pydl/pydlutils/tests/0000755000076500000240000000000013434104632017245 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlutils/tests/test_bspline.py0000644000076500000240000000736613434104050022320 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import catch_warnings, raises from .. import PydlutilsUserWarning from ..bspline import cholesky_band, cholesky_solve, iterfit from ... import smooth class TestBspline(object): """Test the functions in pydl.pydlutils.bspline. """ def setup(self): pass def teardown(self): pass def test_iterfit(self): y0 = np.array([0.661984, 0.134913, 0.0410350, 0.940134, 0.411034, 0.484675, 0.169943, 0.325046, 0.269194, 0.552381, 0.797177, 0.971658, 0.251765, 0.531675, 0.854556, 0.411237, 0.694380, 0.499562, 0.437242, 0.362451, 0.343206, 0.524099, 0.158634, 0.728597, 0.198340, 0.571210, 0.477527, 0.962797, 0.973921, 0.413651, 0.736380, 0.516366, 0.104283, 0.675993, 0.467053, 0.230112, 0.866994, 0.469885, 0.964392, 0.541084, 0.332984, 0.581252, 0.422322, 0.872555, 0.803636, 0.520998, 0.918942, 0.241564, 0.169263, 0.686649, 0.708284, 0.707858, 0.00113957, 0.827920, 0.845985, 0.416961, 0.553842, 0.526549, 0.501051, 0.337514, 0.700873, 0.152816, 0.762935, 0.650039, 0.483321, 0.708600, 0.410033, 0.507671, 0.596956, 0.177692, 0.498112, 0.422037, 0.788333, 0.856578, 0.941245, 0.432411, 0.356469, 0.341916, 0.0331059, 0.641100, 0.690452, 0.168667, 0.915178, 0.158406, 0.701508, 0.841774, 0.434161, 0.153123, 0.420066, 0.0499331, 0.947241, 0.0768818, 0.410540, 0.843788, 0.0640255, 0.513463, 0.511104, 0.680434, 0.762480, 0.0563867]) assert y0.size == 100 y = smooth(y0, 10) assert y.size == 100 x = np.arange(y0.size, dtype='d') sset, outmask = iterfit(x, y, nord=3, maxiter=0, bkspace=10) assert sset.npoly == 1 assert sset.funcname == 'legendre' # print(sset) # yfit,mask = sset.value(x) # print(yfit) # pylab.plot(x, y, 'k-', x, yfit, 'r-') def test_cholesky_band(self): ab = np.array([[ 8., 9., 10., 11., 12., 13., 14., 0., 0., 0.], [ 1., 2., 3., 4., 5., 6., 0., 0., 0., 0.], [ 1., 2., 3., 4., 5., 0., 0., 0., 0., 0.]]) l = np.array([[2.82842712, 2.97909382, 3.07877788, 3.1382224, 3.16559183, 3.16295604, 3.12782377, 0., 0., 0.], [0.35355339, 0.62938602, 0.8371714, 1.01466666, 1.17093392, 1.31223108, 0., 0., 0., 0.], [0.35355339, 0.67134509, 0.97441261, 1.27460692, 1.57948348, 0., 0., 0., 0., 0.]]) i, ll = cholesky_band(ab) assert np.allclose(l, ll) ab[0, 0] = 1.0e-6 with catch_warnings(PydlutilsUserWarning) as w: i, ll = cholesky_band(ab, mininf=1.0e-5) assert len(w) > 0 assert str(w[0].message) == "Bad entries: [0]" assert i == 0 # ab[0, :] = np.array([ 1., 2., 3., 4., 5., 6., 7., 0., 0., 0.]) # with catch_warnings(PydlutilsUserWarning) as w: # i, ll = cholesky_band(ab) # assert len(w) > 0 def test_cholesky_solve(self): l = np.array([[2.82842712, 2.97909382, 3.07877788, 3.1382224, 3.16559183, 3.16295604, 3.12782377, 0., 0., 0.], [0.35355339, 0.62938602, 0.8371714, 1.01466666, 1.17093392, 1.31223108, 0., 0., 0., 0.], [0.35355339, 0.67134509, 0.97441261, 1.27460692, 1.57948348, 0., 0., 0., 0., 0.]]) b = np.ones((10,), dtype=l.dtype) xx = np.array([0.10848432, 0.07752049, 0.05460496, 0.04231068, 0.02116434, 0.03276732, 0.04982674, 0., 0., 0.]) x = cholesky_solve(l, b) assert np.allclose(x, xx) pydl-0.7.0/pydl/pydlutils/tests/test_cooling.py0000644000076500000240000000170612672535402022322 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import raises from ..cooling import read_ds_cooling class TestCooling(object): """Test the functions in pydl.pydlutils.cooling. """ def setup(self): pass def teardown(self): pass def test_read_ds_cooling(self): with raises(ValueError): logT, loglambda = read_ds_cooling('m-99.cie') logT, logL = read_ds_cooling('m-15.cie') assert np.allclose(logT[0:5], np.array([4.0, 4.05, 4.1, 4.15, 4.2])) assert np.allclose(logL[0:5], np.array([-26.0, -24.66, -23.52, -22.62, -22.11])) logT = np.array([4.025, 4.125, 4.225, 4.325]) logT2, logL = read_ds_cooling('m-15.cie', logT) assert np.allclose(logT2, logT) assert np.allclose(logL, np.array([-25.33, -23.07, -22.04, -22.06])) pydl-0.7.0/pydl/pydlutils/tests/test_misc.py0000644000076500000240000001715013064020340021605 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- from os import remove import numpy as np import tempfile from astropy.tests.helper import raises from .. import PydlutilsException from ..misc import djs_laxisgen, djs_laxisnum, hogg_iau_name, struct_print class TestMisc(object): """Test the functions in pydl.pydlutils.misc. """ def setup(self): pass def teardown(self): pass def test_djs_laxisgen(self): # # 1d # assert (np.arange(4, dtype='i4') == djs_laxisgen((4,))).all() # # 2d # l = np.array([[0, 0, 0, 0], [1, 1, 1, 1], [2, 2, 2, 2], [3, 3, 3, 3]], dtype='i4') assert (l == djs_laxisgen((4, 4))).all() assert (l.T == djs_laxisgen((4, 4), iaxis=1)).all() with raises(ValueError): foo = djs_laxisgen((4, 4), iaxis=2) # # 3d # l = np.zeros((4, 4, 4), dtype='i4') l[1, :, :] = 1 l[2, :, :] = 2 l[3, :, :] = 3 assert (l == djs_laxisgen((4, 4, 4))).all() assert (l.swapaxes(0, 1) == djs_laxisgen((4, 4, 4), iaxis=1)).all() assert (l.swapaxes(0, 2) == djs_laxisgen((4, 4, 4), iaxis=2)).all() with raises(ValueError): foo = djs_laxisgen((4, 4, 4), iaxis=3) # # More d # with raises(ValueError): foo = djs_laxisgen((4, 4, 4, 4)) def test_djs_laxisnum(self): # # 1d # assert (np.zeros((4,), dtype='i4') == djs_laxisnum((4,))).all() # # 2d # l = np.array([[0, 0, 0, 0], [1, 1, 1, 1], [2, 2, 2, 2], [3, 3, 3, 3]], dtype='i4') assert (l == djs_laxisnum((4, 4))).all() assert (l.T == djs_laxisnum((4, 4), iaxis=1)).all() with raises(ValueError): foo = djs_laxisnum((4, 4), iaxis=2) # # 3d # l = np.zeros((4, 4, 4), dtype='i4') l[1, :, :] = 1 l[2, :, :] = 2 l[3, :, :] = 3 assert (l == djs_laxisnum((4, 4, 4))).all() assert (l.swapaxes(0, 1) == djs_laxisnum((4, 4, 4), iaxis=1)).all() assert (l.swapaxes(0, 2) == djs_laxisnum((4, 4, 4), iaxis=2)).all() with raises(ValueError): foo = djs_laxisnum((4, 4, 4), iaxis=3) # # More d # with raises(ValueError): foo = djs_laxisnum((4, 4, 4, 4)) def test_hogg_iau_name(self): assert (hogg_iau_name(354.120375, -0.544777778) == 'SDSS J233628.89-003241.2') assert (hogg_iau_name(354.120375, -0.544777778, prefix='2MASS') == '2MASS J233628.89-003241.2') assert (hogg_iau_name(354.120375, -0.544777778, prefix='') == 'J233628.89-003241.2') assert (hogg_iau_name(354.120375, -0.544777778, precision=0) == 'SDSS J233628.8-003241') assert (hogg_iau_name(354.120375, -0.544777778, precision=2) == 'SDSS J233628.890-003241.20') ra = np.array([354.120375, 7.89439, 36.31915, 110.44730]) dec = np.array([-0.544777778, -0.35157, 0.47505, 39.35352]) names = hogg_iau_name(ra, dec) assert tuple(names) == ('SDSS J233628.89-003241.2', 'SDSS J003134.65-002105.6', 'SDSS J022516.59+002830.1', 'SDSS J072147.35+392112.6') def test_struct_print(self): slist = np.zeros((5,), dtype=[('a', 'c16'), ('b', np.bool)]) with raises(PydlutilsException): lines, css = struct_print(slist, silent=True) slist = np.array([(1, 2.34, 'five'), (2, 3.456, 'seven'), (3, -4.5678, 'nine')], dtype=[('a', 'i4'), ('bb', 'f4'), ('ccc', 'S5')]) lines, css = struct_print(slist, silent=True) assert lines[0] == 'a bb ccc ' assert lines[1] == '- ------------ -----' assert lines[2] == '1 2.34 five ' assert lines[3] == '2 3.456 seven' assert lines[4] == '3 -4.5678 nine ' assert len(css) == 0 lines, css = struct_print(slist, silent=True, alias={'ccc': 'c'}) assert lines[0] == 'a bb c ' assert lines[1] == '- ------------ -----' assert lines[2] == '1 2.34 five ' assert lines[3] == '2 3.456 seven' assert lines[4] == '3 -4.5678 nine ' assert len(css) == 0 lines, css = struct_print(slist, silent=True, formatcodes={'a': '{0:02d}'}) assert lines[0] == 'a bb ccc ' assert lines[1] == '-- ------------ -----' assert lines[2] == '01 2.34 five ' assert lines[3] == '02 3.456 seven' assert lines[4] == '03 -4.5678 nine ' assert len(css) == 0 lines, css = struct_print(slist, silent=True, fdigit=3) assert lines[0] == 'a bb ccc ' assert lines[1] == '- ---------- -----' assert lines[2] == '1 2.34 five ' assert lines[3] == '2 3.46 seven' assert lines[4] == '3 -4.57 nine ' assert len(css) == 0 lines, css = struct_print(slist, silent=True, html=True) assert lines[0] == '' assert lines[1] == '' assert lines[2] == '' assert lines[3] == '' assert lines[4] == '' assert lines[5] == '
abbccc
1 2.34five
2 3.456seven
3 -4.5678nine
' assert css[0] == '' slist = np.array([(1, 2.34, 'five'), (2, 3.456, 'seven'), (3, -4.5678, 'nine')], dtype=[('a', 'i4'), ('bb', 'f8'), ('ccc', 'S5')]) lines, css = struct_print(slist, silent=True, ddigit=3) assert lines[0] == 'a bb ccc ' assert lines[1] == '- ---------- -----' assert lines[2] == '1 2.34 five ' assert lines[3] == '2 3.46 seven' assert lines[4] == '3 -4.57 nine ' assert len(css) == 0 with tempfile.NamedTemporaryFile(delete=False) as spf1: spf1_name = spf1.name lines, css = struct_print(slist, silent=True, filename=spf1_name) with open(spf1_name, 'rb') as f: data = f.read().decode('utf-8') assert "\n".join(lines)+"\n" == data remove(spf1_name) with tempfile.TemporaryFile() as spf2: lines, css = struct_print(slist, silent=True, filename=spf2) spf2.seek(0) data = spf2.read().decode('utf-8') assert "\n".join(lines)+"\n" == data pydl-0.7.0/pydl/pydlutils/tests/test_image.py0000644000076500000240000001052612632466352021755 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import raises from ..image import djs_maskinterp1, djs_maskinterp class TestImage(object): """Test the functions in pydl.pydlutils.image. """ def setup(self): pass def teardown(self): pass def test_djs_maskinterp1(self): y = np.array([0.0, 1.0, 2.0, 3.0, 4.0], dtype=np.float64) # Test all good yi = djs_maskinterp1(y, np.zeros(y.shape, dtype=np.int32)) assert (yi == y).all() # Test all bad yi = djs_maskinterp1(y, np.ones(y.shape, dtype=np.int32)) assert (yi == y).all() # Test one good value yt = np.zeros(y.shape, dtype=y.dtype) + y[0] yi = djs_maskinterp1(y, np.arange(len(y))) assert (yi == yt).all() # Test two bad values yi = djs_maskinterp1(y, np.array([0, 1, 0, 1, 0])) assert np.allclose(y, yi) # Because of the default behavior of np.interp(), the 'const' # keyword has no actual effect, because it performs the # same operation. yt = y.copy() yt[4] = 3.0 yi = djs_maskinterp1(y, np.array([0, 1, 0, 0, 1])) assert np.allclose(yt, yi) yi = djs_maskinterp1(y, np.array([0, 1, 0, 0, 1]), const=True) assert np.allclose(yt, yi) yt = y.copy() yt[0] = 1.0 yi = djs_maskinterp1(y, np.array([1, 0, 1, 0, 0]), const=True) assert np.allclose(yt, yi) # # Test other x values # x = np.array([0.0, 0.5, 1.0, 1.5, 2.0], dtype=y.dtype) yi = djs_maskinterp1(y, np.array([0, 1, 0, 1, 0]), xval=x) assert np.allclose(y, yi) yt = y.copy() yt[4] = 3.0 yi = djs_maskinterp1(y, np.array([0, 1, 0, 0, 1]), xval=x) assert np.allclose(yt, yi) yi = djs_maskinterp1(y, np.array([0, 1, 0, 0, 1]), xval=x, const=True) assert np.allclose(yt, yi) yt = y.copy() yt[0] = 1.0 yi = djs_maskinterp1(y, np.array([1, 0, 1, 0, 0]), xval=x, const=True) assert np.allclose(yt, yi) def test_djs_maskinterp(self): y = np.array([0.0, 1.0, 2.0, 3.0, 4.0], dtype=np.float64) mask = np.array([0, 1, 0]) with raises(ValueError): yi = djs_maskinterp(y, mask) mask = np.array([0, 1, 0, 0, 0]) x = np.array([0.0, 0.5, 1.0], dtype=y.dtype) with raises(ValueError): yi = djs_maskinterp(y, mask, xval=x) # 1-D case yi = djs_maskinterp(y, mask) assert np.allclose(y, yi) # 2-D case x = np.array([0.0, 0.5, 1.0, 1.5, 2.0], dtype=y.dtype) x = np.vstack((x, x, x)) y = np.vstack((y, y, y)) mask = np.vstack((mask, mask, mask)) with raises(ValueError): yi = djs_maskinterp(y, mask) with raises(ValueError): yi = djs_maskinterp(y, mask, axis=-1) with raises(ValueError): yi = djs_maskinterp(y, mask, axis=2) yi = djs_maskinterp(y, mask, axis=0) assert np.allclose(y, yi) yi = djs_maskinterp(y, mask, axis=0, xval=x) assert np.allclose(y, yi) mask[:, 1] = 0 mask[1, :] = 1 yi = djs_maskinterp(y, mask, axis=1) assert np.allclose(y, yi) yi = djs_maskinterp(y, mask, axis=1, xval=x) assert np.allclose(y, yi) # 3-D case x = np.dstack((x, x, x, x, x, x, x)) y = np.dstack((y, y, y, y, y, y, y)) mask = np.dstack((mask, mask, mask, mask, mask, mask, mask)) mask[:, :, :] = 0 mask[:, :, 5] = 1 yi = djs_maskinterp(y, mask, axis=0) assert np.allclose(y, yi) yi = djs_maskinterp(y, mask, axis=0, xval=x) assert np.allclose(y, yi) mask[:, :, :] = 0 mask[:, 3, :] = 1 yi = djs_maskinterp(y, mask, axis=1) assert np.allclose(y, yi) yi = djs_maskinterp(y, mask, axis=1, xval=x) assert np.allclose(y, yi) mask[:, :, :] = 0 mask[1, :, :] = 1 yi = djs_maskinterp(y, mask, axis=2) assert np.allclose(y, yi) yi = djs_maskinterp(y, mask, axis=2, xval=x) assert np.allclose(y, yi) # 4-D case y = np.random.random((2, 2, 2, 2)) with raises(ValueError): yi = djs_maskinterp(y, (y > 0.5), axis=0) pydl-0.7.0/pydl/pydlutils/tests/__init__.py0000644000076500000240000000021412632466352021364 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This is the pydl/pydlutils/tests directory. """ pydl-0.7.0/pydl/pydlutils/tests/t/0000755000076500000240000000000013434104632017510 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlutils/tests/t/test.par0000644000076500000240000000475112664677162021223 0ustar weaverstaff00000000000000#%yanny # test.par # # FTCL/Yanny file for testing the perl & python yanny readers. # # This file is much more complicated than the usual yanny file, but should # be completely compliant with the specifications in # http://www.sdss.org/dr6/dm/flatFiles/yanny.html # # $Id: test.par 114735 2010-07-02 22:15:51Z weaver $ # mjd 54579 # a keyword/value pair alpha beta gamma delta # another pair semicolon This pair contains a semicolon; # semicolon test typedef enum { FALSE, TRUE } BOOLEAN; typedef enum { FAILURE, INCOMPLETE, SUCCESS } STATUS; typedef struct { float mag[5]; char b[5][]; char foo[25]; double c; int flags[2]; BOOLEAN new_flag; } MYSTRUCT; typedef struct { float foo<3>; # This is archaic array notation, strongly deprecated, char bar<10>; # but still technically supported. } OLD; typedef struct { STATUS state; char timestamp[]; #UTC timestamp in format 2008-06-21T00:27:33 } STATUS_UPDATE; mystruct {17.5 17.546 17.4 16.1 16.0} {the rain in "spain is" wet} \ mystruct 1.24345567 {123123 1231213} FALSE mystruct {17.5 17.446 17.4 16.1 16.0} {the snow in chile "is dry"} \ "My dog has no nose." 7.24345567 {123123 0} TRUE mystruct {17.5 17.446 17.4 16.1 16.0} {this string is empty ""} \ "" 7.24345567 {123123 0} TRUE mystruct {17.5 17.446 17.4 16.1 16.0} \ {"this array contains empty strings" "" "this array contains empty strings" "" empty} \ "" 7.24345567 {123123 0} TRUE mystruct {17.5 17.446 17.4 16.1 16.0} { this array doesn't contain quotes} \ "" 7.24345567 {123123 0} TRUE mystruct {19.3 18.2 17.1 16.0 15.9} {this array contains "#hash" characters} \ "#hashtag" 2.71828 {321321 54321} FALSE mystruct {19.3 18.2 17.1 16.0 15.9} {there is "a comment" "with a #hash" "at the end"} \ "#hashtag #hash" 2.71828 {321321 54321} FALSE # this last item should be "FALSE" status_update INCOMPLETE 2008-06-21T00:27:33 status_update SUCCESS 2008-06-21T02:27:33 status_update FAILURE 2008-06-21T03:27:33 STATUS_UPDATE SUCCESS 2008-06-22T00:27:33 STATUS_UPDATE FAILURE 2008-06-22T01:27:33 STATUS_UPDATE SUCCESS 2008-06-22T00:27:33 STATUS_UPDATE FAILURE 2008-06-22T01:27:33 # Appended by FTCL.pm at 2008-06-23 23:09:58 UTC. STATUS_UPDATE SUCCESS 2008-06-22T00:27:33 STATUS_UPDATE FAILURE 2008-06-22T01:27:33 # Appended by FTCL.pm at 2008-06-23 23:26:52 UTC. STATUS_UPDATE SUCCESS 2008-06-22T00:27:33 STATUS_UPDATE FAILURE 2008-06-22T01:27:33 # old {1.2 2.3 3.4} hello old {4.3 3.2 2.1} { { } } pydl-0.7.0/pydl/pydlutils/tests/t/polygon.ply0000644000076500000240000000154112672535402021734 0ustar weaverstaff000000000000004 polygons pixelization 6s snapped balkanized polygon 0 ( 1 caps, 0 weight, 6.283185307179586 str): 0.0436193873653360 0.9990482215818578 0.0000000000000000 -1 polygon 1 ( 2 caps, 0 weight, 6.108652381980153 str): 0.0436193873653360 0.9990482215818578 0.0000000000000000 1 -0.0436193873653360 0.9990482215818578 0.0000000000000000 1 polygon 2 ( 3 caps, 0 weight, 0.083459952963577 str): 0.0436193873653360 0.9990482215818578 0.0000000000000000 1 -0.0436193873653360 0.9990482215818578 0.0000000000000000 -1 0.0000000000000000 0.0000000000000000 1.0000000000000000 -1.043619387365336 polygon 3 ( 3 caps, 0 weight, 0.083459952963577 str): 0.0436193873653360 0.9990482215818578 0.0000000000000000 1 -0.0436193873653360 0.9990482215818578 0.0000000000000000 -1 0.0000000000000000 0.0000000000000000 1.0000000000000000 0.956380612634664 pydl-0.7.0/pydl/pydlutils/tests/t/boss_traceset.fits0000644000076500000240000007570012632466352023261 0ustar weaverstaff00000000000000SIMPLE = T / conforms to FITS standard BITPIX = 8 / array data type NAXIS = 0 / number of array dimensions EXTEND = T END XTENSION= 'BINTABLE' /Binary table written by MWRFITS v1.11 BITPIX = 8 /Required value NAXIS = 2 /Required value NAXIS1 = 24036 /Number of bytes per row NAXIS2 = 1 /Number of rows PCOUNT = 0 /Normally 0 (no varying arrays) GCOUNT = 1 /Required value TFIELDS = 7 /Number of columns in table COMMENT COMMENT *** End of mandatory fields *** COMMENT COMMENT COMMENT *** Column names *** COMMENT TTYPE1 = 'FUNC ' / TTYPE2 = 'XMIN ' / TTYPE3 = 'XMAX ' / TTYPE4 = 'COEFF ' / TTYPE5 = 'XJUMPLO ' / TTYPE6 = 'XJUMPHI ' / TTYPE7 = 'XJUMPVAL' / COMMENT COMMENT *** Column formats *** COMMENT TFORM1 = '8A ' / TFORM2 = 'D ' / TFORM3 = 'D ' / TFORM4 = '3000D ' / TFORM5 = 'E ' / TFORM6 = 'E ' / TFORM7 = 'E ' / COMMENT COMMENT *** Column dimensions (2 D or greater) *** COMMENT TDIM4 = '( 6, 500)' / END legendre@@(oA? =]&]U(ap?T!s0o46@@(þ?fl@xod`?T0- G)`l> @(L\6?QAM|@?T1|"uˠfA3K@(Lqi?Vֹ|-z?T2#n ; ``GV@(A#?Š@P?T2"pWt@@(@J?Z$ۺDiPI?T2b8wŎ@(=5PN?y0|?T3r L@(9?l쮢P?T3(W l@(1n?8Tا@?T4A鋀~`gG\@(.?I!eRL?0?T4{ŀXebH@('O?Ӫ7 ?T5xM3-\0@($<'?apڿ1P?T6~6@WG@(U:?M}?T6@N R\@(:?.~>Wna?T7?9vM%@@(3?># U\p?T7a~.I@(r?\'cp?T8h|N D[P#@@(T?o[=r6@?T8VX ?@@( Ci?׆0JF-0?T9I5d;2M@(#b?a41~?T:C{kj6A@'oB?='alF0?T:@좀`2f@@'SW?^YHl?T;0.]\.m6@'̟ݺ?_'b-|G`?T; aO} **3@@'م ?K*[?TZ] +$@@'if?Vb1We0?T>ٱ2*`=K@@'G?dze0?T?\n x@^@'{$?xF2?T? H_h @'xnL?BtGḭ?T@X7d=@'(g?"QDZ?T@'=a:`=\@@'-?Z8ܿܐ?TANHB&1@']EH?aXՄ.?TA]sԀ"݋f@'p?U#BvRP?TB@n >@@' 1!?JQYҿ.S ?TB&@x@'&uk?*yMIP?TC*Cm@c@'4$5?2^  ?TCtByۊ@H@'cXFW?.ħ:rlRJ?TDlZ =#@'ZV?*$ RK0?TDlNа@'?VS?TDGiOOe@|@@'tu.? Qp "J?TE\H2($@'o{1? E:.p?TEG`o(@'ksS? x_nK0?TF4T5>@Z1@@'jy? 4޿ @L@?TFxs܊@'g9h? _OY?TG o(ߞ@'a8? ;͠{?TGpq,~?@@'_)@G? 9@F`?TGGbd$_4@'[? !;m@?TH>YmkDI@'Z;Ԫ? `>+g"?THQ76E]@'W8?]? mcCHp0?TIށF6 ~@@'T5cl? B&?TIi7~FMb@@'Pxr?뿓7?TI3/sVS@'Mĭ?&P?TJ)2s)wρx@'J݊/_?:Mlxe50?TJ c2r@'Du?l*4p?TJ┹93`a@@'C+?TZj?TK?]p ʰ1@@'BHc?H돥o?TK;ƀ@@'<?ՕvS >?TKf@'8MD6?H]|'t?TLINl`Ɔ^@'5L ?*x@?TL*Z5h *؀@'5Cš?ict/vi?TLGV`ÒB@'1&;l?ޣqƿZm1P?TMMIVNC ‹`r@'.?<ȹaԿ3@?TM$:@w`i2&@'+?Fοo1?TMZ{WC@@'&W|?Δ!C?TNC ܀4d@@'%=}?Lc1ο!?TNbuu@d@'!ms?$'迓 2@?TN}@v}@@'Ir&?Ş?TO.]πw74r@'}o?eWY4 i?TOz{/a`nP@'L9?Wߎ cNkK?TO(Jx̀LS@'z?AX!t?TPUs4 R@']D?WT!>?TPWq6K̀@'?U #?TP]V`_u@' ]W?$ϙ@?TP-+(L<@'(|=?ln%y ?TQ+.l(›@'??"u$R&0?TQo 4, 4@@'{=?m>忓&0WO?TQ?? j@&&?_7i& p?TQaZ߀@&J?mi&?TR4@`@@&'?b6'yj?TRt+ˀvA'@&?#N'4??TRMbn}[@& ?[>>s)`vљ?TRsOhP|?@&?H;)i>Ő?TS+6;`=@@&q9?LEH*ZP?TSg3t(ɀJ@&gxn?UD*{@?TSvzO@&ބVw?;}P:,`m̡?TSڜV @&ި͑s? vſ,?TTKKN` @& ~?4]^~,)?TTKMܭ- @&E?i-V ?TT- zҠI@&׵1$? [ѿ.@?TT#XcB/@& F?qj/B_?TTc&`j)@@& M?h8迓/5?TUIc`r@ @&j?AAC0< ?TUPL.41( _ZS@&x?MA41%e\?TUx/q@&U(?=YgM࿓2H!?TU`Q54̀@&ȷ&i?r'GV2|1@?TU;pO@6@&ŭF"?ǂlm32?"h7 tap?TW_D Ý&@&|1?"I7=7oT0?TW[!@Ĩ{@&b3?"C-8V;?TX,w `ŴEG@&?".uU8O@?TX6 i @ƫ$@@&??#[AxϿ:Ccp?TXWLmn[U@&[AcZ?#bZ:ɠ?TXwd〾N# ɺD@&OI?#]51\;{?TXg @-`3@@&wć?#{=?TXQ3 e@&k#?#B; vw?TX$&:`O@@&籢N?$#p<OP?TX#X ́p@&+2?$]> R?TY8 8Ψk@@& "R?$Ќ=?)N?TY!Bȯwc@&s?%xb=Y ?TY9&=Q`t@&62?%q8q@*p?TYS'䀾@h@@&j|?%?tB>Ø1>?TYk; dŀ@&F?%U@,Ҕ.?TYN_Bc ~@&#?&!W@~D?TY򒀾ĥr@&"?&@W:@N?TY@K'(E@&jW+?&Z.@pȝa?TY3E@ٿ)@&0?&oFg?L?TY& ĈC@&A?' A@H-?TY@6" ܵ@&P-?'69 B?TYi_~`1@&3v3?'t B3P?TZ ٗ񂺜ߝr@&qD?'~FCyVLl@?TZZxF?$@&"e?'F鿓ALT?TZ*ӍnNґ3@@&vT?(o8@ڿCxD?TZ;Zjd@:S@&͍?(b8ZBxyP?TZH(Z`Y`@@&(N?(mL=*DDP?TZW6QPKgCr@&3?(¿Eo?TZbáTG% `+WE@&Mr?(rD+?TZo(=꯳@&4K?)GmEiAi?TZy'4^ `tr@&P?TZX8jN`l5@&?+ڿJÖ`?TZ,jz@+|@@&0?+tK[^?TZ6_?𽭀<@& ?+qk迓L Lp?TZ{W@&~@?+K2 T?TZҮtyw @@&)?, { ILP8?TZ`/.! Ig@@&ql$?,%WL޹K p?TZ`E ` OW@&;?,P6,M.JM?TZ1F߳@&~K{?,YՔpҿNG5?TZ͢kE;@&?]7?,!gLjF?TZZ;U+;@&{"?-Y LݙIr?TZe`?.ROW ?TZL+Imh .\7@&y˒K?.'6D7Rߝ?TZAyGEg@/@@&vo/?/PR?TZ4JBG1§@&xO?.攳4S@?TZ'6?i3L@@&iI?/60R^ 0?TZ<@4@&'w?0&񵿓U??TY>@@&$??1(4V⨿@?TYS4iz`?@&&wQ?1XsX|U8/P?TY2|`A;T@@&"䧔?1Ood8WQH`?TYyN0hSBw@&'+s?1W_W C?TY,f..1 DH@&&h?1pSzWP\?TY}Պ,0@Eї@@&'a(?1xnWW4|?TYlz*XfGO@@&),?1aH DXai0?TY[fJ(H@&%5]?1"6XR ?TYG '[J]W@&)a^?1WXt,q ?TY6)9%j@K@&,.*?17+RX#`?TY#{*[$4Y@L%@@&( ?1xlX' ?TY[as"N(Ҁ@&,F!?188Y`?TX>݀! JOD@&(D4l?1ҰX^5?TX jpQb @&)?1֨{Y['?TXo`C4RL@&)RT?1+;ZU?TXI+7`TȀ@&*8YH?217Y?TXβWy@U@@&$3M?27V-0TZp+`?TXeހ+ V&g@@&]d&?11IXwWvp?TXھU,g@&˂ h?0+;?V~8?TX&-r`SƦ@&K8?0FJ/V|?TXvZvTc3@&?0Q4m:W?TX_=z`U@&&>?05AQVة?TXIoն}V@& \g?0jW&VSyP?TX2y6&/1@WEc@&x\~y?09U?TXD2 Y&@&t^?0Mti县W:?TXe Z@&hV?0FBZdҿVl@0?TW2O[ o@@&[AL?0V#V?TWyR [l@@&?0RUx|??TW-@\Ѣ@@&?0LҷV&?TWoD]6@@&+'?0/3bW4p?TW?c܀^ w@@&2?08D ɿW2@?TWl̚ `_r @@& H.`gu @@&?/캺V.X?TU_UЀAg2g@&,} ?/z~鿓UKP?TUAmD@0g`@&mr?/4VL?TU#G!7gL@&?/@lVQg?TU)׀Jg@&1?/`1Wy$I`?TTuMgI@&!a?/{,ؿW}Sq?TT̏D.P̀g;{@&X?/bOcV-6-?TT,Sr5g{@&ӟӢ?/hSK&Wt?TToV(2f ]@@&Y?/2uV4GZa?TTuܫY|۠erD@&]?/`US?TTW]v ea!B@&E?/DPTSL ?TT9; Y`6ḓ@&%WE?.eɿVIo0?TT9/d dU @&Sf?.ځW4}P?TSamgσ cz@&x ?.ЬV[?TSIkTc @&R|?.dKHVߥp?TS o@fObs^@&D?. ]V9h?TSIs.GҠaɬY@&=?.UGUp?TSUwlZ`Ǧ@&0?.c?TRt<'𝫐W @'J/?-wH޿S;?TRT?rrUh@' _?-cHCSzJ\p?TR5Ӟ-aSTs@''1*?,^iFSkM0?TRuhRV|@@'*?,::wRՙP?TQ6vMP˦$@'+?,URc+?TQ[P%Ϣ@K̈́@'3?,NSg`?TQH,@I@'9(?,4~l忓QO0[?TQf]E`HC@@'> v3?+ "?P0?TQIZΆ*F'?@'>N?+)ڏܿR_@?TQ,5܀Ӣ5>D@@'FZdg?+BR MP?TQƐ@AmT@'Hğ2?+ bQ&_ ,`?TPPE5" ?@'J94g?+ͮ<P? ?TP@c:!=C@'Q-uj?+pLQ?ΐ?TPm[@;?I@@'S ?+y! PA@?TP8tg@@'X?*fP#*H?TP}Q/D`6@qq@'[[?*prPp?TP`N@3v@'Y4?*VؿO'XT`?TP@et1yzp@'\.?1?*wiĿQ\D@?TP%3" W.g3@'b@?*UtPG* ?TP A} +@@'b?*;2!ѿPu?TOb)3E@'r?)NC:0?TOӊp-%@@'qIł?)}KrMw`?TOܱL6#(S@'soA.?)ur@oLݵtE?TO`"` =t@'z+p?)D=0MΕ ?TO~#5( m2@' r?)wοLn|?TOcrq-00|@'|?(K+NU'@P?TOGT/4 ٟ@c@'I~m?(L3]ڀ?TO+)ٚ:CD- @'G_?(/K#P?TOطT@;~ Հ@'8J?(h FJ뀮p?TN>ҀF@ [@'H0?(0/M\͐?TN]Lf"@<@'p|L?'1g׿J.r?TNR@@'˽?'YEK)-P?TN HYa@v @'gԖ[?'zϿJgP?TN_}`>-@'q/0?'P)8K: ?TNve 6@'h?'p`Iz ?TN\VlU,%@'1?&}J$&~bp?TNC޳rp@@@'1z?& ؎J?TN*FRxNxe@' ?&ws$I|0?TN!Ow@'zU?&fTKG>W/P?TMG񴀾>^@@' 3?&>DKHڕ~S?TMmv~y@@'4`?%mqyݿFҮ0?TMd%6\@@'; "?%N !F_W@?TMQw}7@@'^ƚ?%jRFŶC?TM怾9ء\@'iH?$G*E ʋ?TMgЀZW)@'ڄ/t?$F,?TMpڀf?`~@'m ?$`G8οEA?TMYEAD@'>?$> Fgp?TM. 7@'Ꜭ;?#\rtEk9?TNKI7@|@'^?#ϿD~?TNV#OUw)@'H?#oNbKD&`/%?TN@\V8q F@@'?#(LFKC4]?TNcĀ\@lPր@'46?"qD嗀?TN:b+fL@(;-zP?"Ƙ0Bn䮠?TNi_-`̍,@(2x?"smAO e@?TN &o" Z @( %-?"Na{AW-(P?TNyȗ&v?`TBw@@(Qa?!QGB+?TNh|^"`N@(J?!QgA ?TNX/j HE@(6U?!vZ@V?TNG~B;+@( ?!@gH`?TN8 xQ ;Ў@(".H? 7?^?TN(\;}5Y?L@(._? C>t ?TNu€،.{n@@(/̗? X!><)ip?TN o@D'x@(9^?L䜘=*?TN30e` @(=a?|g<܉?TMx{q`z^@(AF1?WA R;0?TM%o `"π@(Fݼ?#K;r@?TM׹l1 @(L&b ;Up?TKƀY^.@(d0o?{9-¿9!`?TKM `c 'g@(fN`?%69F%?TKE\go m@@(o ?ՊHFg9  ?TK-5+n\&@V(@(r1?vXyH7+I@?TKv yμ^@@(|8?-c8=rD?TK \ |W`*@(י>?ڷC&8_`ip?TKy^Z@@(3U?؏7=M@?TKn]̴=@(Q?*q\.,7n-p?TKgꮀNt8@@(B?!ю5S?TK]T8mȩ@(?rl迓5pX?TKW+`F@@(q1?⿓5- ?TK@t>R@@(u٠?85JP?TK:y: Ь`@@(},?&A.4[?TK4k |@J@@(&?נ=f45?TK0v @@(?zR4;,P?TK+k<ĽנB@( 6s?-W@X%1yp?TK%1M`J@(?1MH@?TK#=ү@)l@($-k?az2/0?TK"=i`y#@(tV]?Yfq18HU?TK,`*@(# ?=074?TKS&50@@(/?Z1/9P?TK }s[@(FB?͎0^} ?TKeq?8`s@(9!?!zϿ/4S6?TKyF1| i'@@(1=@?9}djO-O`?TK3_tA@($ ?JB-lmP?TKfUQ  Ù@(ѕDܭ?DP-?TK&+۶ӠKdр@(?]L0,]+?TK=B*;@AWx@(f@?8,$?TK"Ti6@@(=q?HO,Bz@?TK&T%5T,4@@(@?RG+ogi?TK* A*, B![@);|?mo(^*?TK:΂1Y7@)$o8?D3g'aʀ?TK? 8cz +@)*? Iᅮ&~2x?TKFh?f 1l@)/|_?\o&l7?TKLE`о@)7:?8E%iދ@?TKTk%Ln20@)? U?$f65?TK\8S"y@8@)Ge?dC@$+?TKeܤY 6@)L'-?$8?TKoʀ`%Uf@)Ph\?0 t"R?TKwg't5@)W_C?;"?TK.mj[@)^[?Z!LM?TKsCU@)g"3_?~4 }hB?TKzf$+fÊ@)lӢ?&r?TKnj%=*@)qg;8? ޴?TK"r6`0@)}b$y? 5;¿2P?TKEbj@WGЀ@)7 _? "ֿo.0?TK&ʀR~ư@).ʀ? Y)M|?TK݀9@/&@)xe? $dnP?TKd4󠋐@By@)S? !#Ƿ?TL 3yA@)V? f;?TLPg^m@):)? KnYqU?TL0e!<B玀@)'@N? hH85 $p?TLBsN'x@)^Y? Eqt@?TLW61g F@)? F3]?TLlJ;m` @)W~?KbI4۰U?TLsO03kL@)ԓN?P0|>?TLF |@)%N??Y :EF?TLmtO'WР휊@)-\?1*/?TL;-%l ~@)*?XΐŢ?TLbyÜbݢ@)p?r#dX1;f@?TL4KEfY@)\+Y?R$*P?TMa&Z~`(Z_@)Oc?7ݿȇ?TM4~0ͭ T}@*Q ?=1 Hx?TMQ.Gj@* LV?١K!?TMoibd@ͼ@*>?yf4pbp?TMXɀ/]``]#@*5:;J?f̿M ?TM 1 쐥@* ~X?˿ L]$?TMkN`q;@*&^,?Y_2 lDwk?TMF-|Q|Հ@*+?< 2x`?TN-朅@1N@*3?U꠿ hp?TN2% T$@*M8+5?6|Ŀ >y?TNX< %\@*Tɣa? fd= ?TN|Z*]{ i1@*Z,&>?S+;߿`?TN)򞀾/-D@*d')?_NOڿv_r0?TN4G@*kc&{?QCR9guH&?TN1@9f`le`@*s[? u?TO- J>`@J0@*y&?bZ ?TOAC~@(Fx@*[F?;׍T]?TOlL HPwF@*Y?[(J%q~p?TOE[Ln@*k)`? mLS`?TO Q$꿚@*l?U#1M~?TO<_Vem꜋@*H?Ah]Ep4u?TP1@ZצּxP@*+D?w툰B?TPLs _ U&@@*мU?BRJ7ʨ?TP:c /@*Q?迒?TPN-4htt aqP@*ޘ.?#\~zroϦ?TP޴I@lq)@@*ĩa?]?TQ?~q.7P=vp@*:M?6̾U7Z?TQF `uH@霚p@*x57?[1G?TQ{`y~v@* +?4R /?TQ<}q`PV p@*;?濒aOj\?TQj@&*~@*s?z*˿߿+K?TRE5@*wtT`?t`?TRXܙ @+l ?ysQX@?TR&5@CС -`@+ )U?%vԿ~?TR`b~6@+ Tf?mIK?TS !L.EPez0@+s,g?%3Lp# f?TSEր8=9@+BJ?DT '?TSN k@y@+"P'e?γJi.N ?TSĺ).1M @++5f?򘣍АNC?TTc`f08/@+/AC>?A + ?TTEC@p@+9{?4п}?TYHGϻ4 H@+Ty=?)Ӌlޮ?TY8@JGq_0@+׮t?rf4* a?TY@hpARZp@+?$7Ͽܨ?TZO"QC)tǰ@+֒X?W2:BIR+?TZ(|@է^'@+5?K/ÿL?T[wՠ0㮷˼p@+ߨ ?:{ ?T[`0}s@,?三' X?T[`2CKw#0@,$?q{Z 4?T\nT8@,VdR?{D۪B?T\}Y׫U@,T?wFȫ(H?T\ݏk ۰5Ⳅ@,,W%?AW8^>xN?T]@-<gx@,7In?Q;aܿӧ?T]8B`̏`Lb@,I?lna'\k?T^RdRr@@,PY?H{Q4?T^ngݼ"P@,WE*M?@˯&Y?T^K=@,_B?in濒y1?T_>N@+`DzXÐ@,f@3+?F:dۿ y?T_@6YD1@,nEpydl-0.7.0/pydl/pydlutils/tests/t/yanny_data.json0000644000076500000240000000655012632466352022551 0ustar weaverstaff00000000000000{ "tempfile1.par": { "pairs": { "keyword1": "value1", "keyword2": "value2" }, "structures": { "MYSTRUCT0": { "dtype": [ ["ra", "f8"], ["dec", "f8"], ["mag", "f4", [5]], ["flags", "i4"], ["new_flag", "|S5"] ], "size": 4, "columns": { "ra": "double", "dec": "double", "mag": "float[5]", "flags": "int", "new_flag": "BOOLEAN" }, "data": { "ra": [10.0, 20.5, 30.75, 40.55], "dec": [-5.1234, -10.74832, 67.994523, 11.437281], "mag": [ [0.0, 1.0, 2.0, 3.0, 4.0], [5.1, 6.2, 7.3, 8.4, 9.5], [22.123, 23.95, 22.6657, 21.0286, 22.9876], [13.54126, 15.37456, 14.52647, 12.648640, 12.0218] ], "flags": [4, 16, 64, 264], "new_flag": ["FALSE", "TRUE", "TRUE", "FALSE"] } }, "MY_STATUS": { "dtype": [ ["timestamp", "i8"], ["state", "S10"] ], "size": 4, "columns": { "timestamp": "long", "state": "STATUS" }, "data": { "timestamp": [1382384327,1382384527,1382384727,1382384927], "state": ["SUCCESS","SUCCESS","FAILURE","INCOMPLETE"] } } }, "enums": { "new_flag": ["BOOLEAN", ["FALSE", "TRUE"]], "state": ["STATUS", ["FAILURE", "INCOMPLETE", "SUCCESS"]] } }, "test.par": { "pairs": { "mjd": "54579", "alpha": "beta gamma delta", "semicolon": "This pair contains a semicolon;" }, "structures": { "MYSTRUCT": { "dtype": [ ["mag", "23 in r-band)" maskbits TARGET 8 GALAXY_BRIGHT_CORE "Galaxy targets who fail all the surface brightness selection limits but have r-band fiber magnitudes brighter than 19" maskbits TARGET 9 ROSAT_A "ROSAT All-Sky Survey match, also a radio source" maskbits TARGET 10 ROSAT_B "ROSAT All-Sky Survey match, have SDSS colors of AGNs or quasars" maskbits TARGET 11 ROSAT_C "ROSAT All-Sky Survey match, fall in a broad intermediate category that includes stars that are bright, moderately blue, or both" maskbits TARGET 12 ROSAT_D "ROSAT All-Sky Survey match, are otherwise bright enough for SDSS spectroscopy" maskbits TARGET 13 STAR_BHB "blue horizontal-branch stars" maskbits TARGET 14 STAR_CARBON "dwarf and giant carbon stars" maskbits TARGET 15 STAR_BROWN_DWARF "brown dwarfs (note this sample is tiled)" maskbits TARGET 16 STAR_SUB_DWARF "low-luminosity subdwarfs" maskbits TARGET 17 STAR_CATY_VAR "cataclysmic variables" maskbits TARGET 18 STAR_RED_DWARF "red dwarfs" maskbits TARGET 19 STAR_WHITE_DWARF "hot white dwarfs" maskbits TARGET 20 SERENDIP_BLUE "lying outside the stellar locus in color space" maskbits TARGET 21 SERENDIP_FIRST "coincident with FIRST sources but fainter than the equivalent in quasar target selection (also includes non-PSF sources" maskbits TARGET 22 SERENDIP_RED "lying outside the stellar locus in color space" maskbits TARGET 23 SERENDIP_DISTANT "lying outside the stellar locus in color space" maskbits TARGET 24 SERENDIP_MANUAL "manual serendipity flag" maskbits TARGET 25 QSO_MAG_OUTLIER "Stellar outlier; too faint or too bright to be targeted" maskbits TARGET 26 GALAXY_RED_II "Luminous Red Galaxy target (Cut II criteria)" maskbits TARGET 27 ROSAT_E "ROSAT All-Sky Survey match, but too faint or too bright for SDSS spectroscopy" maskbits TARGET 28 STAR_PN "central stars of planetary nebulae" maskbits TARGET 29 QSO_REJECT "Object in explicitly excluded region of color space, therefore not targeted at QSO" maskbits TARGET 31 SOUTHERN_SURVEY "Set in primtarget if this is a special program target" maskalias TARGET PRIMTARGET "PRIMTARGET is a synonym for TARGET." maskalias TARGET LEGACY_TARGET1 "LEGACY_TARGET1 is a synonym for TARGET." # #------------------------------------------------------------------------------ masktype TTARGET 32 "Secondary target mask bits in SDSS-I, -II (for LEGACY_TARGET2, SPECIAL_TARGET2 or SECTARGET)." maskbits TTARGET 0 LIGHT_TRAP "hole drilled for bright star, to avoid scattered light" maskbits TTARGET 1 REDDEN_STD "reddening standard star" maskbits TTARGET 2 TEST_TARGET "a test target" maskbits TTARGET 3 QA "quality assurance target" maskbits TTARGET 4 SKY "sky target" maskbits TTARGET 5 SPECTROPHOTO_STD "spectrophotometry standard (typically an F-star)" maskbits TTARGET 6 GUIDE_STAR "guide star hole" maskbits TTARGET 7 BUNDLE_HOLE "fiber bundle hole" maskbits TTARGET 8 QUALITY_HOLE "hole drilled for plate shop quality measurements" maskbits TTARGET 9 HOT_STD "hot standard star" maskbits TTARGET 31 SOUTHERN_SURVEY "a segue or southern survey target" maskalias TTARGET SECTARGET "SECTARGET is identical to TTARGET." maskalias TTARGET LEGACY_TARGET2 "LEGACY_TARGET2 is identical to TTARGET." maskalias TTARGET SPECIAL_TARGET2 "SPECIAL_TARGET2 is identical to TTARGET." # #------------------------------------------------------------------------------ masktype ZWARNING 32 "Warnings for SDSS spectra." maskbits ZWARNING 0 SKY "sky fiber" maskbits ZWARNING 1 LITTLE_COVERAGE "too little wavelength coverage (WCOVERAGE < 0.18)" maskbits ZWARNING 2 SMALL_DELTA_CHI2 "chi-squared of best fit is too close to that of second best (<0.01 in reduced chi-sqaured)" maskbits ZWARNING 3 NEGATIVE_MODEL "synthetic spectrum is negative (only set for stars and QSOs)" maskbits ZWARNING 4 MANY_OUTLIERS "fraction of points more than 5 sigma away from best model is too large (>0.05)" maskbits ZWARNING 5 Z_FITLIMIT "chi-squared minimum at edge of the redshift fitting range (Z_ERR set to -1)" maskbits ZWARNING 6 NEGATIVE_EMISSION "a QSO line exhibits negative emission, triggered only in QSO spectra, if C_IV, C_III, Mg_II, H_beta, or H_alpha has LINEAREA + 3 * LINEAREA_ERR < 0" maskbits ZWARNING 7 UNPLUGGED "the fiber was unplugged, so no spectrum obtained" maskbits ZWARNING 8 BAD_TARGET "catastrophically bad targeting data (e.g. ASTROMBAD in CALIB_STATUS)" maskbits ZWARNING 9 NODATA "No data for this fiber, e.g. because spectrograph was broken during this exposure (ivar=0 for all pixels)" # #------------------------------------------------------------------------------ masktype FLUXMATCH_STATUS 16 "Flags from flux-based matching to SDSS photometry" maskbits FLUXMATCH_STATUS 0 ORIGINAL_FLUXMATCH "used the original positional match (which exists)" maskbits FLUXMATCH_STATUS 1 FIBER_FLUXMATCH "flagged due to fiberflux/aperflux issue" maskbits FLUXMATCH_STATUS 2 NONMATCH_FLUXMATCH "flagged due to non-match" maskbits FLUXMATCH_STATUS 3 NOPARENT_FLUXMATCH "no overlapping parent in primary field" maskbits FLUXMATCH_STATUS 4 PARENT_FLUXMATCH "overlapping parent has no children, so used it" maskbits FLUXMATCH_STATUS 5 BRIGHTEST_FLUXMATCH "picked the brightest child" # #------------------------------------------------------------------------------ #----------- boss target selection flags masktype BOSS_TARGET1 64 "BOSS survey primary target selection flags" # galaxies maskbits BOSS_TARGET1 0 GAL_LOZ "low-z lrgs" maskbits BOSS_TARGET1 1 GAL_CMASS "dperp > 0.55, color-mag cut " maskbits BOSS_TARGET1 2 GAL_CMASS_COMM "dperp > 0.55, commissioning color-mag cut" maskbits BOSS_TARGET1 3 GAL_CMASS_SPARSE "GAL_CMASS_COMM & (!GAL_CMASS) & (i < 19.9) sparsely sampled" maskbits BOSS_TARGET1 6 SDSS_KNOWN "Matches a known SDSS spectra" maskbits BOSS_TARGET1 7 GAL_CMASS_ALL "GAL_CMASS and the entire sparsely sampled region" maskbits BOSS_TARGET1 8 GAL_IFIBER2_FAINT "ifiber2 > 21.5, extinction corrected. Used after Nov 2010" # galaxies deprecated #maskbits BOSS_TARGET1 3 GAL_GRRED "red in g-r" #maskbits BOSS_TARGET1 4 GAL_TRIANGLE "GAL_HIZ and !GAL_CMASS" # part of galaxies deprecated, but some spectra actually have this bit set. maskbits BOSS_TARGET1 5 GAL_LODPERP_DEPRECATED "(DEPRECATED) Same as hiz but between dperp00 and dperp0" # qsos maskbits BOSS_TARGET1 10 QSO_CORE "restrictive qso selection: commissioning only" maskbits BOSS_TARGET1 11 QSO_BONUS "permissive qso selection: commissioning only" maskbits BOSS_TARGET1 12 QSO_KNOWN_MIDZ "known qso between [2.2,9.99]" maskbits BOSS_TARGET1 13 QSO_KNOWN_LOHIZ "known qso outside of miz range. never target" maskbits BOSS_TARGET1 14 QSO_NN "Neural Net that match to sweeps/pass cuts" maskbits BOSS_TARGET1 15 QSO_UKIDSS "UKIDSS stars that match sweeps/pass flag cuts" maskbits BOSS_TARGET1 16 QSO_KDE_COADD "kde targets from the stripe82 coadd " maskbits BOSS_TARGET1 17 QSO_LIKE "likelihood method" maskbits BOSS_TARGET1 18 QSO_FIRST_BOSS "FIRST radio match" maskbits BOSS_TARGET1 19 QSO_KDE "selected by kde+chi2" maskbits BOSS_TARGET1 40 QSO_CORE_MAIN "Main survey core sample" maskbits BOSS_TARGET1 41 QSO_BONUS_MAIN "Main survey bonus sample" maskbits BOSS_TARGET1 42 QSO_CORE_ED "Extreme Deconvolution in Core" maskbits BOSS_TARGET1 43 QSO_CORE_LIKE "Likelihood that make it into core" maskbits BOSS_TARGET1 44 QSO_KNOWN_SUPPZ "known qso between [1.8,2.15]" # standards maskbits BOSS_TARGET1 20 STD_FSTAR "standard f-stars" maskbits BOSS_TARGET1 21 STD_WD "white dwarfs" maskbits BOSS_TARGET1 22 STD_QSO "qso" # template stars maskbits BOSS_TARGET1 32 TEMPLATE_GAL_PHOTO "galaxy templates " maskbits BOSS_TARGET1 33 TEMPLATE_QSO_SDSS1 "QSO templates " maskbits BOSS_TARGET1 34 TEMPLATE_STAR_PHOTO "stellar templates " maskbits BOSS_TARGET1 35 TEMPLATE_STAR_SPECTRO "stellar templates (spectroscopically known)" # #------------------------------------------------------------------------------ #------------ ancillary target selection flags masktype ANCILLARY_TARGET1 64 "BOSS survey target flags for ancillary programs" maskbits ANCILLARY_TARGET1 0 AMC "defined in blake_boss_v2.descr" maskbits ANCILLARY_TARGET1 1 FLARE1 "defined in blake_boss_v2.descr" maskbits ANCILLARY_TARGET1 2 FLARE2 "defined in blake_boss_v2.descr" maskbits ANCILLARY_TARGET1 3 HPM "defined in blake_boss_v2.descr" maskbits ANCILLARY_TARGET1 4 LOW_MET "defined in blake_boss_v2.descr" maskbits ANCILLARY_TARGET1 5 VARS "defined in blake_boss_v2.descr" maskbits ANCILLARY_TARGET1 6 BLAZGVAR "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 7 BLAZR "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 8 BLAZXR "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 9 BLAZXRSAM "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 10 BLAZXRVAR "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 11 XMMBRIGHT "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 12 XMMGRIZ "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 13 XMMHR "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 14 XMMRED "defined in brandtxmm-andersonblazar-merged.descr " maskbits ANCILLARY_TARGET1 15 FBQSBAL "defined in master-BAL-targets.descr" maskbits ANCILLARY_TARGET1 16 LBQSBAL "defined in master-BAL-targets.descr" maskbits ANCILLARY_TARGET1 17 ODDBAL "defined in master-BAL-targets.descr" maskbits ANCILLARY_TARGET1 18 OTBAL "defined in master-BAL-targets.descr" maskbits ANCILLARY_TARGET1 19 PREVBAL "defined in master-BAL-targets.descr" maskbits ANCILLARY_TARGET1 20 VARBAL "defined in master-BAL-targets.descr" maskbits ANCILLARY_TARGET1 21 BRIGHTGAL "defined in bright_gal_v3.descr" maskbits ANCILLARY_TARGET1 22 QSO_AAL "defined in qsoals_v2.descr " maskbits ANCILLARY_TARGET1 23 QSO_AALS "defined in qsoals_v2.descr " maskbits ANCILLARY_TARGET1 24 QSO_IAL "defined in qsoals_v2.descr " maskbits ANCILLARY_TARGET1 25 QSO_RADIO "defined in qsoals_v2.descr " maskbits ANCILLARY_TARGET1 26 QSO_RADIO_AAL "defined in qsoals_v2.descr " maskbits ANCILLARY_TARGET1 27 QSO_RADIO_IAL "defined in qsoals_v2.descr " maskbits ANCILLARY_TARGET1 28 QSO_NOAALS "defined in qsoals_v2.descr " maskbits ANCILLARY_TARGET1 29 QSO_GRI "defined in sdss3_fan.descr " maskbits ANCILLARY_TARGET1 30 QSO_HIZ "defined in sdss3_fan.descr " maskbits ANCILLARY_TARGET1 31 QSO_RIZ "defined in sdss3_fan.descr " maskbits ANCILLARY_TARGET1 32 RQSS_SF "defined in rqss090630.descr" maskbits ANCILLARY_TARGET1 33 RQSS_SFC "defined in rqss090630.descr" maskbits ANCILLARY_TARGET1 34 RQSS_STM "defined in rqss090630.descr" maskbits ANCILLARY_TARGET1 35 RQSS_STMC "defined in rqss090630.descr" maskbits ANCILLARY_TARGET1 36 SN_GAL1 "defined in ancillary_supernova_hosts_v5.descr" maskbits ANCILLARY_TARGET1 37 SN_GAL2 "defined in ancillary_supernova_hosts_v5.descr" maskbits ANCILLARY_TARGET1 38 SN_GAL3 "defined in ancillary_supernova_hosts_v5.descr" maskbits ANCILLARY_TARGET1 39 SN_LOC "defined in ancillary_supernova_hosts_v5.descr" maskbits ANCILLARY_TARGET1 40 SPEC_SN "defined in ancillary_supernova_hosts_v5.descr" maskbits ANCILLARY_TARGET1 41 SPOKE "defined in BOSS_slowpokes_v2.descr" maskbits ANCILLARY_TARGET1 42 WHITEDWARF_NEW "defined in WDv5_eisenste_fixed.descr" maskbits ANCILLARY_TARGET1 43 WHITEDWARF_SDSS "defined in WDv5_eisenste_fixed.descr" maskbits ANCILLARY_TARGET1 44 BRIGHTERL "defined in sd3targets_final.descr" maskbits ANCILLARY_TARGET1 45 BRIGHTERM "defined in sd3targets_final.descr" maskbits ANCILLARY_TARGET1 46 FAINTERL "defined in sd3targets_final.descr" maskbits ANCILLARY_TARGET1 47 FAINTERM "defined in sd3targets_final.descr" maskbits ANCILLARY_TARGET1 48 RED_KG "defined in redkg.descr" maskbits ANCILLARY_TARGET1 49 RVTEST "defined in redkg.descr" maskbits ANCILLARY_TARGET1 50 BLAZGRFLAT "defined in anderson-blazar.par" maskbits ANCILLARY_TARGET1 51 BLAZGRQSO "defined in anderson-blazar.par " maskbits ANCILLARY_TARGET1 52 BLAZGX "defined in anderson-blazar.par" maskbits ANCILLARY_TARGET1 53 BLAZGXQSO "defined in anderson-blazar.par" maskbits ANCILLARY_TARGET1 54 BLAZGXR "defined in anderson-blazar.par" #maskbits ANCILLARY_TARGET1 55 BLAZXR "defined in anderson-blazar.par" maskbits ANCILLARY_TARGET1 56 BLUE_RADIO "defined in tremonti-blue-radio.fits.gz" maskbits ANCILLARY_TARGET1 57 CHANDRAV1 "defined in haggard-sf-accrete.fits" maskbits ANCILLARY_TARGET1 58 CXOBRIGHT "defined in brandt-xray.par" maskbits ANCILLARY_TARGET1 59 CXOGRIZ "defined in brandt-xray.par" maskbits ANCILLARY_TARGET1 60 CXORED "defined in brandt-xray.par" maskbits ANCILLARY_TARGET1 61 ELG "defined in kneib-cfht-elg.fits" maskbits ANCILLARY_TARGET1 62 GAL_NEAR_QSO "defined in weiner-qso-sightline.fits" maskbits ANCILLARY_TARGET1 63 MTEMP "defined in blake-transient-v3.fits" pydl-0.7.0/pydl/pydlutils/tests/t/polygon_no_id.fits0000644000076500000240000002640012732773405023252 0ustar weaverstaff00000000000000SIMPLE = T /Primary Header created by MWRFITS v1.11 BITPIX = 16 / NAXIS = 0 / EXTEND = T /Extensions may be present DATE = 'Fri Jun 10 21:24:31 2016' /Time of creation of polygon fits file IDLUTILS= 'v5_5_5 ' /Version of idlutils used END XTENSION= 'BINTABLE' /Binary table written by MWRFITS v1.11 BITPIX = 8 /Required value NAXIS = 2 /Required value NAXIS1 = 156 /Number of bytes per row NAXIS2 = 1 /Number of rows PCOUNT = 0 /Normally 0 (no varying arrays) GCOUNT = 1 /Required value TFIELDS = 7 /Number of columns in table COMMENT COMMENT *** End of mandatory fields *** COMMENT COMMENT COMMENT *** Column names *** COMMENT TTYPE1 = 'XCAPS ' / TTYPE2 = 'CMCAPS ' / TTYPE3 = 'NCAPS ' / TTYPE4 = 'WEIGHT ' / TTYPE5 = 'PIXEL ' / TTYPE6 = 'STR ' / TTYPE7 = 'USE_CAPS' / COMMENT COMMENT *** Column formats *** COMMENT TFORM1 = '12D ' / TFORM2 = '4D ' / TFORM3 = 'J ' / TFORM4 = 'D ' / TFORM5 = 'J ' / TFORM6 = 'D ' / TFORM7 = 'J ' / COMMENT COMMENT *** Column dimensions (2 D or greater) *** COMMENT TDIM1 = '( 3, 4) ' / COMMENT COMMENT *** Unsigned integer column scalings * COMMENT TSCAL7 = 1 / TZERO7 = 2147483648 / END ?? +?<&3\Oָ€?Ӵ<&3\?q6(???spydl-0.7.0/pydl/pydlutils/tests/t/sdss_traceset.fits0000644000076500000240000005500012632466352023256 0ustar weaverstaff00000000000000SIMPLE = T / conforms to FITS standard BITPIX = 8 / array data type NAXIS = 0 / number of array dimensions EXTEND = T END XTENSION= 'BINTABLE' /Binary table written by MWRFITS v1.4a BITPIX = 8 /Required value NAXIS = 2 /Required value NAXIS1 = 12824 /Number of bytes per row NAXIS2 = 1 /Number of rows PCOUNT = 0 /Normally 0 (no varying arrays) GCOUNT = 1 /Required value TFIELDS = 4 /Number of columns in table COMMENT COMMENT *** End of mandatory fields *** COMMENT COMMENT COMMENT *** End of mandatory fields *** COMMENT COMMENT COMMENT *** Column names *** COMMENT COMMENT COMMENT *** Column names *** COMMENT TTYPE1 = 'FUNC ' / TTYPE2 = 'XMIN ' / TTYPE3 = 'XMAX ' / TTYPE4 = 'COEFF ' / COMMENT COMMENT *** Column formats *** COMMENT COMMENT COMMENT *** Column formats *** COMMENT TFORM1 = '8A ' / TFORM2 = 'D ' / TFORM3 = 'D ' / TFORM4 = '1600D ' / COMMENT COMMENT *** Column dimensions (2 D or greater) *** COMMENT COMMENT COMMENT *** Column dimensions (2 D or greater) *** COMMENT TDIM4 = '( 5, 320)' / END legendre@@ [M&!pР?86[ ?e@ T, kpo'o&?8 p?@ M%|j濺?Vǡ\@ 0匿t!px}X?9=* ?%Μ@ #"zڶp"i?9 D?o@ d0pU\R8?9Md?@ SZzp_*0?9? 4@ R忺dNEp?9"_?&N@ Vjp6?9 @?-1@ |]kI#ƿpp?9&&D?'@ t%uĿgpAd ?9+}?1bFk@ m[M0 pc(i?90x?:gٜ@ db9dۿpd?95AH?D;x@ `c=AC[p>8?9;x?MH㌼@ St7břZp+?9@]jLj?V`(@ Pƒw̿p0#?9Er?_Y@ JB=Eo3p@ﭐ?9Jp3?h~@ {Mq YJr?9?ʁ$@ L.46q X?91~?"N@ ܝL߿m4}q Q ?9;QXL?t T@ ~ݼQ9SnqL2@?9fn?-W@ ͛D%qE?9#D?@ T̿n'/q>k?9?T@ MPÒ q*VY?9~ܚ? z@ }fθzf(qO`?9y=O?@ YHʝBqNd?9Mx?±@ IiY|=qk?9&id?sel@ p%.k޹qpH?9 X?!@ aeC1/@qq(?9ğ?'4@ ֖ӿѕпqTBj?9Y+,?- X@ z'~跿qلÂ?9.H?4au@ Z@2qZ/?9C?9Dz"@ (Tq/H?9LZ??N@ ng&He~3queH?9v?DC"\@ xKJqSI?9??Ipgu|@ wR]^J\Oaq,t_|?9ס ?NlY@ sޣ{,ƿqCDq?9?SPi@ m}uևo̿q*?9A?X@ e"tra1qF(?9ጯ?\ҋp@ ckF]b&q|Z=/(?9佇L?acޱ@ ^ZbK߿M"q QB?9P4?f D@ [ أHq!?9?jY@ W=Mѿq!)P?9%8?n@ P ٖ.q"X4?9%&?r?7@ I4 Jq#vޝ(?9"QX?vN@ @kwN?zL$@ >g Og0a^0q$1@8?9.?~ L@ =y%ۃӿq%?9?I @ 6Jڑk=Կq&}֤?959?^~@ 0ܰq'4tK?:@9?v4@ /FdO.+q($<0?:0?kܭH@ -n俺sPq(BШ?:L?\@ )<䐄q)?: ??t@ (J@ʿނMmq*BNh?: q?L @ .^hzq*?:?u7@ !|oc3[q+Խ?:lm?5@ &߶0q,T?:?}'t@ ߃%) q,L?:68?TCl@ 1@ࢌq- 8?:FX?}@?:~:?@ Jbq.>(?:&߈?NÐ@ (Ŀ{q/)?: C0? @ !!Rq0@ 騿Nq4l.ɀ?:0&ɤ? ,@ {ٿP\q50?:2Fq?oY\@ I{q5v˗?:4@1Ԃ?u@ ` c꿺Bտq6J|ohq8h#l?:E@ ٨ wٿwۿq8l?:=?p>C@ zcADq9k&X?:?$?͕W`@ >Uȿq9hh?:@?Σ\@ tNAq:i!>?:A[0N?ϥe@ ٘oֿq%hq:?h?:C&?Ї:@ вb̅;q;]{p?:Dq~?cXx@ <5h;q;d?:Er?";@@ ˇ5].5qQ0?:K sn?>@ PVhĿhq>nY?:Ks??v@ Y"mؿq?5lQ?:MAY?՟z@ W'_QRvTq?e?:N|?{@ 땒]Oq??:NY?+@ "nu:pc q@*!L?:OL?t@ pXPqq@v4/h?:Pdz9?}<@ 4)@Gq@&?:Q\?1)@ ")&eXǿqAD?:Q2?վ@ qevqAiנ?:RQ*?Չ@ 3࿺Ο-qA\U?:Rޒ0?F+/4@ hrqBK?`?:S`R??n@ G~qBS(?:S?Ԍd@ Ic2qBk7?:TI"1?:@ @~`'uqC@?`?:Tm?ӕ@ ab9?qCҐ?:U!?JQ@ T-m4aKqCغ?:UO5?cm@ W qqD!Ȩ?:U?ѳ0@ E6 fqDh;(H?:Uǝ^?$@ PP]vqDX?:U?+H@ 俿yWqD?:V?Ho@ M`P]ޓqE0lp?:V/޴?g@ 2R8Y`qEQ?:V=?)<@ [!忺!b1PqE ?:V>\?s@ ±#qFB ?:V2rs?ڡ@ (6qF9ۆ?:VW.?ɶ䘤@ =!hqqFoBL?:UGk?ȁ_@ F)>AqF=P?:Ud?=\@ ^ϿvqFՇ-?:UT<?@ qG+Qx?:U[)b?Ė?@ b|z ZqG3 ?:U?.@ %y,*qG_wc?:T?!e@ ۛ񽿺ytoqG?:T\T?5f@ p+e]mqGqj3?:SJ?qX@ L꿺NPPqG@?:S{;P? k@ 2E俺KڿqG?:R,?a@ )pyoqH.?:Rt) ?@ BP2(qH>B?:Q@]?y}p@ e׻οqH\X?:Q<*@?'0@ 4/qHxH?:P$2?M?\@ HʇBqH"&p?:O4?kC3@ S_i^\qHRJ ?:O /7?nЬ@ ;FS/.ҿqHW:y?:N'?@  >sqHrY?:MT3?Q@ uӊ燿qHҳ?:LrD?w@ cɡHqH's@?:Kr? @ ӽnOEqI+6X?:J?m@ / ?35(@ RmݐqI -?:@B[p?@ H0VqI?:>M?Af<@ :=}qIg# ?:=8fP?q@ )̿=7UqI b7?:@@ *1h^qE+ffh?: ,?=@ 2;^uYqC0?9nk?*ai`@ MPֿ抷ZqC3K_K?9j?' l@ CO[ sѿqB6\ ?9 p'?#<@ Lr%VG 3'qBSƸ?9ILq? YEl@ VjeTqB<lh?9K?Cb@ W꿺wJqAi8?9?7@ X?|:V ҏqA 㝸@ a qA2P?9?P`@ ]U0fq@ud?9L?t@ dAW[ &q@xjt?9? L@ jO:|q@g\?94?d@ kPGq?S?98H?8f@ hB&OP\q?MH?9?đ@ fXv.@eq>!?9ԹH? vT@ tO"qEm2q>S?9m]?ll@ wV:%4N¿q=;@?9@ 1yߕ&qh5nԿq4T?`?9w!l?ޔ@ I%zEa+q4/3((?9?֤@ :OC[ q3ߘ?9)}?Q@  @q3j?9?x1|@ 3Nq2j?9}2m1?\H@ 4jL \q1h?9yh?k@ ;ڿ`Xq1,?9u?ح@ y}Puq0{`?9qx`?U@ bd ըYfxq/?9mDw?8t@@ :ῺgcwOq/A?9i}H?S@ |N⿺q.k^x?9d`-?4P$@ &ؗqvpBq-U?9`?r| @ 0y7bq-CrVx?9\D4o?j@ 3迺XI*q,G2?9X l?^8|@ :n۲FAq+ ?9S?w(@ E86cq+6г?9OB?Ѩ @ J%pJўNq*tX* ?9J?}@ Td_4q)8m?9FVǡY?~SR@ XTLnG|q)r87?9A Ɛ?{@ `̸Ϯi|q(W{(?9=FZ?wp@ v5jpd@q'U,9(?97?si@ -6ؿ9q&; h?92Q?p^|p@ GͿOq%ϛ?9-rq?mCV@ *l̜̿+"ؿq%'f~X?9) ?j-=(@ ȿ30q$GUp?9$Fs?gg@ wU_9q#{7X?9nZ?c4@ ys[ ʺq"P"z?9Β?`E0@ Fsq!k"?9?]f{D@ 8iK,ڿq! |?9]?ZP(@ CV́@q HMA`?9 ?W@ ȲXywQqp?9=m?T@ ䷿K}qЧ?9b/G?R@ ֎)ƫ頿q~)?8=?O$/8@ 1[W忺Ȉ+q%Mb?8ܙ?L<<@ L3\Ol`qݠ?86. ?I{@ wƿĩ q0H?8Ժl?FT@ M+KqQiX?8 ?Cܢq`@  濺=Lܿqp)?8㛃zH?A-@ GRl|jqT(8?8J =?>W"L@ T CgbqwE?8)m?;5@ 4cf/Lſq>dt(?8ұP?8yt8@ >O|v4q?8i$M?5\/@ CŸEiHhq/H?8.ّ?3Xn|@ GVV~jpԿqȥE?8Rb?0̓@ R=e(UſqW??8\y?.CF<@ Y%Ὼ37q+8?8we?+`@ d29BxXqd?8IK?)D@ j_j񰼓qm?8a?&O@ o 'RÿqP?89&?${@ |`^玈kq L?8e+'?""^p@ M`6k'Bq 0?8*r?2=$@ Asq {Ҁ?87??@ I^-i#q ?8?Nۢ@ ?q T?8?3Y,@ w 򿺶+>sqi]?8 mG?d@ JROVOq$?8Yj?)@ $8T#zqe ?8y|Z?ؙ@ ڦ2=mqJJh?8sf?p(@ I&AqJY?8mz?cp@ *8جq{t?8gi%? @ 㤓ƿ#q$0?8_ M? brr@ q4wqLJp?8YΉw?@ R4[$-p?8S?ʮX@ %_&ԿU8:pײP?8M?@ '.@F#(p?8GmG?U@ -ؿ;$psy0?8AZL?Y@  ~Tpm{?8;V{?P;@ ..?eտpME?85?@ 5Geҟ1Sp#L2@?8.N?c@ ?,HǿpFߐ?8('}?7(@ K+좿B%pKM?8"V52?E)g@ VmšQ֢pna8?88 b?ECx@ d}v[gpkH`?8\u?u0@ hr(p:{P?8͜?kX@ p"mp͐?8\4?zgt? 1M?%?1MH?ʧ?f)?7$;?1/ ?iY<U?Oָ€ӴX {g??;;?>^`Ŕ?,GU?9˨>zg?Oe ?榁r>zTͅ?1MH?ʧ?f)?7$;?1/ ?iY<U???>?Oe ?榁r>zTͅ?Zс?Pz'>y-t? 1M?%?1MH?ʧ?f)?7$;?1/ ?iY<U??lh>??>ǥ?Zс?Pz'>y-t? 1M?%?7$;?1/ ?iY<U??lh>ݿ?>"vL`?>#?P.:>{$Nt? 1M?%?1MH?ʧ?f)?7$;?1/ ?iY<U?Oָ€Ӵ\??;;?>a3?>#?P.:>{$N? Q?b->{J?1MH?ʧ?f)?7$;?1/ ?iY<U???>ve? Q?b->{J׾t? 1M?%?1MH?ʧ?f)?7$;?1/ ?iY<U??ﮎ??>#LnF? mF_9_˴@<?1MH?ʧ?f)?b@?K#?vy^?;?>Zۨ)?&M5 꿣7Pd?b@?K#?vy^?1MH?ʧ?f)? mF_9_˴@<;???> K€?4tOtG poW?b@?K#?vy^?1MH?ʧ?f)?&M5 꿣7Pd;???>/Rd?6DqaF(IjջB?b@?K#?vy^?1MH?ʧ?f)?4tOtG poW;???> ^Ӏ?+]_W/ ?1MH?ʧ?f)?7$;?1/ ?iY<U?6DqaF(IjջB?b@?K#?vy^??;?>ʮ_?6a̿*4L?1MH?ʧ?f)?7$;?1/ ?iY<U?+]_W/ ???>xUzG?r+ԂVMU7NP|?1MH?ʧ?f)?7$;?1/ ?iY<U?6a̿*4L???>tJ?싾)m-?+.Ke_ ?1MH?ʧ?f)?7$;?1/ ?iY<U?r+ԂVMU7NP|???>ҐMBѼ?싀t]f5 (Ro#?1MH?ʧ?f)?7$;?1/ ?iY<U?싾)m-?+.Ke_ ???>1&?5H[ 񌶥1m?1MH?ʧ?f)?7$;?1/ ?iY<U?싀t]f5 (Ro#???>Ǐtn?ݰC{Q]p2%?1MH?ʧ?f)?7$;?1/ ?iY<U?5H[ 񌶥1m???>{p ~?ySS>gyA?ꃐ/?PJ?v~{0?1MH?ʧ?f)?7$;?1/ ?iY<U?ݰC{Q]p2%?KeFq???>I,f?ySS>gyA?ꃐ/?PJ?v~{0?7$;?1/ ?iY<U?1Nſ3? ??KeFq?>.npydl-0.7.0/pydl/pydlutils/tests/t/test_table.par0000644000076500000240000000021412664755673022364 0ustar weaverstaff00000000000000#%yanny # Table name first table typedef struct { long a; double b; char c[1]; } TEST; TEST 1 2.0 x TEST 4 5.0 y TEST 5 8.2 z pydl-0.7.0/pydl/pydlutils/tests/t/median_data.txt0000644000076500000240000000511612632466352022513 0ustar weaverstaff00000000000000# Data for testing the djs_median function. 49.68311576, 74.83671757, 49.68311576, 42.12573137, 32.11576752, 27.22092569, 15.08905903, 15.08905903, 27.22092569, 20.07809411, 41.55978953, 41.55978953, 35.01184343, 35.01184343, 38.7967727, 38.7967727, 38.7967727, 38.7967727, 38.7967727, 28.85780425, 28.85780425, 30.23801035, 30.23801035, 27.0687078, 30.23801035, 64.77058052, 29.42407045, 38.51749618, 62.48793217, 38.51749618, 29.42407045, 38.51749618, 18.46919414, 18.46919414, 42.72395431, 54.11584807, 54.11584807, 75.67593452, 75.67593452, 72.4380356, 61.68761955, 61.68761955, 45.36796557, 45.36796557, 61.68761955, 76.57621138, 76.57621138, 76.57621138, 23.28589488, 23.28589488, 13.82755909, 12.10607597, 13.82755909, 27.51891089, 44.21266068, 44.21266068, 44.21266068, 47.18083025, 47.18083025, 47.18083025, 47.18083025, 47.18083025, 35.50809933, 35.50809933, 25.52222293, 25.52222293, 67.8568479, 88.54983822, 67.8568479, 93.40053148, 93.40053148, 64.12514945, 47.82074715, 47.82074715, 47.82074715, 34.82234113, 34.82234113, 52.91092248, 78.51075522, 92.16442338, 92.16442338, 92.16442338, 72.07989558, 72.07989558, 68.72579501, 72.07989558, 72.07989558, 70.43134908, 34.55273356, 62.09010468, 62.09010468, 70.43134908, 68.89705132, 68.89705132, 68.89705132, 66.30426084, 55.92748086, 55.92748086, 55.92748086, 65.11387444 # 49.68311576, 49.68311576, 49.68311576, 42.12573137, 32.11576752, 27.22092569, 15.08905903, 15.08905903, 27.22092569, 20.07809411, 41.55978953, 41.55978953, 35.01184343, 35.01184343, 38.7967727, 38.7967727, 38.7967727, 38.7967727, 38.7967727, 28.85780425, 28.85780425, 30.23801035, 30.23801035, 27.0687078, 30.23801035, 64.77058052, 29.42407045, 38.51749618, 62.48793217, 38.51749618, 29.42407045, 38.51749618, 18.46919414, 18.46919414, 42.72395431, 54.11584807, 54.11584807, 75.67593452, 75.67593452, 72.4380356, 61.68761955, 61.68761955, 45.36796557, 45.36796557, 61.68761955, 76.57621138, 76.57621138, 76.57621138, 23.28589488, 23.28589488, 13.82755909, 12.10607597, 13.82755909, 27.51891089, 44.21266068, 44.21266068, 44.21266068, 47.18083025, 47.18083025, 47.18083025, 47.18083025, 47.18083025, 35.50809933, 35.50809933, 25.52222293, 25.52222293, 67.8568479, 88.54983822, 67.8568479, 93.40053148, 93.40053148, 64.12514945, 47.82074715, 47.82074715, 47.82074715, 34.82234113, 34.82234113, 52.91092248, 78.51075522, 92.16442338, 92.16442338, 92.16442338, 72.07989558, 72.07989558, 68.72579501, 72.07989558, 72.07989558, 70.43134908, 34.55273356, 62.09010468, 62.09010468, 70.43134908, 68.89705132, 68.89705132, 68.89705132, 55.92748086, 65.11387444, 55.92748086, 55.92748086, 55.92748086 pydl-0.7.0/pydl/pydlutils/tests/t/polygon_one_cap.fits0000644000076500000240000002640012733017424023557 0ustar weaverstaff00000000000000SIMPLE = T /Primary Header created by MWRFITS v1.11 BITPIX = 16 / NAXIS = 0 / EXTEND = T /Extensions may be present DATE = 'Sat Jun 11 12:40:21 2016' /Time of creation of polygon fits file IDLUTILS= 'v5_5_5 ' /Version of idlutils used END XTENSION= 'BINTABLE' /Binary table written by MWRFITS v1.11 BITPIX = 8 /Required value NAXIS = 2 /Required value NAXIS1 = 60 /Number of bytes per row NAXIS2 = 1 /Number of rows PCOUNT = 0 /Normally 0 (no varying arrays) GCOUNT = 1 /Required value TFIELDS = 7 /Number of columns in table COMMENT COMMENT *** End of mandatory fields *** COMMENT COMMENT COMMENT *** Column names *** COMMENT TTYPE1 = 'XCAPS ' / TTYPE2 = 'CMCAPS ' / TTYPE3 = 'NCAPS ' / TTYPE4 = 'WEIGHT ' / TTYPE5 = 'PIXEL ' / TTYPE6 = 'STR ' / TTYPE7 = 'USE_CAPS' / COMMENT COMMENT *** Column formats *** COMMENT TFORM1 = '3D ' / TFORM2 = 'D ' / TFORM3 = 'J ' / TFORM4 = 'D ' / TFORM5 = 'J ' / TFORM6 = 'D ' / TFORM7 = 'J ' / COMMENT COMMENT *** Unsigned integer column scalings * COMMENT TSCAL7 = 1 / TZERO7 = 2147483648 / END ?@ ?:~ s?Oָ€?#`??O[U5bpydl-0.7.0/pydl/pydlutils/tests/test_spheregroup.py0000644000076500000240000000767413064055440023240 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import raises, catch_warnings from astropy.utils.data import get_pkg_data_filename from ..spheregroup import spheregroup, spherematch from .. import PydlutilsException, PydlutilsUserWarning class TestSpheregroup(object): """Test the functions in pydl.pydlutils.spheregroup. """ def setup(self): pass def teardown(self): pass def test_spheregroup(self): test_data_file = get_pkg_data_filename('t/spheregroup_data.txt') test_data = np.loadtxt(test_data_file, dtype='d', delimiter=',') # np.random.seed(137) # Ngroup = 3 # N = 50 # spread = 0.05 linklength = 3.0 # degrees # x0 = np.concatenate( ( np.random.normal(loc=1,scale=spread,size=(N,)), # np.random.normal(loc=-1,scale=spread,size=(N,)), # np.random.normal(loc=1,scale=spread,size=(N,)), # )).reshape((N*Ngroup,)) # y0 = np.concatenate( ( np.random.normal(loc=1,scale=spread,size=(N,)), # np.random.normal(loc=-1,scale=spread,size=(N,)), # np.random.normal(loc=-1,scale=spread,size=(N,)), # )).reshape((N*Ngroup,)) # z0 = np.concatenate( ( np.random.normal(loc=1,scale=spread,size=(N,)), # np.random.normal(loc=-1,scale=spread,size=(N,)), # np.random.normal(loc=0,scale=spread,size=(N,)), # )).reshape((N*Ngroup,)) # foo = np.arange(N*Ngroup) # np.random.shuffle(foo) # x = x0[foo] # y = y0[foo] # z = z0[foo] # r = np.sqrt(x**2 + y**2 + z**2) # theta = np.degrees(np.arccos(z/r)) # phi = np.degrees(np.arctan2(y,x)) # ra = np.where(phi < 0, phi + 360.0,phi) # dec = 90.0 - theta # group = spheregroup(ra,dec,linklength) # # Reproduce IDL results # ra = test_data[0, :] dec = test_data[1, :] expected_ingroup = test_data[2, :].astype(np.int64) expected_multgroup = test_data[3, :].astype(np.int64) expected_firstgroup = test_data[4, :].astype(np.int64) expected_nextgroup = test_data[5, :].astype(np.int64) group = spheregroup(ra, dec, linklength) assert (group[0] == expected_ingroup).all() assert (group[1] == expected_multgroup).all() assert (group[2] == expected_firstgroup).all() assert (group[3] == expected_nextgroup).all() # # Exceptions # with raises(PydlutilsException): group = spheregroup(np.array([137.0]), np.array([55.0]), linklength) # # warnings # with catch_warnings(PydlutilsUserWarning) as w: group = spheregroup(ra, dec, linklength, chunksize=linklength) # w = recwarn.pop(PydlutilsUserWarning) assert "chunksize changed to" in str(w[0].message) def test_spherematch(self): i1_should_be = np.array([17, 0, 2, 16, 12, 13, 1, 5, 15, 7, 19, 8, 11, 10, 14, 18, 3, 9, 6, 4]) i2_should_be = np.array([2, 0, 17, 3, 16, 15, 14, 5, 6, 10, 8, 19, 18, 4, 9, 7, 11, 12, 1, 13]) np.random.seed(137) searchrad = 3.0/3600.0 n = 20 ra1 = 360.0*np.random.random((n,)) dec1 = 90.0 - np.rad2deg(np.arccos(2.0*np.random.random((n,)) - 1.0)) ra2 = ra1 + np.random.normal(0, 1.0/3600.0) dec2 = dec1 + np.random.normal(0, 1.0/3600.0) foo = np.arange(n) np.random.shuffle(foo) i1, i2, d12 = spherematch(ra1, dec1, ra2[foo], dec2[foo], searchrad, maxmatch=0) assert (i1 == i1_should_be).all() assert (i2 == i2_should_be).all() pydl-0.7.0/pydl/pydlutils/tests/test_trace.py0000644000076500000240000002147713064020340021757 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.io import fits from astropy.tests.helper import raises from astropy.utils.data import get_pkg_data_filename from ..trace import (fchebyshev, fchebyshev_split, fpoly, func_fit, TraceSet, traceset2xy, xy2traceset) from .. import PydlutilsException class TestTrace(object): """Test the functions in pydl.pydlutils.trace. """ def setup(self): # extracted from spFrame-b1-00057618.fits self.sdss = fits.open(get_pkg_data_filename('t/sdss_traceset.fits')) # extracted from spFrame-r1-00180406.fits self.boss = fits.open(get_pkg_data_filename('t/boss_traceset.fits')) return def teardown(self): self.sdss.close() self.boss.close() return def test_fchebyshev(self): x = np.array([-1, -0.5, 0, 0.5, 1], dtype='d') # # Test order # with raises(ValueError): f = fchebyshev(x, 0) # # m = 1 # f = fchebyshev(x, 1) assert (f == np.ones((1, x.size), dtype='d')).all() # # m = 2 # f = fchebyshev(x, 2) foo = np.ones((2, x.size), dtype='d') foo[1, :] = x assert np.allclose(f, foo) # # m = 3 # f = fchebyshev(x, 3) foo = np.ones((3, x.size), dtype='d') foo[1, :] = x foo[2, :] = (2.0*x**2 - 1.0) assert np.allclose(f, foo) # # m = 4 # f = fchebyshev(x, 4) foo = np.ones((4, x.size), dtype='d') foo[1, :] = x foo[2, :] = (2.0*x**2 - 1.0) foo[3, :] = (4.0*x**3 - 3.0*x) assert np.allclose(f, foo) # # random float # f = fchebyshev(2.88, 3) assert np.allclose(f, np.array([[1.00], [2.88], [15.5888]])) def test_fchebyshev_split(self): x = np.array([-1, -0.5, 0, 0.5, 1], dtype='d') # # Test order # with raises(ValueError): f = fchebyshev_split(x, 0) with raises(ValueError): f = fchebyshev_split(x, 1) # # m = 2 # f = fchebyshev_split(x, 2) foo = np.ones((2, x.size), dtype='d') foo[0, :] = (x >= 0).astype(x.dtype) assert np.allclose(f, foo) # # m = 3 # f = fchebyshev_split(x, 3) foo = np.ones((3, x.size), dtype='d') foo[0, :] = (x >= 0).astype(x.dtype) foo[2, :] = x assert np.allclose(f, foo) # # m = 4 # f = fchebyshev_split(x, 4) foo = np.ones((4, x.size), dtype='d') foo[0, :] = (x >= 0).astype(x.dtype) foo[2, :] = x foo[3, :] = (2.0*x**2 - 1.0) assert np.allclose(f, foo) # # m = 5 # f = fchebyshev_split(x, 5) foo = np.ones((5, x.size), dtype='d') foo[0, :] = (x >= 0).astype(x.dtype) foo[2, :] = x foo[3, :] = (2.0*x**2 - 1.0) foo[4, :] = (4.0*x**3 - 3.0*x) assert np.allclose(f, foo) # # random float # f = fchebyshev_split(2.88, 3) assert np.allclose(f, np.array([[1.00], [1.00], [2.88]])) def test_fpoly(self): x = np.array([-1, -0.5, 0, 0.5, 1], dtype='d') # # Test order # with raises(ValueError): f = fpoly(x, 0) # # m = 1 # f = fpoly(x, 1) assert (f == np.ones((1, x.size), dtype='d')).all() # # m = 2 # f = fpoly(x, 2) foo = np.ones((2, x.size), dtype='d') foo[1, :] = x assert np.allclose(f, foo) # # m = 3 # f = fpoly(x, 3) foo = np.ones((3, x.size), dtype='d') foo[1, :] = x foo[2, :] = x**2 assert np.allclose(f, foo) # # m = 4 # f = fpoly(x, 4) foo = np.ones((4, x.size), dtype='d') foo[1, :] = x foo[2, :] = x**2 foo[3, :] = x**3 assert np.allclose(f, foo) # # random float # f = fpoly(2.88, 3) assert np.allclose(f, np.array([[1.00], [2.88], [8.29440]])) def test_func_fit(self): np.random.seed(137) x = np.linspace(-5, 5, 50) y = x**2 + 2*x + 1 + 0.05*np.random.randn(50) # # Bad inputs # with raises(ValueError): foo = func_fit(x, np.array([1, 2, 3]), 3) with raises(ValueError): foo = func_fit(x, y, 3, invvar=np.array([0, 0, 0])) with raises(ValueError): foo = func_fit(x, y, 3, inputfunc=np.array([1.0, 1.0, 1.0])) with raises(KeyError): foo = func_fit(x, y, 3, function_name='npoly') # # No good points # invvar = np.zeros(x.shape, dtype=x.dtype) res, yfit = func_fit(x, y, 3, invvar=invvar) assert (res == np.zeros((3,), dtype=x.dtype)).all() assert (yfit == 0*x).all() # # One good point # invvar[2] = 1.0 res, yfit = func_fit(x, y, 3, invvar=invvar) assert (invvar > 0).nonzero()[0][0] == 2 assert res[0] == y[2] assert (yfit == y[2]).all() # # Various points # invvar = 1.0/(np.random.random(x.shape)**2) assert (invvar < 2).any() invvar[invvar < 2] = 0 res, yfit = func_fit(x, y, 3, invvar=invvar, function_name='poly') # assert np.allclose(res,np.array([0.99665423, 1.9945388, 1.00172303])) assert np.allclose(res, np.array([0.99996197, 1.99340315, 1.00148004])) # # Fixed points # res, yfit = func_fit(x, y, 3, invvar=invvar, function_name='poly', ia=np.array([False, True, True]), inputans=np.array([1.0, 0, 0])) # assert np.allclose(res,np.array([1., 1.99454782, 1.00149949])) assert np.allclose(res, np.array([1., 1.99340359, 1.00147743])) res, yfit = func_fit(x, y, 3, invvar=invvar, function_name='poly', ia=np.array([False, True, False]), inputans=np.array([1.0, 0, 1.0])) # assert np.allclose(res,np.array([1., 1.99403239, 1.])) assert np.allclose(res, np.array([1., 1.99735654, 1.])) # # inputfunc # res, yfit = func_fit(x, y, 3, invvar=invvar, function_name='poly', inputfunc=np.ones(x.shape, dtype=x.dtype)) # assert np.allclose(res,np.array([0.99665423, 1.9945388, 1.00172303])) assert np.allclose(res, np.array([0.99996197, 1.99340315, 1.00148004])) # # Generate inputans # y = x**2 + 2*x + 0.05*np.random.randn(50) res, yfit = func_fit(x, y, 3, invvar=invvar, function_name='poly', ia=np.array([False, True, True])) assert np.allclose(res, np.array([0., 1.99994188, 0.99915111])) def test_traceset_sdss(self): tset = TraceSet(self.sdss[1].data) assert tset.func == 'legendre' assert tset.coeff.shape == (320, 5) assert not tset.has_jump assert tset.xRange == tset.xmax - tset.xmin assert tset.nx == int(tset.xmax - tset.xmin + 1) assert tset.xmid == 0.5 * (tset.xmin + tset.xmax) x, y = traceset2xy(tset) tset2 = xy2traceset(x, y, ncoeff=tset.ncoeff) assert tset2.xmin == tset.xmin assert tset2.xmax == tset.xmax assert tset2.func == tset.func assert np.allclose(tset2.coeff, tset.coeff) def test_traceset_boss(self): tset = TraceSet(self.boss[1].data) assert tset.func == 'legendre' assert tset.coeff.shape == (500, 6) assert tset.has_jump assert tset.xRange == tset.xmax - tset.xmin assert tset.nx == int(tset.xmax - tset.xmin + 1) assert tset.xmid == 0.5 * (tset.xmin + tset.xmax) x, y = traceset2xy(tset) tset2 = xy2traceset(x, y, ncoeff=tset.ncoeff, xjumplo=tset.xjumplo, xjumphi=tset.xjumphi, xjumpval=tset.xjumpval) assert tset2.xmin == tset.xmin assert tset2.xmax == tset.xmax assert tset2.func == tset.func assert np.allclose(tset2.coeff, tset.coeff) def test_traceset_keywords(self): tset = TraceSet(self.boss[1].data) x, y = traceset2xy(tset) iv = 2.0 * np.ones(x.shape, dtype=x.dtype) im = np.ones(x.shape, dtype=np.bool) im[1] = False tset2 = TraceSet(x, y, func='chebyshev', invvar=iv, maxiter=20, inmask=im) assert tset2.func == 'chebyshev' def test_traceset_bad(self): with raises(PydlutilsException): tset = TraceSet(1, 2, 3) pydl-0.7.0/pydl/pydlutils/tests/test_yanny.py0000644000076500000240000003254113434104050022013 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import warnings import json from os import chmod, remove from os.path import exists, join from shutil import copy, rmtree from tempfile import mkdtemp from time import sleep from collections import OrderedDict import six import numpy as np from astropy.tests.helper import catch_warnings, raises from astropy.utils.data import get_pkg_data_filename from astropy.table import Table from astropy.io.registry import (register_identifier, register_reader, register_writer) from .. import PydlutilsException, PydlutilsUserWarning from ..yanny import (write_ndarray_to_yanny, yanny, is_yanny, read_table_yanny, write_table_yanny) register_identifier('yanny', Table, is_yanny) register_reader('yanny', Table, read_table_yanny) register_writer('yanny', Table, write_table_yanny) class YannyTestCase(object): """Based on astropy.io.fits.tests.FitsTestCase. """ save_temp = False def setup(self): self.temp_dir = mkdtemp(prefix='yanny-test-') # Ignore deprecation warnings--this only affects Python 2.5 and 2.6, # since deprecation warnings are ignored by defualt on 2.7 warnings.simplefilter('ignore') warnings.simplefilter('always', UserWarning) # raise ValueError("I am setting up a subclass of YannyTestCase!") with open(get_pkg_data_filename("t/yanny_data.json")) as js: self.test_data = json.load(js) def teardown(self): warnings.resetwarnings() if not self.save_temp: if hasattr(self, 'temp_dir') and exists(self.temp_dir): tries = 3 while tries: try: rmtree(self.temp_dir) break except OSError: # Probably couldn't delete the file because for # whatever reason a handle to it is still open/hasn't # been garbage-collected. sleep(0.5) tries -= 1 # raise ValueError("I am tearing down up a subclass of YannyTestCase!") def copy_file(self, filename): """Copies a backup of a test data file to the temp dir and sets its mode to writeable. """ copy(self.data(filename), self.temp(filename)) chmod(self.temp(filename), stat.S_IREAD | stat.S_IWRITE) def data(self, filename): """Returns the path to a test data file. """ return get_pkg_data_filename('t/'+filename) def temp(self, filename): """Returns the full path to a file in the test temp dir. """ return join(self.temp_dir, filename) def json2dtype(self, data): """Convert JSON-encoded dtype data into a real dtype object. """ stuff = list() for k in data: if len(k) == 3: stuff.append(tuple([str(k[0]), str(k[1]), tuple(k[2])])) else: stuff.append(tuple([str(k[0]), str(k[1])])) return np.dtype(stuff) class TestYanny(YannyTestCase): """Test class for yanny files. """ save_temp = False def test_is_yanny(self): """Test the yanny identifier. """ par = yanny(self.data('test.par')) assert is_yanny('read', None, None, par) assert is_yanny('read', 'test.par', None) with open(self.data('test.par'), 'rb') as fileobj: assert is_yanny('read', None, fileobj) def test_read_table_yanny(self): """Test reading to an astropy Table. """ filename = self.data('test_table.par') with raises(PydlutilsException): t = Table.read(filename) with raises(KeyError): t = Table.read(filename, tablename='foo') t = Table.read(filename, tablename='test') assert isinstance(t.meta, OrderedDict) assert t.meta['name'] == 'first table' assert (t['a'] == np.array([1, 4, 5])).all() assert (t['c'] == np.array([b'x', b'y', b'z'])).all() def test_write_table_yanny(self): """Test writing an astropy Table to yanny. """ filename = self.temp('table.par') a = [1, 4, 5] b = [2.0, 5.0, 8.2] c = [b'x', b'y', b'z'] t = Table([a, b, c], names=('a', 'b', 'c'), meta={'name': 'first table'}, dtype=('i8', 'f8', 'S1')) t.write(filename, tablename='test') par1 = yanny(filename) par2 = yanny(self.data('test_table.par')) assert par1 == par2 remove(filename) def test_write_single_ndarray_to_yanny(self): """Test the write_ndarray_to_yanny function. """ test_data = self.test_data['tempfile1.par'] table = test_data['structures']['MYSTRUCT0'] dt = self.json2dtype(table['dtype']) mystruct = np.zeros((table['size'],), dtype=dt) for col in table['dtype']: mystruct[col[0]] = np.array(table['data'][col[0]], dtype=col[1]) enums = {'new_flag': test_data['enums']['new_flag']} par = write_ndarray_to_yanny(self.temp('tempfile1.par'), mystruct, structnames='magnitudes', enums=enums) assert (par._symbols['enum'][0] == 'typedef enum {\n FALSE,\n TRUE\n} BOOLEAN;') assert par._symbols['struct'][0] == 'typedef struct {\n double ra;\n double dec;\n float mag[5];\n int flags;\n BOOLEAN new_flag;\n} MAGNITUDES;' for k, f in enumerate(('FALSE', 'TRUE', 'TRUE', 'FALSE')): assert par['MAGNITUDES']['new_flag'][k].decode() == f def test_write_multiple_ndarray_to_yanny(self): """Test the write_ndarray_to_yanny function. """ test_data = self.test_data['tempfile1.par'] table = test_data['structures']['MYSTRUCT0'] dt = self.json2dtype(table['dtype']) mystruct = np.zeros((table['size'],), dtype=dt) for col in table['dtype']: mystruct[col[0]] = np.array(table['data'][col[0]], dtype=col[1]) enums = test_data['enums'] table = test_data['structures']['MY_STATUS'] dt = self.json2dtype(table['dtype']) status = np.zeros((table['size'],), dtype=dt) for col in table['dtype']: status[col[0]] = np.array(table['data'][col[0]], dtype=col[1]) enums = test_data['enums'] par = write_ndarray_to_yanny(self.temp('tempfile2.par'), (mystruct, status), structnames=('magnitudes', 'my_status'), enums=enums, comments=['This is a test', 'This is another test']) assert ('typedef enum {\n FALSE,\n TRUE\n} BOOLEAN;' in par._symbols['enum']) assert 'typedef enum {\n FAILURE,\n INCOMPLETE,\n SUCCESS\n} STATUS;' in par._symbols['enum'] assert 'typedef struct {\n double ra;\n double dec;\n float mag[5];\n int flags;\n BOOLEAN new_flag;\n} MAGNITUDES;' in par._symbols['struct'] assert 'typedef struct {\n long timestamp;\n STATUS state;\n} MY_STATUS;' in par._symbols['struct'] for k, f in enumerate(('FALSE', 'TRUE', 'TRUE', 'FALSE')): assert par['MAGNITUDES']['new_flag'][k].decode() == f for k, f in enumerate(["SUCCESS", "SUCCESS", "FAILURE", "INCOMPLETE"]): assert par['MY_STATUS']['state'][k].decode() == f def test_write_ndarray_to_yanny_exceptions(self): """Make sure certain execptions are raised by the write_ndarray_to_yanny function. """ test_data = self.test_data['tempfile1.par'] table = test_data['structures']['MYSTRUCT0'] dt = self.json2dtype(table['dtype']) mystruct = np.zeros((table['size'],), dtype=dt) for col in table['dtype']: mystruct[col[0]] = np.array(table['data'][col[0]], dtype=col[1]) enums = {'new_flag': test_data['enums']['new_flag']} with raises(PydlutilsException): par = write_ndarray_to_yanny(self.data('test.par'), mystruct, structnames='magnitudes', enums=enums) with raises(PydlutilsException): par = write_ndarray_to_yanny(self.temp('tempfile3.par'), mystruct, structnames=('magnitudes', 'my_status'), enums=enums) def test_write_ndarray_to_yanny_misc(self): """Test other minor features of write_ndarray_to_yanny. """ test_data = self.test_data["tempfile1.par"] table = test_data['structures']['MYSTRUCT0'] dt = self.json2dtype(table['dtype']) mystruct = np.zeros((table['size'],), dtype=dt) for col in table['dtype']: mystruct[col[0]] = np.array(table['data'][col[0]], dtype=col[1]) enums = {'new_flag': test_data['enums']['new_flag']} hdr = test_data['pairs'] par = write_ndarray_to_yanny(self.temp('tempfile1.par'), mystruct, hdr=hdr, enums=enums) assert 'MYSTRUCT0' in par assert par['keyword1'] == hdr['keyword1'] assert par['keyword2'] == hdr['keyword2'] def test_yanny(self): """Used to test the yanny class. """ # # Describe what should be in the object # pair_data = self.test_data["test.par"]["pairs"] struct_data = self.test_data["test.par"]["structures"] symbols = [ """typedef struct { float mag[5]; char b[5][]; char foo[25]; double c; int flags[2]; BOOLEAN new_flag; } MYSTRUCT;""", """typedef struct { float foo<3>; # This is archaic array notation, strongly deprecated, char bar<10>; # but still technically supported. } OLD;""", """typedef struct { STATUS state; char timestamp[]; #UTC timestamp in format 2008-06-21T00:27:33 } STATUS_UPDATE;""", ] enum = [ """typedef enum { FALSE, TRUE } BOOLEAN;""", """typedef enum { FAILURE, INCOMPLETE, SUCCESS } STATUS;""", ] with open(self.data('test.par')) as f: file_data = f.read() # # Open the object # par = yanny(self.data('test.par')) # # Test some static methods, etc. # assert par.get_token('abcd') == ('abcd', '') assert str(par) == file_data assert repr(par) == file_data par2 = yanny(self.data('test_table.par')) assert par != par2 # # Test types # assert par.type('MYSTRUCT', 'c') == 'double' assert par.type('FOOBAR', 'd') is None assert par.type('MYSTRUCT', 'foobar') is None assert not par2.isenum('TEST', 'a') assert par.array_length('MYSTRUCT', 'c') == 1 assert par.char_length('MYSTRUCT', 'c') is None # # Test the pairs # assert set(par.pairs()) == set(pair_data) # # Test the pair values # for p in par.pairs(): assert par[p] == pair_data[p] # # Test the structure of the object # assert (set(par) == set(list(pair_data) + list(struct_data))) assert (set(par._symbols) == set(list(struct_data) + ['struct', 'enum'])) assert set(par._symbols['struct']) == set(symbols) assert set(par._symbols['enum']) == set(enum) assert set(par.tables()) == set(struct_data) for t in par.tables(): assert (par.dtype(t) == self.json2dtype(struct_data[t]['dtype'])) assert par.size(t) == struct_data[t]['size'] assert (set(par.columns(t)) == set(struct_data[t]['columns'])) for c in par.columns(t): assert (par.type(t, c) == struct_data[t]['columns'][c]) # print(par[t][c]) assert par.isenum('MYSTRUCT', 'new_flag') assert par._enum_cache['BOOLEAN'] == ['FALSE', 'TRUE'] assert (par._enum_cache['STATUS'] == ['FAILURE', 'INCOMPLETE', 'SUCCESS']) # # Test values # assert np.allclose(par['MYSTRUCT'].mag[0], np.array([17.5, 17.546, 17.4, 16.1, 16.0])) assert np.allclose(par['MYSTRUCT'].mag[5], np.array([19.3, 18.2, 17.1, 16.0, 15.9])) assert par['MYSTRUCT'].foo[1] == six.b("My dog has no nose.") assert np.allclose(par['MYSTRUCT'].c[2], 7.24345567) assert (par['MYSTRUCT']['flags'][2] == np.array([123123, 0])).all() # # Test expected write failures. # # This should fail, since test.par already exists. with raises(PydlutilsException): par.write() with catch_warnings(PydlutilsUserWarning) as w: par.append({}) datatable = {'status_update': {'state': ['SUCCESS', 'SUCCESS'], 'timestamp': ['2008-06-22 01:27:33', '2008-06-22 01:27:36']}, 'new_keyword': 'new_value'} par.filename = self.temp('test_append.par') # This should also fail, because test_append.par does not exist. with raises(PydlutilsException): par.append(datatable) return pydl-0.7.0/pydl/pydlutils/tests/test_mangle.py0000644000076500000240000002730613064020340022121 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import raises from astropy.utils.data import get_pkg_data_filename from .. import PydlutilsException from .. import mangle as mng class TestMangle(object): """Test the functions in pydl.pydlutils.mangle. """ def setup(self): self.poly_fits = get_pkg_data_filename('t/polygon.fits') self.no_id_fits = get_pkg_data_filename('t/polygon_no_id.fits') self.one_cap_fits = get_pkg_data_filename('t/polygon_one_cap.fits') self.poly_ply = get_pkg_data_filename('t/polygon.ply') self.bad_ply = get_pkg_data_filename('t/median_data.txt') def teardown(self): pass def test_ManglePolygon(self): # # Zero caps # poly = mng.ManglePolygon() assert np.allclose(poly.str, 4.0*np.pi) assert poly.cmminf() is None assert not poly.gzeroar() assert np.allclose(poly.garea(), 4.0*np.pi) # # One cap. # x = np.array([[0.0, 0.0, 1.0]]) cm = np.array([0.5]) poly = mng.ManglePolygon(x=x, cm=cm) assert np.allclose(poly.garea(), np.pi) # # Bad inputs # with raises(ValueError): poly = mng.ManglePolygon(weight=1.0) # # Multiple caps # x = np.array([[0.0, 0.0, 1.0], [1.0, 0.0, 0.0], [0.0, 1.0, 0.0]]) cm = np.array([1.0, 1.0, 1.0]) poly = mng.ManglePolygon(x=x, cm=cm, str=np.pi/2.0) assert poly.ncaps == 3 assert poly.weight == 1.0 assert poly.use_caps == (1 << 3) - 1 poly = mng.ManglePolygon(x=x, cm=cm, weight=0.5) assert poly.weight == 0.5 poly = mng.ManglePolygon(x=x, cm=cm, pixel=20) assert poly.pixel == 20 poly = mng.ManglePolygon(x=x, cm=cm, use_caps=3, str=np.pi/2.0) assert poly.use_caps == 3 poly2 = poly.copy() assert poly2.use_caps == poly.use_caps assert (poly2.cm == poly.cm).all() assert np.allclose(poly2.str, np.pi/2.0) x = np.array([[0.0, 0.0, 1.0], [1.0, 0.0, 0.0]]) cm = np.array([1.0, 1.0]) poly = mng.ManglePolygon(x=x, cm=cm) poly2 = poly.add_caps(np.array([[0.0, 1.0, 0.0], ]), np.array([1.0, ])) assert poly2.ncaps == 3 assert poly2.use_caps == poly.use_caps assert poly2.str == 1.0 # dummy value! poly3 = poly.polyn(poly2, 2) assert poly3.ncaps == 3 assert poly3.use_caps == poly.use_caps assert np.allclose(poly3.x[2, :], np.array([0.0, 1.0, 0.0])) poly3 = poly.polyn(poly2, 2, complement=True) assert poly3.ncaps == 3 assert poly3.use_caps == poly.use_caps assert np.allclose(poly3.cm[2], -1.0) def test_angles_to_x(self): x = mng.angles_to_x(np.array([[0.0, 0.0], [90.0, 90.0], [0.0, 90.0]])) assert np.allclose(x, np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0]])) a = mng.angles_to_x(np.array([[0.0, 90.0], [90.0, 0.0], [0.0, 0.0]]), latitude=True) assert np.allclose(x, np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0]])) def test_cap_distance(self): x = np.array([0.0, 0.0, 1.0]) cm = 1.0 with raises(ValueError): d = mng.cap_distance(x, cm, np.array([[1.0, 2.0, 3.0, 4.0], ])) d = mng.cap_distance(x, cm, np.array([[0.0, 45.0], ])) assert np.allclose(d, np.array([45.0])) y = mng.angles_to_x(np.array([[0.0, 45.0], ]), latitude=True) d = mng.cap_distance(x, cm, y) assert np.allclose(d, np.array([45.0])) d = mng.cap_distance(x, cm, np.array([[0.0, -45.0], ])) assert np.allclose(d, np.array([-45.0])) d = mng.cap_distance(x, -1.0, np.array([[0.0, -45.0], ])) assert np.allclose(d, np.array([45.0])) def test_circle_cap(self): with raises(ValueError): x, cm = mng.circle_cap(90.0, np.array([[1.0, 2.0, 3.0, 4.0], ])) xin = np.array([[0.0, 0.0, 1.0], ]) x, cm = mng.circle_cap(90.0, xin) assert np.allclose(x, xin) assert np.allclose(cm, 1.0) radec = np.array([[0.0, 90.0], ]) x, cm = mng.circle_cap(90.0, radec) assert np.allclose(x, xin) assert np.allclose(cm, 1.0) x, cm = mng.circle_cap(np.float32(90.0), radec) assert np.allclose(x, xin) assert np.allclose(cm, 1.0) x, cm = mng.circle_cap(np.array([90.0, ]), radec) assert np.allclose(x, xin) assert np.allclose(cm, np.array([1.0, ])) with raises(ValueError): x, cm = mng.circle_cap(np.array([90.0, 90.0]), radec) def test_is_cap_used(self): assert mng.is_cap_used(1 << 2, 2) assert not mng.is_cap_used(1 << 2, 1) def test_is_in_cap(self): x = np.array([0.0, 0.0, 1.0]) cm = 1.0 d = mng.is_in_cap(x, cm, np.array([[0.0, 45.0], [0.0, -45.0]])) assert (d == np.array([True, False])).all() def test_is_in_polygon(self): x = np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0]]) cm = np.array([1.0, 1.0, 1.0]) p = mng.ManglePolygon(x=x, cm=cm) d = mng.is_in_polygon(p, np.array([[45.0, 45.0], [135.0, -45.0]])) assert (d == np.array([True, False])).all() d = mng.is_in_polygon(p, np.array([[45.0, 45.0], [-45.0, 45.0]]), ncaps=2) assert (d == np.array([True, False])).all() poly = mng.read_fits_polygons(self.no_id_fits) d = mng.is_in_polygon(poly[0], np.array([[0.0, 10.0], [4.0, 4.5], [90, 4.0], [135, 3.0], [180, 0.5], [270, 0.25]])) assert (d == np.array([False, True, False, False, False, False])).all() def test_is_in_window(self): np.random.seed(271828) RA = 7.0*np.random.random(1000) + 268.0 Dec = 90.0 - np.degrees(np.arccos(0.08*np.random.random(1000))) points = np.vstack((RA, Dec)).T poly = mng.read_fits_polygons(self.poly_fits) i = mng.is_in_window(poly, points) assert i[0].sum() == 3 def test_read_fits_polygons(self): poly = mng.read_fits_polygons(self.poly_fits) use_caps = np.array([31, 15, 31, 7, 31, 15, 15, 7, 15, 15, 15, 31, 15, 15, 15, 15, 15, 15, 31, 15], dtype=np.uint32) # # Attribute access doesn't work on unsigned columns. # assert (poly['USE_CAPS'] == use_caps).all() assert (poly['use_caps'] == use_caps).all() with raises(AttributeError): foo = poly.no_such_attribute cm0 = np.array([-1.0, -0.99369437, 1.0, -1.0, 0.00961538]) assert np.allclose(poly.cm[0][0:poly.ncaps[0]], cm0) assert poly[0]['NCAPS'] == 5 poly = mng.read_fits_polygons(self.poly_fits, convert=True) assert poly[0].use_caps == 31 assert np.allclose(poly[0].cm, cm0) assert poly[0].cmminf() == 4 # # A FITS file might not contain IFIELD. # poly = mng.read_fits_polygons(self.no_id_fits) assert len(poly) == 1 # # A FITS file might contain exactly one polygon with exactly one cap. # poly = mng.read_fits_polygons(self.one_cap_fits, convert=True) assert poly[0].ncaps == 1 def test_read_mangle_polygons(self): with raises(PydlutilsException): poly = mng.read_mangle_polygons(self.bad_ply) poly = mng.read_mangle_polygons(self.poly_ply) assert len(poly.header) == 3 assert poly.header[0] == 'pixelization 6s' assert len(poly) == 4 assert np.allclose(poly[0].x[0, :], np.array([0.0436193873653360, 0.9990482215818578, 0.0])) assert poly[3].ncaps == 3 def test_set_use_caps(self): poly = mng.read_fits_polygons(self.poly_fits, convert=True) old_use_caps = poly[0].use_caps index_list = list(range(poly[0].ncaps)) use_caps = mng.set_use_caps(poly[0], index_list, allow_doubles=True) assert use_caps == poly[0].use_caps use_caps = mng.set_use_caps(poly[0], index_list) assert use_caps == poly[0].use_caps x = np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0], [1.0-1.0e-8, 1.0e-8, 0.0]]) cm = np.array([1.0, 1.0, 1.0, 1.0-1.0e-8]) p = mng.ManglePolygon(x=x, cm=cm) index_list = list(range(p.ncaps)) assert p.use_caps == 2**4 - 1 use_caps = mng.set_use_caps(p, index_list, tol=1.0e-7) assert use_caps == 2**3 - 1 x = np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0], [1.0-1.0e-8, 1.0e-8, 0.0]]) cm = np.array([1.0, 1.0, 1.0, -1.0+1.0e-8]) p = mng.ManglePolygon(x=x, cm=cm) index_list = list(range(p.ncaps)) assert p.use_caps == 2**4 - 1 use_caps = mng.set_use_caps(p, index_list, tol=1.0e-7) assert use_caps == 2**3 - 1 use_caps = mng.set_use_caps(p, index_list, tol=1.0e-7, allow_neg_doubles=True) assert use_caps == 2**4 - 1 def test_x_to_angles(self): a = mng.x_to_angles(np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0]])) assert np.allclose(a, np.array([[0.0, 0.0], [90.0, 90.0], [0.0, 90.0]])) a = mng.x_to_angles(np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0]]), latitude=True) assert np.allclose(a, np.array([[0.0, 90.0], [90.0, 0.0], [0.0, 0.0]])) def test_single_polygon(self): x = np.array([[0.0, 0.0, 1.0], [0.0, 1.0, 0.0], [1.0, 0.0, 0.0], [1.0-1.0e-8, 1.0e-8, 0.0]]) cm = np.array([1.0, 1.0, 1.0, -1.0+1.0e-8]) p = mng.ManglePolygon(x=x, cm=cm) p2 = mng._single_polygon(p) assert id(p2) == id(p) pl = mng.PolygonList() pl.append(p) p2 = mng._single_polygon(pl) assert id(p2) == id(p) with raises(ValueError): pl.append(p2) p3 = mng._single_polygon(pl) p = mng.read_fits_polygons(self.no_id_fits) p2 = mng._single_polygon(p) assert p2.ncaps == 4 p2 = mng._single_polygon(p[0]) assert p2.use_caps == 15 def fits_polygon_file(): """Create a small test version of a FITS polygon file. """ from datetime import date from sys import argv from astropy.io import fits from pydl import __version__ as pydlversion with fits.open(argv[1], uint=True) as hdulist: header0 = hdulist[0].header data = hdulist[1].data if 'DATE' in header0: header0['DATE'] = date.today().strftime('%Y-%m-%d') if 'IDLUTILS' in header0: header0['IDLUTILS'] = 'pydl-'+pydlversion hdu0 = fits.PrimaryHDU(header=header0) hdu1 = fits.BinTableHDU(data[0:20]) hdulist2 = fits.HDUList([hdu0, hdu1]) hdulist2.writeto('polygon.fits') return 0 if __name__ == '__main__': from sys import exit exit(fits_polygon_file()) pydl-0.7.0/pydl/pydlutils/tests/test_coord.py0000644000076500000240000000232512632466352021777 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy import units as u from astropy.coordinates import ICRS # from astropy.tests.helper import raises from ..coord import SDSSMuNu, current_mjd, stripe_to_eta, stripe_to_incl class TestCoord(object): """Test the functions in pydl.pydlutils.coord. """ def setup(self): pass def teardown(self): pass def test_current_mjd(self): assert current_mjd() > 50000.0 def test_munu_to_radec(self): munu = SDSSMuNu(mu=0.0*u.deg, nu=0.0*u.deg, stripe=10) radec = munu.transform_to(ICRS) assert radec.ra.value == 0.0 assert radec.dec.value == 0.0 def test_radec_to_munu(self): radec = ICRS(ra=0.0*u.deg, dec=0.0*u.deg) munu = radec.transform_to(SDSSMuNu(stripe=10)) assert munu.mu.value == 0.0 assert munu.nu.value == 0.0 assert munu.incl.value == 0.0 def test_stripe_to_eta(self): eta = stripe_to_eta(82) assert eta == -32.5 def test_stripe_to_incl(self): incl = stripe_to_incl(82) assert incl == 0.0 incl = stripe_to_incl(10) assert incl == 0.0 pydl-0.7.0/pydl/pydlutils/tests/test_rgbcolor.py0000644000076500000240000000513513064020340022463 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import catch_warnings, raises from .. import PydlutilsUserWarning from ..rgbcolor import (nw_arcsinh, nw_cut_to_box, nw_float_to_byte, nw_scale_rgb) class TestRGBColor(object): """Test the functions in pydl.pydlutils.rgbcolor. """ def setup(self): pass def teardown(self): pass def test_nw_arcsinh(self): colors = np.random.random((10, 10)) with raises(ValueError): fitted_colors = nw_arcsinh(colors) colors = np.random.random((10, 10, 5)) with raises(ValueError): fitted_colors = nw_arcsinh(colors) colors = np.random.random((10, 10, 3)) fitted_colors = nw_arcsinh(colors, nonlinearity=0) assert (fitted_colors == colors).all() colors = np.ones((2, 2, 3)) fac = np.arcsinh(9.0)/9.0 fitted_colors = nw_arcsinh(colors) assert np.allclose(fitted_colors, fac) def test_nw_cut_to_box(self): colors = np.random.random((10, 10)) with raises(ValueError): boxed_colors = nw_cut_to_box(colors) colors = np.random.random((10, 10, 5)) with raises(ValueError): boxed_colors = nw_cut_to_box(colors) colors = np.random.random((10, 10, 3)) with raises(ValueError): boxed_colors = nw_cut_to_box(colors, origin=(1.0, 1.0)) boxed_colors = nw_cut_to_box(colors) assert np.allclose(boxed_colors, colors) def test_nw_float_to_byte(self): colors = np.zeros((10, 10, 3), dtype=np.float) byte_colors = nw_float_to_byte(colors) assert (byte_colors == 0).all() colors = np.ones((10, 10, 3), dtype=np.float) byte_colors = nw_float_to_byte(colors) assert (byte_colors == 255).all() with catch_warnings(PydlutilsUserWarning) as w: byte_colors = nw_float_to_byte(colors, bits=16) assert len(w) > 0 def test_nw_scale_rgb(self): colors = np.random.random((10, 10)) with raises(ValueError): scaled_colors = nw_scale_rgb(colors) colors = np.random.random((10, 10, 5)) with raises(ValueError): scaled_colors = nw_scale_rgb(colors) colors = np.random.random((10, 10, 3)) with raises(ValueError): scaled_colors = nw_scale_rgb(colors, scales=(1.0, 1.0)) colors = np.ones((2, 2, 3)) scaled_colors = nw_scale_rgb(colors, scales=(2.0, 2.0, 2.0)) assert np.allclose(scaled_colors, 2.0) pydl-0.7.0/pydl/pydlutils/tests/test_math.py0000644000076500000240000000646013064020340021605 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import raises from astropy.utils.data import get_pkg_data_filename from ..math import computechi2, djs_median, find_contiguous class TestMath(object): """Test the functions in pydl.pydlutils.math. """ def setup(self): pass def teardown(self): pass def test_computechi2(self): x = np.arange(20) y = np.array([6.6198438, 1.3491303, 0.41035045, 9.4013375, 4.1103360, 4.3522868, 4.6338078, 4.7400367, 5.1860726, 5.1082748, 5.1084596, 5.2990997, 5.5987537, 5.7007504, 5.7855296, 4.1123709, 6.9437957, 4.9956179, 4.3724215, 3.6245063]) sqivar = np.array([1.32531, 0.886090, 1.08384, 1.04489, 1.46807, 1.30800, 0.507725, 1.12840, 0.955025, 1.35925, 1.10126, 1.45690, 0.575700, 0.949710, 1.23368, 0.536489, 0.772543, 0.957729, 0.883976, 1.11559]) templates = np.vstack((np.ones((20,), dtype='d'), x)).transpose() chi2 = computechi2(y, sqivar, templates) # # 20 data points, 2 parameters = 18 degrees of freedom. # assert chi2.dof == 18 # # The covariance matrix should be symmetric # assert (chi2.covar.T == chi2.covar).all() # # The variance vector should be the same as the diagonal of the # covariance matrix. # assert (chi2.var == np.diag(chi2.covar)).all() def test_djs_median(self): test_data_file = get_pkg_data_filename('t/median_data.txt') test_data = np.loadtxt(test_data_file, dtype='d', delimiter=',') np.random.seed(424242) data = 100.0*np.random.random(100) data2 = 100.0*np.random.random((10, 10)) data3 = 100.0*np.random.random((10, 10, 10)) data_width_5 = test_data[0, :] data_width_5_reflect = test_data[1, :] # # Degenerate cases that fall back on numpy.median(). # assert np.allclose(np.median(data), djs_median(data)) assert np.allclose(np.median(data2, axis=0), djs_median(data2, dimension=0)) assert np.allclose(np.median(data2, axis=1), djs_median(data2, dimension=1)) # # Test widths. # assert np.allclose(data, djs_median(data, width=1)) assert np.allclose(data_width_5, djs_median(data, width=5)) # assert np.allclose(data_width_5_reflect, # djs_median(data, width=5, boundary='reflect')) # # Exceptions # with raises(ValueError): foo = djs_median(data, width=5, boundary='nearest') with raises(ValueError): foo = djs_median(data, width=5, boundary='wrap') with raises(ValueError): foo = djs_median(data, width=5, boundary='foobar') with raises(ValueError): foo = djs_median(data2, width=5, dimension=1) with raises(ValueError): foo = djs_median(data3, width=5) with raises(ValueError): foo = djs_median(data3, width=5, boundary='reflect') def test_find_contiguous(self): assert (find_contiguous(np.array([0, 1, 1, 1, 0, 1, 1, 0, 1])) == [1, 2, 3]) pydl-0.7.0/pydl/pydlutils/tests/test_sdss.py0000644000076500000240000003110313272372213021632 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np import pydl.pydlutils.sdss from astropy.tests.helper import remote_data, raises from astropy.utils.data import get_pkg_data_filename from ..sdss import (default_skyversion, sdss_flagexist, sdss_flagname, sdss_flagval, set_maskbits, sdss_astrombad, sdss_objid, sdss_specobjid, unwrap_specobjid) class TestSDSS(object): """Test the functions in pydl.pydlutils.sdss. """ def setup(self): pydl.pydlutils.sdss.maskbits = set_maskbits( maskbits_file=get_pkg_data_filename('t/testMaskbits.par')) self.opbadfields = np.array([ (77, 'astrom', 30, 73, 'Large astrometric offset at field 39... 72'), (85, 'astrom', 8, 28, 'Large astrometric offset at field 11... 27'), (85, 'rotator', 242, 253, 'Large rotator offset at field 251 252'), (209, 'astrom', 8, 116, 'Tel. offsets before r-band field 115 -DJS'), (209, 'astrom', 137, 175, 'Tel. offsets after r-band field 145 -DJS'), (250, 'astrom', 456, 468, 'Large astrometric offset -Manual'), (251, 'astrom', 147, 159, 'Large astrometric offset at field 156 158')], dtype=[('run', '`_. """ import numpy as np from warnings import warn from . import PydlutilsUserWarning def nw_arcsinh(colors, nonlinearity=3.0): """Scales `colors` by a degree of nonlinearity specified by user. The input image must have zero background (*i.e.*, it must already be background-subtracted). Parameters ---------- colors : :class:`~numpy.ndarray` 3D Array containing RGB image. The dimensions should be (X, Y, 3). nonlinearity : :class:`float` Amount of nonlinear scaling. If set to zero, no scaling will be performed (this is equivalent to linear scaling). Returns ------- :class:`~numpy.ndarray` The scaled image. Raises ------ :exc:`ValueError` If `colors` has the wrong shape. """ if nonlinearity == 0: return colors if colors.ndim != 3: raise ValueError("A 3D image is required!") nx, ny, nc = colors.shape if nc != 3: raise ValueError("The 3D image must have 3 image planes.") radius = colors.sum(2) w = radius == 0 radius[w] = radius[w] + 1.0 fac = np.arcsinh(radius*nonlinearity)/nonlinearity/radius fitted_colors = np.zeros(colors.shape, dtype=colors.dtype) for k in range(3): fitted_colors[:, :, k] = colors[:, :, k]*fac return fitted_colors def nw_cut_to_box(colors, origin=(0.0, 0.0, 0.0)): """Limits the pixel values of the image to a 'box', so that the colors do not saturate to white but to a specific color. Parameters ---------- colors : :class:`~numpy.ndarray` 3D Array containing RGB image. The dimensions should be (X, Y, 3). origin : :func:`tuple` or :class:`~numpy.ndarray` An array with 3 elements. The "distance" from this origin is considered saturated. Returns ------- :class:`~numpy.ndarray` The "boxed" image. Raises ------ :exc:`ValueError` If `colors` or `origin` has the wrong shape. """ if len(origin) != 3: raise ValueError("The origin array must contain 3 elements!") if colors.ndim != 3: raise ValueError("A 3D image is required!") nx, ny, nc = colors.shape if nc != 3: raise ValueError("The 3D image must have 3 image planes.") pos_dist = 1.0 - np.array(origin, dtype=colors.dtype) factors = np.zeros(colors.shape, dtype=colors.dtype) for k in range(3): factors[:, :, k] = colors[:, :, k]/pos_dist[k] factor = factors.max(2) factor = np.where(factor > 1.0, factor, 1.0) boxed_colors = np.zeros(colors.shape, dtype=colors.dtype) for k in range(3): boxed_colors[:, :, k] = origin[k] + colors[:, :, k]/factor return boxed_colors def nw_float_to_byte(image, bits=8): """Converts an array of floats in [0.0, 1.0] to bytes in [0, 255]. Parameters ---------- image : :class:`~numpy.ndarray` Image to convert. bits : :class:`int`, optional Number of bits in final image. Returns ------- :class:`~numpy.ndarray` Converted image. """ if bits > 8: warn("bits > 8 not supported; setting bits = 8.", PydlutilsUserWarning) bits = 8 fmax = 1 << bits bmax = fmax - 1 f1 = np.floor(image * fmax) f2 = np.where(f1 > 0, f1, 0) f3 = np.where(f2 < bmax, f2, bmax) return f3.astype(np.uint8) def nw_scale_rgb(colors, scales=(1.0, 1.0, 1.0)): """Multiply RGB image by color-dependent scale factor. Parameters ---------- colors : :class:`~numpy.ndarray` 3D Array containing RGB image. The dimensions should be (X, Y, 3). scales : :func:`tuple` or :class:`~numpy.ndarray`, optional An array with 3 elements. Returns ------- :class:`~numpy.ndarray` The scaled image. Raises ------ :exc:`ValueError` If `colors` or `scales` has the wrong shape. """ if len(scales) != 3: raise ValueError("The scale factor must contain 3 elements!") if colors.ndim != 3: raise ValueError("A 3D image is required!") nx, ny, nc = colors.shape if nc != 3: raise ValueError("The 3D image must have 3 image planes.") scaled_colors = np.zeros(colors.shape, dtype=colors.dtype) for k in range(3): scaled_colors[:, :, k] = colors[:, :, k]*scales[k] return scaled_colors pydl-0.7.0/pydl/pydlutils/math.py0000644000076500000240000004664113434104050017413 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the math directory in idlutils. """ import numpy as np from numpy.linalg import svd import astropy.utils as au from .misc import djs_laxisnum from ..median import median class computechi2(object): """Solve the linear set of equations :math:`A x = b` using SVD. The attributes of this class are all read-only properties, implemented with :class:`~astropy.utils.decorators.lazyproperty`. Parameters ---------- bvec : :class:`numpy.ndarray` The :math:`b` vector in :math:`A x = b`. This vector has length :math:`N`. sqivar : :class:`numpy.ndarray` The reciprocal of the errors in `bvec`. The name comes from the square root of the inverse variance, which is what this is. amatrix : :class:`numpy.ndarray` The matrix :math:`A` in :math:`A x = b`. The shape of this matrix is (:math:`N`, :math:`M`). """ def __init__(self, bvec, sqivar, amatrix): """Initialize the object and perform initial computations. """ # # Save the inputs # # self.bvec = bvec self.sqivar = sqivar self.amatrix = amatrix if len(amatrix.shape) > 1: self.nstar = amatrix.shape[1] else: self.nstar = 1 self.bvec = bvec * sqivar self.mmatrix = self.amatrix * np.tile(sqivar, self.nstar).reshape( self.nstar, bvec.size).transpose() mm = np.dot(self.mmatrix.T, self.mmatrix) self.uu, self.ww, self.vv = svd(mm, full_matrices=False) self.mmi = np.dot((self.vv.T / np.tile(self.ww, self.nstar).reshape( self.nstar, self.nstar)), self.uu.T) return @au.lazyproperty def acoeff(self): """(:class:`~numpy.ndarray`) The fit parameters, :math:`x`, in :math:`A x = b`. This vector has length :math:`M`. """ return np.dot(self.mmi, np.dot(self.mmatrix.T, self.bvec)) @au.lazyproperty def chi2(self): """(:class:`float `) The :math:`\chi^2` value of the fit. """ return np.sum((np.dot(self.mmatrix, self.acoeff) - self.bvec)**2) @au.lazyproperty def yfit(self): """(:class:`~numpy.ndarray`) The evaluated best-fit at each point. This vector has length :math:`N`. """ return np.dot(self.amatrix, self.acoeff) @au.lazyproperty def dof(self): """(:class:`int `) The degrees of freedom of the fit. This is the number of values of `bvec` that have `sqivar` > 0 minus the number of fit paramaters, which is equal to :math:`M`. """ return (self.sqivar > 0).sum() - self.nstar @au.lazyproperty def covar(self): """(:class:`~numpy.ndarray`) The covariance matrix. The shape of this matrix is (:math:`M`, :math:`M`). """ wwt = self.ww.copy() wwt[self.ww > 0] = 1.0/self.ww[self.ww > 0] covar = np.zeros((self.nstar, self.nstar), dtype=self.ww.dtype) for i in range(self.nstar): for j in range(i + 1): covar[i, j] = np.sum(wwt * self.vv[:, i] * self.vv[:, j]) covar[j, i] = covar[i, j] return covar @au.lazyproperty def var(self): """(:class:`~numpy.ndarray`) The variances of the fit. This is identical to the diagonal of the covariance matrix. This vector has length :math:`M`. """ return np.diag(self.covar) def djs_median(array, dimension=None, width=None, boundary='none'): """Compute the median of an array. Use a filtering box or collapse the image along one dimension. Parameters ---------- array : :class:`numpy.ndarray` input array dimension : :class:`int`, optional Compute the median over this dimension. It is an error to specify both `dimension` and `width`. width : :class:`int`, optional Width of the median window. In general, this should be an odd integer. It is an error to specify both `dimension` and `width`. boundary : { 'none', 'reflect', 'nearest', 'wrap' }, optional Boundary condition to impose. 'none' means no filtering is done within `width`/2 of the boundary. 'reflect' means reflect pixel values around the boundary. 'nearest' means use the values of the nearest boundary pixel. 'wrap' means wrap pixel values around the boundary. 'nearest' and 'wrap' are not implemented. Returns ------- :class:`numpy.ndarray` The output. If neither `dimension` nor `width` are set, this is a scalar value, just the output of ``numpy.median()``. If `dimension` is set, then the result simply ``numpy.median(array,dimension)``. If `width` is set, the result has the same shape as the input array. """ if dimension is None and width is None: return np.median(array) elif width is None: return np.median(array, axis=dimension) elif dimension is None: if width == 1: return array if boundary == 'none': if array.ndim == 1: return median(array, width) elif array.ndim == 2: return median(array, width) else: raise ValueError('Unsupported number of dimensions with ' + 'this boundary condition.') elif boundary == 'reflect': padsize = int(np.ceil(width/2.0)) if array.ndim == 1: bigarr = np.zeros(array.shape[0]+2*padsize, dtype=array.dtype) bigarr[padsize:padsize+array.shape[0]] = array bigarr[0:padsize] = array[0:padsize][::-1] bigarr[padsize+array.shape[0]:padsize*2+array.shape[0]] = ( array[array.shape[0]-padsize:array.shape[0]][::-1]) f = median(bigarr, width) medarray = f[padsize:padsize+array.shape[0]] return medarray elif array.ndim == 2: bigarr = np.zeros((array.shape[0]+2*padsize, array.shape[1]+2*padsize), dtype=array.dtype) bigarr[padsize:padsize+array.shape[0], padsize:padsize+array.shape[1]] = array # Copy into top + bottom bigarr[0:padsize, padsize:array.shape[1]+padsize] = array[0:padsize, :][::-1, :] bigarr[array.shape[0]+padsize:bigarr.shape[0], padsize:array.shape[1]+padsize] = array[array.shape[0]-padsize:array.shape[0], :][::-1, :] # Copy into left + right bigarr[padsize:array.shape[0]+padsize, 0:padsize] = array[:, 0:padsize][:, ::-1] bigarr[padsize:array.shape[0]+padsize, array.shape[1]+padsize:bigarr.shape[1]] = array[:, array.shape[1]-padsize:array.shape[1]][:, ::-1] # Copy into top left bigarr[0:padsize, 0:padsize] = array[0:padsize, 0:padsize][::-1, ::-1] # Copy into top right bigarr[0:padsize, bigarr.shape[1]-padsize:bigarr.shape[1]] = array[0:padsize, array.shape[1]-padsize:array.shape[1]][::-1, ::-1] # Copy into bottom left bigarr[bigarr.shape[0]-padsize:bigarr.shape[0], 0:padsize] = array[array.shape[0]-padsize:array.shape[0], 0:padsize][::-1, ::-1] # Copy into bottom right bigarr[bigarr.shape[0]-padsize:bigarr.shape[0], bigarr.shape[1]-padsize:bigarr.shape[1]] = array[array.shape[0]-padsize:array.shape[0], array.shape[1]-padsize:array.shape[1]][::-1, ::-1] f = median(bigarr, min(width, array.size)) medarray = f[padsize:array.shape[0]+padsize, padsize:array.shape[1]+padsize] return medarray else: raise ValueError('Unsupported number of dimensions with ' + 'this boundary condition.') elif boundary == 'nearest': raise ValueError('This boundary condition not implemented') elif boundary == 'wrap': raise ValueError('This boundary condition not implemented') else: raise ValueError('Unknown boundary condition.') else: raise ValueError('Invalid to specify both dimension & width.') def djs_reject(data, model, outmask=None, inmask=None, sigma=None, invvar=None, lower=None, upper=None, maxdev=None, maxrej=None, groupdim=None, groupsize=None, groupbadpix=False, grow=0, sticky=False): """Routine to reject points when doing an iterative fit to data. Parameters ---------- data : :class:`numpy.ndarray` The data model : :class:`numpy.ndarray` The model, must have the same number of dimensions as `data`. outmask : :class:`numpy.ndarray`, optional Output mask, generated by a previous call to `djs_reject`. If not supplied, this mask will be initialized to a mask that masks nothing. Although this parameter is technically optional, it will almost always be set. inmask : :class:`numpy.ndarray`, optional Input mask. Bad points are marked with a value that evaluates to ``False``. Must have the same number of dimensions as `data`. sigma : :class:`numpy.ndarray`, optional Standard deviation of the data, used to reject points based on the values of `upper` and `lower`. invvar : :class:`numpy.ndarray`, optional Inverse variance of the data, used to reject points based on the values of `upper` and `lower`. If both `sigma` and `invvar` are set, `invvar` will be ignored. lower : :class:`int` or :class:`float`, optional If set, reject points with data < model - lower * sigma. upper : :class:`int` or :class:`float`, optional If set, reject points with data > model + upper * sigma. maxdev : :class:`int` or :class:`float`, optional If set, reject points with abs(data-model) > maxdev. It is permitted to set all three of `lower`, `upper` and `maxdev`. maxrej : :class:`int` or :class:`numpy.ndarray`, optional Maximum number of points to reject in this iteration. If `groupsize` or `groupdim` are set to arrays, this should be an array as well. groupdim To be documented. groupsize To be documented. groupbadpix : :class:`bool`, optional If set to ``True``, consecutive sets of bad pixels are considered groups, overriding the values of `groupsize`. grow : :class:`int`, optional If set to a non-zero integer, N, the N nearest neighbors of rejected pixels will also be rejected. sticky : :class:`bool`, optional If set to ``True``, pixels rejected in one iteration remain rejected in subsequent iterations, even if the model changes. Returns ------- :func:`tuple` A tuple containing a mask where rejected data values are ``False`` and a boolean value set to ``True`` if `djs_reject` believes there is no further rejection to be done. Raises ------ :exc:`ValueError` If dimensions of various inputs do not match. """ # # Create outmask setting = 1 for good data. # if outmask is None: outmask = np.ones(data.shape, dtype='bool') else: if data.shape != outmask.shape: raise ValueError('Dimensions of data and outmask do not agree.') # # Check other inputs. # if model is None: if inmask is not None: outmask = inmask return (outmask, False) else: if data.shape != model.shape: raise ValueError('Dimensions of data and model do not agree.') if inmask is not None: if data.shape != inmask.shape: raise ValueError('Dimensions of data and inmask do not agree.') if maxrej is not None: if groupdim is not None: if len(maxrej) != len(groupdim): raise ValueError('maxrej and groupdim must have the same number of elements.') else: groupdim = [] if groupsize is not None: if len(maxrej) != len(groupsize): raise ValueError('maxrej and groupsize must have the same number of elements.') else: groupsize = len(data) if sigma is None and invvar is None: if inmask is not None: igood = (inmask & outmask).nonzero()[0] else: igood = outmask.nonzero()[0] if len(igood > 1): sigma = np.std(data[igood] - model[igood]) else: sigma = 0 diff = data - model # # The working array is badness, which is set to zero for good points # (or points already rejected), and positive values for bad points. # The values determine just how bad a point is, either corresponding # to the number of sigma above or below the fit, or to the number # of multiples of maxdev away from the fit. # badness = np.zeros(outmask.shape, dtype=data.dtype) # # Decide how bad a point is according to lower. # if lower is not None: if sigma is not None: qbad = diff < (-lower * sigma) badness += ((-diff/(sigma + (sigma == 0))) > 0) * qbad else: qbad = (diff * np.sqrt(invvar)) < -lower badness += ((-diff * np.sqrt(invvar)) > 0) * qbad # # Decide how bad a point is according to upper. # if upper is not None: if sigma is not None: qbad = diff > (upper * sigma) badness += ((diff/(sigma + (sigma == 0))) > 0) * qbad else: qbad = (diff * np.sqrt(invvar)) > upper badness += ((diff * np.sqrt(invvar)) > 0) * qbad # # Decide how bad a point is according to maxdev. # if maxdev is not None: qbad = np.absolute(diff) > maxdev badness += np.absolute(diff) / maxdev * qbad # # Do not consider rejecting points that are already rejected by inmask. # Do not consider rejecting points that are already rejected by outmask, # if sticky is set. # if inmask is not None: badness *= inmask if sticky: badness *= outmask # # Reject a maximum of maxrej (additional) points in all the data, or # in each group as specified by groupsize, and optionally along each # dimension specified by groupdim. # if maxrej is not None: # # Loop over each dimension of groupdim or loop once if not set. # for iloop in range(max(len(groupdim), 1)): # # Assign an index number in this dimension to each data point. # if len(groupdim) > 0: yndim = len(ydata.shape) if groupdim[iloop] > yndim: raise ValueError('groupdim is larger than the number of dimensions for ydata.') dimnum = djs_laxisnum(ydata.shape, iaxis=groupdim[iloop]-1) else: dimnum = np.asarray([0]) # # Loop over each vector specified by groupdim. For example, if # this is a 2-D array with groupdim=1, then loop over each # column of the data. If groupdim=2, then loop over each row. # If groupdim is not set, then use the whole image. # for ivec in range(max(dimnum)): # # At this point it is not possible that dimnum is not set. # indx = (dimnum == ivec).nonzero()[0] # # Within this group of points, break it down into groups # of points specified by groupsize, if set. # nin = len(indx) if groupbadpix: goodtemp = badness == 0 groups_lower = (-1*np.diff(np.insert(goodtemp, 0, 1)) == 1).nonzero()[0] groups_upper = (np.diff(np.append(goodtemp, 1)) == 1).nonzero()[0] ngroups = len(groups_lower) else: # # The IDL version of this test makes no sense because # groupsize will always be set. # if 'groupsize' in kwargs: ngroups = nin/groupsize + 1 groups_lower = np.arange(ngroups, dtype='i4')*groupsize foo = (np.arange(ngroups, dtype='i4')+1)*groupsize groups_upper = np.where(foo < nin, foo, nin) - 1 else: ngroups = 1 groups_lower = [0, ] groups_upper = [nin - 1, ] for igroup in range(ngroups): i1 = groups_lower[igroup] i2 = groups_upper[igroup] nii = i2 - i1 + 1 # # Need the test that i1 != -1 below to prevent a crash # condition, but why is it that we ever get groups # without any points? Because this is badly-written, # that's why. # if nii > 0 and i1 != -1: jj = indx[i1:i2+1] # # Test if too many points rejected in this group. # if np.sum(badness[jj] != 0) > maxrej[iloop]: isort = badness[jj].argsort() # # Make the following points good again. # badness[jj[isort[0:nii-maxrej[iloop]]]] = 0 i1 += groupsize[iloop] # # Now modify outmask, rejecting points specified by inmask=0, outmask=0 # if sticky is set, or badness > 0. # # print(badness) newmask = badness == 0 # print(newmask) if grow > 0: rejects = newmask == 0 if rejects.any(): irejects = rejects.nonzero()[0] for k in range(1, grow): newmask[(irejects - k) > 0] = 0 newmask[(irejects + k) < (data.shape[0]-1)] = 0 if inmask is not None: newmask = newmask & inmask if sticky: newmask = newmask & outmask # # Set qdone if the input outmask is identical to the output outmask; # convert np.bool to Python built-in bool. # qdone = bool(np.all(newmask == outmask)) outmask = newmask return (outmask, qdone) def find_contiguous(x): """Find the longest sequence of contiguous non-zero array elements. Parameters ---------- x : :class:`numpy.ndarray` A 1d array. A dtype of bool is preferred although any dtype where the operation ``if x[k]:`` is well-defined should work. Returns ------- :class:`list` A list of indices of the longest contiguous non-zero sequence. Examples -------- >>> import numpy as np >>> from pydl.pydlutils.math import find_contiguous >>> find_contiguous(np.array([0,1,1,1,0,1,1,0,1])) [1, 2, 3] """ contig = list() for k in range(x.size): if x[k]: if len(contig) == 0: contig.append([k]) else: if k == contig[-1][-1]+1: contig[-1].append(k) else: contig.append([k]) lengths = [len(c) for c in contig] longest = contig[lengths.index(max(lengths))] return longest pydl-0.7.0/pydl/pydlutils/sdss.py0000644000076500000240000006555113434104050017437 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the sdss directory in idlutils. """ import os import re import numpy as np from six import string_types from astropy.io import fits from astropy.utils.data import download_file from . import PydlutilsException from .spheregroup import spherematch from .yanny import yanny from .. import uniq # # Cache for the maskbits file. # maskbits = None # # Cache the sweep index # sweep_cache = {'star': None, 'gal': None, 'sky': None} # # Cache sdss_astrombad data # opbadfields = None def default_skyversion(): """Returns skyversion number to use for photoop imaging. Returns ------- :class:`int` The default skyversion number. Notes ----- The skyversion number is hard-coded to 2. Examples -------- >>> from pydl.pydlutils.sdss import default_skyversion >>> default_skyversion() 2 """ return 2 def sdss_astrombad(run, camcol, field, photolog_version='dr10'): """For a list of RUN, CAMCOL, FIELD, return whether each field has bad astrometry. Parameters ---------- run, camcol, field : :class:`int` or array of :class:`int` Run, camcol and field. If arrays are passed, all must have the same length. photolog_version : :class:`str`, optional Use this version of photolog to obtain the obBadfields.par file, if :envvar:`PHOTOLOG_DIR` is not set. Returns ------- :class:`numpy.ndarray` of :class:`bool` Array of bool. ``True`` indicates the field is bad. Raises ------ :exc:`ValueError` If the sizes of the arrays don't match or if the array values are out of bounds. Notes ----- Reads data from ``$PHOTOLOG_DIR/opfiles/opBadFields.par``. If there is a problem with one camcol, we assume a problem with all camcols. """ global opbadfields # # Check inputs # if isinstance(run, int): # # Assume all inputs are integers & promote to arrays. # run = np.array([run], dtype=np.int64) camcol = np.array([camcol], dtype=np.int64) field = np.array([field], dtype=np.int64) else: # # Check that all inputs have the same shape. # if run.shape != camcol.shape: raise ValueError("camcol.shape does not match run.shape!") if run.shape != field.shape: raise ValueError("field.shape does not match run.shape!") # # Check ranges of parameters # if ((run < 0) | (run >= 2**16)).any(): raise ValueError("run values are out-of-bounds!") if ((camcol < 1) | (camcol > 6)).any(): raise ValueError("camcol values are out-of-bounds!") if ((field < 0) | (field >= 2**12)).any(): raise ValueError("camcol values are out-of-bounds!") # # Read the file # if opbadfields is None: # pragma: no cover if os.getenv('PHOTOLOG_DIR') is None: if (photolog_version == 'trunk' or photolog_version.startswith('branches/')): iversion = photolog_version else: iversion = 'tags/'+photolog_version baseurl = ('https://svn.sdss.org/public/data/sdss/photolog/' + '{0}/opfiles/opBadfields.par').format(iversion) filename = download_file(baseurl, cache=True) else: filename = os.path.join(os.getenv('PHOTOLOG_DIR'), 'opfiles', 'opBadfields.par') astrombadfile = yanny(filename) w = ((astrombadfile['BADFIELDS']['problem'] == 'astrom'.encode()) | (astrombadfile['BADFIELDS']['problem'] == 'rotator'.encode())) opbadfields = astrombadfile['BADFIELDS'][w] # # opbadfields already has astrom problems selected at this point # bad = np.zeros(run.shape, dtype=bool) for row in opbadfields: w = ((run == row['run']) & (field >= row['firstfield']) & (field < row['lastfield'])) if w.any(): bad[w] = True return bad def sdss_flagexist(flagname, bitname, flagexist=False, whichexist=False): """Check for the existence of flags. Parameters ---------- flagname : :class:`str` The name of a bitmask group. Not case-sensitive. bitname : :class:`str` or :class:`list` The name(s) of the specific bitmask(s) within the `flagname` group. flagexist : :class:`bool`, optional If flagexist is True, return a tuple with the second component indicating whether the binary flag named `flagname` exists, even if `bitname` is wrong. whichexist : :class:`bool`, optional If whichexist is True, return a list containing existence test results for each individual flag. Returns ------- :class:`bool` or :func:`tuple` A boolean value or a tuple of bool. """ global maskbits if maskbits is None: # pragma: no cover maskbits = set_maskbits() # # Make sure label is a list # if isinstance(bitname, string_types): bitnames = [bitname.upper()] else: bitnames = [b.upper() for b in bitname] f = False l = False which = [False] * len(bitnames) if flagname.upper() in maskbits: f = True which = [n in maskbits[flagname.upper()] for n in bitnames] l = sum(which) == len(which) if flagexist and whichexist: return (l, f, which) elif flagexist: return (l, f) elif whichexist: return (l, which) else: return l def sdss_flagname(flagname, flagvalue, concat=False): """Return a list of flag names corresponding to the values. Parameters ---------- flagname : :class:`str` The name of a bitmask group. Not case-sensitive. flagvalue : :class:`long` The value to be converted into bitmask names. concat : :class:`bool`, optional If set to ``True``, the list of names is converted to a space-separated string. Returns ------- :class:`str` or :class:`list` The names of the bitmasks encoded in `flagvalue`. Raises ------ :exc:`KeyError` If `flagname` is invalid Examples -------- >>> from pydl.pydlutils.sdss import sdss_flagname >>> sdss_flagname('ANCILLARY_TARGET1',2310346608843161600) # doctest: +REMOTE_DATA ['BRIGHTGAL', 'BLAZGX', 'ELG'] """ global maskbits if maskbits is None: # pragma: no cover maskbits = set_maskbits() flagu = flagname.upper() flagvaluint = np.uint64(flagvalue) one = np.uint64(1) bits = [bit for bit in range(64) if (flagvaluint & (one << np.uint64(bit))) != 0] retval = list() for bit in bits: try: f = [x for x in maskbits[flagu].items() if x[1] == bit] except KeyError: raise KeyError("Unknown flag group {0}!".format(flagu)) if f: retval.append(f[0][0]) if concat: retval = ' '.join(retval) return retval def sdss_flagval(flagname, bitname): """Convert bitmask names into values. Converts human-readable bitmask names into numerical values. The inputs are not case-sensitive; all inputs are converted to upper case internally. Parameters ---------- flagname : :class:`str` The name of a bitmask group. bitname : :class:`str` or :class:`list` The name(s) of the specific bitmask(s) within the `flagname` group. Returns ------- :class:`numpy.uint64` The value of the bitmask name(s). Raises ------ :exc:`KeyError` If `flagname` or `bitname` are invalid names. Examples -------- >>> from pydl.pydlutils.sdss import sdss_flagval >>> sdss_flagval('ANCILLARY_TARGET1',['BLAZGX','ELG','BRIGHTGAL']) # doctest: +REMOTE_DATA 2310346608843161600 """ global maskbits if maskbits is None: # pragma: no cover maskbits = set_maskbits() # # Make sure inlabel is a list # if isinstance(bitname, string_types): bitnames = [bitname.upper()] else: bitnames = [b.upper() for b in bitname] flagu = flagname.upper() flagvalue = np.uint64(0) for bit in bitnames: if flagu in maskbits: if bit in maskbits[flagu]: flagvalue += np.uint64(2)**np.uint64(maskbits[flagu][bit]) else: raise KeyError("Unknown bit label {0} for flag group {1}!".format(bit, flagu)) else: raise KeyError("Unknown flag group {0}!".format(flagu)) return flagvalue def sdss_objid(run, camcol, field, objnum, rerun=301, skyversion=None, firstfield=None): """Convert SDSS photometric identifiers into CAS-style ObjID. Bits are assigned in ObjID thus: ===== ========== =============================================== Bits Name Comment ===== ========== =============================================== 63 empty unassigned 59-62 skyVersion resolved sky version (0-15) 48-58 rerun number of pipeline rerun 32-47 run run number 29-31 camcol camera column (1-6) 28 firstField is this the first field in segment? Usually 0. 16-27 field field number within run 0-15 object object number within field ===== ========== =============================================== Parameters ---------- run, camcol, field, objnum : :class:`int` or array of int Run, camcol, field and object number within field. If arrays are passed, all must have the same length. rerun, skyversion, firstfield : :class:`int` or array of int, optional `rerun`, `skyversion` and `firstfield` usually don't change at all, especially for ObjIDs in DR8 and later. If supplied, make sure the size matches all the other values. Returns ------- :class:`numpy.ndarray` of :class:`numpy.int64` The ObjIDs of the objects. Raises ------ :exc:`ValueError` If the sizes of the arrays don't match or if the array values are out of bounds. Notes ----- * The ``firstField`` flag is never set in ObjIDs from DR8 and later. * On 32-bit systems, makes sure to explicitly declare all inputs as 64-bit integers. Examples -------- >>> from pydl.pydlutils.sdss import sdss_objid >>> print(sdss_objid(3704,3,91,146)) [1237661382772195474] """ if skyversion is None: skyversion = default_skyversion() if firstfield is None: firstfield = 0 if isinstance(run, int): run = np.array([run], dtype=np.int64) if isinstance(camcol, int): camcol = np.array([camcol], dtype=np.int64) if isinstance(field, int): field = np.array([field], dtype=np.int64) if isinstance(objnum, int): objnum = np.array([objnum], dtype=np.int64) if isinstance(rerun, int): if rerun == 301: rerun = np.zeros(run.shape, dtype=np.int64) + 301 else: rerun = np.array([rerun], dtype=np.int64) if isinstance(skyversion, int): if skyversion == default_skyversion(): skyversion = np.zeros(run.shape, dtype=np.int64) + default_skyversion() else: skyversion = np.array([skyversion], dtype=np.int64) if isinstance(firstfield, int): if firstfield == 0: firstfield = np.zeros(run.shape, dtype=np.int64) else: firstfield = np.array([firstfield], dtype=np.int64) # # Check that all inputs have the same shape. # if run.shape != camcol.shape: raise ValueError("camcol.shape does not match run.shape!") if run.shape != field.shape: raise ValueError("field.shape does not match run.shape!") if run.shape != objnum.shape: raise ValueError("objnum.shape does not match run.shape!") if run.shape != rerun.shape: raise ValueError("rerun.shape does not match run.shape!") if run.shape != skyversion.shape: raise ValueError("skyversion.shape does not match run.shape!") if run.shape != firstfield.shape: raise ValueError("firstfield.shape does not match run.shape!") # # Check ranges of parameters # if ((firstfield < 0) | (firstfield > 1)).any(): raise ValueError("firstfield values are out-of-bounds!") if ((skyversion < 0) | (skyversion >= 16)).any(): raise ValueError("skyversion values are out-of-bounds!") if ((rerun < 0) | (rerun >= 2**11)).any(): raise ValueError("rerun values are out-of-bounds!") if ((run < 0) | (run >= 2**16)).any(): raise ValueError("run values are out-of-bounds!") if ((camcol < 1) | (camcol > 6)).any(): raise ValueError("camcol values are out-of-bounds!") if ((field < 0) | (field >= 2**12)).any(): raise ValueError("camcol values are out-of-bounds!") if ((objnum < 0) | (objnum >= 2**16)).any(): raise ValueError("id values are out-of-bounds!") # # Compute the objid # objid = ((skyversion << 59) | (rerun << 48) | (run << 32) | (camcol << 29) | (firstfield << 28) | (field << 16) | (objnum)) return objid def sdss_specobjid(plate, fiber, mjd, run2d, line=None, index=None): """Convert SDSS spectrum identifiers into CAS-style specObjID. Bits are assigned in specObjID thus: ===== ========== ============================================================= Bits Name Comment ===== ========== ============================================================= 50-63 Plate ID 14 bits 38-49 Fiber ID 12 bits 24-37 MJD Date plate was observed minus 50000 (14 bits) 10-23 run2d Spectroscopic reduction version 0-9 line/index 0 for use in SpecObj files see below for other uses (10 bits) ===== ========== ============================================================= Parameters ---------- plate, fiber, mjd : :class:`int` or array of int Plate, fiber ID, and MJD for a spectrum. If arrays are passed, all must have the same length. The MJD value must be greater than 50000. run2d : :class:`int`, :class:`str` or array of int or str The run2d value must be an integer or a string of the form 'vN_M_P'. If an array is passed, it must have the same length as the other inputs listed above. If the string form is used, the values are restricted to :math:`5 \le N \le 6`, :math:`0 \le M \le 99`, :math:`0 \le P \le 99`. line : :class:`int`, optional A line index, only used for defining specObjID for SpecLine files. `line` and `index` cannot both be non-zero. index : :class:`int`, optional An index measure, only used for defining specObjID for SpecLineIndex files. `line` and `index` cannot both be non-zero. Returns ------- :class:`numpy.ndarray` of :class:`numpy.uint64` The specObjIDs of the objects. Raises ------ :exc:`ValueError` If the sizes of the arrays don't match or if the array values are out of bounds. Notes ----- * On 32-bit systems, makes sure to explicitly declare all inputs as 64-bit integers. * This function defines the SDSS-III/IV version of specObjID, used for SDSS DR8 and subsequent data releases. It is not compatible with SDSS DR7 or earlier. * If the string form of `run2d` is used, the bits are assigned by the formula :math:`(N - 5) \\times 10000 + M \\times 100 + P`. Examples -------- >>> from pydl.pydlutils.sdss import sdss_specobjid >>> print(sdss_specobjid(4055,408,55359,'v5_7_0')) [4565636362342690816] """ if line is not None and index is not None: raise ValueError("line and index inputs cannot both be non-zero!") if isinstance(plate, int): plate = np.array([plate], dtype=np.uint64) if isinstance(fiber, int): fiber = np.array([fiber], dtype=np.uint64) if isinstance(mjd, int): mjd = np.array([mjd], dtype=np.uint64) - 50000 if isinstance(run2d, str): try: run2d = np.array([int(run2d)], dtype=np.uint64) except ValueError: # Try a "vN_M_P" string. m = re.match(r'v(\d+)_(\d+)_(\d+)', run2d) if m is None: raise ValueError("Could not extract integer run2d value!") else: N, M, P = m.groups() run2d = np.array([(int(N) - 5)*10000 + int(M) * 100 + int(P)], dtype=np.uint64) elif isinstance(run2d, int): run2d = np.array([run2d], dtype=np.uint64) if line is None: line = np.zeros(plate.shape, dtype=plate.dtype) else: if isinstance(line, int): line = np.array([line], dtype=np.uint64) if index is None: index = np.zeros(plate.shape, dtype=plate.dtype) else: if isinstance(index, int): index = np.array([index], dtype=np.uint64) # # Check that all inputs have the same shape. # if plate.shape != fiber.shape: raise ValueError("fiber.shape does not match plate.shape!") if plate.shape != mjd.shape: raise ValueError("mjd.shape does not match plate.shape!") if plate.shape != run2d.shape: raise ValueError("run2d.shape does not match plate.shape!") if plate.shape != line.shape: raise ValueError("line.shape does not match plate.shape!") if plate.shape != index.shape: raise ValueError("index.shape does not match plate.shape!") # # Check ranges of parameters # if ((plate < 0) | (plate >= 2**14)).any(): raise ValueError("plate values are out-of-bounds!") if ((fiber < 0) | (fiber >= 2**12)).any(): raise ValueError("fiber values are out-of-bounds!") if ((mjd < 0) | (mjd >= 2**14)).any(): raise ValueError("MJD values are out-of-bounds!") if ((run2d < 0) | (run2d >= 2**14)).any(): raise ValueError("MJD values are out-of-bounds!") if ((line < 0) | (line >= 2**10)).any(): raise ValueError("line values are out-of-bounds!") if ((index < 0) | (index >= 2**10)).any(): raise ValueError("index values are out-of-bounds!") # # Compute the specObjID # specObjID = ((plate << 50) | (fiber << 38) | (mjd << 24) | (run2d << 10) | (line | index)) return specObjID def sdss_sweep_circle(ra, dec, radius, stype='star', allobj=False): """Read the SDSS datasweep files and return objects around a location. Parameters ---------- ra, dec : :class:`float` The sky location to search, J2000 degrees. radius : :class:`float` The radius around `ra`, `dec` to search. stype : :class:`str`, optional The type of object to search, 'star', 'gal' or 'sky'. The default is 'star'. allobj : :class:`bool`, optional If set to ``True``, return all objects found, not just SURVEY_PRIMARY. Returns ------- :class:`numpy.ndarray` The data extracted from the sweep files. Raises ------ :exc:`PydlutilsException` If :envvar:`PHOTO_SWEEP` is not set. Notes ----- Assumes that the sweep files exist in :envvar:`PHOTO_SWEEP` and that index files have been created. """ global sweep_cache # # Check values # if stype not in ('star', 'gal', 'sky'): raise ValueError('Invalid type {0}!'.format(stype)) sweepdir = os.getenv('PHOTO_SWEEP') if sweepdir is None: raise PydlutilsException('PHOTO_SWEEP is not set!') # # Read the index # if sweep_cache[stype] is None: indexfile = os.path.join(sweepdir, 'datasweep-index-{0}.fits'.format(stype)) with fits.open(indexfile) as f: sweep_cache[stype] = f[1].data index = sweep_cache[stype] # # Match # ira = np.array([ra]) idec = np.array([dec]) m1, m2, d12 = spherematch(index['RA'], index['DEC'], ira, idec, radius+0.36, maxmatch=0) if len(m1) == 0: return None if not allobj: w = index['NPRIMARY'][m1] > 0 if w.any(): m1 = m1[w] else: return None # # Maximum number of objects # if allobj: n = index['IEND'][m1] - index['ISTART'][m1] + 1 ntot = (where(n > 0, n, np.zeros(n.shape, dtype=n.dtype))).sum() else: ntot = index['NPRIMARY'][m1].sum() # # Find unique run + camcol # rc = index['RUN'][m1]*6 + index['CAMCOL'][m1] - 1 isort = rc.argsort() iuniq = uniq(rc[isort]) istart = 0 objs = None nobjs = 0 for i in range(len(iuniq)): iend = iuniq[i] icurr = isort[istart:iend] # # Determine which file and range of rows # run = index['RUN'][m1[icurr[0]]] camcol = index['CAMCOL'][m1[icurr[0]]] rerun = index['RERUN'][m1[icurr[0]]] fields = index[m1[icurr]] ist = fields['ISTART'].min() ind = fields['IEND'].max() if ind >= ist: # # Read in the rows of that file # swfile = os.path.join(os.getenv('PHOTO_SWEEP'), rerun, 'calibObj-{0:06d}-{1:1d}-{2}.fits.gz'.format( int(run), int(camcol), stype)) with fits.open(swfile) as f: tmp_objs = f[1].data[ist:ind] if tmp_objs.size > 0: # # Keep only objects within the desired radius # tm1, tm2, d12 = spherematch(tmp_objs['RA'], tmp_objs['DEC'], ira, idec, radius, maxmatch=0) if len(tm1) > 0: tmp_objs = tmp_objs[tm1] # # Keep only SURVEY_PRIMARY objects by default # if not allobj: w = ((tmp_objs['RESOLVE_STATUS'] & sdss_flagval('RESOLVE_STATUS', 'SURVEY_PRIMARY')) > 0) if w.any(): tmp_objs = tmp_objs[w] else: tmp_objs = None if tmp_objs is not None: if objs is None: objs = np.zeros(ntot, dtype=tmp_objs.dtype) objs[nobjs:nobjs+tmp_objs.size] = tmp_objs nobjs += tmp_objs.size istart = iend+1 if nobjs > 0: return objs[0:nobjs] else: return None def set_maskbits(idlutils_version='v5_5_24', maskbits_file=None): """Populate the maskbits cache. Parameters ---------- idlutils_version : :class:`str`, optional Fetch the sdssMaskbits.par file corresponding to this idlutils version. maskbits_file : :class:`str`, optional Use an explicit file instead of downloading the official version. This should only be used for tests. Returns ------- :class:`dict` A dictionary of bitmasks suitable for caching. Raises ------ :exc:`URLError` If the data file could not be retrieved. """ from astropy.utils.data import download_file from .yanny import yanny if maskbits_file is None: # pragma: no cover if (idlutils_version == 'trunk' or idlutils_version.startswith('branches/')): iversion = idlutils_version else: iversion = 'tags/'+idlutils_version baseurl = ('https://svn.sdss.org/public/repo/sdss/idlutils/' + '{0}/data/sdss/sdssMaskbits.par').format(iversion) filename = download_file(baseurl, cache=True, show_progress=False) else: filename = maskbits_file maskfile = yanny(filename, raw=True) # # Parse the file & cache the results in maskbits # maskbits = dict() for k in range(maskfile.size('MASKBITS')): if maskfile['MASKBITS']['flag'][k] in maskbits: maskbits[maskfile['MASKBITS']['flag'][k]][maskfile['MASKBITS']['label'][k]] = maskfile['MASKBITS']['bit'][k] else: maskbits[maskfile['MASKBITS']['flag'][k]] = {maskfile['MASKBITS']['label'][k]: maskfile['MASKBITS']['bit'][k]} if 'MASKALIAS' in maskfile: for k in range(maskfile.size('MASKALIAS')): maskbits[maskfile['MASKALIAS']['alias'][k]] = maskbits[maskfile['MASKALIAS']['flag'][k]].copy() return maskbits def unwrap_specobjid(specObjID, run2d_integer=False, specLineIndex=False): """Unwrap CAS-style specObjID into plate, fiber, mjd, run2d. See :func:`~pydl.pydlutils.sdss.sdss_specobjid` for details on how the bits within a specObjID are assigned. Parameters ---------- specObjID : :class:`numpy.ndarray` An array containing 64-bit integers or strings. If strings are passed, they will be converted to integers internally. run2d_integer : :class:`bool`, optional If ``True``, do *not* attempt to convert the encoded run2d values to a string of the form 'vN_M_P'. specLineIndex : :class:`bool`, optional If ``True`` interpret any low-order bits as being an 'index' rather than a 'line'. Returns ------- :class:`numpy.recarray` A record array with the same length as `specObjID`, with the columns 'plate', 'fiber', 'mjd', 'run2d', 'line'. Examples -------- >>> from numpy import array, uint64 >>> from pydl.pydlutils.sdss import unwrap_specobjid >>> unwrap_specobjid(array([4565636362342690816], dtype=uint64)) rec.array([(4055, 408, 55359, 'v5_7_0', 0)], dtype=[('plate', '> 50, 2**14 - 1) unwrap.fiber = np.bitwise_and(tempobjid >> 38, 2**12 - 1) unwrap.mjd = np.bitwise_and(tempobjid >> 24, 2**14 - 1) + 50000 run2d = np.bitwise_and(tempobjid >> 10, 2**14 - 1) if run2d_integer: unwrap.run2d = run2d else: N = ((run2d // 10000) + 5).tolist() M = ((run2d % 10000) // 100).tolist() P = (run2d % 100).tolist() unwrap.run2d = ['v{0:d}_{1:d}_{2:d}'.format(n, m, p) for n, m, p in zip(N, M, P)] unwrap[line] = np.bitwise_and(tempobjid, 2**10 - 1) return unwrap pydl-0.7.0/pydl/pydlutils/spheregroup.py0000644000076500000240000006305013434104050021016 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the spheregroup directory in idlutils. """ from warnings import warn import numpy as np from six import string_types from . import PydlutilsException, PydlutilsUserWarning from ..goddard.astro import gcirc class chunks(object): """chunks class Functions for creating and manipulating spherical chunks are implemented as methods on this class. """ def __init__(self, ra, dec, minSize): """Init creates an object whose attributes are similar those created by the setchunks() function in the spheregroup library. """ # # Save the value of minSize # self.minSize = minSize # # Find maximum and minimum dec (in degrees) # decMin = dec.min() decMax = dec.max() decRange = decMax - decMin # # Find the declination boundaries; make them an integer multiple of # minSize, with extra room (one cell) on the edges. # self.nDec = 3 + int(np.floor(decRange/minSize)) decRange = minSize*float(self.nDec) decMin = decMin - 0.5*(decRange - decMax + decMin) decMax = decMin + decRange if decMin < -90.0 + 3.0*minSize: decMin = -90.0 if decMax > 90.0 - 3.0*minSize: decMax = 90.0 self.decBounds = decMin + ((decMax - decMin) * np.arange(self.nDec + 1, dtype='d'))/float(self.nDec) # # Find ra offset which minimizes the range in ra (this should take care # of the case that ra crosses zero in some parts # if abs(self.decBounds[self.nDec]) > abs(self.decBounds[0]): cosDecMin = np.cos(np.deg2rad(self.decBounds[self.nDec])) else: cosDecMin = np.cos(np.deg2rad(self.decBounds[0])) if cosDecMin <= 0.0: raise PydlutilsException("cosDecMin={0:f} not positive in setchunks().".format(cosDecMin)) self.raRange, self.raOffset = self.rarange(ra, minSize/cosDecMin) self.raMin, self.raMax = self.getraminmax(ra, self.raOffset) # # Isn't this redundant? # self.raRange = self.raMax - self.raMin # # For each declination slice, find the number of ra divisions # necessary and set them # self.raBounds = list() self.nRa = list() for i in range(self.nDec): # # Get maximum declination and its cosine # if abs(self.decBounds[i]) > abs(self.decBounds[i+1]): cosDecMin = np.cos(np.deg2rad(self.decBounds[i])) else: cosDecMin = np.cos(np.deg2rad(self.decBounds[i+1])) if cosDecMin <= 0.0: raise PydlutilsException("cosDecMin={0:f} not positive in setchunks().".format(cosDecMin)) # # Get raBounds array for this declination array, leave an extra # cell on each end # self.nRa.append(3 + int(np.floor(cosDecMin*self.raRange/minSize))) raRangeTmp = minSize*float(self.nRa[i])/cosDecMin raMinTmp = self.raMin - 0.5*(raRangeTmp-self.raMax+self.raMin) raMaxTmp = raMinTmp + raRangeTmp # # If we cannot avoid the 0/360 point, embrace it # if (raRangeTmp >= 360.0 or raMinTmp <= minSize/cosDecMin or raMaxTmp >= 360.0 - minSize/cosDecMin or abs(self.decBounds[i]) == 90.0): raMinTmp = 0.0 raMaxTmp = 360.0 raRangeTmp = 360.0 if self.decBounds[i] == -90.0 or self.decBounds[i+1] == 90.0: self.nRa[i] = 1 self.raBounds.append(raMinTmp + (raMaxTmp - raMinTmp) * np.arange(self.nRa[i] + 1, dtype='d') / float(self.nRa[i])) # # Create an empty set of lists to hold the output of self.assign() # self.chunkList = [[list() for j in range(self.nRa[i])] for i in range(self.nDec)] # # nChunkMax will be the length of the largest list in chunkList # it is computed by chunks.assign() # self.nChunkMax = 0 return def rarange(self, ra, minSize): """Finds the offset which yields the smallest raRange & returns both. Notes ----- .. warning:: This is not (yet) well-defined for the case of only one point. """ NRA = 6 raRangeMin = 361.0 raOffset = 0.0 EPS = 1.0e-5 for j in range(NRA): raMin, raMax = self.getraminmax(ra, 360.0*float(j)/float(NRA)) raRange = raMax-raMin if (2.0*(raRange-raRangeMin)/(raRange+raRangeMin) < -EPS and raMin > minSize and raMax < 360.0 - minSize): raRangeMin = raRange raOffset = 360.0*float(j)/float(NRA) return (raRangeMin, raOffset) def getraminmax(self, ra, raOffset): """Utility function used by rarange. """ currRa = np.fmod(ra + raOffset, 360.0) return (currRa.min(), currRa.max()) def cosDecMin(self, i): """Frequently used utility function. """ if abs(self.decBounds[i]) > abs(self.decBounds[i+1]): return np.cos(np.deg2rad(self.decBounds[i])) else: return np.cos(np.deg2rad(self.decBounds[i+1])) def assign(self, ra, dec, marginSize): """Take the objects and the chunks (already defined in the constructor) and assign the objects to the appropriate chunks, with some leeway given by the parameter marginSize. Basically, at the end, each chunk should be associated with a list of the objects that belong to it. """ if marginSize >= self.minSize: raise PydlutilsException("marginSize>=minSize ({0:f}={1:f}) in chunks.assign().".format(marginSize, self.minSize)) chunkDone = [[False for j in range(self.nRa[i])] for i in range(self.nDec)] for i in range(ra.size): currRa = np.fmod(ra[i] + self.raOffset, 360.0) try: raChunkMin, raChunkMax, decChunkMin, decChunkMax = self.getbounds(currRa, dec[i], marginSize) except PydlutilsException: continue # # Reset chunkDone. This is silly, but is necessary to # reproduce the logic. # for decChunk in range(decChunkMin, decChunkMax+1): for raChunk in range(raChunkMin[decChunk-decChunkMin]-1, raChunkMax[decChunk-decChunkMin]+2): if raChunk < 0: currRaChunk = (raChunk+self.nRa[decChunk]) % self.nRa[decChunk] elif raChunk > self.nRa[decChunk]-1: currRaChunk = (raChunk-self.nRa[decChunk]) % self.nRa[decChunk] else: currRaChunk = raChunk if currRaChunk >= 0 and currRaChunk <= self.nRa[decChunk]-1: chunkDone[decChunk][currRaChunk] = False for decChunk in range(decChunkMin, decChunkMax+1): for raChunk in range(raChunkMin[decChunk-decChunkMin], raChunkMax[decChunk-decChunkMin]+1): if raChunk < 0: currRaChunk = (raChunk+self.nRa[decChunk]) % self.nRa[decChunk] elif raChunk > self.nRa[decChunk]-1: currRaChunk = (raChunk-self.nRa[decChunk]) % self.nRa[decChunk] else: currRaChunk = raChunk if currRaChunk >= 0 and currRaChunk <= self.nRa[decChunk]-1: if not chunkDone[decChunk][currRaChunk]: self.chunkList[decChunk][currRaChunk].append(i) # # Update nChunkMax # if len(self.chunkList[decChunk][currRaChunk]) > self.nChunkMax: self.nChunkMax = len(self.chunkList[decChunk][currRaChunk]) chunkDone[decChunk][currRaChunk] = True return def getbounds(self, ra, dec, marginSize): """Find the set of chunks a point (with margin) belongs to. """ # # Find the declination slice without regard to marginSize # decChunkMin = int(np.floor((dec - self.decBounds[0]) * float(self.nDec) / (self.decBounds[self.nDec]-self.decBounds[0]))) decChunkMax = decChunkMin if decChunkMin < 0 or decChunkMin > self.nDec - 1: raise PydlutilsException("decChunkMin out of range in chunks.getbounds().") # # Set minimum and maximum bounds of dec # while dec - self.decBounds[decChunkMin] < marginSize and decChunkMin > 0: decChunkMin -= 1 while self.decBounds[decChunkMax+1] - dec < marginSize and decChunkMax < self.nDec - 1: decChunkMax += 1 # # Find ra chunk bounds for each dec chunk # raChunkMin = np.zeros(decChunkMax-decChunkMin+1, dtype='i4') raChunkMax = np.zeros(decChunkMax-decChunkMin+1, dtype='i4') for i in range(decChunkMin, decChunkMax+1): cosDecMin = self.cosDecMin(i) raChunkMin[i-decChunkMin] = int(np.floor((ra - self.raBounds[i][0]) * float(self.nRa[i]) / (self.raBounds[i][self.nRa[i]] - self.raBounds[i][0]))) raChunkMax[i-decChunkMin] = raChunkMin[i-decChunkMin] if raChunkMin[i-decChunkMin] < 0 or raChunkMin[i-decChunkMin] > self.nRa[i]-1: raise PydlutilsException("raChunkMin out of range in chunks.getbounds().") # # Set minimum and maximum bounds of ra # raCheck = raChunkMin[i-decChunkMin] keepGoing = True while keepGoing and raCheck > -1: if raCheck >= 0 and raCheck < self.nRa[i]: keepGoing = (ra - self.raBounds[i][raCheck])*cosDecMin < marginSize else: keepGoing = False if keepGoing: raCheck -= 1 raChunkMin[i-decChunkMin] = raCheck raCheck = raChunkMax[i-decChunkMin] keepGoing = True while keepGoing and raCheck < self.nRa[i]: if raCheck >= 0 and raCheck < self.nRa[i]: keepGoing = (self.raBounds[i][raCheck+1]-ra)*cosDecMin < marginSize else: keepGoing = False if keepGoing: raCheck += 1 raChunkMax[i-decChunkMin] = raCheck return (raChunkMin, raChunkMax, decChunkMin, decChunkMax) def get(self, ra, dec): """Find the chunk to which a given point belongs. """ # # Find dec chunk # decChunk = int(np.floor((dec - self.decBounds[0]) * float(self.nDec) / (self.decBounds[self.nDec]-self.decBounds[0]))) # # Find ra chunk # if decChunk < self.nDec and decChunk >= 0: raChunk = int(np.floor((ra - self.raBounds[decChunk][0]) * float(self.nRa[decChunk]) / (self.raBounds[decChunk][self.nRa[decChunk]] - self.raBounds[decChunk][0]))) if raChunk < 0 or raChunk > self.nRa[decChunk]-1: raise PydlutilsException("raChunk out of range in chunks.get()") else: raChunk = -1 return (raChunk, decChunk) def friendsoffriends(self, ra, dec, linkSep): """Friends-of-friends using chunked data. """ nPoints = ra.size inGroup = np.zeros(nPoints, dtype='i4') - 1 # # mapGroups contains an equivalency mapping of groups. mapGroup[i]=j # means i and j are actually the same group. j<=i always, by design. # The largest number of groups you can get # (assuming linkSep < marginSize < minSize) is 9 times the number of # targets # mapGroups = np.zeros(9*nPoints, dtype='i4') - 1 nMapGroups = 0 for i in range(self.nDec): for j in range(self.nRa[i]): if len(self.chunkList[i][j]) > 0: chunkGroup = self.chunkfriendsoffriends(ra, dec, self.chunkList[i][j], linkSep) for k in range(chunkGroup.nGroups): minEarly = 9*nPoints l = chunkGroup.firstGroup[k] while l != -1: if inGroup[self.chunkList[i][j][l]] != -1: checkEarly = inGroup[self.chunkList[i][j][l]] while mapGroups[checkEarly] != checkEarly: checkEarly = mapGroups[checkEarly] minEarly = min(minEarly, checkEarly) else: inGroup[self.chunkList[i][j][l]] = nMapGroups l = chunkGroup.nextGroup[l] if minEarly == 9*nPoints: mapGroups[nMapGroups] = nMapGroups else: mapGroups[nMapGroups] = minEarly l = chunkGroup.firstGroup[k] while l != -1: checkEarly = inGroup[self.chunkList[i][j][l]] while mapGroups[checkEarly] != checkEarly: tmpEarly = mapGroups[checkEarly] mapGroups[checkEarly] = minEarly checkEarly = tmpEarly mapGroups[checkEarly] = minEarly l = chunkGroup.nextGroup[l] nMapGroups += 1 # # Now all groups which are mapped to themselves are the real groups # Make sure the mappings are set up to go all the way down. # nGroups = 0 for i in range(nMapGroups): if mapGroups[i] != -1: if mapGroups[i] == i: mapGroups[i] = nGroups nGroups += 1 else: mapGroups[i] = mapGroups[mapGroups[i]] else: raise PydlutilsException("MapGroups[{0:d}]={1:d} in chunks.friendsoffriends().".format(i, mapGroups[i])) for i in range(nPoints): inGroup[i] = mapGroups[inGroup[i]] firstGroup = np.zeros(nPoints, dtype='i4') - 1 nextGroup = np.zeros(nPoints, dtype='i4') - 1 multGroup = np.zeros(nPoints, dtype='i4') for i in range(nPoints-1, -1, -1): nextGroup[i] = firstGroup[inGroup[i]] firstGroup[inGroup[i]] = i for i in range(nGroups): j = firstGroup[i] while j != -1: multGroup[i] += 1 j = nextGroup[j] return (inGroup, multGroup, firstGroup, nextGroup, nGroups) def chunkfriendsoffriends(self, ra, dec, chunkList, linkSep): """Does friends-of-friends on the ra, dec that are defined by chunkList. """ # # Convert ra, dec into something that can be digested by the # groups object. # x = np.deg2rad(np.vstack((ra[chunkList], dec[chunkList]))) radLinkSep = np.deg2rad(linkSep) group = groups(x, radLinkSep, 'sphereradec') return group class groups(object): """Group a set of objects (a list of coordinates in some space) based on a friends-of-friends algorithm """ @staticmethod def euclid(x1, x2): """Pythagorean theorem in Euclidean space with arbitrary number of dimensions. """ return np.sqrt(((x1-x2)**2).sum()) @staticmethod def sphereradec(x1, x2): """Separation of two points on a 2D-sphere, assuming they are in longitude-latitude or right ascension-declination form. Assumes everything is already in radians. """ return gcirc(x1[0], x1[1], x2[0], x2[1], units=0) def __init__(self, coordinates, distance, separation='euclid'): """Init creates an object and performs the friends-of-friends algorithm. The coordinates can have arbitrary dimensions, with each column representing one of the dimensions. Each row defines an object. If separation is not defined it defaults to Euclidean space. """ # # Find a separation function # if callable(separation): self.separation = separation elif isinstance(separation, string_types): if separation == 'euclid': self.separation = self.euclid elif separation == 'sphereradec': self.separation = self.sphereradec else: raise PydlutilsException("Unknown separation function: {0}.".format(separation)) else: raise PydlutilsException("Improper type for separation!") # # Save information about the coordinates. # nGroups = 0 nTargets = coordinates.shape[1] multGroup = np.zeros(nTargets, dtype='i4') firstGroup = np.zeros(nTargets, dtype='i4') - 1 nextGroup = np.zeros(nTargets, dtype='i4') - 1 inGroup = np.arange(nTargets, dtype='i4') # # Find all the other targets associated with each target # for i in range(nTargets): nTmp = 0 minGroup = nGroups for j in range(nTargets): sep = self.separation(coordinates[:, i], coordinates[:, j]) if sep <= distance: multGroup[nTmp] = j minGroup = min(minGroup, inGroup[j]) nTmp += 1 # # Use this minimum for all # for j in range(nTmp): if inGroup[multGroup[j]] < nTargets: k = firstGroup[inGroup[multGroup[j]]] while k != -1: inGroup[k] = minGroup k = nextGroup[k] inGroup[multGroup[j]] = minGroup # # If it is a new group (no earlier groups), increment nGroups # if minGroup == nGroups: nGroups += 1 for j in range(i+1): firstGroup[j] = -1 for j in range(i, -1, -1): nextGroup[j] = firstGroup[inGroup[j]] firstGroup[inGroup[j]] = j # # Renumber to get rid of the numbers which were skipped # renumbered = np.zeros(nTargets, dtype='bool') nTmp = nGroups nGroups = 0 for i in range(nTargets): if not renumbered[i]: j = firstGroup[inGroup[i]] while j != -1: inGroup[j] = nGroups renumbered[j] = True j = nextGroup[j] nGroups += 1 # # Reset the values of firstGroup and inGroup # firstGroup[:] = -1 for i in range(nTargets-1, -1, -1): nextGroup[i] = firstGroup[inGroup[i]] firstGroup[inGroup[i]] = i # # Get the multiplicity # for i in range(nGroups): multGroup[i] = 0 j = firstGroup[i] while j != -1: multGroup[i] += 1 j = nextGroup[j] # # Set attributes # self.nGroups = nGroups self.nTargets = nTargets self.inGroup = inGroup self.multGroup = multGroup self.firstGroup = firstGroup self.nextGroup = nextGroup return def spheregroup(ra, dec, linklength, chunksize=None): """Perform friends-of-friends grouping given ra/dec coordinates. Parameters ---------- ra, dec : :class:`numpy.ndarray` Arrays of coordinates to group in decimal degrees. linklength : :class:`float` Linking length for the groups in decimal degrees. chunksize : :class:`float`, optional Break up the sphere into chunks of this size in decimal degrees. Returns ------- :func:`tuple` A tuple containing the group number of each object, the multiplicity of each group, the first member of each group, and the next member of the group for each object. Raises ------ :exc:`PydlutilsException` If the array of coordinates only contains one point. Notes ----- It is important that `chunksize` >= 4 * `linklength`. This is enforced. .. warning:: Behavior at the poles is not well tested. """ npoints = ra.size if npoints == 1: raise PydlutilsException("Cannot group only one point!") # # Define the chunksize # if chunksize is not None: if chunksize < 4.0*linklength: chunksize = 4.0*linklength warn("chunksize changed to {0:.2f}.".format(chunksize), PydlutilsUserWarning) else: chunksize = max(4.0*linklength, 0.1) # # Initialize chunks # chunk = chunks(ra, dec, chunksize) chunk.assign(ra, dec, linklength) # # Run friends-of-friends # ingroup, multgroup, firstgroup, nextgroup, ngroups = chunk.friendsoffriends(ra, dec, linklength) # # Renumber the groups in order of appearance # renumbered = np.zeros(npoints, dtype='bool') iclump = 0 for i in range(npoints): if not renumbered[i]: j = firstgroup[ingroup[i]] while j != -1: ingroup[j] = iclump renumbered[j] = True j = nextgroup[j] iclump += 1 # # Reset the index lists # firstgroup[:] = -1 for i in range(npoints-1, -1, -1): nextgroup[i] = firstgroup[ingroup[i]] firstgroup[ingroup[i]] = i # # Reset the multiplicities # multgroup[:] = 0 for i in range(ngroups): j = firstgroup[i] while j != -1: multgroup[i] += 1 j = nextgroup[j] return (ingroup, multgroup, firstgroup, nextgroup) def spherematch(ra1, dec1, ra2, dec2, matchlength, chunksize=None, maxmatch=1): """Match points on a sphere. Parameters ---------- ra1, dec1, ra2, dec2 : :class:`numpy.ndarray` The sets of coordinates to match. Assumed to be in decimal degrees matchlength : :class:`float` Two points closer than this separation are matched. Assumed to be in decimal degrees. chunksize : :class:`float`, optional Value to pass to chunk assignment. maxmatch : :class:`int`, optional Allow up to `maxmatch` matches per coordinate. Default 1. If set to zero, All possible matches will be returned. Returns ------- :func:`tuple` A tuple containing the indices into the first set of points, the indices into the second set of points and the match distance in decimal degrees. Notes ----- If you have sets of coordinates that differ in size, call this function with the larger list first. This exploits the inherent asymmetry in the underlying code to reduce memory use. .. warning:: Behavior at the poles is not well tested. """ # # Set default values # if chunksize is None: chunksize = max(4.0*matchlength, 0.1) # # Check input size # if ra1.size == 1: raise PydlutilsException("Change the order of the sets of coordinates!") # # Initialize chunks # chunk = chunks(ra1, dec1, chunksize) chunk.assign(ra2, dec2, matchlength) # # Create return arrays # match1 = list() match2 = list() distance12 = list() for i in range(ra1.size): currra = np.fmod(ra1[i]+chunk.raOffset, 360.0) rachunk, decchunk = chunk.get(currra, dec1[i]) jmax = len(chunk.chunkList[decchunk][rachunk]) if jmax > 0: for j in range(jmax): k = chunk.chunkList[decchunk][rachunk][j] sep = gcirc(ra1[i], dec1[i], ra2[k], dec2[k], units=2)/3600.0 if sep < matchlength: match1.append(i) match2.append(k) distance12.append(sep) # # Sort distances # omatch1 = np.array(match1) omatch2 = np.array(match2) odistance12 = np.array(distance12) s = odistance12.argsort() # # Retain only desired matches # if maxmatch > 0: gotten1 = np.zeros(ra1.size, dtype='i4') gotten2 = np.zeros(ra2.size, dtype='i4') nmatch = 0 for i in range(omatch1.size): if (gotten1[omatch1[s[i]]] < maxmatch and gotten2[omatch2[s[i]]] < maxmatch): gotten1[omatch1[s[i]]] += 1 gotten2[omatch2[s[i]]] += 1 nmatch += 1 match1 = np.zeros(nmatch, dtype='i4') match2 = np.zeros(nmatch, dtype='i4') distance12 = np.zeros(nmatch, dtype='d') gotten1[:] = 0 gotten2[:] = 0 nmatch = 0 for i in range(omatch1.size): if (gotten1[omatch1[s[i]]] < maxmatch and gotten2[omatch2[s[i]]] < maxmatch): gotten1[omatch1[s[i]]] += 1 gotten2[omatch2[s[i]]] += 1 match1[nmatch] = omatch1[s[i]] match2[nmatch] = omatch2[s[i]] distance12[nmatch] = odistance12[s[i]] nmatch += 1 else: match1 = omatch1[s] match2 = omatch2[s] distance12 = odistance12[s] return (match1, match2, distance12) pydl-0.7.0/pydl/pydlutils/data/0000755000076500000240000000000013434104632017014 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlutils/data/filters/0000755000076500000240000000000013434104632020464 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlutils/data/filters/sdss_jun2001_g_atm.dat0000644000076500000240000001535412544033417024473 0ustar weaverstaff00000000000000# The following file is from Jim Gunn, from June 2001. It should be # self-explanatory; for most purposes, you will want to use the second # column. Consider this file preliminary. # # These filter curves have been used to calculate the effective # wavelengths and the qtdl/l (see Chapter 8 of the Black Book) of the # filters; the values are: # # u 3551 0.0171 # g 4686 0.0893 # r 6166 0.0886 # i 7480 0.0591 # z 8932 0.0099 # # Table Caption For Response Functions # # The first column is the wavelength in \AAngstroms. The second column # (respt) is the quantum efficiency on the sky looking through 1.3 # airmasses at APO for a point source. The third column (resbig) is the # QE under these conditions for very large sources (size greater than # about 80 pixels) for which the infrared scattering is negligible. The # only filters for which the infrared scattering has any effect are r and # i; the scattering in the bluer chips is negligible, and the z chips are # not thinned and the phenomenon does not exist. The fourth column # (resnoa) is the response of the third column with {\it no} atmosphere, # and the fifth column is the assumed atmospheric transparency at {\it # one} airmass at APO. The tables were constructed using monochromator # illumination of the camera with a bandpass of about 100 \AA, sampled for # the u filter at 50 \AA intervals and for the others at 100 \AA # intervals. These measurements were compared with measured responses of # the component filters and detectors and three additional points were # interpolated using these data, two at the extreme toes and one # additional (in g, r, and i) at the point of the beginning of the sharp # cutoff of the shortpass interference filter. These points are necessary # in order to make spline interpolation of the response data well-behaved. # These spline-interpolated response data were then multiplied by measured # aluminum reflectivities and scaled atmospheric transmission to produce # the tables below. The overall normalization is somewhat uncertain, # but this uncertainty does not affect the shapes. Note, however, that # there has been no attempt to remove the finite resolution of the # monochromator measurements. These tables are the {\it averages} of the # responses for all six of the camera chips with a given filter. The # responses are in general very similar except in the z band, where the # nonuniformity of the infrared rolloff, presumably associated with # varying thickness of the epitaxial layer or perhaps the gate structures # in these thick devices, introduces variations in the effective wavelengths # of the filters of order 100 \AA. We are currently working on better # response functions and will present them when they become available, but # these will suffice for most applications. In all cases the first point # is a measured point, so the grid of wavelengths at which measurements # exist is a subset of the wavelength lists here. # # SDSS Camera g Response Function 89 Points # # lam respt resbig resnoa xatm 3630 0.0000 0.0000 0.0000 0.6476 3655 0.0003 0.0003 0.0005 0.6553 3680 0.0008 0.0008 0.0013 0.6631 3705 0.0013 0.0013 0.0022 0.6702 3730 0.0019 0.0019 0.0030 0.6763 3755 0.0024 0.0024 0.0039 0.6815 3780 0.0034 0.0034 0.0055 0.6863 3805 0.0055 0.0055 0.0087 0.6912 3830 0.0103 0.0103 0.0162 0.6965 3855 0.0194 0.0194 0.0301 0.7023 3880 0.0326 0.0326 0.0500 0.7088 3905 0.0492 0.0492 0.0745 0.7158 3930 0.0686 0.0686 0.1024 0.7235 3955 0.0900 0.0900 0.1324 0.7315 3980 0.1123 0.1123 0.1629 0.7393 4005 0.1342 0.1342 0.1924 0.7464 4030 0.1545 0.1545 0.2191 0.7526 4055 0.1722 0.1722 0.2419 0.7581 4080 0.1873 0.1873 0.2609 0.7631 4105 0.2003 0.2003 0.2767 0.7680 4130 0.2116 0.2116 0.2899 0.7727 4155 0.2214 0.2214 0.3010 0.7774 4180 0.2301 0.2301 0.3105 0.7820 4205 0.2378 0.2378 0.3186 0.7862 4230 0.2448 0.2448 0.3258 0.7902 4255 0.2513 0.2513 0.3324 0.7940 4280 0.2574 0.2574 0.3385 0.7976 4305 0.2633 0.2633 0.3442 0.8013 4330 0.2691 0.2691 0.3496 0.8049 4355 0.2747 0.2747 0.3548 0.8087 4380 0.2801 0.2801 0.3596 0.8124 4405 0.2852 0.2852 0.3640 0.8161 4430 0.2899 0.2899 0.3678 0.8199 4455 0.2940 0.2940 0.3709 0.8235 4480 0.2979 0.2979 0.3736 0.8271 4505 0.3016 0.3016 0.3763 0.8305 4530 0.3055 0.3055 0.3792 0.8337 4555 0.3097 0.3097 0.3827 0.8368 4580 0.3141 0.3141 0.3863 0.8397 4605 0.3184 0.3184 0.3899 0.8425 4630 0.3224 0.3224 0.3931 0.8453 4655 0.3257 0.3257 0.3955 0.8480 4680 0.3284 0.3284 0.3973 0.8505 4705 0.3307 0.3307 0.3986 0.8528 4730 0.3327 0.3327 0.3997 0.8549 4755 0.3346 0.3346 0.4008 0.8568 4780 0.3364 0.3364 0.4019 0.8587 4805 0.3383 0.3383 0.4030 0.8606 4830 0.3403 0.3403 0.4043 0.8625 4855 0.3425 0.3425 0.4057 0.8643 4880 0.3448 0.3448 0.4073 0.8661 4905 0.3472 0.3472 0.4091 0.8678 4930 0.3495 0.3495 0.4110 0.8693 4955 0.3519 0.3519 0.4129 0.8706 4980 0.3541 0.3541 0.4147 0.8719 5005 0.3562 0.3562 0.4165 0.8730 5030 0.3581 0.3581 0.4181 0.8740 5055 0.3597 0.3597 0.4194 0.8750 5080 0.3609 0.3609 0.4201 0.8759 5105 0.3613 0.3613 0.4201 0.8768 5130 0.3609 0.3609 0.4191 0.8777 5155 0.3595 0.3595 0.4169 0.8785 5180 0.3581 0.3581 0.4147 0.8794 5205 0.3558 0.3558 0.4115 0.8803 5230 0.3452 0.3452 0.3988 0.8813 5255 0.3194 0.3194 0.3684 0.8822 5280 0.2807 0.2807 0.3233 0.8832 5305 0.2339 0.2339 0.2690 0.8842 5330 0.1839 0.1839 0.2112 0.8852 5355 0.1352 0.1352 0.1550 0.8861 5380 0.0911 0.0911 0.1043 0.8869 5405 0.0548 0.0548 0.0627 0.8877 5430 0.0295 0.0295 0.0337 0.8885 5455 0.0166 0.0166 0.0190 0.8891 5480 0.0112 0.0112 0.0128 0.8897 5505 0.0077 0.0077 0.0087 0.8902 5530 0.0050 0.0050 0.0057 0.8907 5555 0.0032 0.0032 0.0037 0.8911 5580 0.0021 0.0021 0.0024 0.8914 5605 0.0015 0.0015 0.0017 0.8917 5630 0.0012 0.0012 0.0014 0.8920 5655 0.0010 0.0010 0.0012 0.8923 5680 0.0009 0.0009 0.0010 0.8926 5705 0.0008 0.0008 0.0009 0.8929 5730 0.0006 0.0006 0.0007 0.8933 5755 0.0005 0.0005 0.0005 0.8938 5780 0.0003 0.0003 0.0003 0.8945 5805 0.0001 0.0001 0.0001 0.8952 5830 0.0000 0.0000 0.0000 0.8962 pydl-0.7.0/pydl/pydlutils/data/filters/sdss_jun2001_r_atm.dat0000644000076500000240000001422312544033417024500 0ustar weaverstaff00000000000000# The following file is from Jim Gunn, from June 2001. It should be # self-explanatory; for most purposes, you will want to use the second # column. Consider this file preliminary. # # These filter curves have been used to calculate the effective # wavelengths and the qtdl/l (see Chapter 8 of the Black Book) of the # filters; the values are: # # u 3551 0.0171 # g 4686 0.0893 # r 6166 0.0886 # i 7480 0.0591 # z 8932 0.0099 # # Table Caption For Response Functions # # The first column is the wavelength in \AAngstroms. The second column # (respt) is the quantum efficiency on the sky looking through 1.3 # airmasses at APO for a point source. The third column (resbig) is the # QE under these conditions for very large sources (size greater than # about 80 pixels) for which the infrared scattering is negligible. The # only filters for which the infrared scattering has any effect are r and # i; the scattering in the bluer chips is negligible, and the z chips are # not thinned and the phenomenon does not exist. The fourth column # (resnoa) is the response of the third column with {\it no} atmosphere, # and the fifth column is the assumed atmospheric transparency at {\it # one} airmass at APO. The tables were constructed using monochromator # illumination of the camera with a bandpass of about 100 \AA, sampled for # the u filter at 50 \AA intervals and for the others at 100 \AA # intervals. These measurements were compared with measured responses of # the component filters and detectors and three additional points were # interpolated using these data, two at the extreme toes and one # additional (in g, r, and i) at the point of the beginning of the sharp # cutoff of the shortpass interference filter. These points are necessary # in order to make spline interpolation of the response data well-behaved. # These spline-interpolated response data were then multiplied by measured # aluminum reflectivities and scaled atmospheric transmission to produce # the tables below. The overall normalization is somewhat uncertain, # but this uncertainty does not affect the shapes. Note, however, that # there has been no attempt to remove the finite resolution of the # monochromator measurements. These tables are the {\it averages} of the # responses for all six of the camera chips with a given filter. The # responses are in general very similar except in the z band, where the # nonuniformity of the infrared rolloff, presumably associated with # varying thickness of the epitaxial layer or perhaps the gate structures # in these thick devices, introduces variations in the effective wavelengths # of the filters of order 100 \AA. We are currently working on better # response functions and will present them when they become available, but # these will suffice for most applications. In all cases the first point # is a measured point, so the grid of wavelengths at which measurements # exist is a subset of the wavelength lists here. # # SDSS Camera r Response Function 75 Points # # lam respt resbig resnoa xatm 5380 0.0000 0.0000 0.0000 0.8869 5405 0.0014 0.0014 0.0016 0.8877 5430 0.0099 0.0099 0.0113 0.8885 5455 0.0259 0.0260 0.0297 0.8891 5480 0.0497 0.0498 0.0568 0.8897 5505 0.0807 0.0809 0.0923 0.8902 5530 0.1186 0.1190 0.1356 0.8907 5555 0.1625 0.1630 0.1856 0.8911 5580 0.2093 0.2100 0.2390 0.8914 5605 0.2555 0.2564 0.2917 0.8917 5630 0.2975 0.2986 0.3395 0.8920 5655 0.3326 0.3339 0.3794 0.8923 5680 0.3609 0.3623 0.4116 0.8926 5705 0.3834 0.3849 0.4371 0.8929 5730 0.4010 0.4027 0.4570 0.8933 5755 0.4147 0.4165 0.4723 0.8938 5780 0.4253 0.4271 0.4839 0.8945 5805 0.4333 0.4353 0.4925 0.8952 5830 0.4395 0.4416 0.4990 0.8962 5855 0.4446 0.4467 0.5040 0.8973 5880 0.4489 0.4511 0.5080 0.8986 5905 0.4527 0.4550 0.5112 0.9001 5930 0.4563 0.4587 0.5141 0.9018 5955 0.4599 0.4624 0.5169 0.9037 5980 0.4634 0.4660 0.5194 0.9057 6005 0.4665 0.4692 0.5213 0.9079 6030 0.4689 0.4716 0.5222 0.9103 6055 0.4703 0.4731 0.5220 0.9128 6080 0.4711 0.4740 0.5212 0.9153 6105 0.4717 0.4747 0.5202 0.9177 6130 0.4727 0.4758 0.5197 0.9199 6155 0.4744 0.4776 0.5202 0.9220 6180 0.4767 0.4800 0.5215 0.9238 6205 0.4792 0.4827 0.5233 0.9253 6230 0.4819 0.4854 0.5254 0.9265 6255 0.4844 0.4881 0.5275 0.9275 6280 0.4867 0.4905 0.5294 0.9285 6305 0.4887 0.4926 0.5310 0.9294 6330 0.4902 0.4942 0.5319 0.9305 6355 0.4909 0.4951 0.5320 0.9316 6380 0.4912 0.4955 0.5316 0.9327 6405 0.4912 0.4956 0.5310 0.9337 6430 0.4912 0.4958 0.5305 0.9346 6455 0.4914 0.4961 0.5302 0.9354 6480 0.4915 0.4964 0.5299 0.9363 6505 0.4912 0.4962 0.5290 0.9373 6530 0.4901 0.4953 0.5271 0.9385 6555 0.4878 0.4931 0.5241 0.9395 6580 0.4852 0.4906 0.5211 0.9400 6605 0.4818 0.4873 0.5176 0.9398 6630 0.4697 0.4752 0.5057 0.9386 6655 0.4421 0.4474 0.4775 0.9366 6680 0.4009 0.4059 0.4341 0.9349 6705 0.3499 0.3544 0.3792 0.9345 6730 0.2924 0.2963 0.3162 0.9366 6755 0.2318 0.2350 0.2488 0.9421 6780 0.1715 0.1739 0.1824 0.9492 6805 0.1152 0.1168 0.1225 0.9494 6830 0.0687 0.0697 0.0747 0.9334 6855 0.0380 0.0386 0.0430 0.9057 6880 0.0212 0.0215 0.0247 0.8862 6905 0.0134 0.0136 0.0155 0.8893 6930 0.0099 0.0101 0.0112 0.9083 6955 0.0076 0.0077 0.0083 0.9311 6980 0.0055 0.0056 0.0059 0.9450 7005 0.0039 0.0039 0.0041 0.9464 7030 0.0027 0.0028 0.0029 0.9561 7055 0.0020 0.0020 0.0021 0.9709 7080 0.0015 0.0016 0.0016 0.9826 7105 0.0012 0.0013 0.0013 0.9827 7130 0.0010 0.0010 0.0010 0.9629 7155 0.0007 0.0007 0.0008 0.9192 7180 0.0004 0.0004 0.0005 0.8849 7205 0.0002 0.0002 0.0002 0.8974 7230 0.0000 0.0000 0.0000 0.9182 pydl-0.7.0/pydl/pydlutils/data/filters/sdss_jun2001_z_atm.dat0000644000076500000240000002165112544033417024513 0ustar weaverstaff00000000000000# The following file is from Jim Gunn, from June 2001. It should be # self-explanatory; for most purposes, you will want to use the second # column. Consider this file preliminary. # # These filter curves have been used to calculate the effective # wavelengths and the qtdl/l (see Chapter 8 of the Black Book) of the # filters; the values are: # # u 3551 0.0171 # g 4686 0.0893 # r 6166 0.0886 # i 7480 0.0591 # z 8932 0.0099 # # Table Caption For Response Functions # # The first column is the wavelength in \AAngstroms. The second column # (respt) is the quantum efficiency on the sky looking through 1.3 # airmasses at APO for a point source. The third column (resbig) is the # QE under these conditions for very large sources (size greater than # about 80 pixels) for which the infrared scattering is negligible. The # only filters for which the infrared scattering has any effect are r and # i; the scattering in the bluer chips is negligible, and the z chips are # not thinned and the phenomenon does not exist. The fourth column # (resnoa) is the response of the third column with {\it no} atmosphere, # and the fifth column is the assumed atmospheric transparency at {\it # one} airmass at APO. The tables were constructed using monochromator # illumination of the camera with a bandpass of about 100 \AA, sampled for # the u filter at 50 \AA intervals and for the others at 100 \AA # intervals. These measurements were compared with measured responses of # the component filters and detectors and three additional points were # interpolated using these data, two at the extreme toes and one # additional (in g, r, and i) at the point of the beginning of the sharp # cutoff of the shortpass interference filter. These points are necessary # in order to make spline interpolation of the response data well-behaved. # These spline-interpolated response data were then multiplied by measured # aluminum reflectivities and scaled atmospheric transmission to produce # the tables below. The overall normalization is somewhat uncertain, # but this uncertainty does not affect the shapes. Note, however, that # there has been no attempt to remove the finite resolution of the # monochromator measurements. These tables are the {\it averages} of the # responses for all six of the camera chips with a given filter. The # responses are in general very similar except in the z band, where the # nonuniformity of the infrared rolloff, presumably associated with # varying thickness of the epitaxial layer or perhaps the gate structures # in these thick devices, introduces variations in the effective wavelengths # of the filters of order 100 \AA. We are currently working on better # response functions and will present them when they become available, but # these will suffice for most applications. In all cases the first point # is a measured point, so the grid of wavelengths at which measurements # exist is a subset of the wavelength lists here. # # SDSS Camera z Response Function 141 Points # # lam respt resbig resnoa xatm 7730 0.0000 0.0000 0.0000 0.9602 7755 0.0000 0.0000 0.0000 0.9615 7780 0.0001 0.0001 0.0001 0.9605 7805 0.0001 0.0001 0.0001 0.9583 7830 0.0001 0.0001 0.0001 0.9559 7855 0.0002 0.0002 0.0002 0.9541 7880 0.0002 0.0002 0.0002 0.9541 7905 0.0003 0.0003 0.0003 0.9567 7930 0.0005 0.0005 0.0005 0.9622 7955 0.0007 0.0007 0.0007 0.9692 7980 0.0011 0.0011 0.0011 0.9762 8005 0.0017 0.0017 0.0017 0.9814 8030 0.0027 0.0027 0.0027 0.9833 8055 0.0040 0.0040 0.0040 0.9801 8080 0.0057 0.0057 0.0058 0.9702 8105 0.0079 0.0079 0.0082 0.9524 8130 0.0106 0.0106 0.0114 0.9285 8155 0.0139 0.0139 0.0155 0.9075 8180 0.0178 0.0178 0.0202 0.8931 8205 0.0222 0.0222 0.0255 0.8853 8230 0.0271 0.0271 0.0311 0.8843 8255 0.0324 0.0324 0.0369 0.8902 8280 0.0382 0.0382 0.0428 0.9033 8305 0.0446 0.0446 0.0484 0.9242 8330 0.0511 0.0511 0.0536 0.9483 8355 0.0564 0.0564 0.0583 0.9591 8380 0.0603 0.0603 0.0625 0.9576 8405 0.0637 0.0637 0.0661 0.9567 8430 0.0667 0.0667 0.0693 0.9564 8455 0.0694 0.0694 0.0720 0.9565 8480 0.0717 0.0717 0.0744 0.9569 8505 0.0736 0.0736 0.0763 0.9576 8530 0.0752 0.0752 0.0779 0.9584 8555 0.0765 0.0765 0.0792 0.9592 8580 0.0775 0.0775 0.0801 0.9598 8605 0.0782 0.0782 0.0808 0.9602 8630 0.0786 0.0786 0.0812 0.9603 8655 0.0787 0.0787 0.0813 0.9599 8680 0.0785 0.0785 0.0812 0.9593 8705 0.0780 0.0780 0.0807 0.9586 8730 0.0772 0.0772 0.0801 0.9578 8755 0.0763 0.0763 0.0791 0.9571 8780 0.0751 0.0751 0.0779 0.9567 8805 0.0738 0.0738 0.0766 0.9566 8830 0.0723 0.0723 0.0750 0.9571 8855 0.0708 0.0708 0.0734 0.9582 8880 0.0693 0.0693 0.0716 0.9600 8905 0.0674 0.0674 0.0698 0.9591 8930 0.0632 0.0632 0.0679 0.9314 8955 0.0581 0.0581 0.0661 0.8923 8980 0.0543 0.0543 0.0642 0.8648 9005 0.0526 0.0526 0.0624 0.8633 9030 0.0523 0.0523 0.0607 0.8787 9055 0.0522 0.0522 0.0590 0.8961 9080 0.0512 0.0512 0.0574 0.9020 9105 0.0496 0.0496 0.0559 0.8980 9130 0.0481 0.0481 0.0546 0.8931 9155 0.0473 0.0473 0.0535 0.8962 9180 0.0476 0.0476 0.0524 0.9138 9205 0.0482 0.0482 0.0515 0.9352 9230 0.0476 0.0476 0.0505 0.9407 9255 0.0447 0.0447 0.0496 0.9103 9280 0.0391 0.0391 0.0485 0.8345 9305 0.0329 0.0329 0.0474 0.7441 9330 0.0283 0.0283 0.0462 0.6752 9355 0.0264 0.0264 0.0450 0.6524 9380 0.0271 0.0271 0.0438 0.6794 9405 0.0283 0.0283 0.0426 0.7178 9430 0.0275 0.0275 0.0415 0.7184 9455 0.0254 0.0254 0.0404 0.6897 9480 0.0252 0.0252 0.0393 0.7003 9505 0.0256 0.0256 0.0383 0.7214 9530 0.0246 0.0246 0.0373 0.7147 9555 0.0244 0.0244 0.0363 0.7251 9580 0.0252 0.0252 0.0353 0.7594 9605 0.0258 0.0258 0.0342 0.7923 9630 0.0265 0.0265 0.0331 0.8302 9655 0.0274 0.0274 0.0319 0.8766 9680 0.0279 0.0279 0.0307 0.9150 9705 0.0271 0.0271 0.0294 0.9253 9730 0.0252 0.0252 0.0280 0.9059 9755 0.0236 0.0236 0.0267 0.8947 9780 0.0227 0.0227 0.0253 0.9045 9805 0.0222 0.0222 0.0240 0.9262 9830 0.0216 0.0216 0.0227 0.9500 9855 0.0208 0.0208 0.0213 0.9652 9880 0.0196 0.0196 0.0201 0.9656 9905 0.0183 0.0183 0.0188 0.9642 9930 0.0171 0.0171 0.0176 0.9630 9955 0.0160 0.0160 0.0165 0.9618 9980 0.0149 0.0149 0.0153 0.9607 10005 0.0138 0.0138 0.0143 0.9597 10030 0.0128 0.0128 0.0132 0.9588 10055 0.0118 0.0118 0.0122 0.9579 10080 0.0108 0.0108 0.0112 0.9572 10105 0.0099 0.0099 0.0103 0.9565 10130 0.0091 0.0091 0.0094 0.9559 10155 0.0083 0.0083 0.0086 0.9553 10180 0.0075 0.0075 0.0078 0.9549 10205 0.0068 0.0068 0.0071 0.9545 10230 0.0061 0.0061 0.0064 0.9541 10255 0.0055 0.0055 0.0058 0.9539 10280 0.0050 0.0050 0.0052 0.9537 10305 0.0045 0.0045 0.0047 0.9535 10330 0.0041 0.0041 0.0042 0.9534 10355 0.0037 0.0037 0.0038 0.9534 10380 0.0033 0.0033 0.0035 0.9534 10405 0.0030 0.0030 0.0031 0.9535 10430 0.0027 0.0027 0.0028 0.9536 10455 0.0025 0.0025 0.0026 0.9537 10480 0.0023 0.0023 0.0024 0.9539 10505 0.0021 0.0021 0.0022 0.9541 10530 0.0019 0.0019 0.0020 0.9544 10555 0.0018 0.0018 0.0019 0.9547 10580 0.0017 0.0017 0.0018 0.9551 10605 0.0016 0.0016 0.0016 0.9554 10630 0.0015 0.0015 0.0015 0.9558 10655 0.0014 0.0014 0.0014 0.9563 10680 0.0013 0.0013 0.0013 0.9567 10705 0.0012 0.0012 0.0012 0.9572 10730 0.0011 0.0011 0.0011 0.9577 10755 0.0010 0.0010 0.0010 0.9582 10780 0.0009 0.0009 0.0009 0.9587 10805 0.0008 0.0008 0.0008 0.9593 10830 0.0008 0.0008 0.0008 0.9598 10855 0.0007 0.0007 0.0007 0.9604 10880 0.0006 0.0006 0.0007 0.9609 10905 0.0006 0.0006 0.0006 0.9615 10930 0.0006 0.0006 0.0006 0.9621 10955 0.0005 0.0005 0.0005 0.9626 10980 0.0005 0.0005 0.0005 0.9632 11005 0.0004 0.0004 0.0004 0.9638 11030 0.0004 0.0004 0.0004 0.9643 11055 0.0003 0.0003 0.0003 0.9648 11080 0.0003 0.0003 0.0003 0.9654 11105 0.0002 0.0002 0.0002 0.9659 11130 0.0002 0.0002 0.0002 0.9664 11155 0.0001 0.0001 0.0001 0.9669 11180 0.0001 0.0001 0.0001 0.9673 11205 0.0000 0.0000 0.0000 0.9677 11230 0.0000 0.0000 0.0000 0.9682 pydl-0.7.0/pydl/pydlutils/data/filters/sdss_jun2001_u_atm.dat0000644000076500000240000001173612544033417024511 0ustar weaverstaff00000000000000# The following file is from Jim Gunn, from June 2001. It should be # self-explanatory; for most purposes, you will want to use the second # column. Consider this file preliminary. # # These filter curves have been used to calculate the effective # wavelengths and the qtdl/l (see Chapter 8 of the Black Book) of the # filters; the values are: # # u 3551 0.0171 # g 4686 0.0893 # r 6166 0.0886 # i 7480 0.0591 # z 8932 0.0099 # # Table Caption For Response Functions # # The first column is the wavelength in \AAngstroms. The second column # (respt) is the quantum efficiency on the sky looking through 1.3 # airmasses at APO for a point source. The third column (resbig) is the # QE under these conditions for very large sources (size greater than # about 80 pixels) for which the infrared scattering is negligible. The # only filters for which the infrared scattering has any effect are r and # i; the scattering in the bluer chips is negligible, and the z chips are # not thinned and the phenomenon does not exist. The fourth column # (resnoa) is the response of the third column with {\it no} atmosphere, # and the fifth column is the assumed atmospheric transparency at {\it # one} airmass at APO. The tables were constructed using monochromator # illumination of the camera with a bandpass of about 100 \AA, sampled for # the u filter at 50 \AA intervals and for the others at 100 \AA # intervals. These measurements were compared with measured responses of # the component filters and detectors and three additional points were # interpolated using these data, two at the extreme toes and one # additional (in g, r, and i) at the point of the beginning of the sharp # cutoff of the shortpass interference filter. These points are necessary # in order to make spline interpolation of the response data well-behaved. # These spline-interpolated response data were then multiplied by measured # aluminum reflectivities and scaled atmospheric transmission to produce # the tables below. The overall normalization is somewhat uncertain, # but this uncertainty does not affect the shapes. Note, however, that # there has been no attempt to remove the finite resolution of the # monochromator measurements. These tables are the {\it averages} of the # responses for all six of the camera chips with a given filter. The # responses are in general very similar except in the z band, where the # nonuniformity of the infrared rolloff, presumably associated with # varying thickness of the epitaxial layer or perhaps the gate structures # in these thick devices, introduces variations in the effective wavelengths # of the filters of order 100 \AA. We are currently working on better # response functions and will present them when they become available, but # these will suffice for most applications. In all cases the first point # is a measured point, so the grid of wavelengths at which measurements # exist is a subset of the wavelength lists here. # # SDSS Camera u Response Function 47 Points # # lam respt resbig resnoa xatm 2980 0.0000 0.0000 0.0000 0.0727 3005 0.0001 0.0001 0.0014 0.0992 3030 0.0005 0.0005 0.0071 0.1308 3055 0.0013 0.0013 0.0127 0.1673 3080 0.0026 0.0026 0.0198 0.2075 3105 0.0052 0.0052 0.0314 0.2470 3130 0.0093 0.0093 0.0464 0.2862 3155 0.0161 0.0161 0.0629 0.3444 3180 0.0240 0.0240 0.0794 0.3920 3205 0.0323 0.0323 0.0949 0.4300 3230 0.0405 0.0405 0.1093 0.4585 3255 0.0485 0.0485 0.1229 0.4817 3280 0.0561 0.0561 0.1352 0.5007 3305 0.0634 0.0634 0.1458 0.5189 3330 0.0700 0.0700 0.1545 0.5351 3355 0.0756 0.0756 0.1617 0.5486 3380 0.0803 0.0803 0.1679 0.5581 3405 0.0848 0.0848 0.1737 0.5669 3430 0.0883 0.0883 0.1786 0.5727 3455 0.0917 0.0917 0.1819 0.5812 3480 0.0959 0.0959 0.1842 0.5959 3505 0.1001 0.1001 0.1860 0.6112 3530 0.1029 0.1029 0.1870 0.6221 3555 0.1044 0.1044 0.1868 0.6294 3580 0.1053 0.1053 0.1862 0.6350 3605 0.1063 0.1063 0.1858 0.6406 3630 0.1075 0.1075 0.1853 0.6476 3655 0.1085 0.1085 0.1841 0.6553 3680 0.1084 0.1084 0.1812 0.6631 3705 0.1064 0.1064 0.1754 0.6702 3730 0.1024 0.1024 0.1669 0.6763 3755 0.0966 0.0966 0.1558 0.6815 3780 0.0887 0.0887 0.1419 0.6863 3805 0.0787 0.0787 0.1247 0.6912 3830 0.0672 0.0672 0.1054 0.6965 3855 0.0549 0.0549 0.0851 0.7023 3880 0.0413 0.0413 0.0634 0.7088 3905 0.0268 0.0268 0.0405 0.7158 3930 0.0145 0.0145 0.0216 0.7235 3955 0.0075 0.0075 0.0110 0.7315 3980 0.0042 0.0042 0.0062 0.7393 4005 0.0022 0.0022 0.0032 0.7464 4030 0.0010 0.0010 0.0015 0.7526 4055 0.0006 0.0006 0.0008 0.7581 4080 0.0004 0.0004 0.0006 0.7631 4105 0.0002 0.0002 0.0003 0.7680 4130 0.0000 0.0000 0.0000 0.7727 pydl-0.7.0/pydl/pydlutils/data/filters/sdss_jun2001_i_atm.dat0000644000076500000240000001535512544033417024476 0ustar weaverstaff00000000000000# The following file is from Jim Gunn, from June 2001. It should be # self-explanatory; for most purposes, you will want to use the second # column. Consider this file preliminary. # # These filter curves have been used to calculate the effective # wavelengths and the qtdl/l (see Chapter 8 of the Black Book) of the # filters; the values are: # # u 3551 0.0171 # g 4686 0.0893 # r 6166 0.0886 # i 7480 0.0591 # z 8932 0.0099 # # Table Caption For Response Functions # # The first column is the wavelength in \AAngstroms. The second column # (respt) is the quantum efficiency on the sky looking through 1.3 # airmasses at APO for a point source. The third column (resbig) is the # QE under these conditions for very large sources (size greater than # about 80 pixels) for which the infrared scattering is negligible. The # only filters for which the infrared scattering has any effect are r and # i; the scattering in the bluer chips is negligible, and the z chips are # not thinned and the phenomenon does not exist. The fourth column # (resnoa) is the response of the third column with {\it no} atmosphere, # and the fifth column is the assumed atmospheric transparency at {\it # one} airmass at APO. The tables were constructed using monochromator # illumination of the camera with a bandpass of about 100 \AA, sampled for # the u filter at 50 \AA intervals and for the others at 100 \AA # intervals. These measurements were compared with measured responses of # the component filters and detectors and three additional points were # interpolated using these data, two at the extreme toes and one # additional (in g, r, and i) at the point of the beginning of the sharp # cutoff of the shortpass interference filter. These points are necessary # in order to make spline interpolation of the response data well-behaved. # These spline-interpolated response data were then multiplied by measured # aluminum reflectivities and scaled atmospheric transmission to produce # the tables below. The overall normalization is somewhat uncertain, # but this uncertainty does not affect the shapes. Note, however, that # there has been no attempt to remove the finite resolution of the # monochromator measurements. These tables are the {\it averages} of the # responses for all six of the camera chips with a given filter. The # responses are in general very similar except in the z band, where the # nonuniformity of the infrared rolloff, presumably associated with # varying thickness of the epitaxial layer or perhaps the gate structures # in these thick devices, introduces variations in the effective wavelengths # of the filters of order 100 \AA. We are currently working on better # response functions and will present them when they become available, but # these will suffice for most applications. In all cases the first point # is a measured point, so the grid of wavelengths at which measurements # exist is a subset of the wavelength lists here. # # SDSS Camera i Response Function 89 Points # # lam respt resbig resnoa xatm 6430 0.0000 0.0000 0.0000 0.9346 6455 0.0001 0.0001 0.0001 0.9354 6480 0.0003 0.0003 0.0003 0.9363 6505 0.0004 0.0004 0.0004 0.9373 6530 0.0004 0.0004 0.0005 0.9385 6555 0.0003 0.0004 0.0004 0.9395 6580 0.0003 0.0003 0.0003 0.9400 6605 0.0004 0.0004 0.0005 0.9398 6630 0.0009 0.0009 0.0010 0.9386 6655 0.0019 0.0019 0.0021 0.9366 6680 0.0034 0.0034 0.0036 0.9349 6705 0.0056 0.0056 0.0060 0.9345 6730 0.0103 0.0104 0.0111 0.9366 6755 0.0194 0.0197 0.0208 0.9421 6780 0.0344 0.0349 0.0366 0.9492 6805 0.0561 0.0569 0.0597 0.9494 6830 0.0839 0.0851 0.0913 0.9334 6855 0.1164 0.1181 0.1317 0.9057 6880 0.1528 0.1552 0.1779 0.8862 6905 0.1948 0.1980 0.2260 0.8893 6930 0.2408 0.2448 0.2719 0.9083 6955 0.2857 0.2906 0.3125 0.9311 6980 0.3233 0.3290 0.3470 0.9450 7005 0.3503 0.3566 0.3755 0.9464 7030 0.3759 0.3829 0.3978 0.9561 7055 0.3990 0.4067 0.4142 0.9709 7080 0.4162 0.4245 0.4256 0.9826 7105 0.4233 0.4320 0.4331 0.9827 7130 0.4165 0.4252 0.4377 0.9629 7155 0.3943 0.4028 0.4405 0.9192 7180 0.3760 0.3844 0.4416 0.8849 7205 0.3823 0.3911 0.4411 0.8974 7230 0.3918 0.4011 0.4392 0.9182 7255 0.3892 0.3988 0.4358 0.9195 7280 0.3828 0.3924 0.4315 0.9153 7305 0.3820 0.3919 0.4265 0.9225 7330 0.3884 0.3988 0.4214 0.9436 7355 0.3872 0.3979 0.4165 0.9505 7380 0.3821 0.3930 0.4119 0.9496 7405 0.3787 0.3898 0.4077 0.9512 7430 0.3759 0.3872 0.4039 0.9531 7455 0.3727 0.3842 0.4006 0.9535 7480 0.3681 0.3799 0.3975 0.9508 7505 0.3618 0.3737 0.3943 0.9449 7530 0.3565 0.3685 0.3906 0.9416 7555 0.3554 0.3678 0.3862 0.9483 7580 0.3478 0.3603 0.3812 0.9429 7605 0.1473 0.1527 0.3757 0.4926 7630 0.2096 0.2176 0.3700 0.6546 7655 0.2648 0.2752 0.3641 0.7939 7680 0.3300 0.3434 0.3583 0.9530 7705 0.3256 0.3392 0.3526 0.9558 7730 0.3223 0.3361 0.3473 0.9602 7755 0.3179 0.3319 0.3424 0.9615 7780 0.3129 0.3272 0.3379 0.9605 7805 0.3077 0.3221 0.3337 0.9583 7830 0.3026 0.3173 0.3297 0.9559 7855 0.2980 0.3129 0.3259 0.9541 7880 0.2944 0.3095 0.3224 0.9541 7905 0.2921 0.3077 0.3194 0.9567 7930 0.2916 0.3075 0.3169 0.9622 7955 0.2921 0.3086 0.3150 0.9692 7980 0.2927 0.3098 0.3132 0.9762 8005 0.2923 0.3098 0.3111 0.9814 8030 0.2896 0.3076 0.3081 0.9833 8055 0.2840 0.3021 0.3039 0.9801 8080 0.2758 0.2939 0.2996 0.9702 8105 0.2642 0.2821 0.2945 0.9524 8130 0.2427 0.2597 0.2803 0.9285 8155 0.2091 0.2242 0.2493 0.9075 8180 0.1689 0.1815 0.2060 0.8931 8205 0.1276 0.1374 0.1578 0.8853 8230 0.0901 0.0973 0.1118 0.8843 8255 0.0603 0.0652 0.0743 0.8902 8280 0.0378 0.0410 0.0458 0.9033 8305 0.0218 0.0237 0.0257 0.9242 8330 0.0117 0.0128 0.0134 0.9483 8355 0.0068 0.0074 0.0077 0.9591 8380 0.0048 0.0053 0.0055 0.9576 8405 0.0033 0.0036 0.0037 0.9567 8430 0.0020 0.0022 0.0023 0.9564 8455 0.0013 0.0014 0.0015 0.9565 8480 0.0010 0.0011 0.0011 0.9569 8505 0.0009 0.0010 0.0011 0.9576 8530 0.0009 0.0010 0.0011 0.9584 8555 0.0008 0.0009 0.0009 0.9592 8580 0.0005 0.0006 0.0006 0.9598 8605 0.0002 0.0003 0.0003 0.9602 8630 0.0000 0.0000 0.0000 0.9603 pydl-0.7.0/pydl/pydlutils/data/cooling/0000755000076500000240000000000013434104632020446 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/pydlutils/data/cooling/m-30.cie0000644000076500000240000001544712672535402021625 0ustar weaverstaff00000000000000Cooling Curve : CIE:[Fe/H] = -3.0 log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.00 0.0022 1.000 1.064 -26.04 -23.41 -11.66 11.73 1.479 2.114 8.365 1.987 4.05 0.0141 1.000 1.064 -24.69 -22.87 -11.60 11.23 1.682 2.114 8.919 1.961 4.10 0.0701 1.000 1.064 -23.53 -22.40 -11.53 10.82 1.985 2.114 9.690 1.864 4.15 0.2528 1.000 1.064 -22.63 -22.06 -11.41 10.53 2.586 2.114 11.06 1.606 4.20 0.5669 1.000 1.064 -22.12 -21.90 -11.27 10.42 3.594 2.115 13.04 1.297 4.25 0.8163 1.000 1.064 -21.98 -21.92 -11.16 10.48 4.649 2.115 14.83 1.125 4.30 0.9310 1.000 1.064 -22.03 -22.02 -11.08 10.64 5.535 2.115 16.18 1.060 4.35 0.9755 1.000 1.064 -22.13 -22.15 -11.02 10.81 6.349 2.115 17.32 1.037 4.40 0.9980 1.000 1.064 -22.24 -22.26 -10.97 10.98 7.202 2.115 18.45 1.026 4.45 1.020 1.000 1.064 -22.33 -22.36 -10.91 11.13 8.169 2.115 19.65 1.015 4.50 1.042 1.000 1.064 -22.40 -22.45 -10.86 11.26 9.261 2.115 20.92 1.004 4.55 1.055 1.000 1.064 -22.49 -22.54 -10.81 11.41 10.45 2.115 22.23 0.998 4.60 1.060 1.000 1.064 -22.57 -22.62 -10.76 11.54 11.76 2.115 23.58 0.996 4.65 1.062 1.000 1.064 -22.63 -22.68 -10.71 11.65 13.21 2.115 24.99 0.995 4.70 1.064 1.000 1.064 -22.63 -22.69 -10.66 11.70 14.83 2.115 26.48 0.994 4.75 1.065 1.000 1.064 -22.55 -22.61 -10.61 11.67 16.65 2.115 28.06 0.993 4.80 1.070 1.000 1.064 -22.39 -22.45 -10.55 11.56 18.73 2.115 29.75 0.991 4.85 1.082 1.000 1.064 -22.23 -22.29 -10.50 11.46 21.12 2.115 31.60 0.986 4.90 1.099 1.000 1.064 -22.15 -22.21 -10.45 11.43 23.89 2.115 33.60 0.978 4.95 1.113 1.000 1.064 -22.16 -22.23 -10.40 11.50 26.98 2.115 35.71 0.972 5.00 1.121 1.000 1.064 -22.24 -22.31 -10.34 11.63 30.38 2.115 37.90 0.968 5.05 1.125 1.000 1.064 -22.33 -22.41 -10.29 11.77 34.15 2.115 40.18 0.966 5.10 1.126 1.000 1.064 -22.43 -22.51 -10.24 11.92 38.34 2.115 42.58 0.966 5.15 1.127 1.000 1.064 -22.52 -22.60 -10.19 12.07 43.04 2.115 45.11 0.965 5.20 1.128 1.000 1.064 -22.60 -22.67 -10.14 12.20 48.30 2.115 47.78 0.965 5.25 1.128 1.000 1.064 -22.66 -22.74 -10.09 12.31 54.20 2.115 50.62 0.965 5.30 1.128 1.000 1.064 -22.72 -22.80 -10.04 12.42 60.81 2.115 53.62 0.965 5.35 1.128 1.000 1.064 -22.77 -22.85 -9.993 12.52 68.23 2.115 56.80 0.965 5.40 1.128 1.000 1.064 -22.82 -22.90 -9.943 12.62 76.56 2.115 60.16 0.965 5.45 1.128 1.000 1.064 -22.90 -22.98 -9.893 12.75 85.90 2.115 63.73 0.965 5.50 1.128 1.000 1.064 -22.98 -23.06 -9.843 12.88 96.38 2.115 67.50 0.965 5.55 1.128 1.000 1.064 -23.04 -23.12 -9.793 12.99 108.1 2.115 71.50 0.965 5.60 1.128 1.000 1.064 -23.08 -23.16 -9.743 13.08 121.3 2.115 75.74 0.965 5.65 1.128 1.000 1.064 -23.12 -23.20 -9.693 13.17 136.1 2.115 80.23 0.965 5.70 1.128 1.000 1.064 -23.14 -23.22 -9.643 13.24 152.8 2.115 84.98 0.965 5.75 1.128 1.000 1.064 -23.16 -23.24 -9.593 13.31 171.4 2.115 90.02 0.965 5.80 1.128 1.000 1.064 -23.18 -23.26 -9.543 13.38 192.3 2.115 95.35 0.965 5.85 1.128 1.000 1.064 -23.20 -23.28 -9.493 13.45 215.8 2.115 101.0 0.965 5.90 1.128 1.000 1.064 -23.22 -23.30 -9.443 13.52 242.1 2.115 107.0 0.965 5.95 1.128 1.000 1.064 -23.23 -23.31 -9.393 13.58 271.7 2.115 113.3 0.965 6.00 1.128 1.000 1.064 -23.23 -23.31 -9.343 13.63 304.8 2.115 120.0 0.965 6.05 1.128 1.000 1.064 -23.24 -23.32 -9.293 13.69 342.0 2.115 127.2 0.965 6.10 1.128 1.000 1.064 -23.24 -23.32 -9.243 13.74 383.7 2.115 134.7 0.965 6.15 1.128 1.000 1.064 -23.23 -23.31 -9.193 13.78 430.5 2.115 142.7 0.965 6.20 1.128 1.000 1.064 -23.23 -23.31 -9.143 13.83 483.1 2.115 151.1 0.965 6.25 1.128 1.000 1.064 -23.23 -23.31 -9.093 13.88 542.0 2.115 160.1 0.965 6.30 1.128 1.000 1.064 -23.22 -23.30 -9.043 13.92 608.2 2.115 169.6 0.965 6.35 1.128 1.000 1.064 -23.23 -23.30 -8.993 13.98 682.4 2.115 179.6 0.965 6.40 1.128 1.000 1.064 -23.22 -23.30 -8.943 14.02 765.6 2.115 190.2 0.965 6.45 1.128 1.000 1.064 -23.21 -23.29 -8.893 14.06 859.0 2.115 201.5 0.965 6.50 1.128 1.000 1.064 -23.20 -23.28 -8.843 14.10 963.9 2.115 213.5 0.965 6.55 1.128 1.000 1.064 -23.19 -23.26 -8.793 14.14 1081.5 2.115 226.1 0.965 6.60 1.128 1.000 1.064 -23.17 -23.25 -8.743 14.17 1213.4 2.115 239.5 0.965 6.65 1.128 1.000 1.064 -23.16 -23.24 -8.693 14.21 1361.5 2.115 253.7 0.965 6.70 1.128 1.000 1.064 -23.14 -23.22 -8.643 14.24 1527.6 2.115 268.7 0.965 6.75 1.128 1.000 1.064 -23.13 -23.21 -8.593 14.28 1714.0 2.115 284.7 0.965 6.80 1.128 1.000 1.064 -23.11 -23.19 -8.543 14.31 1923.2 2.115 301.5 0.965 6.85 1.128 1.000 1.064 -23.09 -23.17 -8.493 14.34 2157.8 2.115 319.4 0.965 6.90 1.128 1.000 1.064 -23.08 -23.16 -8.443 14.38 2421.1 2.115 338.3 0.965 6.95 1.128 1.000 1.064 -23.06 -23.14 -8.393 14.41 2716.5 2.115 358.4 0.965 7.00 1.128 1.000 1.064 -23.04 -23.12 -8.343 14.44 3048.0 2.115 379.6 0.965 7.05 1.128 1.000 1.064 -23.03 -23.10 -8.293 14.48 3419.9 2.115 402.1 0.965 7.10 1.128 1.000 1.064 -23.01 -23.09 -8.243 14.51 3837.2 2.115 425.9 0.965 7.15 1.128 1.000 1.064 -22.99 -23.07 -8.193 14.54 4305.4 2.115 451.2 0.965 7.20 1.128 1.000 1.064 -22.97 -23.05 -8.143 14.57 4830.7 2.115 477.9 0.965 7.25 1.128 1.000 1.064 -22.95 -23.03 -8.093 14.60 5420.2 2.115 506.2 0.965 7.30 1.128 1.000 1.064 -22.93 -23.01 -8.043 14.63 6081.5 2.115 536.2 0.965 7.35 1.128 1.000 1.064 -22.91 -22.99 -7.993 14.66 6823.6 2.115 568.0 0.965 7.40 1.128 1.000 1.064 -22.89 -22.96 -7.943 14.69 7656.2 2.115 601.6 0.965 7.45 1.128 1.000 1.064 -22.86 -22.94 -7.893 14.71 8590.4 2.115 637.3 0.965 7.50 1.128 1.000 1.064 -22.84 -22.92 -7.843 14.74 9638.6 2.115 675.0 0.965 7.55 1.128 1.000 1.064 -22.82 -22.90 -7.793 14.77 10814.7 2.115 715.0 0.965 7.60 1.128 1.000 1.064 -22.80 -22.88 -7.743 14.80 12134.3 2.115 757.4 0.965 7.65 1.128 1.000 1.064 -22.78 -22.86 -7.693 14.83 13614.9 2.115 802.3 0.965 7.70 1.128 1.000 1.064 -22.76 -22.84 -7.643 14.86 15276.1 2.115 849.8 0.965 7.75 1.128 1.000 1.064 -22.73 -22.81 -7.593 14.88 17140.1 2.115 900.2 0.965 7.80 1.128 1.000 1.064 -22.71 -22.79 -7.543 14.91 19231.5 2.115 953.5 0.965 7.85 1.128 1.000 1.064 -22.69 -22.77 -7.493 14.94 21578.1 2.115 1010.0 0.965 7.90 1.128 1.000 1.064 -22.67 -22.75 -7.443 14.97 24211.0 2.115 1069.8 0.965 7.95 1.128 1.000 1.064 -22.65 -22.73 -7.393 15.00 27165.2 2.115 1133.2 0.965 8.00 1.128 1.000 1.064 -22.62 -22.70 -7.343 15.02 30479.9 2.115 1200.4 0.965 8.05 1.128 1.000 1.064 -22.60 -22.68 -7.293 15.05 34199.0 2.115 1271.5 0.965 8.10 1.128 1.000 1.064 -22.58 -22.66 -7.243 15.08 38371.9 2.115 1346.9 0.965 8.15 1.128 1.000 1.064 -22.56 -22.63 -7.193 15.11 43054.0 2.115 1426.7 0.965 8.20 1.128 1.000 1.064 -22.53 -22.61 -7.143 15.13 48307.4 2.115 1511.2 0.965 8.25 1.128 1.000 1.064 -22.51 -22.59 -7.093 15.16 54201.8 2.115 1600.8 0.965 8.30 1.128 1.000 1.064 -22.49 -22.57 -7.043 15.19 60815.4 2.115 1695.6 0.965 8.35 1.128 1.000 1.064 -22.46 -22.54 -6.993 15.21 68236.0 2.115 1796.1 0.965 8.40 1.128 1.000 1.064 -22.44 -22.52 -6.943 15.24 76562.1 2.115 1902.5 0.965 8.45 1.128 1.000 1.064 -22.42 -22.50 -6.893 15.27 85904.0 2.115 2015.2 0.965 8.50 1.128 1.000 1.064 -22.39 -22.47 -6.843 15.29 96385.9 2.115 2134.6 0.965pydl-0.7.0/pydl/pydlutils/data/cooling/m-20.cie0000644000076500000240000001544712672535402021624 0ustar weaverstaff00000000000000Cooling Curve : CIE:[Fe/H] = -2.0 log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.00 0.0022 1.000 1.064 -26.03 -23.40 -11.66 11.72 1.483 2.115 8.372 1.984 4.05 0.0141 1.000 1.064 -24.68 -22.86 -11.60 11.22 1.682 2.115 8.917 1.962 4.10 0.0700 1.000 1.064 -23.53 -22.40 -11.53 10.82 1.985 2.115 9.687 1.865 4.15 0.2528 1.000 1.064 -22.63 -22.05 -11.41 10.53 2.586 2.115 11.06 1.607 4.20 0.5668 1.000 1.064 -22.12 -21.90 -11.27 10.42 3.594 2.116 13.03 1.297 4.25 0.8163 1.000 1.064 -21.98 -21.92 -11.16 10.48 4.649 2.116 14.82 1.125 4.30 0.9310 1.000 1.064 -22.02 -22.02 -11.08 10.63 5.535 2.116 16.17 1.061 4.35 0.9755 1.000 1.064 -22.12 -22.14 -11.02 10.80 6.349 2.116 17.32 1.038 4.40 0.9980 1.000 1.064 -22.23 -22.25 -10.97 10.97 7.202 2.116 18.45 1.026 4.45 1.020 1.000 1.064 -22.31 -22.34 -10.91 11.11 8.169 2.116 19.65 1.015 4.50 1.042 1.000 1.064 -22.37 -22.42 -10.86 11.23 9.262 2.116 20.92 1.005 4.55 1.055 1.000 1.064 -22.44 -22.49 -10.81 11.36 10.45 2.116 22.23 0.999 4.60 1.060 1.000 1.064 -22.49 -22.54 -10.76 11.46 11.76 2.116 23.57 0.996 4.65 1.062 1.000 1.064 -22.51 -22.56 -10.71 11.53 13.21 2.116 24.98 0.995 4.70 1.064 1.000 1.064 -22.49 -22.54 -10.66 11.56 14.83 2.116 26.47 0.995 4.75 1.065 1.000 1.064 -22.40 -22.46 -10.61 11.52 16.65 2.116 28.05 0.994 4.80 1.070 1.000 1.064 -22.27 -22.32 -10.55 11.44 18.73 2.116 29.75 0.991 4.85 1.082 1.000 1.064 -22.13 -22.19 -10.50 11.36 21.12 2.116 31.59 0.986 4.90 1.099 1.000 1.064 -22.05 -22.12 -10.45 11.33 23.89 2.116 33.60 0.978 4.95 1.113 1.000 1.064 -22.05 -22.12 -10.40 11.39 26.98 2.116 35.70 0.972 5.00 1.121 1.000 1.064 -22.10 -22.17 -10.34 11.49 30.38 2.116 37.89 0.969 5.05 1.125 1.000 1.064 -22.16 -22.24 -10.29 11.60 34.15 2.116 40.17 0.967 5.10 1.127 1.000 1.064 -22.21 -22.28 -10.24 11.71 38.35 2.116 42.57 0.966 5.15 1.127 1.000 1.064 -22.23 -22.31 -10.19 11.78 43.04 2.116 45.10 0.966 5.20 1.128 1.000 1.064 -22.25 -22.33 -10.14 11.85 48.30 2.116 47.77 0.966 5.25 1.128 1.000 1.064 -22.26 -22.33 -10.09 11.91 54.20 2.116 50.61 0.965 5.30 1.128 1.000 1.064 -22.26 -22.34 -10.04 11.96 60.81 2.116 53.61 0.965 5.35 1.128 1.000 1.064 -22.26 -22.34 -9.993 12.01 68.24 2.116 56.78 0.965 5.40 1.128 1.000 1.064 -22.30 -22.38 -9.943 12.10 76.57 2.116 60.15 0.965 5.45 1.128 1.000 1.064 -22.42 -22.50 -9.893 12.27 85.91 2.116 63.71 0.965 5.50 1.128 1.000 1.064 -22.59 -22.67 -9.843 12.49 96.39 2.116 67.49 0.965 5.55 1.128 1.000 1.064 -22.74 -22.81 -9.793 12.69 108.2 2.116 71.49 0.965 5.60 1.128 1.000 1.064 -22.82 -22.90 -9.743 12.82 121.4 2.116 75.73 0.965 5.65 1.128 1.000 1.064 -22.86 -22.94 -9.693 12.91 136.2 2.116 80.21 0.965 5.70 1.128 1.000 1.064 -22.87 -22.95 -9.643 12.97 152.8 2.116 84.97 0.965 5.75 1.128 1.000 1.064 -22.89 -22.97 -9.593 13.04 171.4 2.116 90.00 0.965 5.80 1.128 1.000 1.064 -22.93 -23.01 -9.543 13.13 192.3 2.116 95.33 0.965 5.85 1.128 1.000 1.064 -23.00 -23.08 -9.493 13.25 215.8 2.116 101.0 0.965 5.90 1.128 1.000 1.064 -23.06 -23.14 -9.443 13.36 242.1 2.116 107.0 0.965 5.95 1.128 1.000 1.064 -23.09 -23.17 -9.393 13.44 271.7 2.116 113.3 0.965 6.00 1.128 1.000 1.064 -23.10 -23.18 -9.343 13.50 304.8 2.116 120.0 0.965 6.05 1.128 1.000 1.064 -23.12 -23.20 -9.293 13.57 342.0 2.116 127.1 0.965 6.10 1.128 1.000 1.064 -23.12 -23.20 -9.243 13.62 383.8 2.116 134.7 0.965 6.15 1.128 1.000 1.064 -23.12 -23.20 -9.193 13.67 430.6 2.116 142.6 0.965 6.20 1.128 1.000 1.064 -23.12 -23.20 -9.143 13.72 483.1 2.116 151.1 0.965 6.25 1.128 1.000 1.064 -23.12 -23.20 -9.093 13.77 542.1 2.116 160.0 0.965 6.30 1.128 1.000 1.064 -23.13 -23.21 -9.043 13.83 608.2 2.116 169.5 0.965 6.35 1.128 1.000 1.064 -23.15 -23.23 -8.993 13.90 682.4 2.116 179.6 0.965 6.40 1.128 1.000 1.064 -23.16 -23.24 -8.943 13.96 765.7 2.116 190.2 0.965 6.45 1.128 1.000 1.064 -23.16 -23.24 -8.893 14.01 859.2 2.116 201.5 0.965 6.50 1.128 1.000 1.064 -23.16 -23.24 -8.843 14.06 964.0 2.116 213.4 0.965 6.55 1.128 1.000 1.064 -23.15 -23.23 -8.793 14.10 1081.6 2.116 226.1 0.965 6.60 1.128 1.000 1.064 -23.14 -23.22 -8.743 14.14 1213.6 2.116 239.5 0.965 6.65 1.128 1.000 1.064 -23.13 -23.21 -8.693 14.18 1361.7 2.116 253.7 0.965 6.70 1.128 1.000 1.064 -23.12 -23.20 -8.643 14.22 1527.8 2.116 268.7 0.965 6.75 1.128 1.000 1.064 -23.11 -23.19 -8.593 14.26 1714.3 2.116 284.6 0.965 6.80 1.128 1.000 1.064 -23.09 -23.17 -8.543 14.29 1923.4 2.116 301.5 0.965 6.85 1.128 1.000 1.064 -23.08 -23.16 -8.493 14.33 2158.1 2.116 319.3 0.965 6.90 1.128 1.000 1.064 -23.06 -23.14 -8.443 14.36 2421.5 2.116 338.3 0.965 6.95 1.128 1.000 1.064 -23.05 -23.13 -8.393 14.40 2716.9 2.116 358.3 0.965 7.00 1.128 1.000 1.064 -23.03 -23.11 -8.343 14.43 3048.4 2.116 379.5 0.965 7.05 1.128 1.000 1.064 -23.01 -23.09 -8.293 14.46 3420.4 2.116 402.0 0.965 7.10 1.128 1.000 1.064 -23.00 -23.07 -8.243 14.50 3837.8 2.116 425.8 0.965 7.15 1.128 1.000 1.064 -22.98 -23.06 -8.193 14.53 4306.0 2.116 451.1 0.965 7.20 1.128 1.000 1.064 -22.96 -23.04 -8.143 14.56 4831.5 2.116 477.8 0.965 7.25 1.128 1.000 1.064 -22.94 -23.02 -8.093 14.59 5421.0 2.116 506.1 0.965 7.30 1.128 1.000 1.064 -22.92 -23.00 -8.043 14.62 6082.4 2.116 536.1 0.965 7.35 1.128 1.000 1.064 -22.90 -22.98 -7.993 14.65 6824.6 2.116 567.9 0.965 7.40 1.128 1.000 1.064 -22.88 -22.96 -7.943 14.68 7657.4 2.116 601.5 0.965 7.45 1.128 1.000 1.064 -22.86 -22.94 -7.893 14.71 8591.7 2.116 637.2 0.965 7.50 1.128 1.000 1.064 -22.84 -22.92 -7.843 14.74 9640.0 2.116 674.9 0.965 7.55 1.128 1.000 1.064 -22.82 -22.90 -7.793 14.77 10816.3 2.116 714.9 0.965 7.60 1.128 1.000 1.064 -22.80 -22.88 -7.743 14.80 12136.1 2.116 757.3 0.965 7.65 1.128 1.000 1.064 -22.78 -22.86 -7.693 14.83 13616.9 2.116 802.1 0.965 7.70 1.128 1.000 1.064 -22.75 -22.83 -7.643 14.85 15278.4 2.116 849.7 0.965 7.75 1.128 1.000 1.064 -22.73 -22.81 -7.593 14.88 17142.7 2.116 900.0 0.965 7.80 1.128 1.000 1.064 -22.71 -22.79 -7.543 14.91 19234.4 2.116 953.4 0.965 7.85 1.128 1.000 1.064 -22.69 -22.77 -7.493 14.94 21581.3 2.116 1009.8 0.965 7.90 1.128 1.000 1.064 -22.67 -22.75 -7.443 14.97 24214.7 2.116 1069.7 0.965 7.95 1.128 1.000 1.064 -22.64 -22.72 -7.393 14.99 27169.3 2.116 1133.1 0.965 8.00 1.128 1.000 1.064 -22.62 -22.70 -7.343 15.02 30484.5 2.116 1200.2 0.965 8.05 1.128 1.000 1.064 -22.60 -22.68 -7.293 15.05 34204.1 2.116 1271.3 0.965 8.10 1.128 1.000 1.064 -22.58 -22.66 -7.243 15.08 38377.7 2.116 1346.7 0.965 8.15 1.128 1.000 1.064 -22.55 -22.63 -7.193 15.10 43060.5 2.116 1426.4 0.965 8.20 1.128 1.000 1.064 -22.53 -22.61 -7.143 15.13 48314.6 2.116 1511.0 0.965 8.25 1.128 1.000 1.064 -22.51 -22.59 -7.093 15.16 54209.9 2.116 1600.5 0.965 8.30 1.128 1.000 1.064 -22.49 -22.56 -7.043 15.19 60824.5 2.116 1695.3 0.965 8.35 1.128 1.000 1.064 -22.46 -22.54 -6.993 15.21 68246.2 2.116 1795.8 0.965 8.40 1.128 1.000 1.064 -22.44 -22.52 -6.943 15.24 76573.5 2.116 1902.2 0.965 8.45 1.128 1.000 1.064 -22.42 -22.50 -6.893 15.27 85916.9 2.116 2014.9 0.965 8.50 1.128 1.000 1.064 -22.39 -22.47 -6.843 15.29 96400.4 2.116 2134.3 0.965pydl-0.7.0/pydl/pydlutils/data/cooling/mzero.cie0000644000076500000240000001544512672535402022303 0ustar weaverstaff00000000000000Cooling Curve : CIE:Zero Metals log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.00 0.0022 1.000 1.064 -26.04 -23.41 -11.66 11.73 1.483 2.114 8.374 1.983 4.05 0.0141 1.000 1.064 -24.69 -22.87 -11.60 11.23 1.682 2.114 8.919 1.961 4.10 0.0701 1.000 1.064 -23.53 -22.40 -11.53 10.82 1.985 2.114 9.690 1.864 4.15 0.2528 1.000 1.064 -22.63 -22.06 -11.41 10.53 2.586 2.114 11.06 1.606 4.20 0.5669 1.000 1.064 -22.12 -21.90 -11.27 10.42 3.594 2.115 13.04 1.297 4.25 0.8163 1.000 1.064 -21.98 -21.92 -11.16 10.48 4.649 2.115 14.83 1.125 4.30 0.9310 1.000 1.064 -22.03 -22.02 -11.08 10.64 5.535 2.115 16.18 1.060 4.35 0.9755 1.000 1.064 -22.13 -22.15 -11.02 10.81 6.349 2.115 17.33 1.037 4.40 0.9980 1.000 1.064 -22.24 -22.26 -10.97 10.98 7.202 2.115 18.45 1.026 4.45 1.020 1.000 1.064 -22.33 -22.36 -10.91 11.13 8.169 2.115 19.65 1.015 4.50 1.042 1.000 1.064 -22.41 -22.45 -10.86 11.27 9.261 2.115 20.93 1.004 4.55 1.055 1.000 1.064 -22.49 -22.54 -10.81 11.41 10.45 2.115 22.23 0.998 4.60 1.060 1.000 1.064 -22.58 -22.63 -10.76 11.55 11.76 2.115 23.58 0.996 4.65 1.062 1.000 1.064 -22.64 -22.70 -10.71 11.66 13.21 2.115 24.99 0.995 4.70 1.064 1.000 1.064 -22.65 -22.71 -10.66 11.72 14.83 2.115 26.48 0.994 4.75 1.065 1.000 1.064 -22.57 -22.63 -10.61 11.69 16.65 2.115 28.06 0.993 4.80 1.070 1.000 1.064 -22.41 -22.47 -10.55 11.58 18.73 2.115 29.75 0.991 4.85 1.082 1.000 1.064 -22.24 -22.30 -10.50 11.47 21.12 2.115 31.60 0.986 4.90 1.099 1.000 1.064 -22.16 -22.23 -10.45 11.44 23.89 2.115 33.61 0.978 4.95 1.113 1.000 1.064 -22.17 -22.25 -10.40 11.51 26.98 2.115 35.71 0.972 5.00 1.121 1.000 1.064 -22.25 -22.33 -10.34 11.64 30.38 2.115 37.90 0.968 5.05 1.125 1.000 1.064 -22.36 -22.44 -10.29 11.80 34.15 2.115 40.18 0.966 5.10 1.126 1.000 1.064 -22.46 -22.54 -10.24 11.95 38.34 2.115 42.58 0.966 5.15 1.127 1.000 1.064 -22.56 -22.64 -10.19 12.11 43.04 2.115 45.11 0.965 5.20 1.128 1.000 1.064 -22.66 -22.74 -10.14 12.26 48.30 2.115 47.78 0.965 5.25 1.128 1.000 1.064 -22.74 -22.82 -10.09 12.39 54.19 2.115 50.62 0.965 5.30 1.128 1.000 1.064 -22.82 -22.90 -10.04 12.52 60.81 2.115 53.62 0.965 5.35 1.128 1.000 1.064 -22.89 -22.97 -9.993 12.64 68.23 2.115 56.80 0.965 5.40 1.128 1.000 1.064 -22.95 -23.03 -9.943 12.75 76.56 2.115 60.16 0.965 5.45 1.128 1.000 1.064 -23.01 -23.08 -9.893 12.86 85.90 2.115 63.73 0.965 5.50 1.128 1.000 1.064 -23.05 -23.13 -9.843 12.95 96.38 2.115 67.50 0.965 5.55 1.128 1.000 1.064 -23.09 -23.17 -9.793 13.04 108.1 2.115 71.50 0.965 5.60 1.128 1.000 1.064 -23.13 -23.21 -9.743 13.13 121.3 2.115 75.74 0.965 5.65 1.128 1.000 1.064 -23.16 -23.24 -9.693 13.21 136.1 2.115 80.23 0.965 5.70 1.128 1.000 1.064 -23.18 -23.26 -9.643 13.28 152.8 2.115 84.98 0.965 5.75 1.128 1.000 1.064 -23.20 -23.28 -9.593 13.35 171.4 2.115 90.02 0.965 5.80 1.128 1.000 1.064 -23.22 -23.30 -9.543 13.42 192.3 2.115 95.35 0.965 5.85 1.128 1.000 1.064 -23.23 -23.31 -9.493 13.48 215.8 2.115 101.0 0.965 5.90 1.128 1.000 1.064 -23.24 -23.32 -9.443 13.54 242.1 2.115 107.0 0.965 5.95 1.128 1.000 1.064 -23.25 -23.33 -9.393 13.60 271.6 2.115 113.3 0.965 6.00 1.128 1.000 1.064 -23.25 -23.33 -9.343 13.65 304.8 2.115 120.0 0.965 6.05 1.128 1.000 1.064 -23.25 -23.33 -9.293 13.70 342.0 2.115 127.2 0.965 6.10 1.128 1.000 1.064 -23.25 -23.33 -9.243 13.75 383.7 2.115 134.7 0.965 6.15 1.128 1.000 1.064 -23.25 -23.33 -9.193 13.80 430.5 2.115 142.7 0.965 6.20 1.128 1.000 1.064 -23.25 -23.32 -9.143 13.85 483.1 2.115 151.1 0.965 6.25 1.128 1.000 1.064 -23.24 -23.32 -9.093 13.89 542.0 2.115 160.1 0.965 6.30 1.128 1.000 1.064 -23.23 -23.31 -9.043 13.93 608.1 2.115 169.6 0.965 6.35 1.128 1.000 1.064 -23.23 -23.31 -8.993 13.98 682.3 2.115 179.6 0.965 6.40 1.128 1.000 1.064 -23.23 -23.30 -8.943 14.03 765.6 2.115 190.3 0.965 6.45 1.128 1.000 1.064 -23.21 -23.29 -8.893 14.06 859.0 2.115 201.5 0.965 6.50 1.128 1.000 1.064 -23.20 -23.28 -8.843 14.10 963.8 2.115 213.5 0.965 6.55 1.128 1.000 1.064 -23.19 -23.27 -8.793 14.14 1081.4 2.115 226.1 0.965 6.60 1.128 1.000 1.064 -23.18 -23.25 -8.743 14.18 1213.4 2.115 239.5 0.965 6.65 1.128 1.000 1.064 -23.16 -23.24 -8.693 14.21 1361.5 2.115 253.7 0.965 6.70 1.128 1.000 1.064 -23.15 -23.22 -8.643 14.25 1527.6 2.115 268.7 0.965 6.75 1.128 1.000 1.064 -23.13 -23.21 -8.593 14.28 1714.0 2.115 284.7 0.965 6.80 1.128 1.000 1.064 -23.11 -23.19 -8.543 14.31 1923.1 2.115 301.5 0.965 6.85 1.128 1.000 1.064 -23.10 -23.17 -8.493 14.35 2157.8 2.115 319.4 0.965 6.90 1.128 1.000 1.064 -23.08 -23.16 -8.443 14.38 2421.1 2.115 338.3 0.965 6.95 1.128 1.000 1.064 -23.06 -23.14 -8.393 14.41 2716.5 2.115 358.4 0.965 7.00 1.128 1.000 1.064 -23.05 -23.12 -8.343 14.45 3047.9 2.115 379.6 0.965 7.05 1.128 1.000 1.064 -23.03 -23.11 -8.293 14.48 3419.8 2.115 402.1 0.965 7.10 1.128 1.000 1.064 -23.01 -23.09 -8.243 14.51 3837.1 2.115 425.9 0.965 7.15 1.128 1.000 1.064 -22.99 -23.07 -8.193 14.54 4305.3 2.115 451.2 0.965 7.20 1.128 1.000 1.064 -22.97 -23.05 -8.143 14.57 4830.7 2.115 477.9 0.965 7.25 1.128 1.000 1.064 -22.95 -23.03 -8.093 14.60 5420.1 2.115 506.2 0.965 7.30 1.128 1.000 1.064 -22.93 -23.01 -8.043 14.63 6081.4 2.115 536.2 0.965 7.35 1.128 1.000 1.064 -22.91 -22.99 -7.993 14.66 6823.5 2.115 568.0 0.965 7.40 1.128 1.000 1.064 -22.89 -22.96 -7.943 14.69 7656.1 2.115 601.6 0.965 7.45 1.128 1.000 1.064 -22.86 -22.94 -7.893 14.71 8590.3 2.115 637.3 0.965 7.50 1.128 1.000 1.064 -22.84 -22.92 -7.843 14.74 9638.4 2.115 675.0 0.965 7.55 1.128 1.000 1.064 -22.82 -22.90 -7.793 14.77 10814.5 2.115 715.0 0.965 7.60 1.128 1.000 1.064 -22.80 -22.88 -7.743 14.80 12134.1 2.115 757.4 0.965 7.65 1.128 1.000 1.064 -22.78 -22.86 -7.693 14.83 13614.6 2.115 802.3 0.965 7.70 1.128 1.000 1.064 -22.76 -22.84 -7.643 14.86 15275.9 2.115 849.8 0.965 7.75 1.128 1.000 1.064 -22.74 -22.81 -7.593 14.89 17139.8 2.115 900.2 0.965 7.80 1.128 1.000 1.064 -22.71 -22.79 -7.543 14.91 19231.2 2.115 953.5 0.965 7.85 1.128 1.000 1.064 -22.69 -22.77 -7.493 14.94 21577.8 2.115 1010.0 0.965 7.90 1.128 1.000 1.064 -22.67 -22.75 -7.443 14.97 24210.6 2.115 1069.9 0.965 7.95 1.128 1.000 1.064 -22.65 -22.73 -7.393 15.00 27164.8 2.115 1133.3 0.965 8.00 1.128 1.000 1.064 -22.62 -22.70 -7.343 15.02 30479.4 2.115 1200.4 0.965 8.05 1.128 1.000 1.064 -22.60 -22.68 -7.293 15.05 34198.4 2.115 1271.5 0.965 8.10 1.128 1.000 1.064 -22.58 -22.66 -7.243 15.08 38371.3 2.115 1346.9 0.965 8.15 1.128 1.000 1.064 -22.56 -22.63 -7.193 15.11 43053.3 2.115 1426.7 0.965 8.20 1.128 1.000 1.064 -22.53 -22.61 -7.143 15.13 48306.6 2.115 1511.2 0.965 8.25 1.128 1.000 1.064 -22.51 -22.59 -7.093 15.16 54200.9 2.115 1600.8 0.965 8.30 1.128 1.000 1.064 -22.49 -22.57 -7.043 15.19 60814.4 2.115 1695.6 0.965 8.35 1.128 1.000 1.064 -22.46 -22.54 -6.993 15.21 68234.9 2.115 1796.1 0.965 8.40 1.128 1.000 1.064 -22.44 -22.52 -6.943 15.24 76560.8 2.115 1902.5 0.965 8.45 1.128 1.000 1.064 -22.42 -22.50 -6.893 15.27 85902.6 2.115 2015.3 0.965 8.50 1.128 1.000 1.064 -22.39 -22.47 -6.843 15.29 96384.3 2.115 2134.7 0.965pydl-0.7.0/pydl/pydlutils/data/cooling/cooling.sha1sum0000644000076500000240000000063112672535402023411 0ustar weaverstaff0000000000000034b041f8241bebada7bbc5aa3791d7b80f346ca1 m-00.cie 2d74d9093fbef1786ebc29d10f8c301002a2ca91 m-05.cie d202615848505dd58f7e1786c00c5f30224e8252 m+05.cie 9d370f6d2cec327fb6b5ce79c5f3943f41466582 m-10.cie 7835424157683c2b090c2a41552acb4c45c80798 m-15.cie 482a27f65a24e1431cb8a2f8a89e32e8ea5bd4e8 m-20.cie a735ba627510bb315bfe7158e0e55a19671eb6f6 m-30.cie f35c2dc6c376aa7486f5be2436b1791087431062 mzero.cie pydl-0.7.0/pydl/pydlutils/data/cooling/m-05.cie0000644000076500000240000001544512672535402021625 0ustar weaverstaff00000000000000Cooling Curve : CIE:[Fe/H] = -0.5 log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.00 0.0023 1.000 1.080 -25.83 -23.22 -11.65 11.54 1.505 2.239 8.198 2.069 4.05 0.0141 1.000 1.080 -24.49 -22.68 -11.59 11.04 1.707 2.239 8.732 2.046 4.10 0.0703 1.000 1.080 -23.42 -22.30 -11.52 10.72 2.014 2.239 9.483 1.947 4.15 0.2530 1.000 1.080 -22.56 -22.00 -11.41 10.46 2.619 2.240 10.81 1.680 4.20 0.5673 1.000 1.080 -22.07 -21.85 -11.27 10.37 3.631 2.240 12.73 1.360 4.25 0.8162 1.000 1.080 -21.92 -21.86 -11.16 10.43 4.689 2.240 14.47 1.181 4.30 0.9317 1.000 1.080 -21.94 -21.94 -11.08 10.56 5.582 2.240 15.78 1.113 4.35 0.9767 1.000 1.080 -22.00 -22.02 -11.02 10.69 6.403 2.240 16.91 1.089 4.40 1.001 1.000 1.080 -22.02 -22.06 -10.97 10.77 7.269 2.240 18.01 1.076 4.45 1.027 1.000 1.080 -22.00 -22.05 -10.91 10.81 8.259 2.240 19.20 1.063 4.50 1.054 1.000 1.080 -21.94 -22.00 -10.85 10.81 9.383 2.240 20.47 1.050 4.55 1.069 1.000 1.080 -21.87 -21.93 -10.80 10.80 10.61 2.240 21.76 1.042 4.60 1.076 1.000 1.080 -21.78 -21.85 -10.75 10.76 11.94 2.240 23.08 1.039 4.65 1.079 1.000 1.080 -21.69 -21.75 -10.70 10.72 13.41 2.240 24.46 1.038 4.70 1.080 1.000 1.080 -21.59 -21.66 -10.65 10.67 15.05 2.240 25.92 1.037 4.75 1.082 1.000 1.080 -21.50 -21.57 -10.60 10.63 16.91 2.240 27.47 1.036 4.80 1.088 1.000 1.080 -21.41 -21.48 -10.55 10.60 19.03 2.240 29.14 1.033 4.85 1.103 1.000 1.080 -21.33 -21.41 -10.49 10.57 21.49 2.240 30.97 1.026 4.90 1.124 1.000 1.080 -21.27 -21.35 -10.44 10.57 24.34 2.240 32.96 1.016 4.95 1.142 1.000 1.080 -21.23 -21.32 -10.39 10.59 27.53 2.240 35.06 1.008 5.00 1.152 1.000 1.080 -21.22 -21.32 -10.34 10.63 31.04 2.240 37.22 1.004 5.05 1.157 1.000 1.080 -21.23 -21.33 -10.28 10.69 34.90 2.240 39.47 1.002 5.10 1.159 1.000 1.080 -21.22 -21.32 -10.23 10.73 39.20 2.240 41.83 1.001 5.15 1.160 1.000 1.080 -21.19 -21.29 -10.18 10.75 44.00 2.240 44.32 1.000 5.20 1.161 1.000 1.080 -21.16 -21.26 -10.13 10.77 49.38 2.240 46.95 1.000 5.25 1.161 1.000 1.080 -21.14 -21.23 -10.08 10.80 55.42 2.240 49.74 1.000 5.30 1.161 1.000 1.080 -21.12 -21.21 -10.03 10.83 62.19 2.240 52.69 0.999 5.35 1.162 1.000 1.080 -21.10 -21.20 -9.983 10.86 69.79 2.240 55.81 0.999 5.40 1.162 1.000 1.080 -21.14 -21.23 -9.933 10.95 78.31 2.240 59.12 0.999 5.45 1.162 1.000 1.080 -21.27 -21.37 -9.883 11.13 87.88 2.240 62.63 0.999 5.50 1.163 1.000 1.080 -21.50 -21.60 -9.833 11.42 98.62 2.240 66.35 0.999 5.55 1.163 1.000 1.080 -21.70 -21.79 -9.783 11.67 110.7 2.240 70.28 0.999 5.60 1.163 1.000 1.080 -21.81 -21.91 -9.733 11.83 124.2 2.240 74.45 0.999 5.65 1.163 1.000 1.080 -21.85 -21.95 -9.683 11.92 139.3 2.240 78.86 0.999 5.70 1.163 1.000 1.080 -21.86 -21.96 -9.633 11.98 156.3 2.240 83.53 0.999 5.75 1.163 1.000 1.080 -21.87 -21.97 -9.583 12.04 175.4 2.240 88.48 0.999 5.80 1.163 1.000 1.080 -21.93 -22.03 -9.533 12.15 196.8 2.240 93.73 0.999 5.85 1.163 1.000 1.080 -22.05 -22.15 -9.483 12.32 220.9 2.240 99.29 0.999 5.90 1.163 1.000 1.080 -22.15 -22.24 -9.433 12.47 247.8 2.240 105.2 0.999 5.95 1.164 1.000 1.080 -22.19 -22.29 -9.383 12.56 278.1 2.240 111.4 0.999 6.00 1.164 1.000 1.080 -22.22 -22.32 -9.333 12.64 312.0 2.240 118.0 0.999 6.05 1.164 1.000 1.080 -22.24 -22.34 -9.283 12.71 350.1 2.240 125.0 0.998 6.10 1.164 1.000 1.080 -22.24 -22.34 -9.233 12.76 392.8 2.240 132.4 0.998 6.15 1.164 1.000 1.080 -22.24 -22.34 -9.183 12.81 440.8 2.240 140.3 0.998 6.20 1.164 1.000 1.080 -22.23 -22.33 -9.133 12.85 494.6 2.240 148.6 0.998 6.25 1.164 1.000 1.080 -22.26 -22.36 -9.083 12.93 554.9 2.240 157.4 0.998 6.30 1.164 1.000 1.080 -22.34 -22.44 -9.033 13.06 622.7 2.240 166.7 0.998 6.35 1.164 1.000 1.080 -22.44 -22.54 -8.983 13.21 698.7 2.240 176.6 0.998 6.40 1.164 1.000 1.080 -22.53 -22.63 -8.933 13.35 784.0 2.240 187.1 0.998 6.45 1.165 1.000 1.080 -22.59 -22.69 -8.883 13.46 879.7 2.240 198.2 0.998 6.50 1.165 1.000 1.080 -22.64 -22.74 -8.833 13.56 987.1 2.240 209.9 0.998 6.55 1.165 1.000 1.080 -22.66 -22.76 -8.783 13.63 1107.6 2.240 222.3 0.998 6.60 1.165 1.000 1.080 -22.69 -22.79 -8.733 13.71 1242.8 2.240 235.5 0.998 6.65 1.165 1.000 1.080 -22.71 -22.81 -8.683 13.78 1394.5 2.240 249.5 0.998 6.70 1.165 1.000 1.080 -22.73 -22.83 -8.633 13.85 1564.7 2.240 264.3 0.998 6.75 1.165 1.000 1.080 -22.75 -22.85 -8.583 13.92 1755.6 2.240 279.9 0.998 6.80 1.165 1.000 1.080 -22.76 -22.86 -8.533 13.98 1969.9 2.240 296.5 0.998 6.85 1.165 1.000 1.080 -22.75 -22.85 -8.483 14.02 2210.3 2.240 314.1 0.998 6.90 1.165 1.000 1.080 -22.74 -22.84 -8.433 14.06 2480.0 2.240 332.7 0.998 6.95 1.165 1.000 1.080 -22.74 -22.84 -8.383 14.11 2782.6 2.240 352.4 0.998 7.00 1.165 1.000 1.080 -22.73 -22.83 -8.333 14.15 3122.2 2.240 373.3 0.998 7.05 1.165 1.000 1.080 -22.73 -22.83 -8.283 14.20 3503.2 2.240 395.4 0.998 7.10 1.165 1.000 1.080 -22.75 -22.85 -8.233 14.27 3930.7 2.240 418.9 0.998 7.15 1.165 1.000 1.080 -22.76 -22.86 -8.183 14.33 4410.4 2.240 443.7 0.998 7.20 1.165 1.000 1.080 -22.77 -22.87 -8.133 14.39 4948.5 2.240 470.0 0.998 7.25 1.165 1.000 1.080 -22.77 -22.87 -8.083 14.44 5552.4 2.240 497.8 0.998 7.30 1.165 1.000 1.080 -22.77 -22.87 -8.033 14.49 6229.9 2.240 527.3 0.998 7.35 1.165 1.000 1.080 -22.76 -22.86 -7.983 14.53 6990.0 2.240 558.6 0.998 7.40 1.165 1.000 1.080 -22.75 -22.85 -7.933 14.57 7843.0 2.240 591.7 0.998 7.45 1.165 1.000 1.080 -22.74 -22.84 -7.883 14.61 8800.0 2.240 626.7 0.998 7.50 1.165 1.000 1.080 -22.73 -22.83 -7.833 14.65 9873.8 2.240 663.9 0.998 7.55 1.165 1.000 1.080 -22.72 -22.82 -7.783 14.69 11078.6 2.240 703.2 0.998 7.60 1.165 1.000 1.080 -22.70 -22.80 -7.733 14.72 12430.4 2.240 744.9 0.998 7.65 1.165 1.000 1.080 -22.68 -22.78 -7.683 14.75 13947.1 2.240 789.0 0.998 7.70 1.165 1.000 1.080 -22.66 -22.76 -7.633 14.78 15648.9 2.240 835.8 0.998 7.75 1.165 1.000 1.080 -22.65 -22.75 -7.583 14.82 17558.4 2.240 885.3 0.998 7.80 1.165 1.000 1.080 -22.63 -22.73 -7.533 14.85 19700.9 2.240 937.7 0.998 7.85 1.165 1.000 1.080 -22.61 -22.71 -7.483 14.88 22104.8 2.240 993.3 0.998 7.90 1.165 1.000 1.080 -22.59 -22.69 -7.433 14.91 24802.0 2.240 1052.2 0.998 7.95 1.165 1.000 1.080 -22.57 -22.67 -7.383 14.94 27828.3 2.240 1114.5 0.998 8.00 1.165 1.000 1.080 -22.55 -22.65 -7.333 14.97 31223.9 2.240 1180.5 0.998 8.05 1.165 1.000 1.080 -22.53 -22.63 -7.283 15.00 35033.9 2.240 1250.5 0.998 8.10 1.165 1.000 1.080 -22.50 -22.60 -7.233 15.02 39308.7 2.240 1324.6 0.998 8.15 1.165 1.000 1.080 -22.48 -22.58 -7.183 15.05 44105.1 2.240 1403.1 0.998 8.20 1.165 1.000 1.080 -22.46 -22.56 -7.133 15.08 49486.8 2.240 1486.2 0.998 8.25 1.165 1.000 1.080 -22.44 -22.54 -7.083 15.11 55525.1 2.240 1574.3 0.998 8.30 1.165 1.000 1.080 -22.42 -22.52 -7.033 15.14 62300.2 2.240 1667.6 0.998 8.35 1.165 1.000 1.080 -22.40 -22.50 -6.983 15.17 69902.0 2.240 1766.4 0.998 8.40 1.165 1.000 1.080 -22.38 -22.48 -6.933 15.20 78431.4 2.240 1871.0 0.998 8.45 1.165 1.000 1.080 -22.35 -22.45 -6.883 15.22 88001.5 2.240 1981.9 0.998 8.50 1.165 1.000 1.080 -22.33 -22.43 -6.833 15.25 98739.4 2.240 2099.3 0.998pydl-0.7.0/pydl/pydlutils/data/cooling/README.rst0000644000076500000240000000510512672535402022144 0ustar weaverstaff00000000000000=========================== ABOUT THE COOLING FUNCTIONS =========================== Introduction ------------ This file has been adapted from http://www.mso.anu.edu.au/~ralph/data/cool/ABOUT.txt. The associated files contain the cooling functions calculated for the paper by `Sutherland & Dopita (1993) `_. See paper or preprint for details. FILE LIST --------- Collisonal Ionisation Equilibrium cooling functions as a function of metallicity are given in the files: mzero.cie cie cooling for primordial hydrogen/helium mix log(T) = 4-8.5 m-00.cie cie cooling for solar abundances mix log(T) = 4-8.5 m-05.cie [Fe/H] = -0.5, solar/primordial average ratios m-10.cie [Fe/H] = -1.0, primordial ratios (ie enhanced oxygen) m-15.cie [Fe/H] = -1.5, primordial ratios m-20.cie [Fe/H] = -2.0, primordial ratios m-30.cie [Fe/H] = -3.0, primordial ratios m+05.cie [Fe/H] = +0.5, solar ratios log(T) = 4.1-8.5 (due to charge exchange problems at log(T) = 4.0) FILE FORMAT ----------- The data files consist of tab separated ASCII column data described as follows: log(T) log temperature in K ne, nH, nt number densities, electrons, hydrogen and total ion in cm^-3. log(lambda net), log(lambda norm) log on the net cooling function and the normalised cooling function. lambda norm = lambda net / (ne nt). lambda net in ergs cm^-3 s^-1, lambda net in ergs cm^3 s^-1. While the densities are kept less than about p/k 10^8 both isobaric and isochoric curves can be constructed from the normalised function using an appropriate density function. The non-equilibrium net cooling function is from the isobaric model used to calculate the curves. In the CIE curves the net function is for the isochoric cie model. log(U) U = 3/2 N kT, N = ne + nt the total internal energy. ergs cm^-3 log(taucool) The normalized cooling timescale Nr*( U/(lambda net)) Nr = (ne nt)/(ne+nt). s cm^-3 P12 Gas pressure NkT. ergs ergs cm^-3 times 10^12 rho24 Density g cm^-3 times 10^24 Ci The isothermal sound speed kms s^-1 mubar mean molecular weight grams times 10^24 NOTES ----- Tests showed that density quenching begins to affect the curves in the low temperature parts of the cooling functions when log(p/k) = 8 or more. Above log(T) = 7.5-8.5 it should be pretty safe to use a powerlaw fit to the free-free losses Good luck, and let me know if you have any problems. These files will be available via anonymous ftp from Yours sincerely Ralph S. Sutherland 1992-1993. pydl-0.7.0/pydl/pydlutils/data/cooling/m-10.cie0000644000076500000240000001544712672535402021623 0ustar weaverstaff00000000000000Cooling Curve : CIE:[Fe/H] = -1.0 log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.00 0.0022 1.000 1.064 -25.94 -23.31 -11.66 11.63 1.483 2.125 8.354 1.992 4.05 0.0141 1.000 1.064 -24.61 -22.78 -11.60 11.15 1.683 2.125 8.898 1.971 4.10 0.0702 1.000 1.064 -23.49 -22.36 -11.53 10.78 1.986 2.125 9.667 1.873 4.15 0.2529 1.000 1.064 -22.60 -22.03 -11.41 10.50 2.587 2.125 11.03 1.613 4.20 0.5667 1.000 1.064 -22.10 -21.88 -11.27 10.40 3.595 2.126 13.00 1.303 4.25 0.8163 1.000 1.064 -21.95 -21.89 -11.16 10.45 4.650 2.126 14.79 1.130 4.30 0.9312 1.000 1.064 -21.99 -21.98 -11.08 10.60 5.537 2.126 16.14 1.065 4.35 0.9758 1.000 1.064 -22.07 -22.08 -11.02 10.75 6.351 2.126 17.28 1.042 4.40 0.9984 1.000 1.064 -22.13 -22.16 -10.97 10.87 7.205 2.126 18.41 1.031 4.45 1.021 1.000 1.064 -22.15 -22.19 -10.91 10.95 8.172 2.126 19.61 1.020 4.50 1.043 1.000 1.064 -22.15 -22.19 -10.86 11.01 9.265 2.126 20.88 1.009 4.55 1.055 1.000 1.064 -22.11 -22.16 -10.81 11.03 10.46 2.126 22.18 1.003 4.60 1.061 1.000 1.064 -22.06 -22.11 -10.76 11.03 11.76 2.126 23.52 1.000 4.65 1.063 1.000 1.064 -21.98 -22.04 -10.71 11.00 13.21 2.126 24.93 0.999 4.70 1.064 1.000 1.064 -21.90 -21.95 -10.66 10.97 14.83 2.126 26.41 0.999 4.75 1.066 1.000 1.064 -21.81 -21.86 -10.61 10.93 16.66 2.126 27.99 0.998 4.80 1.071 1.000 1.064 -21.72 -21.77 -10.55 10.89 18.73 2.126 29.68 0.996 4.85 1.082 1.000 1.064 -21.63 -21.69 -10.50 10.86 21.13 2.126 31.53 0.990 4.90 1.099 1.000 1.064 -21.57 -21.64 -10.45 10.85 23.90 2.126 33.53 0.983 4.95 1.114 1.000 1.064 -21.54 -21.61 -10.40 10.88 26.99 2.126 35.63 0.976 5.00 1.122 1.000 1.064 -21.53 -21.60 -10.34 10.92 30.40 2.126 37.81 0.973 5.05 1.126 1.000 1.064 -21.52 -21.60 -10.29 10.96 34.17 2.126 40.09 0.971 5.10 1.127 1.000 1.064 -21.50 -21.58 -10.24 11.00 38.37 2.126 42.48 0.970 5.15 1.128 1.000 1.064 -21.47 -21.55 -10.19 11.02 43.07 2.126 45.01 0.970 5.20 1.129 1.000 1.064 -21.44 -21.52 -10.14 11.04 48.33 2.126 47.68 0.969 5.25 1.129 1.000 1.064 -21.41 -21.49 -10.09 11.06 54.23 2.126 50.51 0.969 5.30 1.129 1.000 1.064 -21.38 -21.46 -10.04 11.08 60.86 2.126 53.50 0.969 5.35 1.129 1.000 1.064 -21.36 -21.44 -9.993 11.11 68.29 2.126 56.67 0.969 5.40 1.129 1.000 1.064 -21.40 -21.48 -9.943 11.20 76.63 2.126 60.03 0.969 5.45 1.130 1.000 1.064 -21.53 -21.61 -9.893 11.38 85.99 2.126 63.59 0.969 5.50 1.130 1.000 1.064 -21.75 -21.84 -9.843 11.65 96.49 2.126 67.37 0.969 5.55 1.130 1.000 1.064 -21.95 -22.03 -9.793 11.90 108.3 2.126 71.36 0.969 5.60 1.130 1.000 1.064 -22.07 -22.15 -9.743 12.07 121.5 2.126 75.59 0.969 5.65 1.130 1.000 1.064 -22.11 -22.19 -9.693 12.16 136.3 2.126 80.07 0.969 5.70 1.130 1.000 1.064 -22.12 -22.20 -9.643 12.22 152.9 2.126 84.81 0.969 5.75 1.130 1.000 1.064 -22.14 -22.22 -9.593 12.29 171.6 2.126 89.84 0.969 5.80 1.130 1.000 1.064 -22.20 -22.28 -9.543 12.40 192.5 2.126 95.16 0.969 5.85 1.130 1.000 1.064 -22.33 -22.41 -9.493 12.58 216.0 2.126 100.8 0.969 5.90 1.130 1.000 1.064 -22.45 -22.53 -9.443 12.75 242.4 2.126 106.8 0.969 5.95 1.130 1.000 1.064 -22.51 -22.59 -9.393 12.86 272.0 2.126 113.1 0.969 6.00 1.130 1.000 1.064 -22.55 -22.63 -9.343 12.95 305.2 2.126 119.8 0.969 6.05 1.130 1.000 1.064 -22.58 -22.66 -9.293 13.03 342.4 2.126 126.9 0.969 6.10 1.131 1.000 1.064 -22.60 -22.68 -9.242 13.10 384.2 2.126 134.4 0.969 6.15 1.131 1.000 1.064 -22.59 -22.67 -9.192 13.14 431.1 2.126 142.4 0.969 6.20 1.131 1.000 1.064 -22.59 -22.67 -9.142 13.19 483.7 2.126 150.8 0.969 6.25 1.131 1.000 1.064 -22.61 -22.69 -9.092 13.26 542.8 2.126 159.8 0.969 6.30 1.131 1.000 1.064 -22.67 -22.75 -9.042 13.37 609.0 2.126 169.2 0.969 6.35 1.131 1.000 1.064 -22.74 -22.82 -8.992 13.49 683.3 2.126 179.3 0.969 6.40 1.131 1.000 1.064 -22.80 -22.88 -8.942 13.60 766.7 2.126 189.9 0.969 6.45 1.131 1.000 1.064 -22.84 -22.92 -8.892 13.69 860.3 2.126 201.2 0.969 6.50 1.131 1.000 1.064 -22.87 -22.95 -8.842 13.77 965.3 2.126 213.1 0.968 6.55 1.131 1.000 1.064 -22.89 -22.98 -8.792 13.84 1083.1 2.126 225.7 0.968 6.60 1.131 1.000 1.064 -22.91 -22.99 -8.742 13.91 1215.3 2.126 239.1 0.968 6.65 1.131 1.000 1.064 -22.92 -23.00 -8.692 13.97 1363.6 2.126 253.2 0.968 6.70 1.131 1.000 1.064 -22.93 -23.01 -8.642 14.03 1530.0 2.126 268.3 0.968 6.75 1.131 1.000 1.064 -22.94 -23.02 -8.592 14.09 1716.7 2.126 284.2 0.968 6.80 1.131 1.000 1.064 -22.94 -23.02 -8.542 14.14 1926.2 2.126 301.0 0.968 6.85 1.131 1.000 1.064 -22.94 -23.02 -8.492 14.19 2161.3 2.126 318.8 0.968 6.90 1.131 1.000 1.064 -22.93 -23.01 -8.442 14.23 2425.0 2.126 337.7 0.968 6.95 1.131 1.000 1.064 -22.92 -23.00 -8.392 14.27 2720.9 2.126 357.7 0.968 7.00 1.131 1.000 1.064 -22.91 -22.99 -8.342 14.31 3052.9 2.126 378.9 0.968 7.05 1.131 1.000 1.064 -22.90 -22.98 -8.292 14.35 3425.5 2.126 401.4 0.968 7.10 1.131 1.000 1.064 -22.90 -22.98 -8.242 14.40 3843.4 2.126 425.2 0.968 7.15 1.131 1.000 1.064 -22.90 -22.98 -8.192 14.45 4312.4 2.126 450.4 0.968 7.20 1.131 1.000 1.064 -22.89 -22.97 -8.142 14.49 4838.6 2.126 477.0 0.968 7.25 1.131 1.000 1.064 -22.88 -22.96 -8.092 14.53 5429.1 2.126 505.3 0.968 7.30 1.131 1.000 1.064 -22.87 -22.95 -8.042 14.57 6091.5 2.126 535.3 0.968 7.35 1.131 1.000 1.064 -22.85 -22.94 -7.992 14.60 6834.8 2.126 567.0 0.968 7.40 1.131 1.000 1.064 -22.84 -22.92 -7.942 14.64 7668.8 2.126 600.6 0.968 7.45 1.131 1.000 1.064 -22.82 -22.90 -7.892 14.67 8604.5 2.126 636.2 0.968 7.50 1.131 1.000 1.064 -22.81 -22.89 -7.842 14.71 9654.4 2.126 673.9 0.968 7.55 1.131 1.000 1.064 -22.79 -22.87 -7.792 14.74 10832.4 2.126 713.8 0.968 7.60 1.131 1.000 1.064 -22.77 -22.85 -7.742 14.77 12154.2 2.126 756.1 0.968 7.65 1.131 1.000 1.064 -22.75 -22.83 -7.692 14.80 13637.2 2.126 800.9 0.968 7.70 1.131 1.000 1.064 -22.73 -22.81 -7.642 14.83 15301.2 2.126 848.3 0.968 7.75 1.131 1.000 1.064 -22.71 -22.79 -7.592 14.86 17168.3 2.126 898.6 0.968 7.80 1.131 1.000 1.064 -22.69 -22.77 -7.542 14.89 19263.1 2.126 951.8 0.968 7.85 1.131 1.000 1.064 -22.67 -22.75 -7.492 14.92 21613.6 2.126 1008.2 0.968 7.90 1.131 1.000 1.064 -22.65 -22.73 -7.442 14.95 24250.9 2.126 1068.0 0.968 7.95 1.131 1.000 1.064 -22.62 -22.70 -7.392 14.97 27209.9 2.126 1131.3 0.968 8.00 1.131 1.000 1.064 -22.60 -22.68 -7.342 15.00 30530.1 2.126 1198.3 0.968 8.05 1.131 1.000 1.064 -22.58 -22.66 -7.292 15.03 34255.3 2.126 1269.3 0.968 8.10 1.131 1.000 1.064 -22.56 -22.64 -7.242 15.06 38435.1 2.126 1344.5 0.968 8.15 1.131 1.000 1.064 -22.54 -22.62 -7.192 15.09 43124.9 2.126 1424.2 0.968 8.20 1.131 1.000 1.064 -22.51 -22.60 -7.142 15.11 48387.0 2.126 1508.6 0.968 8.25 1.131 1.000 1.064 -22.49 -22.57 -7.092 15.14 54291.1 2.126 1598.0 0.968 8.30 1.131 1.000 1.064 -22.47 -22.55 -7.042 15.17 60915.6 2.126 1692.7 0.968 8.35 1.131 1.000 1.064 -22.45 -22.53 -6.992 15.20 68348.4 2.126 1792.9 0.968 8.40 1.131 1.000 1.064 -22.42 -22.51 -6.942 15.22 76688.2 2.126 1899.2 0.968 8.45 1.131 1.000 1.064 -22.40 -22.48 -6.892 15.25 86045.6 2.126 2011.7 0.968 8.50 1.131 1.000 1.064 -22.38 -22.46 -6.842 15.28 96544.8 2.126 2130.9 0.968pydl-0.7.0/pydl/pydlutils/data/cooling/m-00.cie0000644000076500000240000001544512672535402021620 0ustar weaverstaff00000000000000Cooling Curve : CIE:Solar log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.00 0.0023 1.000 1.099 -25.65 -23.06 -11.64 11.37 1.532 2.386 8.012 2.166 4.05 0.0142 1.000 1.099 -24.27 -22.46 -11.59 10.83 1.737 2.386 8.533 2.143 4.10 0.0704 1.000 1.099 -23.28 -22.17 -11.52 10.58 2.048 2.386 9.264 2.040 4.15 0.2534 1.000 1.099 -22.47 -21.92 -11.40 10.38 2.657 2.386 10.55 1.764 4.20 0.5678 1.000 1.099 -21.99 -21.79 -11.26 10.30 3.674 2.387 12.41 1.431 4.25 0.8177 1.000 1.099 -21.84 -21.80 -11.15 10.36 4.740 2.387 14.09 1.245 4.30 0.9324 1.000 1.099 -21.85 -21.86 -11.08 10.48 5.637 2.387 15.37 1.175 4.35 0.9780 1.000 1.099 -21.87 -21.90 -11.02 10.57 6.467 2.387 16.46 1.149 4.40 1.004 1.000 1.099 -21.84 -21.88 -10.96 10.60 7.347 2.387 17.54 1.135 4.45 1.035 1.000 1.099 -21.77 -21.82 -10.90 10.59 8.366 2.387 18.72 1.118 4.50 1.067 1.000 1.099 -21.66 -21.73 -10.85 10.55 9.528 2.387 19.98 1.102 4.55 1.086 1.000 1.099 -21.56 -21.63 -10.79 10.50 10.78 2.387 21.25 1.092 4.60 1.094 1.000 1.099 -21.45 -21.53 -10.74 10.45 12.14 2.387 22.56 1.088 4.65 1.098 1.000 1.099 -21.34 -21.42 -10.69 10.39 13.65 2.387 23.91 1.087 4.70 1.100 1.000 1.099 -21.24 -21.32 -10.64 10.34 15.32 2.387 25.34 1.086 4.75 1.102 1.000 1.099 -21.14 -21.22 -10.59 10.29 17.22 2.387 26.86 1.084 4.80 1.110 1.000 1.099 -21.05 -21.14 -10.54 10.25 19.38 2.387 28.50 1.080 4.85 1.128 1.000 1.099 -20.97 -21.07 -10.49 10.23 21.92 2.387 30.31 1.072 4.90 1.154 1.000 1.099 -20.91 -21.01 -10.43 10.23 24.89 2.387 32.29 1.059 4.95 1.176 1.000 1.099 -20.87 -20.98 -10.38 10.25 28.19 2.387 34.37 1.049 5.00 1.188 1.000 1.099 -20.87 -20.99 -10.32 10.30 31.81 2.387 36.50 1.043 5.05 1.195 1.000 1.099 -20.90 -21.02 -10.27 10.38 35.79 2.387 38.72 1.041 5.10 1.198 1.000 1.099 -20.91 -21.03 -10.22 10.45 40.21 2.387 41.04 1.039 5.15 1.199 1.000 1.099 -20.89 -21.01 -10.17 10.48 45.14 2.387 43.49 1.039 5.20 1.200 1.000 1.099 -20.86 -20.98 -10.12 10.50 50.67 2.387 46.07 1.038 5.25 1.201 1.000 1.099 -20.85 -20.97 -10.07 10.54 56.87 2.387 48.81 1.038 5.30 1.201 1.000 1.099 -20.84 -20.96 -10.02 10.58 63.82 2.387 51.71 1.038 5.35 1.201 1.000 1.099 -20.84 -20.96 -9.972 10.63 71.62 2.387 54.78 1.038 5.40 1.202 1.000 1.099 -20.87 -20.99 -9.922 10.71 80.38 2.387 58.03 1.037 5.45 1.203 1.000 1.099 -21.01 -21.13 -9.872 10.90 90.22 2.387 61.48 1.037 5.50 1.203 1.000 1.099 -21.23 -21.35 -9.822 11.17 101.2 2.387 65.13 1.037 5.55 1.204 1.000 1.099 -21.43 -21.55 -9.772 11.42 113.6 2.387 68.99 1.037 5.60 1.204 1.000 1.099 -21.54 -21.66 -9.722 11.58 127.5 2.387 73.08 1.036 5.65 1.204 1.000 1.099 -21.58 -21.71 -9.672 11.67 143.1 2.387 77.41 1.036 5.70 1.204 1.000 1.099 -21.59 -21.71 -9.622 11.73 160.5 2.387 82.00 1.036 5.75 1.204 1.000 1.099 -21.59 -21.71 -9.572 11.78 180.1 2.387 86.87 1.036 5.80 1.204 1.000 1.099 -21.64 -21.76 -9.522 11.88 202.1 2.387 92.02 1.036 5.85 1.205 1.000 1.099 -21.74 -21.86 -9.471 12.03 226.8 2.387 97.47 1.036 5.90 1.205 1.000 1.099 -21.81 -21.93 -9.421 12.15 254.5 2.387 103.3 1.036 5.95 1.205 1.000 1.099 -21.83 -21.95 -9.371 12.22 285.6 2.387 109.4 1.036 6.00 1.205 1.000 1.099 -21.84 -21.96 -9.321 12.28 320.4 2.387 115.9 1.036 6.05 1.205 1.000 1.099 -21.84 -21.96 -9.271 12.33 359.6 2.387 122.7 1.036 6.10 1.206 1.000 1.099 -21.83 -21.96 -9.221 12.37 403.5 2.387 130.0 1.036 6.15 1.206 1.000 1.099 -21.82 -21.95 -9.171 12.41 452.8 2.387 137.7 1.036 6.20 1.206 1.000 1.099 -21.82 -21.94 -9.121 12.46 508.1 2.387 145.9 1.035 6.25 1.206 1.000 1.099 -21.85 -21.97 -9.071 12.54 570.1 2.387 154.5 1.035 6.30 1.207 1.000 1.099 -21.95 -22.07 -9.021 12.69 639.8 2.387 163.7 1.035 6.35 1.207 1.000 1.099 -22.08 -22.20 -8.971 12.87 717.9 2.387 173.4 1.035 6.40 1.207 1.000 1.099 -22.19 -22.31 -8.921 13.03 805.6 2.387 183.7 1.035 6.45 1.208 1.000 1.099 -22.27 -22.39 -8.871 13.16 904.0 2.387 194.6 1.035 6.50 1.208 1.000 1.099 -22.32 -22.44 -8.821 13.26 1014.5 2.387 206.1 1.035 6.55 1.208 1.000 1.099 -22.35 -22.48 -8.771 13.34 1138.3 2.387 218.4 1.035 6.60 1.208 1.000 1.099 -22.38 -22.50 -8.721 13.42 1277.3 2.387 231.3 1.035 6.65 1.208 1.000 1.099 -22.41 -22.53 -8.671 13.50 1433.2 2.387 245.0 1.034 6.70 1.208 1.000 1.099 -22.44 -22.56 -8.621 13.58 1608.2 2.387 259.6 1.034 6.75 1.208 1.000 1.099 -22.46 -22.59 -8.571 13.65 1804.4 2.387 274.9 1.034 6.80 1.208 1.000 1.099 -22.48 -22.60 -8.521 13.72 2024.7 2.387 291.2 1.034 6.85 1.209 1.000 1.099 -22.48 -22.60 -8.471 13.77 2271.8 2.387 308.5 1.034 6.90 1.209 1.000 1.099 -22.46 -22.59 -8.421 13.80 2549.1 2.387 326.8 1.034 6.95 1.209 1.000 1.099 -22.45 -22.57 -8.371 13.84 2860.2 2.387 346.1 1.034 7.00 1.209 1.000 1.099 -22.45 -22.57 -8.321 13.89 3209.2 2.387 366.7 1.034 7.05 1.209 1.000 1.099 -22.46 -22.59 -8.271 13.95 3600.9 2.387 388.4 1.034 7.10 1.209 1.000 1.099 -22.49 -22.62 -8.221 14.03 4040.4 2.387 411.4 1.034 7.15 1.209 1.000 1.099 -22.53 -22.65 -8.171 14.12 4533.5 2.387 435.8 1.034 7.20 1.209 1.000 1.099 -22.56 -22.68 -8.121 14.20 5086.8 2.387 461.6 1.034 7.25 1.209 1.000 1.099 -22.58 -22.70 -8.071 14.27 5707.5 2.387 489.0 1.034 7.30 1.209 1.000 1.099 -22.60 -22.72 -8.021 14.34 6404.0 2.387 518.0 1.034 7.35 1.209 1.000 1.099 -22.61 -22.73 -7.971 14.40 7185.4 2.387 548.6 1.034 7.40 1.209 1.000 1.099 -22.61 -22.73 -7.921 14.45 8062.2 2.387 581.2 1.034 7.45 1.209 1.000 1.099 -22.61 -22.73 -7.871 14.50 9046.0 2.387 615.6 1.034 7.50 1.209 1.000 1.099 -22.60 -22.73 -7.821 14.54 10149.8 2.387 652.1 1.034 7.55 1.209 1.000 1.099 -22.60 -22.72 -7.771 14.59 11388.3 2.387 690.7 1.034 7.60 1.209 1.000 1.099 -22.59 -22.71 -7.721 14.63 12777.9 2.387 731.6 1.034 7.65 1.209 1.000 1.099 -22.57 -22.70 -7.671 14.66 14337.1 2.387 775.0 1.034 7.70 1.209 1.000 1.099 -22.56 -22.68 -7.621 14.70 16086.6 2.387 820.9 1.034 7.75 1.209 1.000 1.099 -22.54 -22.67 -7.571 14.73 18049.5 2.387 869.6 1.034 7.80 1.209 1.000 1.099 -22.53 -22.65 -7.521 14.77 20251.9 2.387 921.1 1.034 7.85 1.209 1.000 1.099 -22.51 -22.64 -7.471 14.80 22723.1 2.387 975.7 1.034 7.90 1.209 1.000 1.099 -22.49 -22.62 -7.421 14.83 25495.8 2.387 1033.5 1.034 7.95 1.209 1.000 1.099 -22.48 -22.60 -7.371 14.87 28606.9 2.387 1094.7 1.034 8.00 1.209 1.000 1.099 -22.46 -22.58 -7.321 14.90 32097.6 2.387 1159.6 1.034 8.05 1.209 1.000 1.099 -22.44 -22.56 -7.271 14.93 36014.2 2.387 1228.3 1.034 8.10 1.209 1.000 1.099 -22.42 -22.54 -7.221 14.96 40408.7 2.387 1301.1 1.034 8.15 1.209 1.000 1.099 -22.40 -22.53 -7.171 14.99 45339.4 2.387 1378.2 1.034 8.20 1.209 1.000 1.099 -22.38 -22.51 -7.121 15.02 50871.7 2.387 1459.8 1.034 8.25 1.209 1.000 1.099 -22.36 -22.49 -7.071 15.05 57079.1 2.387 1546.3 1.034 8.30 1.209 1.000 1.099 -22.34 -22.47 -7.021 15.08 64044.0 2.387 1638.0 1.034 8.35 1.209 1.000 1.099 -22.32 -22.45 -6.971 15.11 71858.6 2.387 1735.0 1.034 8.40 1.209 1.000 1.099 -22.30 -22.43 -6.921 15.14 80626.8 2.387 1837.8 1.034 8.45 1.209 1.000 1.099 -22.28 -22.40 -6.871 15.17 90464.8 2.387 1946.7 1.034 8.50 1.209 1.000 1.099 -22.26 -22.38 -6.821 15.20 101503 2.387 2062.1 1.034 pydl-0.7.0/pydl/pydlutils/data/cooling/m-15.cie0000644000076500000240000001544712672535402021630 0ustar weaverstaff00000000000000Cooling Curve : CIE:[Fe/H] = -1.5 log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.00 0.0022 1.000 1.064 -26.00 -23.38 -11.66 11.69 1.483 2.118 8.368 1.986 4.05 0.0141 1.000 1.064 -24.66 -22.84 -11.60 11.20 1.682 2.118 8.913 1.964 4.10 0.0702 1.000 1.064 -23.52 -22.39 -11.53 10.81 1.986 2.118 9.683 1.867 4.15 0.2525 1.000 1.064 -22.62 -22.05 -11.41 10.52 2.586 2.118 11.05 1.609 4.20 0.5666 1.000 1.064 -22.11 -21.89 -11.27 10.41 3.594 2.118 13.03 1.299 4.25 0.8163 1.000 1.064 -21.97 -21.91 -11.16 10.47 4.650 2.118 14.82 1.127 4.30 0.9311 1.000 1.064 -22.01 -22.01 -11.08 10.62 5.535 2.118 16.16 1.062 4.35 0.9756 1.000 1.064 -22.11 -22.12 -11.02 10.79 6.349 2.118 17.31 1.039 4.40 0.9981 1.000 1.064 -22.20 -22.23 -10.97 10.94 7.203 2.119 18.44 1.027 4.45 1.021 1.000 1.064 -22.27 -22.30 -10.91 11.07 8.170 2.119 19.64 1.016 4.50 1.042 1.000 1.064 -22.31 -22.35 -10.86 11.17 9.263 2.119 20.91 1.006 4.55 1.055 1.000 1.064 -22.33 -22.38 -10.81 11.25 10.46 2.119 22.22 1.000 4.60 1.060 1.000 1.064 -22.34 -22.39 -10.76 11.31 11.76 2.119 23.56 0.997 4.65 1.063 1.000 1.064 -22.32 -22.37 -10.71 11.34 13.21 2.119 24.97 0.996 4.70 1.064 1.000 1.064 -22.26 -22.31 -10.66 11.33 14.83 2.119 26.46 0.996 4.75 1.066 1.000 1.064 -22.17 -22.23 -10.61 11.29 16.65 2.119 28.04 0.995 4.80 1.070 1.000 1.064 -22.06 -22.12 -10.55 11.23 18.73 2.119 29.73 0.992 4.85 1.082 1.000 1.064 -21.95 -22.01 -10.50 11.18 21.13 2.119 31.58 0.987 4.90 1.099 1.000 1.064 -21.88 -21.95 -10.45 11.16 23.89 2.119 33.58 0.979 4.95 1.113 1.000 1.064 -21.86 -21.94 -10.40 11.20 26.98 2.119 35.69 0.973 5.00 1.121 1.000 1.064 -21.88 -21.96 -10.34 11.27 30.39 2.119 37.87 0.970 5.05 1.125 1.000 1.064 -21.90 -21.98 -10.29 11.34 34.15 2.119 40.15 0.968 5.10 1.127 1.000 1.064 -21.91 -21.99 -10.24 11.41 38.35 2.119 42.55 0.967 5.15 1.128 1.000 1.064 -21.90 -21.98 -10.19 11.45 43.05 2.119 45.08 0.967 5.20 1.128 1.000 1.064 -21.88 -21.96 -10.14 11.48 48.31 2.119 47.75 0.967 5.25 1.128 1.000 1.064 -21.86 -21.94 -10.09 11.51 54.21 2.119 50.58 0.966 5.30 1.128 1.000 1.064 -21.85 -21.93 -10.04 11.55 60.83 2.119 53.58 0.966 5.35 1.128 1.000 1.064 -21.84 -21.92 -9.993 11.59 68.25 2.119 56.76 0.966 5.40 1.128 1.000 1.064 -21.87 -21.95 -9.943 11.67 76.58 2.119 60.12 0.966 5.45 1.128 1.000 1.064 -22.00 -22.08 -9.893 11.85 85.93 2.119 63.69 0.966 5.50 1.129 1.000 1.064 -22.21 -22.29 -9.843 12.11 96.42 2.119 67.46 0.966 5.55 1.129 1.000 1.064 -22.39 -22.47 -9.793 12.34 108.2 2.119 71.46 0.966 5.60 1.129 1.000 1.064 -22.49 -22.57 -9.743 12.49 121.4 2.119 75.69 0.966 5.65 1.129 1.000 1.064 -22.54 -22.62 -9.693 12.59 136.2 2.119 80.18 0.966 5.70 1.129 1.000 1.064 -22.55 -22.63 -9.643 12.65 152.8 2.119 84.93 0.966 5.75 1.129 1.000 1.064 -22.56 -22.64 -9.593 12.71 171.5 2.119 89.96 0.966 5.80 1.129 1.000 1.064 -22.62 -22.70 -9.543 12.82 192.4 2.119 95.29 0.966 5.85 1.129 1.000 1.064 -22.73 -22.81 -9.493 12.98 215.9 2.119 100.9 0.966 5.90 1.129 1.000 1.064 -22.82 -22.90 -9.443 13.12 242.2 2.119 106.9 0.966 5.95 1.129 1.000 1.064 -22.86 -22.94 -9.393 13.21 271.8 2.119 113.3 0.966 6.00 1.129 1.000 1.064 -22.89 -22.97 -9.343 13.29 304.9 2.119 120.0 0.966 6.05 1.129 1.000 1.064 -22.92 -23.00 -9.293 13.37 342.1 2.119 127.1 0.966 6.10 1.129 1.000 1.064 -22.93 -23.01 -9.243 13.43 383.9 2.119 134.6 0.966 6.15 1.129 1.000 1.064 -22.93 -23.00 -9.193 13.48 430.7 2.119 142.6 0.966 6.20 1.129 1.000 1.064 -22.92 -23.00 -9.143 13.52 483.3 2.119 151.0 0.966 6.25 1.129 1.000 1.064 -22.93 -23.01 -9.093 13.58 542.2 2.119 160.0 0.966 6.30 1.129 1.000 1.064 -22.97 -23.04 -9.043 13.67 608.4 2.119 169.5 0.966 6.35 1.129 1.000 1.064 -23.01 -23.09 -8.993 13.76 682.7 2.119 179.5 0.966 6.40 1.129 1.000 1.064 -23.04 -23.12 -8.943 13.84 766.0 2.119 190.1 0.966 6.45 1.129 1.000 1.064 -23.06 -23.14 -8.893 13.91 859.4 2.119 201.4 0.966 6.50 1.129 1.000 1.064 -23.07 -23.15 -8.843 13.97 964.3 2.119 213.3 0.966 6.55 1.129 1.000 1.064 -23.07 -23.15 -8.793 14.02 1082.0 2.119 226.0 0.966 6.60 1.129 1.000 1.064 -23.07 -23.15 -8.743 14.07 1214.0 2.119 239.4 0.966 6.65 1.129 1.000 1.064 -23.07 -23.15 -8.693 14.12 1362.2 2.119 253.6 0.966 6.70 1.129 1.000 1.064 -23.07 -23.15 -8.643 14.17 1528.4 2.119 268.6 0.966 6.75 1.129 1.000 1.064 -23.06 -23.14 -8.593 14.21 1714.8 2.119 284.5 0.966 6.80 1.129 1.000 1.064 -23.05 -23.13 -8.543 14.25 1924.1 2.119 301.4 0.966 6.85 1.129 1.000 1.064 -23.04 -23.12 -8.493 14.29 2158.9 2.119 319.2 0.966 6.90 1.129 1.000 1.064 -23.02 -23.10 -8.443 14.32 2422.3 2.119 338.1 0.966 6.95 1.129 1.000 1.064 -23.01 -23.09 -8.393 14.36 2717.9 2.119 358.2 0.966 7.00 1.129 1.000 1.064 -23.00 -23.08 -8.343 14.40 3049.5 2.119 379.4 0.966 7.05 1.129 1.000 1.064 -22.98 -23.06 -8.293 14.43 3421.6 2.119 401.9 0.966 7.10 1.129 1.000 1.064 -22.97 -23.05 -8.243 14.47 3839.1 2.119 425.7 0.966 7.15 1.129 1.000 1.064 -22.96 -23.04 -8.193 14.51 4307.6 2.119 450.9 0.966 7.20 1.129 1.000 1.064 -22.94 -23.02 -8.143 14.54 4833.2 2.119 477.6 0.966 7.25 1.129 1.000 1.064 -22.93 -23.00 -8.093 14.58 5422.9 2.119 505.9 0.966 7.30 1.129 1.000 1.064 -22.91 -22.99 -8.043 14.61 6084.6 2.119 535.9 0.966 7.35 1.129 1.000 1.064 -22.89 -22.97 -7.993 14.64 6827.1 2.119 567.7 0.966 7.40 1.129 1.000 1.064 -22.87 -22.95 -7.943 14.67 7660.1 2.119 601.3 0.966 7.45 1.129 1.000 1.064 -22.85 -22.93 -7.893 14.70 8594.8 2.119 636.9 0.966 7.50 1.129 1.000 1.064 -22.83 -22.91 -7.843 14.73 9643.5 2.119 674.7 0.966 7.55 1.129 1.000 1.064 -22.81 -22.89 -7.793 14.76 10820.2 2.119 714.6 0.966 7.60 1.129 1.000 1.064 -22.79 -22.87 -7.743 14.79 12140.4 2.119 757.0 0.966 7.65 1.129 1.000 1.064 -22.77 -22.85 -7.693 14.82 13621.8 2.119 801.8 0.966 7.70 1.129 1.000 1.064 -22.75 -22.83 -7.643 14.85 15283.9 2.119 849.4 0.966 7.75 1.129 1.000 1.064 -22.73 -22.81 -7.593 14.88 17148.8 2.119 899.7 0.966 7.80 1.129 1.000 1.064 -22.70 -22.78 -7.543 14.90 19241.3 2.119 953.0 0.966 7.85 1.129 1.000 1.064 -22.68 -22.76 -7.493 14.93 21589.1 2.119 1009.5 0.966 7.90 1.129 1.000 1.064 -22.66 -22.74 -7.443 14.96 24223.4 2.119 1069.3 0.966 7.95 1.129 1.000 1.064 -22.64 -22.72 -7.393 14.99 27179.1 2.119 1132.6 0.966 8.00 1.129 1.000 1.064 -22.62 -22.70 -7.343 15.02 30495.4 2.119 1199.7 0.966 8.05 1.129 1.000 1.064 -22.59 -22.67 -7.293 15.04 34216.4 2.119 1270.8 0.966 8.10 1.129 1.000 1.064 -22.57 -22.65 -7.243 15.07 38391.5 2.119 1346.1 0.966 8.15 1.129 1.000 1.064 -22.55 -22.63 -7.193 15.10 43075.9 2.119 1425.9 0.966 8.20 1.129 1.000 1.064 -22.53 -22.61 -7.143 15.13 48332.0 2.119 1510.4 0.966 8.25 1.129 1.000 1.064 -22.50 -22.58 -7.093 15.15 54229.4 2.119 1599.9 0.966 8.30 1.129 1.000 1.064 -22.48 -22.56 -7.043 15.18 60846.4 2.119 1694.7 0.966 8.35 1.129 1.000 1.064 -22.46 -22.54 -6.993 15.21 68270.8 2.119 1795.1 0.966 8.40 1.129 1.000 1.064 -22.44 -22.52 -6.943 15.24 76601.1 2.119 1901.5 0.966 8.45 1.129 1.000 1.064 -22.41 -22.49 -6.893 15.26 85947.8 2.119 2014.1 0.966 8.50 1.129 1.000 1.064 -22.39 -22.47 -6.843 15.29 96435.1 2.119 2133.5 0.966pydl-0.7.0/pydl/pydlutils/data/cooling/m+05.cie0000644000076500000240000001525412672535402021621 0ustar weaverstaff00000000000000Cooling Curve : CIE:[Fe/H] = +0.5 log(T) ne nH nt log(lambda net) log(lambda norm) log(U) log(taucool) P12 rho24 Ci mubar 4.10 0.0711 1.000 1.103 -22.98 -21.88 -11.51 10.29 2.055 2.486 9.092 2.117 4.15 0.2547 1.000 1.103 -22.26 -21.71 -11.40 10.17 2.666 2.486 10.36 1.831 4.20 0.5700 1.000 1.103 -21.80 -21.60 -11.26 10.11 3.686 2.486 12.18 1.486 4.25 0.8202 1.000 1.103 -21.64 -21.60 -11.15 10.16 4.755 2.486 13.83 1.293 4.30 0.9356 1.000 1.103 -21.61 -21.62 -11.07 10.24 5.655 2.487 15.08 1.220 4.35 0.9812 1.000 1.103 -21.57 -21.60 -11.01 10.27 6.487 2.487 16.15 1.193 4.40 1.008 1.000 1.103 -21.48 -21.53 -10.96 10.24 7.371 2.487 17.22 1.178 4.45 1.039 1.000 1.103 -21.36 -21.42 -10.90 10.19 8.393 2.487 18.37 1.161 4.50 1.071 1.000 1.103 -21.22 -21.30 -10.85 10.11 9.559 2.487 19.61 1.144 4.55 1.090 1.000 1.103 -21.09 -21.17 -10.79 10.04 10.82 2.487 20.86 1.134 4.60 1.098 1.000 1.103 -20.97 -21.05 -10.74 9.969 12.19 2.487 22.14 1.130 4.65 1.102 1.000 1.103 -20.86 -20.94 -10.69 9.911 13.69 2.487 23.47 1.128 4.70 1.104 1.000 1.103 -20.75 -20.83 -10.64 9.852 15.38 2.487 24.87 1.127 4.75 1.107 1.000 1.103 -20.65 -20.74 -10.59 9.803 17.28 2.487 26.36 1.125 4.80 1.116 1.000 1.103 -20.57 -20.66 -10.54 9.776 19.46 2.487 27.98 1.121 4.85 1.134 1.000 1.103 -20.50 -20.59 -10.48 9.763 22.02 2.487 29.76 1.112 4.90 1.160 1.000 1.103 -20.43 -20.54 -10.43 9.753 25.00 2.487 31.71 1.099 4.95 1.183 1.000 1.103 -20.39 -20.50 -10.37 9.772 28.33 2.487 33.75 1.088 5.00 1.197 1.000 1.103 -20.39 -20.51 -10.32 9.827 31.97 2.487 35.86 1.082 5.05 1.204 1.000 1.103 -20.42 -20.54 -10.27 9.909 35.99 2.487 38.04 1.078 5.10 1.208 1.000 1.103 -20.42 -20.55 -10.22 9.961 40.44 2.487 40.33 1.076 5.15 1.210 1.000 1.103 -20.40 -20.52 -10.17 9.992 45.42 2.487 42.74 1.075 5.20 1.211 1.000 1.103 -20.37 -20.49 -10.12 10.01 50.99 2.487 45.28 1.075 5.25 1.212 1.000 1.103 -20.35 -20.48 -10.07 10.04 57.24 2.487 47.98 1.074 5.30 1.213 1.000 1.103 -20.34 -20.47 -10.02 10.08 64.26 2.487 50.83 1.074 5.35 1.215 1.000 1.103 -20.34 -20.46 -9.969 10.13 72.14 2.487 53.86 1.073 5.40 1.216 1.000 1.103 -20.37 -20.50 -9.919 10.21 81.00 2.487 57.07 1.072 5.45 1.219 1.000 1.103 -20.51 -20.63 -9.868 10.40 90.97 2.487 60.48 1.071 5.50 1.221 1.000 1.103 -20.73 -20.86 -9.818 10.68 102.20 2.487 64.09 1.070 5.55 1.221 1.000 1.103 -20.93 -21.06 -9.768 10.93 114.70 2.487 67.91 1.070 5.60 1.222 1.000 1.103 -21.05 -21.18 -9.717 11.10 128.70 2.487 71.94 1.070 5.65 1.222 1.000 1.103 -21.09 -21.22 -9.667 11.19 144.4 2.487 76.21 1.069 5.70 1.223 1.000 1.103 -21.09 -21.22 -9.617 11.24 162.1 2.487 80.73 1.069 5.75 1.223 1.000 1.103 -21.09 -21.22 -9.567 11.29 181.9 2.487 85.52 1.069 5.80 1.224 1.000 1.103 -21.14 -21.27 -9.517 11.39 204.1 2.487 90.60 1.069 5.85 1.224 1.000 1.103 -21.24 -21.38 -9.467 11.54 229.1 2.487 95.98 1.069 5.90 1.225 1.000 1.103 -21.32 -21.45 -9.417 11.67 257.1 2.487 101.70 1.068 5.95 1.226 1.000 1.103 -21.34 -21.47 -9.367 11.74 288.5 2.487 107.7 1.068 6.00 1.226 1.000 1.103 -21.35 -21.48 -9.317 11.80 323.8 2.487 114.1 1.068 6.05 1.227 1.000 1.103 -21.35 -21.48 -9.266 11.85 363.4 2.487 120.9 1.067 6.10 1.227 1.000 1.103 -21.34 -21.47 -9.216 11.89 407.9 2.487 128.1 1.067 6.15 1.228 1.000 1.103 -21.33 -21.46 -9.166 11.93 457.9 2.487 135.7 1.067 6.20 1.229 1.000 1.103 -21.32 -21.46 -9.116 11.97 513.9 2.487 143.8 1.066 6.25 1.230 1.000 1.103 -21.35 -21.49 -9.066 12.05 576.8 2.487 152.3 1.066 6.30 1.231 1.000 1.103 -21.46 -21.59 -9.016 12.21 647.5 2.487 161.4 1.066 6.35 1.232 1.000 1.103 -21.59 -21.73 -8.966 12.39 726.8 2.487 171.0 1.065 6.40 1.233 1.000 1.103 -21.71 -21.85 -8.915 12.56 815.8 2.487 181.1 1.065 6.45 1.234 1.000 1.103 -21.80 -21.94 -8.865 12.70 915.7 2.487 191.9 1.064 6.50 1.234 1.000 1.103 -21.86 -22.00 -8.815 12.81 1027.7 2.487 203.3 1.064 6.55 1.235 1.000 1.103 -21.90 -22.03 -8.765 12.90 1153.3 2.487 215.4 1.064 6.60 1.235 1.000 1.103 -21.93 -22.07 -8.715 12.98 1294.3 2.487 228.1 1.064 6.65 1.236 1.000 1.103 -21.96 -22.10 -8.665 13.06 1452.5 2.487 241.7 1.063 6.70 1.236 1.000 1.103 -22.00 -22.14 -8.615 13.15 1629.9 2.487 256.0 1.063 6.75 1.236 1.000 1.103 -22.04 -22.17 -8.565 13.24 1829.0 2.487 271.2 1.063 6.80 1.237 1.000 1.103 -22.06 -22.20 -8.515 13.31 2052.3 2.487 287.3 1.063 6.85 1.237 1.000 1.103 -22.06 -22.20 -8.465 13.36 2303.0 2.487 304.3 1.063 6.90 1.237 1.000 1.103 -22.05 -22.18 -8.415 13.40 2584.2 2.487 322.4 1.063 6.95 1.237 1.000 1.103 -22.04 -22.17 -8.365 13.44 2899.7 2.487 341.5 1.063 7.00 1.237 1.000 1.103 -22.04 -22.17 -8.315 13.49 3253.8 2.487 361.7 1.063 7.05 1.237 1.000 1.103 -22.06 -22.20 -8.265 13.56 3651.2 2.487 383.2 1.063 7.10 1.238 1.000 1.103 -22.11 -22.24 -8.214 13.66 4097.0 2.487 405.9 1.063 7.15 1.238 1.000 1.103 -22.16 -22.30 -8.164 13.76 4597.2 2.487 430.0 1.062 7.20 1.238 1.000 1.103 -22.21 -22.35 -8.114 13.86 5158.4 2.487 455.4 1.062 7.25 1.238 1.000 1.103 -22.26 -22.40 -8.064 13.96 5788.0 2.487 482.4 1.062 7.30 1.238 1.000 1.103 -22.30 -22.43 -8.014 14.05 6494.4 2.487 511.0 1.062 7.35 1.238 1.000 1.103 -22.32 -22.46 -7.964 14.12 7287.0 2.487 541.3 1.062 7.40 1.238 1.000 1.103 -22.35 -22.48 -7.914 14.20 8176.3 2.487 573.4 1.062 7.45 1.238 1.000 1.103 -22.36 -22.49 -7.864 14.26 9174.0 2.487 607.4 1.062 7.50 1.238 1.000 1.103 -22.37 -22.50 -7.814 14.32 10293.6 2.487 643.4 1.062 7.55 1.238 1.000 1.103 -22.37 -22.51 -7.764 14.37 11549.7 2.487 681.5 1.062 7.60 1.238 1.000 1.103 -22.37 -22.51 -7.714 14.42 12959.1 2.487 721.9 1.062 7.65 1.238 1.000 1.103 -22.37 -22.51 -7.664 14.47 14540.5 2.487 764.7 1.062 7.70 1.238 1.000 1.103 -22.37 -22.50 -7.614 14.52 16314.8 2.487 810.0 1.062 7.75 1.238 1.000 1.103 -22.36 -22.49 -7.564 14.56 18305.7 2.487 858.0 1.062 7.80 1.238 1.000 1.103 -22.35 -22.48 -7.514 14.60 20539.6 2.487 908.8 1.062 7.85 1.238 1.000 1.103 -22.34 -22.47 -7.464 14.64 23046.0 2.487 962.7 1.062 7.90 1.238 1.000 1.103 -22.33 -22.46 -7.414 14.68 25858.3 2.487 1019.7 1.062 7.95 1.238 1.000 1.103 -22.31 -22.45 -7.364 14.71 29013.8 2.487 1080.1 1.062 8.00 1.238 1.000 1.103 -22.30 -22.44 -7.314 14.75 32554.3 2.487 1144.2 1.062 8.05 1.238 1.000 1.103 -22.29 -22.42 -7.264 14.79 36526.9 2.487 1211.9 1.062 8.10 1.238 1.000 1.103 -22.27 -22.41 -7.214 14.82 40984.2 2.487 1283.8 1.062 8.15 1.238 1.000 1.103 -22.26 -22.40 -7.164 14.86 45985.4 2.487 1359.8 1.062 8.20 1.238 1.000 1.103 -22.25 -22.38 -7.114 14.90 51596.8 2.487 1440.4 1.062 8.25 1.238 1.000 1.103 -22.23 -22.37 -7.064 14.93 57892.9 2.487 1525.8 1.062 8.30 1.238 1.000 1.103 -22.21 -22.35 -7.014 14.96 64957.3 2.487 1616.2 1.062 8.35 1.239 1.000 1.103 -22.20 -22.33 -6.964 15.00 72883.6 2.487 1712.0 1.062 8.40 1.239 1.000 1.103 -22.18 -22.32 -6.914 15.03 81777.0 2.487 1813.4 1.062 8.45 1.239 1.000 1.103 -22.16 -22.30 -6.864 15.06 91755.6 2.487 1920.9 1.062 8.50 1.239 1.000 1.103 -22.14 -22.28 -6.814 15.09 102952 2.487 2034.7 1.062 pydl-0.7.0/pydl/pydlutils/image.py0000644000076500000240000001570713434104050017543 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the image directory in idlutils. """ import numpy as np def djs_maskinterp1(yval, mask, xval=None, const=False): """Interpolate over a masked, 1-d array. Parameters ---------- yval : :class:`numpy.ndarray` The input values. mask : :class:`numpy.ndarray` The mask. xval : :class:`numpy.ndarray`, optional If set, use these x values, otherwise use an array. const : :class:`bool`, optional If set to ``True``, bad values around the edges of the array will be set to a constant value. Because of the default behavior of :func:`numpy.interp`, this value actually makes no difference in the output. Returns ------- :class:`numpy.ndarray` The `yval` array with masked values replaced by interpolated values. """ good = mask == 0 if good.all(): return yval ngood = good.sum() igood = good.nonzero()[0] if ngood == 0: return yval if ngood == 1: return np.zeros(yval.shape, dtype=yval.dtype) + yval[igood[0]] ynew = yval.astype('d') ny = yval.size ibad = (mask != 0).nonzero()[0] if xval is None: ynew[ibad] = np.interp(ibad, igood, ynew[igood]) if const: if igood[0] != 0: ynew[0:igood[0]] = ynew[igood[0]] if igood[ngood-1] != ny-1: ynew[igood[ngood-1]+1:ny] = ynew[igood[ngood-1]] else: ii = xval.argsort() ibad = (mask[ii] != 0).nonzero()[0] igood = (mask[ii] == 0).nonzero()[0] ynew[ii[ibad]] = np.interp(xval[ii[ibad]], xval[ii[igood]], ynew[ii[igood]]) if const: if igood[0] != 0: ynew[ii[0:igood[0]]] = ynew[ii[igood[0]]] if igood[ngood-1] != ny-1: ynew[ii[igood[ngood-1]+1:ny]] = ynew[ii[igood[ngood-1]]] return ynew def djs_maskinterp(yval, mask, xval=None, axis=None, const=False): """Interpolate over masked pixels in a vector, image or 3-D array. Parameters ---------- yval : :class:`numpy.ndarray` The input values mask : :class:`numpy.ndarray` The mask xval : :class:`numpy.ndarray`, optional If set, use these x values, otherwise use an array axis : :class:`int`, optional Must be set if yval has more than one dimension. If set to zero, interpolate along the first axis of the array, if set to one, interpolate along the second axis of the array, and so on. const : :class:`bool`, optional This value is passed to a helper function, djs_maskinterp1. Returns ------- :class:`numpy.ndarray` The interpolated array. """ if mask.shape != yval.shape: raise ValueError('mask must have the same shape as yval.') if xval is not None: if xval.shape != yval.shape: raise ValueError('xval must have the same shape as yval.') ndim = yval.ndim if ndim == 1: ynew = djs_maskinterp1(yval, mask, xval=xval, const=const) else: if axis is None: raise ValueError('Must set axis if yval has more than one dimension.') if axis < 0 or axis > ndim-1 or axis - int(axis) != 0: raise ValueError('Invalid axis value.') ynew = np.zeros(yval.shape, dtype=yval.dtype) if ndim == 2: if xval is None: if axis == 0: for i in range(yval.shape[0]): ynew[i, :] = djs_maskinterp1(yval[i, :], mask[i, :], const=const) else: for i in range(yval.shape[1]): ynew[:, i] = djs_maskinterp1(yval[:, i], mask[:, i], const=const) else: if axis == 0: for i in range(yval.shape[0]): ynew[i, :] = djs_maskinterp1(yval[i, :], mask[i, :], xval=xval[i, :], const=const) else: for i in range(yval.shape[1]): ynew[:, i] = djs_maskinterp1(yval[:, i], mask[:, i], xval=xval[:, i], const=const) elif ndim == 3: if xval is None: if axis == 0: for i in range(yval.shape[0]): for j in range(yval.shape[1]): ynew[i, j, :] = djs_maskinterp1(yval[i, j, :], mask[i, j, :], const=const) elif axis == 1: for i in range(yval.shape[0]): for j in range(yval.shape[2]): ynew[i, :, j] = djs_maskinterp1(yval[i, :, j], mask[i, :, j], const=const) else: for i in range(yval.shape[1]): for j in range(yval.shape[2]): ynew[:, i, j] = djs_maskinterp1(yval[:, i, j], mask[:, i, j], const=const) else: if axis == 0: for i in range(yval.shape[0]): for j in range(yval.shape[1]): ynew[i, j, :] = djs_maskinterp1(yval[i, j, :], mask[i, j, :], xval=xval[i, j, :], const=const) elif axis == 1: for i in range(yval.shape[0]): for j in range(yval.shape[2]): ynew[i, :, j] = djs_maskinterp1(yval[i, :, j], mask[i, :, j], xval=xval[i, :, j], const=const) else: for i in range(yval.shape[1]): for j in range(yval.shape[2]): ynew[:, i, j] = djs_maskinterp1(yval[:, i, j], mask[:, i, j], xval=xval[:, i, j], const=const) else: raise ValueError('Unsupported number of dimensions.') return ynew pydl-0.7.0/pydl/file_lines.py0000644000076500000240000000336013434104050016531 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- def file_lines(path, compress=False): """Replicates the IDL ``FILE_LINES()`` function. Given a path to a file name or a list of such paths, returns the number of lines in the file(s). Parameters ---------- path : :class:`str` or :class:`list` of :class:`str` Path to a file. Can be a list of paths. compress : :class:`bool`, optional If set to ``True``, assumes that all files in `path` are GZIP compressed. Returns ------- :class:`int` or :class:`list` of :class:`int` The number of lines in `path`. Returns a list of lengths if a list of files is supplied. Notes ----- The ``/NOEXPAND_PATH`` option in IDL's ``FILE_LINES()`` is not implemented. References ---------- http://www.harrisgeospatial.com/docs/file_lines.html Examples -------- >>> from pydl import file_lines >>> from os.path import dirname, join >>> file_lines(join(dirname(__file__),'tests','t','this-file-contains-42-lines.txt')) 42 """ from six import string_types scalar = False if isinstance(path, string_types): working_path = [path] scalar = True else: working_path = path lines = list() for filename in working_path: if compress: # # gzip in Python 2.6 can't use a context manager. # import gzip f = gzip.open(filename) lines.append(len(f.readlines())) f.close() else: with open(filename) as f: lines.append(len(f.readlines())) if scalar: return lines[0] else: return lines pydl-0.7.0/pydl/median.py0000644000076500000240000000553013434104050015656 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- def median(array, width=None, axis=None, even=False): """Replicate the IDL ``MEDIAN()`` function. Parameters ---------- array : array-like Compute the median of this array. width : :class:`int`, optional Size of the neighborhood in which to compute the median (*i.e.*, perform median filtering). If omitted, the median of the whole array is returned. axis : :class:`int`, optional Compute the median over this axis for a multi-dimensional array. If ommitted, the median over the entire array will be returned. If set, this function will behave as though `even` is ``True``. even : :class:`bool`, optional If set to ``True``, the median of arrays with an even number of elements will be the average of the middle two values. Returns ------- array-like The median of the array. Raises ------ :exc:`ValueError` If `width` is set, and the input `array` is not 1 or 2 dimensional. Notes ----- * For arrays with an even number of elements, the :func:`numpy.median` function behaves like ``MEDIAN(array, /EVEN)``, so the absence of the `even` keyword has to turn *off* that behavior. * For median filtering, this uses :func:`scipy.signal.medfilt` and :func:`scipy.signal.medfilt2d` under the hood, but patches up the values on the array boundaries to match the return values of the IDL ``MEDIAN()`` function. """ import numpy as np from scipy.signal import medfilt, medfilt2d if width is None: if axis is None: f = array.flatten() if f.size % 2 == 1 or even: return np.median(array) else: i = f.argsort() return f[i[f.size//2]] else: return np.median(array, axis=axis) else: if array.ndim == 1: medarray = medfilt(array, min(width, array.size)) istart = int((width - 1)/2) iend = array.size - int((width + 1)/2) i = np.arange(array.size) w = (i < istart) | (i > iend) medarray[w] = array[w] return medarray elif array.ndim == 2: medarray = medfilt2d(array, min(width, array.size)) istart = int((width-1)/2) iend = (array.shape[0] - int((width+1)/2), array.shape[1] - int((width+1)/2)) i = np.arange(array.shape[0]) j = np.arange(array.shape[1]) w = ((i < istart) | (i > iend[0]), (j < istart) | (j > iend[1])) medarray[w[0], :] = array[w[0], :] medarray[:, w[1]] = array[:, w[1]] return medarray else: raise ValueError("Invalid number of dimensions for input array!") pydl-0.7.0/pydl/smooth.py0000644000076500000240000000302113434104050015723 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- def smooth(signal, owidth, edge_truncate=False): """Replicates the IDL ``SMOOTH()`` function. Parameters ---------- signal : array-like The array to be smoothed. owidth : :class:`int` or array-like Width of the smoothing window. Can be a scalar or an array with length equal to the number of dimensions of `signal`. edge_truncate : :class:`bool`, optional Set `edge_truncate` to ``True`` to apply smoothing to all points. Points near the edge are normally excluded from smoothing. Returns ------- array-like A smoothed array with the same dimesions and type as `signal`. References ---------- http://www.harrisgeospatial.com/docs/smooth.html Examples -------- """ if owidth % 2 == 0: width = owidth + 1 else: width = owidth if width < 3: return signal n = signal.size istart = int((width-1)/2) iend = n - int((width+1)/2) w2 = int(width/2) s = signal.copy() for i in range(n): if i < istart: if edge_truncate: s[i] = (signal[0:istart+i+1].sum() + (istart-i)*signal[0])/float(width) elif i > iend: if edge_truncate: s[i] = (signal[i-istart:n].sum() + (i-iend)*signal[n-1])/float(width) else: s[i] = signal[i-w2:i+w2+1].sum()/float(width) return s pydl-0.7.0/pydl/photoop/0000755000076500000240000000000013434104632015542 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/photoop/window.py0000644000076500000240000002555013434104050017424 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the window directory in photoop. """ import os import time from warnings import warn import numpy as np from astropy import log from astropy.io import fits from . import PhotoopException from .sdssio import sdss_name, sdss_calib from ..pydlutils.mangle import set_use_caps from ..pydlutils.sdss import sdss_flagval def sdss_score(flist, silent=True): """Score a list of imaging fields from zero to one. Parameters ---------- flist : :class:`~astropy.io.fits.HDUList` Opened FITS file. silent : :class:`bool`, optional If ``False``, print extra information. Returns ------- :class:`numpy.ndarray` A vector of scores, one for each row of the FITS file. """ lat = 32.780361 lng = 360.0 - 105.820417 tzone = 7 scores = 1 # # Read the PHOTO status bits # if not silent: log.info('Setting PHOTO status bits') t1 = time.time() nlist = flist[1].header.get('NAXIS2') fdata = flist[1].data for k in range(nlist): if not silent and ((k % 1000) == 0): log.info("Setting PHOTO status {0:d} {1:d}".format(k, nlist)) thisfile = sdss_name('fpFieldStat', fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k]) try: fpfield = fits.open(thisfile) except IOError: warn("Retrying fpFieldStat file for RUN = {0:d} CAMCOL = {1:d} FIELD = {2:d} RERUN = {3}".format(fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k])) try: fpfield = fits.open(thisfile) except IOError: warn("Bad fpFieldStat file for RUN = {0:d} CAMCOL = {1:d} FIELD = {2:d} RERUN = {3}".format(fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k])) fdata.field('PHOTO_STATUS')[k] = -1 if not silent: log.info('Trying tsField instead.') thisfile = sdss_name('tsField', fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k]) try: tsfield = fits.open(thisfile) except IOError: warn('Bad tsField file.') else: if not silent: log.info('tsField found, using frames_status.') fdata.field('PHOTO_STATUS')[k] = tsfield[1].data.field('frames_status')[0] else: fdata.field('PHOTO_STATUS')[k] = fpfield[1].data.field('status')[0] else: fdata.field('PHOTO_STATUS')[k] = fpfield[1].data.field('status')[0] if not silent: log.info("Time to set PHOTO status = {0:f} sec".format(time.time()-t1)) # # Read in the PSP status # if not silent: log.info('Setting PSP status bits') t2 = time.time() for k in range(nlist): if not silent and ((k % 1000) == 0): log.info("Setting PSP status {0:d} {1:d}".format(k, nlist)) thisfile = sdss_name('psField', fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k]) try: psfield = fits.open(thisfile) except IOError: warn("Retrying psField file for RUN = {0:d} CAMCOL = {1:d} FIELD = {2:d} RERUN = {3}".format(fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k])) try: psfield = fits.open(thisfile) except IOError: warn("Bad psField file for RUN = {0:d} CAMCOL = {1:d} FIELD = {2:d} RERUN = {3}".format(fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k])) fdata.field('PSP_STATUS')[k] = -1 fdata.field('PSF_FWHM')[k] = -1 fdata.field('SKYFLUX')[k] = -1 pixscale = 0.396 * np.sqrt(fdata.field('XBIN')[k]**2 + fdata.field('YBIN')[k]**2)/np.sqrt(2.0) calibinfo = sdss_calib(fdata.field('RUN')[k], fdata.field('CAMCOL')[k], fdata.field('FIELD')[k], fdata.field('RERUN')[k], **kwargs) fdata.field('PSP_STATUS')[k] = psfield[6].data.field('status')[0] fdata.field('PSF_FWHM')[k] = psfield[6].data.field('psf_width')[0] fdata.field('SKYFLUX')[k] = (psfield[6].data.field('sky')[0] * calibinfo['NMGYPERCOUNT']/pixscale**2) if not silent: log.info("Time to set PSP status = {0:f} sec".format(time.time()-t2)) # # Decide if each field exists in all 5 bands. # bad_bits = sdss_flagval('image_status', ['bad_rotator', 'bad_astrom', 'bad_focus', 'shutters']) if 'ignoreframesstatus' in kwargs: ignoreframesstatus = np.zeros(fdata.field('PHOTO_STATUS').shape) == 0 else: ignoreframesstatus = np.zeros(fdata.field('PHOTO_STATUS').shape) == 1 qexist = (fdata.field('PHOTO_STATUS') == 0) | ignoreframesstatus for k in range(5): qexist &= (fdata.field('IMAGE_STATUS')[:, k] & bad_bits) == 0 # # Decide if each field is phtometric in all 5 bands. # unphot_bits = sdss_flagval('image_status', ['cloudy', 'unknown', 'ff_petals', 'dead_ccd', 'noisy_ccd']) qphot = fdata.field('SUN_ANGLE') < -12 for k in range(5): qphot &= (fdata.field('IMAGE_STATUS')[:, k] & unphot_bits) == 0 for k in range(5): qphot &= (((fdata.field('PSP_STATUS')[:, k] & 31) <= 2) | (fdata.field('XBIN') > 1) | ignoreframesstatus) # # Now set the score for each field # sensitivity = (0.7 / (fdata.field('PSF_FWHM')[:, 2] * np.sqrt(fdata.field('SKYFLUX')[:, 2]))) < 0.4 fdata.field('SCORE')[:] = qexist * (0.1 + 0.5*qphot + sensitivity) ibinned = np.find(fdata.field('XBIN') > 1) if len(ibinned) > 0: fdata.field('SCORE')[ibinned] *= 0.1 # # Look for any NaN values, which could happen for example if there # is a corrupted psField file and PSF_FWHM or SKYFLUX was negative. # ibad = np.find(~np.isfinite(fdata.field('SCORE'))) if len(ibad) > 0: warn("Changing NaN scores for {0:d} fields to zero.".format(len(ibad))) fdata.field('SCORE')[ibad] = 0 return fdata.field('SCORE') def window_read(**kwargs): """Read window files in $PHOTO_RESOLVE. """ try: resolve_dir = os.environ['PHOTO_RESOLVE'] except KeyError: raise PhotoopException(('You have not set the environment variable ' + 'PHOTO_RESOLVE!')) if 'silent' not in kwargs: kwargs['silent'] = True r = dict() if 'flist' in kwargs: if 'rescore' in kwargs: flist_file = os.path.join(resolve_dir, 'window_flist_rescore.fits') if not os.path.exists(rescore_file): # # This will be called if window_flist_rescore.fits doesn't exist. # window_score() else: flist_file = os.path.join(resolve_dir, 'window_flist.fits') with fits.open(rescore_file) as fit: r['flist'] = fit[1].data if 'blist' in kwargs or 'balkans' in kwargs: blist_file = os.path.join(resolve_dir, 'window_blist.fits') with fits.open(balkan_file) as fit: r['blist'] = fit[1].data if 'bcaps' in kwargs or 'balkans' in kwargs: bcaps_file = os.path.join(resolve_dir, 'window_bcaps.fits') with fits.open(bcaps_file) as fit: r['bcaps'] = fit[1].data if 'findx' in kwargs: findx_file = os.path.join(resolve_dir, 'window_findx.fits') with fits.open(findx_file) as fit: r['findx'] = fit[1].data if 'bindx' in kwargs: bindx_file = os.path.join(resolve_dir, 'window_bindx.fits') with fits.open(bindx_file) as fit: r['bindx'] = fit[1].data if 'balkans' in kwargs: # # Copy blist data to balkans # r['balkans'] = r['blist'].copy() r['balkans']['caps'] = {'X': list(), 'CM': list()} r['balkans']['use_caps'] = np.zeros(r['balkans']['ICAP'].shape, dtype=np.uint64) if 'blist' not in kwargs: del r['blist'] # # Copy bcaps data into balkans # for k in range(len(r['balkans']['ICAP'])): r['balkans']['caps']['X'].append(r['bcaps']['X'][balkans['ICAP'][k]:balkans['ICAP'][k]+balkans['NCAPS'][k]]) r['balkans']['caps']['CM'].append(r['bcaps']['CM'][balkans['ICAP'][k]:balkans['ICAP'][k]+balkans['NCAPS'][k]]) r['balkans']['use_caps'][k] = set_use_caps( r['balkans']['caps']['X'][k], r['balkans']['caps']['CM'][k], r['balkans']['use_caps'][k], allow_doubles=True) if 'bcaps' not in kwargs: del r['bcaps'] return r def window_score(**kwargs): """For uber-resolve, score all the fields from zero to one. If 'rescore' is set, then write a new file 'window_flist_rescore.fits' rather than over-writing the file 'window_flist.fits' """ # # Be certain not to use global calibrations # try: calib_dir_save = os.environ['PHOTO_CALIB'] except KeyError: raise PhotoopException(('You have not set the environment variable ' + 'PHOTO_CALIB!')) del os.environ['PHOTO_CALIB'] # # Read the file # try: resolve_dir = os.environ['PHOTO_RESOLVE'] except KeyError: raise PhotoopException(('You have not set the environment variable ' + 'PHOTO_RESOLVE!')) filename = os.path.join(resolve_dir, 'window_flist.fits') if 'rescore' in kwargs: fitsmode = 'readonly' else: fitsmode = 'update' try: flist = fits.open(filename, mode=fitsmode) except OSError: raise PhotoopException('Unable to read FLIST file.') # # Construct the scores filling in the values to FLIST.SCORE # flist.field('SCORE')[:] = sdss_score(flist) if 'rescore' in kwargs: flist.writeto(os.path.join(resolve_dir, 'window_flist_rescore.fits')) flist.close() # # Restore the PHOTO_CALIB variable # os.environ['PHOTO_CALIB'] = calib_dir_save return pydl-0.7.0/pydl/photoop/tests/0000755000076500000240000000000013434104632016704 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/photoop/tests/test_sdssio.py0000644000076500000240000001343212671553447021642 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import os import numpy as np from astropy.tests.helper import raises from ..sdssio import (filtername, filternum, sdss_calib, sdss_name, sdss_path, sdssflux2ab) class TestSDSSio(object): """Test the functions in pydl.photoop.sdssio. """ # # Set up environment variables for testing sdss_path and sdss_name. # path_envvars = ('PHOTO_CALIB', 'PHOTO_DATA', 'BOSS_PHOTOOBJ', 'PHOTO_REDUX', 'PHOTO_RESOLVE', 'PHOTO_SKY', 'PHOTO_SWEEP') env_cache = {} path_data = { 'apObj': "/PHOTO_REDUX/301/137/objcs/4", 'calibMatch': "/PHOTO_REDUX/301/137/nfcalib", 'calibPhotom': "/PHOTO_REDUX/301/137/nfcalib", 'calibPhotomGlobal': "/PHOTO_CALIB/301/137/nfcalib", 'fakeIdR': "/PHOTO_DATA/137/fake_fields/4", 'fpAtlas': "/PHOTO_REDUX/301/137/objcs/4", 'fpBIN': "/PHOTO_REDUX/301/137/objcs/4", 'fpC': "/PHOTO_REDUX/301/137/objcs/4", 'fpFieldStat': "/PHOTO_REDUX/301/137/objcs/4", 'fpM': "/PHOTO_REDUX/301/137/objcs/4", 'fpObjc': "/PHOTO_REDUX/301/137/objcs/4", 'hoggObj': "/PHOTO_REDUX/301/137/objcs/4", 'idFF': "/PHOTO_REDUX/301/137/objcs/4", 'idR': "/PHOTO_DATA/137/fields/4", 'idRR': "/PHOTO_DATA/137/fields/4", 'psBB': "/PHOTO_REDUX/301/137/objcs/4", 'psFF': "/PHOTO_REDUX/301/137/objcs/4", 'psField': "/PHOTO_REDUX/301/137/objcs/4", 'reObjGlobal': "/PHOTO_RESOLVE/301/137/resolve/4", 'reObjRun': "/PHOTO_REDUX/301/137/resolve/4", 'reObjTmp': "/PHOTO_RESOLVE/301/137/resolve/4", 'tsField': "/PHOTO_REDUX/301/137/calibChunks/4", } name_data = { 'apObj': "apObj-000137-r4-0042.fit", 'calibMatch': "calibMatch-000137-4.fits", 'calibPhotom': "calibPhotom-000137-4.fits", 'calibPhotomGlobal': "calibPhotomGlobal-000137-4.fits", 'fakeIdR': "idR-000137-r4-0042.fit", 'fpAtlas': "fpAtlas-000137-4-0042.fit", 'fpBIN': "fpBIN-000137-r4-0042.fit", 'fpC': "fpC-000137-r4-0042.fit", 'fpFieldStat': "fpFieldStat-000137-4-0042.fit", 'fpM': "fpM-000137-r4-0042.fit", 'fpObjc': "fpObjc-000137-4-0042.fit", 'hoggObj': "hoggObj-000137-4-0042.fits", 'idFF': "idFF-000137-r4.fit", 'idR': "idR-000137-r4-0042.fit", 'idRR': "idRR-000137-r4-0042.fit", 'psBB': "psBB-000137-r4-0042.fit", 'psFF': "psFF-000137-r4.fit", 'psField': "psField-000137-4-0042.fit", 'reObjGlobal': "reObjGlobal-000137-4-0042.fits", 'reObjRun': "reObjRun-000137-4-0042.fits", 'reObjTmp': "reObjTmp-000137-4-0042.fits", 'tsField': "tsField-000137-4-301-0042.fit", } def setup(self): for e in self.path_envvars: try: actual_env = os.environ[e] except KeyError: actual_env = None self.env_cache[e] = actual_env os.environ[e] = '/' + e return def teardown(self): for e in self.env_cache: if self.env_cache[e] is None: del os.environ[e] else: os.environ[e] = self.env_cache[e] return def test_filtername(self): assert filtername(0) == 'u' assert filtername(1) == 'g' assert filtername(2) == 'r' assert filtername(3) == 'i' assert filtername(4) == 'z' # # filtername should return its argument if it's not # integer-like # assert filtername('r') == 'r' def test_filternum(self): assert filternum('u') == 0 assert filternum('g') == 1 assert filternum('r') == 2 assert filternum('i') == 3 assert filternum('z') == 4 # # Test default return value # fn = filternum() for k in range(5): assert fn[k] == k def test_sdss_calib(self): foo = sdss_calib(94, 6, 101) assert foo['NMGYPERCOUNT'] == 1.0 def test_sdss_name(self): # # Bad ftype # with raises(KeyError): p = sdss_name('fooBar', 137, 4, 42) for ftype in self.name_data: assert sdss_name(ftype, 137, 4, 42, '301', 'r', no_path=True) == self.name_data[ftype] def test_sdss_name_with_path(self): for ftype in self.name_data: assert (sdss_name(ftype, 137, 4, 42, '301', 'r') == os.path.join(self.path_data[ftype], self.name_data[ftype])) def test_sdss_name_reObj(self): assert sdss_name('reObj', 137, 4, 42, '301', 'r', no_path=True) == self.name_data['reObjGlobal'] resolve = os.environ['PHOTO_RESOLVE'] del os.environ['PHOTO_RESOLVE'] assert sdss_name('reObj', 137, 4, 42, '301', 'r', no_path=True) == self.name_data['reObjRun'] os.environ['PHOTO_RESOLVE'] = resolve def test_sdss_path(self): with raises(KeyError): p = sdss_path('fooBar', 137) for ftype in self.path_data: assert sdss_path(ftype, 137, 4, '301') == self.path_data[ftype] def test_sdssflux2ab(self): correction = np.array([-0.042, 0.036, 0.015, 0.013, -0.002]) mags = np.zeros((2, 5), dtype='d') mags[0, :] = 18.0 mags[1, :] = 19.0 ab = sdssflux2ab(mags, magnitude=True) assert (ab == (mags + correction)).all() flux = 10**((22.5 - mags)/2.5) # nanomaggies ab = sdssflux2ab(flux) assert np.allclose(ab, 10**((22.5 - (mags + correction))/2.5)) ivar = 1.0/flux ab = sdssflux2ab(ivar, ivar=True) assert np.allclose(ab, ivar/(10**(-2.0*correction/2.5))) pydl-0.7.0/pydl/photoop/tests/__init__.py0000644000076500000240000000021212632466352021021 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This is the pydl/photoop/tests directory. """ pydl-0.7.0/pydl/photoop/tests/test_photoop.py0000644000076500000240000000271113434104050022000 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np from astropy.tests.helper import raises from ..photoobj import sdss_calibv, unwrap_objid class TestPhotoobj(object): """Test the functions in pydl.photoop.photoobj. """ def setup(self): pass def teardown(self): pass def test_sdss_calibv(self): assert np.allclose(0.2650306748466258, sdss_calibv().value) def test_unwrap_objid(self): objid = unwrap_objid(np.array([1237661382772195474])) assert objid.skyversion == 2 assert objid.rerun == 301 assert objid.run == 3704 assert objid.camcol == 3 assert objid.firstfield == 0 assert objid.frame == 91 assert objid.id == 146 objid = unwrap_objid(np.array(['1237661382772195474'])) assert objid.skyversion == 2 assert objid.rerun == 301 assert objid.run == 3704 assert objid.camcol == 3 assert objid.firstfield == 0 assert objid.frame == 91 assert objid.id == 146 objid = unwrap_objid(np.array([587722984180548043])) assert objid.skyversion == 1 assert objid.rerun == 40 assert objid.run == 752 assert objid.camcol == 5 assert objid.firstfield == 1 assert objid.frame == 618 assert objid.id == 459 with raises(ValueError): objid = unwrap_objid(np.array([3.14159])) pydl-0.7.0/pydl/photoop/tests/test_window.py0000644000076500000240000000100112671556620021626 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import os import numpy as np from astropy.tests.helper import raises from ..window import sdss_score, window_read, window_score class TestWindow(object): """Test the functions in pydl.photoop.window. """ def setup(self): pass def teardown(self): pass def test_sdss_score(self): pass def test_window_read(self): pass def test_window_score(self): pass pydl-0.7.0/pydl/photoop/__init__.py0000644000076500000240000000071312671060562017661 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This subpackage implements functions from the photoop package. """ # # Define this early on so that submodules can use it # from .. import PydlException class PhotoopException(PydlException): """Exceptions raised by :mod:`pydl.photoop` that don't fit into a standard exception class like :exc:`ValueError`. """ pass __all__ = ['PhotoopException'] pydl-0.7.0/pydl/photoop/photoobj.py0000644000076500000240000000615513434104050017741 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the photoobj directory in photoop. """ import numpy as np import astropy.units as u def sdss_calibv(): """Return calibration for velocities from pix/frame to deg/day. Returns ------- :class:`~astropy.units.quanity.Quantity` The conversion from pixels per frame to degrees per day Notes ----- Assumes frame time difference of 71.72 seconds and pixel scale of 0.396 arcsec, both fixed. Also note that observations of the same part of sky from adjacent bands are separated by *two* frame numbers, so we multiply by a factor two. """ pixscale = 0.396 * u.arcsec ftime = 71.72 * u.s return (2.0*pixscale/ftime).to('deg / d') def unwrap_objid(objid): """Unwrap CAS-style objID into run, camcol, field, id, rerun. See :func:`~pydl.pydlutils.sdss.sdss_objid` for details on how the bits within an objID are assigned. Parameters ---------- objid : :class:`numpy.ndarray` An array containing 64-bit integers or strings. If strings are passed, they will be converted to integers internally. Returns ------- :class:`numpy.recarray` A record array with the same length as `objid`, with the columns 'skyversion', 'rerun', 'run', 'camcol', 'firstfield', 'frame', 'id'. Raises ------ :exc:`ValueError` If the input objID has a type that can't be converted into 64-bit integer. Notes ----- For historical reasons, the inverse of this function, :func:`~pydl.pydlutils.sdss.sdss_objid` is not in the same namespace as this function. 'frame' is used instead of 'field' because record arrays have a method of the same name. Examples -------- >>> from numpy import array >>> from pydl.photoop.photoobj import unwrap_objid >>> unwrap_objid(array([1237661382772195474])) rec.array([(2, 301, 3704, 3, 0, 91, 146)], dtype=[('skyversion', '> 59, 2**4 - 1) unwrap.rerun = np.bitwise_and(tempobjid >> 48, 2**11 - 1) unwrap.run = np.bitwise_and(tempobjid >> 32, 2**16 - 1) unwrap.camcol = np.bitwise_and(tempobjid >> 29, 2**3 - 1) unwrap.firstfield = np.bitwise_and(tempobjid >> 28, 2**1 - 1) unwrap.frame = np.bitwise_and(tempobjid >> 16, 2**12 - 1) unwrap.id = np.bitwise_and(tempobjid, 2**16 - 1) return unwrap pydl-0.7.0/pydl/photoop/sdssio.py0000644000076500000240000002137413434104050017421 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the sdssio directory of photoop. """ import os import numpy as np from six import string_types # # Filename formats used by sdss_name and sdss_path # _name_formats = { 'apObj': "{ftype}-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'calibMatch': "{ftype}-{run:06d}-{camcol:1d}.fits", 'calibPhotom': "{ftype}-{run:06d}-{camcol:1d}.fits", 'calibPhotomGlobal': "{ftype}-{run:06d}-{camcol:1d}.fits", 'fakeIdR': "idR-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'fpAtlas': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fit", 'fpBIN': "{ftype}-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'fpC': "{ftype}-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'fpFieldStat': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fit", 'fpM': "{ftype}-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'fpObjc': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fit", 'hoggObj': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fits", 'idFF': "{ftype}-{run:06d}-{filter}{camcol:1d}.fit", 'idR': "{ftype}-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'idRR': "{ftype}-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'psBB': "{ftype}-{run:06d}-{filter}{camcol:1d}-{field:04d}.fit", 'psFF': "{ftype}-{run:06d}-{filter}{camcol:1d}.fit", 'psField': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fit", 'reObjGlobal': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fits", 'reObjRun': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fits", 'reObjTmp': "{ftype}-{run:06d}-{camcol:1d}-{field:04d}.fits", 'tsField': "{ftype}-{run:06d}-{camcol:1d}-{rerun}-{field:04d}.fit", } _path_formats = { 'apObj': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'calibMatch': "{redux}/{rerun}/{run:d}/nfcalib", 'calibPhotom': "{redux}/{rerun}/{run:d}/nfcalib", 'calibPhotomGlobal': "{calib}/{rerun}/{run:d}/nfcalib", 'fakeIdR': "{data}/{run:d}/fake_fields/{camcol:1d}", 'fpAtlas': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'fpBIN': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'fpC': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'fpFieldStat': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'fpM': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'fpObjc': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'hoggObj': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'idFF': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'idR': "{data}/{run:d}/fields/{camcol:1d}", 'idRR': "{data}/{run:d}/fields/{camcol:1d}", 'psBB': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'psFF': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'psField': "{redux}/{rerun}/{run:d}/objcs/{camcol:1d}", 'reObjGlobal': "{resolve}/{rerun}/{run:d}/resolve/{camcol:1d}", 'reObjRun': "{redux}/{rerun}/{run:d}/resolve/{camcol:1d}", 'reObjTmp': "{resolve}/{rerun}/{run:d}/resolve/{camcol:1d}", 'tsField': "{redux}/{rerun}/{run:d}/calibChunks/{camcol:1d}", } def filtername(f): """Return the name of a filter given its number. Parameters ---------- f : :class:`int` The filter number. Returns ------- :class:`str` The corresponding filter name. Examples -------- >>> filtername(0) 'u' """ if isinstance(f, string_types): return f fname = ('u', 'g', 'r', 'i', 'z') return fname[f] def filternum(filt='foo'): """Return index number for SDSS filters either from a number or name. Parameters ---------- filt : :class:`str` The filter name. Returns ------- :class:`int` The corresponding filter number Raises ------ :exc:`KeyError` If `filt` is not a valid filter name. Examples -------- >>> filternum('g') 1 """ if filt == 'foo': return list(range(5)) else: filters = {'u': 0, 'g': 1, 'r': 2, 'i': 3, 'z': 4} return filters[filt] def sdss_calib(run, camcol, field, rerun='', **kwargs): """Read photometric calibration solutions from calibPhotom or calibPhotomGlobal files. Parameters ---------- run : :class:`int` Photo run number camcol : :class:`int` Camcol number field : :class:`int` Field number rerun : :class:`str`, optional Photometric reduction number, as a string. Returns ------- :class:`dict` A dictionary containing the 'NMGYPERCOUNT' keyword. Notes ----- Currently, this is just a placeholder. """ return {'NMGYPERCOUNT': 1.0} def sdss_name(ftype, run, camcol, field, rerun='', thisfilter='r', no_path=False): """Return the name of an SDSS data file including path. Parameters ---------- ftype : :class:`str` The general type of the file, for example ``'reObj'`` run : :class:`int` The run number. camcol : :class:`int` The camcol number. field : :class:`int` The field number rerun : :class:`str`, optional If necessary, set the rerun name using this argument. thisfilter : :class:`int` or :class:`str`, optional If necessary, set the filter using this argument. no_path : :class:`bool`, optional Normally, sdss_name returns the full path. If `no_path` is ``True``, only the basename of the file is returned. Returns ------- :class:`str` The full file name, normally including the full path. Raises ------ :exc:`KeyError` If the file type is unknown. """ if ftype == 'reObj': if 'PHOTO_RESOLVE' in os.environ: myftype = 'reObjGlobal' else: myftype = 'reObjRun' else: myftype = ftype thisfilter = filtername(thisfilter) indict = {'ftype': myftype, 'run': run, 'camcol': camcol, 'field': field, 'filter': thisfilter, 'rerun': rerun} try: fullname = _name_formats[myftype].format(**indict) except KeyError: raise KeyError("Unknown FTYPE = {0}".format(myftype)) if not no_path: datadir = sdss_path(myftype, run, camcol, rerun) fullname = os.path.join(datadir, fullname) return fullname def sdss_path(ftype, run, camcol=0, rerun=''): """Return the path name for SDSS data assuming SAS directory structure. Parameters ---------- ftype : :class:`str` The general type of the file, for example ``'reObj'`` run : :class:`int` The run number. camcol : :class:`int`, optional If necessary, set the camcol number using this argument. rerun : :class:`str`, optional If necessary, set the rerun name using this argument. Returns ------- :class:`str` The directory in which file `ftype` lives. Raises ------ :exc:`KeyError` If the file type is unknown. """ indict = { 'run': run, 'camcol': camcol, 'rerun': rerun, 'calib': os.getenv('PHOTO_CALIB'), 'data': os.getenv('PHOTO_DATA'), 'photoobj': os.getenv('BOSS_PHOTOOBJ'), 'redux': os.getenv('PHOTO_REDUX'), 'resolve': os.getenv('PHOTO_RESOLVE'), 'sky': os.getenv('PHOTO_SKY'), 'sweep': os.getenv('PHOTO_SWEEP'), } try: datadir = _path_formats[ftype].format(**indict) except KeyError: raise KeyError("Unknown FTYPE = {0}".format(ftype)) return datadir def sdssflux2ab(flux, magnitude=False, ivar=False): """Convert the SDSS calibrated fluxes (magnitudes) into AB fluxes (magnitudes). Parameters ---------- flux : :class:`numpy.ndarray` Array of calibrated fluxes or SDSS magnitudes with 5 columns, corresponding to the 5 filters *u*, *g*, *r*, *i*, *z*. magnitude : :class:`bool`, optional If set to ``True``, then assume `flux` are SDSS magnitudes instead of linear flux units. ivar : :class:`numpy.ndarray`, optional If set, the input fluxes are actually inverse variances. Returns ------- :class:`numpy.ndarray` Array of fluxes or magnitudes on the AB system. Notes ----- Uses the conversions posted by D.Hogg (sdss-calib/845):: u(AB,2.5m) = u(2.5m) - 0.042 g(AB,2.5m) = g(2.5m) + 0.036 r(AB,2.5m) = r(2.5m) + 0.015 i(AB,2.5m) = i(2.5m) + 0.013 z(AB,2.5m) = z(2.5m) - 0.002 """ # # Correction vector, adjust this as necessary # correction = np.array([-0.042, 0.036, 0.015, 0.013, -0.002]) rows, cols = flux.shape abflux = flux.copy() if magnitude: for i in range(rows): abflux[i, :] += correction else: factor = 10.0**(-correction/2.5) if ivar: factor = 1.0/factor**2 for i in range(rows): abflux[i, :] *= factor return abflux pydl-0.7.0/pydl/goddard/0000755000076500000240000000000013434104632015456 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/goddard/misc.py0000644000076500000240000000213013434104050016751 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the goddard/misc directory in idlutils. """ from __future__ import absolute_import # Needed for Python 2 because there is a math.py module in the same directory. from math import pi def cirrange(ang, radians=False): """Convert an angle larger than 360 degrees to one less than 360 degrees. Parameters ---------- ang : :class:`float` or array-like Angle to convert. If the angle is in radians, the `radians` argument should be set. radians : class:`bool`, optional If ``True``, the input angle is in radians, and the output will be between zero and 2π. Returns ------- :class:`float` or array-like Angle in the restricted range. Examples -------- >>> from pydl.goddard.misc import cirrange >>> cirrange(-270.0) 90.0 """ if radians: cnst = pi * 2.0 else: cnst = 360.0 # # The modulo operator automatically deals with negative values # return ang % cnst pydl-0.7.0/pydl/goddard/tests/0000755000076500000240000000000013434104632016620 5ustar weaverstaff00000000000000pydl-0.7.0/pydl/goddard/tests/__init__.py0000644000076500000240000000021212632466352020735 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This is the pydl/goddard/tests directory. """ pydl-0.7.0/pydl/goddard/tests/test_goddard.py0000644000076500000240000001454113301642157021644 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- import numpy as np import astropy.units as u from astropy.tests.helper import raises from ..astro import airtovac, gcirc, get_juldate, vactoair from ..math import flegendre from ..misc import cirrange class TestGoddard(object): """Test the goddard package. """ def setup(self): pass def teardown(self): pass def test_airtovac(self): vacuum = airtovac(1900.0) assert vacuum == 1900.0 vacuum = airtovac(2000.0) assert np.allclose(vacuum, 2000.6475) air = np.array([1800.0, 1900.0, 2000.0, 2100.0, 2200.0, 2300.0]) vacuum = airtovac(air) assert np.allclose(vacuum, np.array([1800.0, 1900.0, 2000.6475, 2100.6664, 2200.6868, 2300.7083])) vacuum = airtovac(6056.125) assert np.allclose(vacuum, 6057.8019) vacuum = airtovac(np.array([1800.0, 1850.0, 1900.0])) assert np.allclose(vacuum, np.array([1800.0, 1850.0, 1900.0])) # # Regression test for #8. # wave = air.reshape(2, 3) vacuum = airtovac(wave) assert np.allclose(vacuum, np.array([[1800.0, 1900.0, 2000.6475], [2100.6664, 2200.6868, 2300.7083]])) # # Test with units. # air = air * u.Angstrom vacuum = airtovac(air) assert np.allclose(vacuum.value, np.array([1800.0, 1900.0, 2000.6475, 2100.6664, 2200.6868, 2300.7083])) assert vacuum.unit is u.Angstrom vacuum = airtovac(air.to(u.nm)) # # Due to numeric funkiness, 2000 -> 1999.999999999 < 2000. # assert np.allclose(vacuum.value, np.array([180.0, 190.0, 200.0, 210.06664, 220.06868, 230.07083])) assert vacuum.unit is u.nm def test_cirrange(self): ra1 = np.linspace(-4.0*np.pi, 4.0*np.pi, 100) ra2 = cirrange(ra1, radians=True) assert (ra2 == (ra1 % (2.0 * np.pi))).all() ra1 = np.rad2deg(ra1) ra2 = cirrange(ra1) assert (ra2 == (ra1 % 360.0)).all() def test_flegendre(self): x = np.array([-1, -0.5, 0, 0.5, 1], dtype='d') # # Test order # with raises(ValueError): f = flegendre(x, 0) # # m = 1 # f = flegendre(x, 1) assert (f == np.ones((1, x.size), dtype='d')).all() # # m = 2 # f = flegendre(x, 2) foo = np.ones((2, x.size), dtype='d') foo[1, :] = x assert np.allclose(f, foo) # # m = 3 # f = flegendre(x, 3) foo = np.ones((3, x.size), dtype='d') foo[1, :] = x foo[2, :] = 0.5 * (3.0*x**2 - 1.0) assert np.allclose(f, foo) # # m = 4 # f = flegendre(x, 4) foo = np.ones((4, x.size), dtype='d') foo[1, :] = x foo[2, :] = 0.5*(3.0*x**2 - 1.0) foo[3, :] = 0.5*(5.0*x**3 - 3.0*x) assert np.allclose(f, foo) # # random float # f = flegendre(2.88, 3) assert np.allclose(f, np.array([[1.00], [2.88], [11.9416]])) def test_gcirc(self): np.random.seed(137) # # Start in radians # offset = 5.0e-6 # approx 1 arcsec ra1 = 2.0 * np.pi * np.random.rand(100) dec1 = np.pi/2.0 - np.arccos(2.0*np.random.rand(100) - 1.0) ra2 = ra1 + offset ra2 = np.where((ra2 > 2.0*np.pi), ra2 - 2.0*np.pi, ra2) dec2 = np.where((dec1 > 0), dec1 - offset, dec1 + offset) deldec2 = (dec2-dec1)/2.0 delra2 = (ra2-ra1)/2.0 sindis = np.sqrt(np.sin(deldec2) * np.sin(deldec2) + np.cos(dec1) * np.cos(dec2) * np.sin(delra2) * np.sin(delra2)) dis = 2.0*np.arcsin(sindis) # # units = 0 # d0 = gcirc(ra1, dec1, ra2, dec2, units=0) assert np.allclose(d0, dis) # # units = 2 # d0 = gcirc(np.rad2deg(ra1)/15.0, np.rad2deg(dec1), np.rad2deg(ra2)/15.0, np.rad2deg(dec2), units=1) assert np.allclose(d0, np.rad2deg(dis)*3600.0) # # units = 2 # d0 = gcirc(np.rad2deg(ra1), np.rad2deg(dec1), np.rad2deg(ra2), np.rad2deg(dec2), units=2) assert np.allclose(d0, np.rad2deg(dis)*3600.0) # # Units = whatever # with raises(ValueError): d0 = gcirc(ra1, dec1, ra2, dec2, units=5) def test_get_juldate(self): now = get_juldate() assert now > 2400000.5 assert get_juldate(0) == 40587.0 + 2400000.5 assert get_juldate(86400) == 40588.0 + 2400000.5 def test_vactoair(self): air = vactoair(1900.0) assert air == 1900.0 air = vactoair(2000.0) assert np.allclose(air, 1999.3526) vacuum = np.array([1800.0, 1900.0, 2000.0, 2100.0, 2200.0, 2300.0]) air = vactoair(vacuum) assert np.allclose(air, np.array([1800.0, 1900.0, 1999.3526, 2099.3337, 2199.3133, 2299.2918])) air = vactoair(np.array([1800.0, 1850.0, 1900.0])) assert np.allclose(air, np.array([1800.0, 1850.0, 1900.0])) # # Regression test for #8. # wave = vacuum.reshape(2, 3) air = vactoair(wave) assert np.allclose(air, np.array([[1800.0, 1900.0, 1999.3526], [2099.3337, 2199.3133, 2299.2918]])) # # Test with units. # vacuum = vacuum * u.Angstrom air = vactoair(vacuum) assert np.allclose(air.value, np.array([1800.0, 1900.0, 1999.3526, 2099.3337, 2199.3133, 2299.2918])) assert air.unit is u.Angstrom air = vactoair(vacuum.to(u.nm)) # # Due to numeric funkiness, 2000 -> 1999.999999999 < 2000. # assert np.allclose(air.value, np.array([180.0, 190.0, 200.0, 209.93337, 219.93133, 229.92918])) assert air.unit is u.nm pydl-0.7.0/pydl/goddard/__init__.py0000644000076500000240000000022012632466352017572 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """ This subpackage contains the Goddard utilities. """ pydl-0.7.0/pydl/goddard/astro.py0000644000076500000240000001343413434104050017157 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the goddard/astro directory in idlutils. """ from __future__ import print_function from time import time import numpy as np from astropy.units import Angstrom def airtovac(air): """Convert air wavelengths to wavelengths in vacuum. Parameters ---------- air : array-like Values of wavelength in air in Angstroms. :class:`~astropy.units.Quantity` objects with valid length dimensions will be internally converted to Angstrom. Returns ------- array-like Values of wavelength in vacuum in Angstroms. If a :class:`~astropy.units.Quantity` object was passed in, the output will be converted to the same units as the input. Notes ----- * Formula from `P. E. Ciddor, Applied Optics, 35, 1566 (1996) `_. * Values of wavelength below 2000 Å are not converted. """ try: u = air.unit except AttributeError: u = None try: t = air.dtype except AttributeError: # Most likely, air is simply a float. t = None if t is None: if air < 2000.0: return air vacuum = air a = air g = None else: try: a = air.to(Angstrom).value except AttributeError: a = air g = a < 2000.0 if g.all(): return air vacuum = np.zeros(air.shape, dtype=t) + a for k in range(2): sigma2 = (1.0e4/vacuum)**2 fact = (1.0 + 5.792105e-2/(238.0185 - sigma2) + 1.67917e-3/(57.362 - sigma2)) vacuum = a * fact if g is not None: vacuum[g] = a[g] if u is not None: vacuum = (vacuum * Angstrom).to(u) return vacuum def gcirc(ra1, dec1, ra2, dec2, units=2): """Computes rigorous great circle arc distances. Parameters ---------- ra1, dec1, ra2, dec2 : :class:`float` or array-like RA and Dec of two points. units : { 0, 1, 2 }, optional * units = 0: everything is already in radians * units = 1: RA in hours, dec in degrees, distance in arcsec. * units = 2: RA, dec in degrees, distance in arcsec (default) Returns ------- :class:`float` or array-like The angular distance. Units of the value returned depend on the input value of `units`. Notes ----- The formula below is the one best suited to handling small angular separations. See: http://en.wikipedia.org/wiki/Great-circle_distance """ if units == 0: rarad1 = ra1 dcrad1 = dec1 rarad2 = ra2 dcrad2 = dec2 elif units == 1: rarad1 = np.deg2rad(15.0*ra1) dcrad1 = np.deg2rad(dec1) rarad2 = np.deg2rad(15.0*ra2) dcrad2 = np.deg2rad(dec2) elif units == 2: rarad1 = np.deg2rad(ra1) dcrad1 = np.deg2rad(dec1) rarad2 = np.deg2rad(ra2) dcrad2 = np.deg2rad(dec2) else: raise ValueError('units must be 0, 1 or 2!') deldec2 = (dcrad2-dcrad1)/2.0 delra2 = (rarad2-rarad1)/2.0 sindis = np.sqrt(np.sin(deldec2)*np.sin(deldec2) + np.cos(dcrad1)*np.cos(dcrad2)*np.sin(delra2)*np.sin(delra2)) dis = 2.0*np.arcsin(sindis) if units == 0: return dis else: return np.rad2deg(dis)*3600.0 def get_juldate(seconds=None): """Returns the current Julian date. Uses the MJD trick & adds the offset to get JD. Parameters ---------- seconds : :class:`int` or :class:`float`, optional Time in seconds since the UNIX epoch. This should only be used for testing. Returns ------- :class:`float` The Julian Day number as a floating point number. Notes ----- Do not use this function if high precision is required, or if you are concerned about the distinction between UTC & TAI. """ if seconds is None: t = time() else: t = seconds mjd = t/86400.0 + 40587.0 return mjd + 2400000.5 def get_juldate_main(): # pragma: no cover """Entry point for the get_juldate command-line script. """ print(get_juldate()) return 0 def vactoair(vacuum): """Convert vacuum wavelengths to wavelengths in air. Parameters ---------- vacuum : array-like Values of wavelength in vacuum in Angstroms. :class:`~astropy.units.Quantity` objects with valid length dimensions will be internally converted to Angstrom. Returns ------- array-like Values of wavelength in air in Angstroms. :class:`~astropy.units.Quantity` object was passed in, the output will be converted to the same units as the input. Notes ----- * Formula from `P. E. Ciddor, Applied Optics, 35, 1566 (1996) `_. * Values of wavelength below 2000 Å are not converted. """ try: u = vacuum.unit except AttributeError: u = None try: t = vacuum.dtype except AttributeError: # Most likely, vacuum is simply a float. t = None if t is None: if vacuum < 2000.0: return vacuum air = vacuum v = vacuum g = None else: try: v = vacuum.to(Angstrom).value except AttributeError: v = vacuum g = v < 2000.0 if g.all(): return vacuum air = np.zeros(vacuum.shape, dtype=t) + v sigma2 = (1.0e4/v)**2 fact = (1.0 + 5.792105e-2/(238.0185 - sigma2) + 1.67917e-3/(57.362 - sigma2)) air = v / fact if g is not None: air[g] = v[g] if u is not None: air = (air * Angstrom).to(u) return air pydl-0.7.0/pydl/goddard/math.py0000644000076500000240000000216713434104050016761 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- """This module corresponds to the goddard/math directory in idlutils. """ import numpy as np from scipy.special import legendre def flegendre(x, m): """Compute the first `m` Legendre polynomials. Parameters ---------- x : array-like Compute the Legendre polynomials at these abscissa values. m : :class:`int` The number of Legendre polynomials to compute. For example, if :math:`m = 3`, :math:`P_0 (x)`, :math:`P_1 (x)` and :math:`P_2 (x)` will be computed. Returns ------- :class:`numpy.ndarray` The values of the Legendre functions. """ if isinstance(x, np.ndarray): n = x.size else: n = 1 if m < 1: raise ValueError('Number of Legendre polynomials must be at least 1.') try: dt = x.dtype except AttributeError: dt = np.float64 leg = np.ones((m, n), dtype=dt) if m >= 2: leg[1, :] = x if m >= 3: for k in range(2, m): leg[k, :] = np.polyval(legendre(k), x) return leg pydl-0.7.0/pydl/rebin.py0000644000076500000240000000775613434104050015534 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # -*- coding: utf-8 -*- from __future__ import division def rebin(x, d, sample=False): """Resize `x` to new dimensions given by `d`. The new dimensions must be integer multiples or factors of the original dimensions. Although there are some elegant solutions out there for rebinning, this function is intended to replace the IDL ``REBIN()`` function, which has a number of special properties: * It refuses to perform extrapolation when rebinning to a larger size in a particular dimension. * It can simultaneously rebin to a larger size in one dimension while rebinning to a smaller size in another dimension. Parameters ---------- x : :class:`~numpy.ndarray` The array to resample. d : :func:`tuple` The new shape of the array. sample : :class:`bool`, optional If ``True``, nearest-neighbor techniques will be used instead of interpolation. Returns ------- :class:`~numpy.ndarray` The resampled array. Raises ------ :exc:`ValueError` If the new dimensions are incompatible with the algorithm. References ---------- http://www.harrisgeospatial.com/docs/rebin.html Examples -------- >>> from numpy import arange, float >>> from pydl import rebin >>> rebin(arange(10, dtype=float), (5,)) # doctest: +NORMALIZE_WHITESPACE array([ 0.5, 2.5, 4.5, 6.5, 8.5]) >>> rebin(arange(5, dtype=float), (10,)) # doctest: +NORMALIZE_WHITESPACE array([ 0. , 0.5, 1. , 1.5, 2. , 2.5, 3. , 3.5, 4. , 4. ]) """ from numpy import floor, zeros d0 = x.shape if len(d0) != len(d): raise ValueError(("The new shape is incompatible with the " + "original array.!")) for k in range(len(d0)): if d[k] > d0[k]: if d[k] % d0[k] != 0: raise ValueError(("{0:d} is not a multiple " + "of {1:d}!").format(d[k], d0[k])) elif d[k] == d0[k]: pass else: if d0[k] % d[k] != 0: raise ValueError(("{0:d} is not a multiple " + "of {1:d}!").format(d0[k], d[k])) xx = x.copy() new_shape = list(d0) for k in range(len(d0)): new_shape[k] = d[k] r = zeros(new_shape, dtype=xx.dtype) sliceobj0 = [slice(None)]*len(d0) sliceobj1 = [slice(None)]*len(d0) sliceobj = [slice(None)]*len(d) f = d0[k]/d[k] if d[k] > d0[k]: for i in range(d[k]): p = f*i fp = int(floor(p)) sliceobj0[k] = slice(fp, fp + 1) sliceobj[k] = slice(i, i + 1) if sample: r[sliceobj] = xx[sliceobj0] else: if p < d0[k] - 1: sliceobj1[k] = slice(fp + 1, fp + 2) rshape = r[sliceobj].shape r[sliceobj] = (xx[sliceobj0].reshape(rshape) + (p - fp)*(xx[sliceobj1] - xx[sliceobj0]).reshape(rshape) ) else: r[sliceobj] = xx[sliceobj0] elif d[k] == d0[k]: for i in range(d[k]): sliceobj0[k] = slice(i, i + 1) sliceobj[k] = slice(i, i + 1) r[sliceobj] = xx[sliceobj0] else: for i in range(d[k]): sliceobj[k] = slice(i, i + 1) if sample: fp = int(floor(f*i)) sliceobj0[k] = slice(fp, fp + 1) r[sliceobj] = xx[sliceobj0] else: sliceobj0[k] = slice(int(f*i), int(f*(i+1))) rshape = r[sliceobj].shape r[sliceobj] = xx[sliceobj0].sum(k).reshape(rshape)/f xx = r return r pydl-0.7.0/pydl/setup_package.py0000644000076500000240000000075612672535402017255 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license # -*- coding: utf-8 -*- def get_package_data(): # Installs the testing data files. Unable to get package_data # to deal with a directory hierarchy of files, so just explicitly list. return { 'pydl.tests': ['coveragerc', 't/*'], 'pydl.pydlspec2d.tests': ['t/*'], 'pydl.pydlutils': ['data/cooling/*', 'data/filters/*'], 'pydl.pydlutils.tests': ['t/*'], } # def requires_2to3(): # return False pydl-0.7.0/licenses/0000755000076500000240000000000013434104632014707 5ustar weaverstaff00000000000000pydl-0.7.0/licenses/LICENSE.rst0000644000076500000240000000275413434104050016525 0ustar weaverstaff00000000000000Copyright (c) 2010-2018, Benjamin Alan Weaver All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the Astropy Team nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. pydl-0.7.0/licenses/README.rst0000644000076500000240000000024212135272325016376 0ustar weaverstaff00000000000000Licenses ======== This directory holds license and credit information for the affiliated package, works the affiliated package is derived from, and/or datasets. pydl-0.7.0/docs/0000755000076500000240000000000013434104632014032 5ustar weaverstaff00000000000000pydl-0.7.0/docs/templates.rst0000644000076500000240000000572413434104050016564 0ustar weaverstaff00000000000000====================================== SDSS Spectroscopic Templates with PyDL ====================================== One of the original motivations for creating the PyDL package was to reproduce and, ultimately, improve the method for generating spectroscopic templates for the Sloan Digital Sky Survey (SDSS_). The spectroscopic templates are used by the "1D" portion of the SDSS spectroscopic pipeline (Bolton *et al.* 2012 [1]_; See also :mod:`pydl.pydlspec2d`). Historically, those were generated by: 1. Selecting individual objects (galaxies, QSOs, stars, white dwarfs, etc.) with good SDSS spectra and reliable classifications. 2. For a given set of objects, resample the spectra onto a common wavelength grid and redshift to rest frame. 3. Perform an iterative PCA analysis on the flux versus wavelength matrix. 4. Select the most significant eigenspectra as the templates. Typically the first four eigenspectra were chosen. For SDSS-I/II, the input spectra were all high signal-to-noise, but as the BOSS_ survey in SDSS-III started to target more distant galaxies, there was interest in improving the robustness of the PCA fitting to handle low signal-to-noise inputs. In addition, full alternatives to PCA, such has Heteroscedastic Matrix Factorization (HMF) [2]_, [3]_ could be compared. Here are the algorithmic steps taken in the code to implement the procedure described above. 1. :func:`~pydl.pydlspec2d.spec1d.template_input` reads a configuration file that includes a list of individual spectra. There is one configuration file for each type of object (galaxy, QSO, etc.). 2. :func:`~pydl.pydlspec2d.spec1d.readspec` reads each spectrum. Masked areas are identified by setting the per-spectrum inverse variance to zero as needed. 3. A new wavelength grid is created. SDSS typically uses wavelength grids that are evenly spaced in :math:`\log_{10}` wavelength. 4. :func:`~pydl.pydlspec2d.spec1d.preprocess_spectra` calls :func:`~pydl.pydlspec2d.spec2d.combine1fiber` to perform the resampling and rest frame shift. 5. :func:`~pydl.pydlspec2d.spec1d.pca_solve` or :class:`~pydl.pydlspec2d.spec1d.HMF` are used to obtain the eigenspectra. 6. The eigenspectra are written to a FITS file. At least historically, there was no test data set on which to (unit) test these algorithms. The best one could do was to take the inputs and compare them, function-by-function, to the equivalent `IDL®`_ code, ensuring that the results were the same, to some numerical precision. .. _SDSS: https://www.sdss.org .. _BOSS: https://www.sdss.org/surveys/boss/ .. _`IDL®`: http://www.harrisgeospatial.com/SoftwareTechnology/IDL.aspx .. [1] `Bolton, Adam, et al. 2012 AJ 144, 144 `_. .. [2] `Tsalmantza, P., Decarli, R., Dotti, M., Hogg, D. W., 2011 ApJ 738, 20 `_ .. [3] `Tsalmantza, P., Hogg, D. W., 2012 ApJ 753, 122 `_ pydl-0.7.0/docs/index.rst0000644000076500000240000000405113434104050015665 0ustar weaverstaff00000000000000==== PyDL ==== Introduction ++++++++++++ This package consists of Python_ replacements for functions that are part of the `IDL®`_ built-in library or part of astronomical `IDL®`_ libraries. The emphasis is on reproducing results of the astronomical library functions. Only the bare minimum of `IDL®`_ built-in functions are implemented to support this. There are four astronomical libraries targeted: * idlutils_ : a general suite of tools heavily used by SDSS_. * `Goddard utilities`_ : The `IDL®`_ Astronomy User's Libary, maintained by Wayne Landsman and distributed with idlutils_. * idlspec2d_ : tools for working with SDSS_, BOSS_ and eBOSS_ spectroscopic data. * photoop_ : tools for working with SDSS_ imaging data. This package affiliated with the astropy_ project and is registered with PyPI_. IDL® is a registered trademark of `Harris Geospatial Solutions`_. Components ++++++++++ Most of the functionality of PyDL is in sub-packages. .. toctree:: :maxdepth: 1 pydlutils.rst goddard.rst pydlspec2d.rst photoop.rst Other Notes +++++++++++ .. toctree:: :maxdepth: 1 templates.rst changes.rst todo.rst credits.rst licenses.rst Base API ++++++++ .. automodapi:: pydl :no-inheritance-diagram: .. _Python: http://python.org .. _`IDL®`: http://www.harrisgeospatial.com/SoftwareTechnology/IDL.aspx .. _idlutils: https://www.sdss.org/dr14/software/idlutils/ .. _SDSS: https://www.sdss.org .. _`Goddard utilities`: http://idlastro.gsfc.nasa.gov/ .. _idlspec2d: https://svn.sdss.org/public/repo/eboss/idlspec2d/trunk/ .. _BOSS: https://www.sdss.org/surveys/boss/ .. _eBOSS: https://www.sdss.org/surveys/eboss/ .. _photoop: https://svn.sdss.org/public/repo/sdss/photoop/trunk/ .. .. _astropy: http://www.astropy.org .. _PyPI: https://pypi.python.org/pypi/pydl/ .. _`PyDL on Read the Docs`: https://pydl.readthedocs.io/en/latest/ .. _SDSS-III: http://www.sdss3.org .. _`svn repository`: https://www.sdss.org/dr14/software/products/ .. _GitHub: https://github.com .. _`Harris Geospatial Solutions`: http://www.harrisgeospatial.com/ pydl-0.7.0/docs/pydlspec2d.rst0000644000076500000240000001002613434104050016626 0ustar weaverstaff00000000000000.. _pydl.pydlspec2d: ============================================ SDSS Spectroscopic Data (`pydl.pydlspec2d`) ============================================ Introduction ++++++++++++ This package provides functionality in the SDSS idlspec2d_ package. This package is used for processing and analyzing data from the SDSS optical spectrographs. The code is thus relevant to the `SDSS Legacy`_, BOSS_ and eBOSS_ surveys. This package does *not* work with any infrared spectrograph data associated with the APOGEE-2_ survey. The primary *technical* focus of this particular implementation is the function :func:`~pydl.pydlspec2d.spec2d.combine1fiber`. This function is responsible for resampling 1D spectra onto a new wavelength solution. This allows for: 1. Shifting a spectrum from observed redshift to rest frame. 2. Coaddition of spectra of the same object, after resampling all spectra onto the same wavelength solution. The primary *scientific* motivation of implementing :func:`~pydl.pydlspec2d.spec2d.combine1fiber` is to create template spectra based on curated spectra of, *e.g.*, luminous red galaxies (LRGs). Principal Component Analysis (PCA) or other techniques may be used to construct template spectra, but putting all spectra on the same rest-frame wavelength solution is the first step. The idlspec2d package is itself divided into a number of subpackages. Below we list the subpackages and the usability of the PyDL equivalent. The readiness levels are defined as: Obsolete No point in implementing because the purpose of the code lapsed many years ago. Not Applicable (NA) No point in implementing because another built-in or numpy/scipy/astropy package completely replaces this. None Not (yet) implemented at all. Rudimentary Only a few functions are implemented. Fair Enough functions are implemented to be useful, but some are missing. Good Pretty much anything you could do with the idlspec2d code you can do with the equivalent here. ========== =============== =================================================== Subpackage Readiness Level Comments ========== =============== =================================================== apo2d None Quick extraction code for quality assurance at observation time. config None Extraction pipeline configuration parameters in object-oriented IDL. fluxfix None Flux calibration guider None Interface to guider camera. inspect None Tools for manual inspection of spectra. photoz Obsolete Photometric redshifts for SDSS objects using spectral templates. plan None Tools for planning exposures and recordings summaries of exposures. plate Obsolete Tools for designing SDSS spectroscopic plates, especially for star clusters. science None Code for science analysis of sets of 1D spectra. spec1d Fair Tools for processing 1D spectra, including redshift fitting. spec2d Fair Tools for extracting spectra from 2D images. specdb Obsolete Tests on storing spectroscopic results in SQL databases. specflat None Flat-fielding of spectroscopic 2D images. templates None Tools for constructing spectroscopic templates. testsuite None Tools for high-level quality assurance, *e.g.* comparing two reductions of the same data. ========== =============== =================================================== .. _idlspec2d: https://svn.sdss.org/public/repo/eboss/idlspec2d/trunk/ .. _`SDSS Legacy`: https://classic.sdss.org/legacy/index.html .. _BOSS: https://www.sdss.org/surveys/boss/ .. _eBOSS: https://www.sdss.org/surveys/eboss/ .. _APOGEE-2: https://www.sdss.org/surveys/apogee-2/ API +++ .. automodapi:: pydl.pydlspec2d .. automodapi:: pydl.pydlspec2d.spec1d :skip: warn, solve, FontProperties, Pydlspec2dException, Pydlspec2dUserWarning .. automodapi:: pydl.pydlspec2d.spec2d :skip: warn, erf, get_pkg_data_filename, smooth, iterfit, djs_maskinterp, djs_median, sdss_flagval, traceset2xy, xy2traceset, vactoair pydl-0.7.0/docs/pydlutils.rst0000644000076500000240000001304313434104050016610 0ustar weaverstaff00000000000000.. _pydl.pydlutils: ================================= SDSS Utilities (`pydl.pydlutils`) ================================= Introduction ++++++++++++ This package provides functionality similar to idlutils_, a general suite of tools heavily used by SDSS_. idlutils_ is itself divided into a number of subpackages. Below we list the subpackages and the usability of the PyDL equivalent. The readiness levels are defined as: Obsolete No point in implementing because the purpose of the code lapsed many years ago. Not Applicable (NA) No point in implementing because another built-in or numpy/scipy/astropy package completely replaces this. None Not (yet) implemented at all. Rudimentary Only a few functions are implemented. Fair Enough functions are implemented to be useful, but some are missing. Good Pretty much anything you could do with the idlutils code you can do with the equivalent here. =========== =============== =================================================== Subpackage Readiness Level Comments =========== =============== =================================================== 2mass None For use with matching 2MASS catalogs to SDSS data. astrom None For use with SDSS astrometric data structures. Largely superseded by WCS. bspline Good Fitting B-splines to data, especially for resampling. cooling Good See :func:`pydl.pydlutils.cooling.read_ds_cooling`. coord Fair Some functionality already provided by :mod:`astropy.coordinates`. cosmography NA Tools for computing lookback time, angular sizes at cosmological distances, etc. Use :mod:`astropy.cosmology`. dimage None Interface to C code used for sky subtraction. djsphot None A simple aperture photometry code. dust None For use with the SFD galactic dust map. first None For use with matching FIRST catalogs to SDSS data. fits NA Use :mod:`astropy.io.fits`. healpix NA Interact with HEALPix data. Use healpy_. image Rudimentary Image manipulation functions. json NA Use :mod:`json` or other packages. mangle Fair Some work still required on polygon area calculations. math Fair Generic mathematical functions. Many are implemented in numpy or scipy. mcmc None But there are plenty of good Python MCMC packages out there. mglib Obsolete An IDL object-oriented configuration file reader. misc Fair General purpose utility functions. mpeg None Wrapper for :command:`ppmtompeg`, makes movies from data. mpfit None Appears to be an out-of-sync copy of the "markwardt" package in the `The IDL® Astronomy User's Libary`_. physics None Implementation of physical formulas, *e.g.* free-free scattering. plot None Much functionality already exists in matplotlib. psf Obsolete Point-spread function fitting. rgbcolor Good Some functionality is duplicated in :mod:`astropy.visualization`, especially :func:`~astropy.visualization.make_lupton_rgb`. rosat None For use with matching ROSAT catalogs to SDSS data. sdss Good Most important functionalities are bitmasks_ and reading `sweep files`_. slatec None Fit B-splines using C code. spheregroup Good Used for matching arbitrary RA, Dec coordinates to other arbitrary RA, Dec coordinates. TeXtoIDL NA This package is for including TeX in IDL plots. Since matplotlib understands TeX natively, this is not needed. trace Fair Used for fitting orthogonal functions to spectroscopic wavelength solutions. ukidss None Used for matching UKIDSS catalogs to SDSS data. wise None Used for matching WISE catalogs to SDSS data. yanny Good Tools for manipulating `SDSS parameter files`_. =========== =============== =================================================== .. _idlutils: https://www.sdss.org/dr14/software/idlutils/ .. _SDSS: https://www.sdss.org .. _`The IDL® Astronomy User's Libary`: http://idlastro.gsfc.nasa.gov/ .. _healpy: https://healpy.readthedocs.io/en/latest/ .. _bitmasks: https://www.sdss.org/dr14/algorithms/bitmasks/ .. _`sweep files`: https://data.sdss.org/datamodel/files/PHOTO_SWEEP/RERUN/calibObj.html .. _`SDSS parameter files`: https://www.sdss.org/dr14/algorithms/software/par/ API +++ .. automodapi:: pydl.pydlutils .. automodapi:: pydl.pydlutils.bspline :skip: warn, PydlutilsUserWarning, djs_reject, fchebyshev, uniq, flegendre, cholesky_banded, LinAlgError, cho_solve_banded .. automodapi:: pydl.pydlutils.cooling :skip: interp, get_pkg_data_contents .. automodapi:: pydl.pydlutils.coord :skip: get_juldate .. automodapi:: pydl.pydlutils.image .. automodapi:: pydl.pydlutils.mangle :skip: PydlutilsException, PydlutilsUserWarning .. automodapi:: pydl.pydlutils.math :skip: svd, djs_laxisnum, median .. automodapi:: pydl.pydlutils.misc .. automodapi:: pydl.pydlutils.rgbcolor :skip: warn .. automodapi:: pydl.pydlutils.sdss :skip: download_file, spherematch, uniq .. automodapi:: pydl.pydlutils.spheregroup :skip: warn, PydlutilsException, PydlutilsUserWarning, gcirc .. automodapi:: pydl.pydlutils.trace :skip: chebyt, FITS_rec, PydlutilsException, djs_reject, djs_laxisgen, flegendre .. automodapi:: pydl.pydlutils.yanny :skip: OrderedDict, PydlutilsException, PydlutilsUserWarning, Table pydl-0.7.0/docs/_templates/0000755000076500000240000000000013434104632016167 5ustar weaverstaff00000000000000pydl-0.7.0/docs/_templates/autosummary/0000755000076500000240000000000013434104632020555 5ustar weaverstaff00000000000000pydl-0.7.0/docs/_templates/autosummary/class.rst0000644000076500000240000000037312135272325022421 0ustar weaverstaff00000000000000{% extends "autosummary_core/class.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}pydl-0.7.0/docs/_templates/autosummary/base.rst0000644000076500000240000000037212135272325022225 0ustar weaverstaff00000000000000{% extends "autosummary_core/base.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}pydl-0.7.0/docs/_templates/autosummary/module.rst0000644000076500000240000000037412135272325022602 0ustar weaverstaff00000000000000{% extends "autosummary_core/module.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}pydl-0.7.0/docs/todo.rst0000644000076500000240000000135113434104050015523 0ustar weaverstaff00000000000000==== TODO ==== * Increase test coverage. * :mod:`pydl.pydlutils.mangle` needs more work. - Area (solid angle) calculation. - Area calculation needs to account for the ``use_caps`` attribute. - Intersection of caps with other caps. * Use numpy/scipy Cholesky tools - https://trac.sdss.org/browser/repo/sdss/idlutils/trunk/pro/bspline/cholesky_band.pro - https://docs.scipy.org/doc/scipy/reference/generated/scipy.linalg.cholesky.html - https://docs.scipy.org/doc/numpy/reference/generated/numpy.linalg.cholesky.html * Update ``astropy_helpers`` to v2.0.8. * Check ``groupdim``, ``groupsize`` in :func:`~pydl.pydlutils.math.djs_reject`. Make sure integer division works. * Document :class:`~pydl.pydlutils.bspline.bspline`. pydl-0.7.0/docs/Makefile0000644000076500000240000001074513064055440015502 0ustar weaverstaff00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest #This is needed with git because git doesn't create a dir if it's empty $(shell [ -d "_static" ] || mkdir -p _static) help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" clean: -rm -rf $(BUILDDIR) -rm -rf api -rm -rf generated html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Astropy.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Astropy.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/Astropy" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Astropy" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." make -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: @echo "Run 'python setup.py test' in the root directory to run doctests " \ @echo "in the documentation." pydl-0.7.0/docs/conf.py0000644000076500000240000001676213273057371015355 0ustar weaverstaff00000000000000# -*- coding: utf-8 -*- # Licensed under a 3-clause BSD style license - see LICENSE.rst # # Astropy documentation build configuration file. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this file. # # All configuration values have a default. Some values are defined in # the global Astropy configuration which is loaded here before anything else. # See astropy.sphinx.conf for which values are set there. # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # sys.path.insert(0, os.path.abspath('..')) # IMPORTANT: the above commented section was generated by sphinx-quickstart, but # is *NOT* appropriate for astropy or Astropy affiliated packages. It is left # commented out with this explanation to make it clear why this should not be # done. If the sys.path entry above is added, when the astropy.sphinx.conf # import occurs, it will import the *source* version of astropy instead of the # version installed (if invoked as "make html" or directly with sphinx), or the # version in the build directory (if "python setup.py build_sphinx" is used). # Thus, any C-extensions that are needed to build the documentation will *not* # be accessible, and the documentation will not build correctly. import datetime import os import sys try: import astropy_helpers except ImportError: # Building from inside the docs/ directory? if os.path.basename(os.getcwd()) == 'docs': a_h_path = os.path.abspath(os.path.join('..', 'astropy_helpers')) if os.path.isdir(a_h_path): sys.path.insert(1, a_h_path) # Load all of the global Astropy configuration from astropy_helpers.sphinx.conf import * # Get configuration information from setup.cfg try: from ConfigParser import ConfigParser except ImportError: from configparser import ConfigParser conf = ConfigParser() conf.read([os.path.join(os.path.dirname(__file__), '..', 'setup.cfg')]) setup_cfg = dict(conf.items('metadata')) # -- General configuration ---------------------------------------------------- # By default, highlight as Python 3. highlight_language = 'python3' # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.2' # To perform a Sphinx version check that needs to be more specific than # major.minor, call `check_sphinx_version("x.y.z")` here. # check_sphinx_version("1.2.1") # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns.append('_templates') # This is added to the end of RST files - a good place to put substitutions to # be used globally. rst_epilog += """ """ # -- Project information ------------------------------------------------------ # This does not *have* to match the package name, but typically does project = setup_cfg['package_name'] author = setup_cfg['author'] copyright = '{0}, {1}'.format( datetime.datetime.now().year, setup_cfg['author']) # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. __import__(setup_cfg['package_name']) package = sys.modules[setup_cfg['package_name']] # The short X.Y version. version = package.__version__.split('-', 1)[0] # The full version, including alpha/beta/rc tags. release = package.__version__ # -- Options for HTML output -------------------------------------------------- # A NOTE ON HTML THEMES # The global astropy configuration uses a custom theme, 'bootstrap-astropy', # which is installed along with astropy. A different theme can be used or # the options for this theme can be modified by overriding some of the # variables set in the global configuration. The variables set in the # global configuration are listed below, commented out. # Add any paths that contain custom themes here, relative to this directory. # To use a different custom theme, add the directory containing the theme. #html_theme_path = [] # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. To override the custom theme, set this to the # name of a builtin theme or the name of a custom theme in html_theme_path. #html_theme = None # Please update these texts to match the name of your package. html_theme_options = { 'logotext1': '', # white, semi-bold 'logotext2': 'PyDL', # orange, light 'logotext3': ':docs' # white, light } # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = '' # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = '' # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '' # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". html_title = '{0} v{1}'.format(project, release) # Output file base name for HTML help builder. htmlhelp_basename = project + 'doc' # -- Options for LaTeX output ------------------------------------------------- # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [('index', project + '.tex', project + u' Documentation', author, 'manual')] # -- Options for manual page output ------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [('index', project.lower(), project + u' Documentation', [author], 1)] # -- Options for the edit_on_github extension --------------------------------- if eval(setup_cfg.get('edit_on_github')): extensions += ['astropy_helpers.sphinx.ext.edit_on_github'] versionmod = __import__(setup_cfg['package_name'] + '.version') edit_on_github_project = setup_cfg['github_project'] if versionmod.version.release: edit_on_github_branch = "v" + versionmod.version.version else: edit_on_github_branch = "master" edit_on_github_source_root = "" edit_on_github_doc_root = "docs" # -- Resolving issue number to links in changelog ----------------------------- github_issues_url = 'https://github.com/{0}/issues/'.format(setup_cfg['github_project']) # -- Turn on nitpicky mode for sphinx (to warn about references not found) ---- # # nitpicky = True # nitpick_ignore = [] # # Some warnings are impossible to suppress, and you can list specific references # that should be ignored in a nitpick-exceptions file which should be inside # the docs/ directory. The format of the file should be: # # # # for example: # # py:class astropy.io.votable.tree.Element # py:class astropy.io.votable.tree.SimpleElement # py:class astropy.io.votable.tree.SimpleElementWithContent # # Uncomment the following lines to enable the exceptions: # # for line in open('nitpick-exceptions'): # if line.strip() == "" or line.startswith("#"): # continue # dtype, target = line.split(None, 1) # target = target.strip() # nitpick_ignore.append((dtype, six.u(target))) pydl-0.7.0/docs/goddard.rst0000644000076500000240000000730713434104050016171 0ustar weaverstaff00000000000000.. _pydl.goddard: ================================== Goddard Utilities (`pydl.goddard`) ================================== Introduction ++++++++++++ This package provides functionality similar to the `The IDL® Astronomy User's Libary`_, sometimes called the "Goddard Utilities", maintained by Wayne Landsman and distributed with idlutils_. In general, functions that are needed by :mod:`pydl.pydlutils` or :mod:`pydl.pydlspec2d` are implemented, while functions that are *not* needed have much lower priority. The Goddard package is itself divided into a number of subpackages. Below we list the subpackages and the usability of the PyDL equivalent. The readiness levels are defined as: Obsolete No point in implementing because the purpose of the code lapsed many years ago. Not Applicable (NA) No point in implementing because another built-in or numpy/scipy/astropy package completely replaces this. None Not (yet) implemented at all. Rudimentary Only a few functions are implemented. Fair Enough functions are implemented to be useful, but some are missing. Good Pretty much anything you could do with the Goddard code you can do with the equivalent here. ============= =============== =================================================== Subpackage Readiness Level Comments ============= =============== =================================================== astro Rudimentary General astronomical utility functions. astrom NA Tools for manipulating WCS data in FITS headers. Use :mod:`astropy.io.fits` and :mod:`astropy.wcs`. coyote NA The `Coyote library`_ for plotting and graphics developed by David Fanning. database None Allows access to IDL-specific databases. disk_io None Provides access to IRAF image (``.imh``) files and AJ/ApJ-style tables. fits NA Use :mod:`astropy.io.fits`. fits_bintable NA Use :mod:`astropy.io.fits`. fits_table NA Use :mod:`astropy.io.fits`. idlphot None Adapted from an early version of DAOPHOT. image None Generic image processing functions, including convolution/deconvolution. jhuapl None `Functions from the JHU Applied Physics Lab`_. markwardt None Levenberg-Marquardt least-squares minimization. math Rudimentary Generic mathematical functions. Many are implemented in numpy or scipy. misc Rudimentary General utility functions that do not involve astronomy specifically. plot NA Functions that supplement the built-in IDL plotting capabilities. robust None Robust statistical fitting procedures. sdas None Provides access to `STDAS/GEIS`_ image files. sockets NA Functions for performing web queries in IDL. Use astroquery_. structure NA Tools for manipulating IDL data structures. Use :class:`numpy.recarray`. tv NA Functions for manipulating IDL image displays. ============= =============== =================================================== .. _`The IDL® Astronomy User's Libary`: http://idlastro.gsfc.nasa.gov/ .. _idlutils: https://www.sdss.org/dr14/software/idlutils/ .. _`Coyote library`: http://www.idlcoyote.com/ .. _`Functions from the JHU Applied Physics Lab`: http://fermi.jhuapl.edu/s1r/idl/idl.html .. _`STDAS/GEIS`: http://www.stsci.edu/instruments/wfpc2/Wfpc2_dhb/intro_ch24.html#1905747 .. _astroquery: https://astroquery.readthedocs.io/en/latest/ API +++ .. automodapi:: pydl.goddard .. automodapi:: pydl.goddard.astro :skip: time .. automodapi:: pydl.goddard.math :skip: legendre .. automodapi:: pydl.goddard.misc pydl-0.7.0/docs/licenses.rst0000644000076500000240000000055013434104050016363 0ustar weaverstaff00000000000000******** Licenses ******** PyDL License ============ PyDL is licensed under a 3-clause BSD style license: .. include:: ../licenses/LICENSE.rst .. Other Licenses .. ============== .. Full licenses for third-party software astropy is derived from or included .. with Astropy can be found in the ``'licenses/'`` directory of the source .. code distribution. pydl-0.7.0/docs/make.bat0000644000076500000240000001064112135272325015443 0ustar weaverstaff00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Astropy.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Astropy.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end pydl-0.7.0/docs/changes.rst0000644000076500000240000002011113434104174016170 0ustar weaverstaff00000000000000============== PyDL Changelog ============== 1.0.0 (unreleased) ------------------ *This version will only support Python 3 and Astropy 3.* This release is planned for early 2019. 0.7.0 (2019-02-22) ------------------ *This version is planned to be the last version with Python 2 support.* * Support the ``firstField`` bit in ObjIDs from DR7 and earlier (Issue `#37`_). * Change tests of Astropy development version from Python 2 to Python 3. * Update to `astropy_helpers`_/v2.0.6 (PR `#40`_). * Add :mod:`astropy.units` support to :func:`~pydl.goddard.astro.airtovac` and :func:`~pydl.goddard.astro.vactoair` (PR `#41`_). * Change Exelis to Harris Geospatial (PR `#42`_). * Fix ``FutureWarning`` in ``re`` in Python 3.7 due to nested sets (PR `#44`_). * Use ``six`` instead of ``astropy.extern.six`` (PR `#48`_). * Update ``_astropy_init.py`` (PR `#47`_ via PR `#48`_). * Update `astropy_helpers`_/v2.0.8 (PR `#45`_ via PR `#48`_). .. _`#37`: https://github.com/weaverba137/pydl/issues/37. .. _`#40`: https://github.com/weaverba137/pydl/pull/40 .. _`#41`: https://github.com/weaverba137/pydl/pull/41 .. _`#42`: https://github.com/weaverba137/pydl/pull/42 .. _`#44`: https://github.com/weaverba137/pydl/pull/44 .. _`#45`: https://github.com/weaverba137/pydl/pull/45 .. _`#47`: https://github.com/weaverba137/pydl/pull/47 .. _`#48`: https://github.com/weaverba137/pydl/pull/48 0.6.0 (2017-09-19) ------------------ * This release is compatible with Astropy 2.0, and may be backwards incompatible with astropy v1.x. * Update to `astropy_helpers`_/v2.0.1. * Use standard library :mod:`argparse` (Issue `#31`_). * Use the new :class:`astropy.coordinates.Attribute` class. * Fix typo (PR `#26`_). .. _`#31`: https://github.com/weaverba137/pydl/issues/31. .. _`#26`: https://github.com/weaverba137/pydl/pull/26 0.5.4 (2017-05-04) ------------------ * Added :func:`~pydl.pydlutils.sdss.sdss_specobjid` to compute SDSS specObjIDs, and its inverse function :func:`~pydl.pydlutils.sdss.unwrap_specobjid`. * Update to `astropy_helpers`_/v1.3.1. * Refactor HMF code into an object to contain the data and methods. * Use functions from :mod:`astropy.utils.data` where possible. * Fix an integer division error encountered when using Numpy 1.12 (Issue `#19`_). * Fixed tests that were failing on 32-bit platforms *and* Python 3.5 (Issue `#20`_). .. _`#19`: https://github.com/weaverba137/pydl/issues/19 .. _`#20`: https://github.com/weaverba137/pydl/issues/20 0.5.3 (2016-12-03) ------------------ * Fixed formatting of TODO document. * Fixed tests that were failing on 32-bit platforms (Issue `#14`_). * Use temporary files so that tests can run when astropy is installed read-only (*e.g.*, with :command:`pip`; Issue `#16`_) .. _`#14`: https://github.com/weaverba137/pydl/issues/14 .. _`#16`: https://github.com/weaverba137/pydl/issues/16 0.5.2 (2016-08-04) ------------------ * Changes in how Mangle-polygon containing FITS files are handled, related to Issue `#11`_. * Fixed memory leak in :func:`~pydl.pydlspec2d.spec2d.combine1fiber`, see Issue `#12`_. * Added :func:`~pydl.pydlutils.mangle.is_in_window`. * Allow polygon area functions to deal with negative caps and ``use_caps``. * Update ``docs/conf.py`` for Python 3.5 compatibility (PR `#13`_). .. _`#13`: https://github.com/weaverba137/pydl/pull/13 .. _`#11`: https://github.com/weaverba137/pydl/issues/11 .. _`#12`: https://github.com/weaverba137/pydl/issues/12 0.5.1 (2016-06-22) ------------------ * Removed unnecessary ``from __future__`` import in :mod:`pydl.pydlspec2d.spec1d`. * Ongoing documentation upgrades. * Update some links that needed to be transitioned from SDSS-III to SDSS-IV. * Upgrade to `astropy_helpers`_/v1.2. * Update to latest version of package-template_. * Disabled tests on Python 3.3; enabled tests on Python 3.5 * Fix Issue `#8`_; Issue `#9`_. * Add warnings about incomplete Mangle functions. .. _`#8`: https://github.com/weaverba137/pydl/issues/8 .. _`#9`: https://github.com/weaverba137/pydl/issues/9 0.5.0 (2016-05-01) ------------------ * Dropped support for Python 2.6. Python 2.6 does not contain :class:`collections.OrderedDict`, which is needed to support :class:`~pydl.pydlutils.yanny.yanny` objects, and at this point it is not worth going to the trouble to support this with an external package. * Ongoing review and upgrade of docstrings. * Yanny files can now be converted into *genuine* NumPy :class:`record arrays `; previously, the conversion was only to :class:`numpy.ndarray` with named columns, which is a slightly different thing. * Added additional tests on :class:`~pydl.pydlutils.yanny.yanny` objects. * Experimental support for interconversion of :class:`~pydl.pydlutils.yanny.yanny` objects and :class:`~astropy.table.Table` objects. * Improving `PEP 8`_ compliance * Restructuing sub-packages to reduce the number of files. * Improvements to spectral template processing code, deduplicated some code. * Support platform-independent home directory (PR `#7`_). * Uppercase the package name (in documentation only). * Upgrade to `astropy_helpers`_/v1.1.1. * Add functions from the idlutils rgbcolor directory. * :func:`~pydl.pydlspec2d.spec1d.spec_path` can now find SDSS spectra, not just BOSS. .. _`PEP 8`: https://www.python.org/dev/peps/pep-0008/ .. _`#7`: https://github.com/weaverba137/pydl/pull/7 0.4.1 (2015-09-22) ------------------ * No changes at all from 0.4.0. This tag only exists because of a botched PyPI upload. 0.4.0 (2015-09-22) ------------------ * Use `astropy_helpers`_/v1.0.3, package-template_/v1.0. * Remove some old FITS code that :mod:`astropy.io.fits` makes moot. * Remove code for command-line scripts. These are now auto-generated by the "entry_point" method. * Remove Python/3.2 tests. * Improved test coverage. * Fixed problem with the :mod:`~pydl.pydlutils.spheregroup` code. * Removed some code that is 100% redundant with astropy (*e.g.* ``angles_to_xyz()``). * Fixed bug in :func:`~pydl.pydlutils.mangle.set_use_caps` that was discovered on the IDL side. * Updated documentation of :func:`~pydl.pydlutils.mangle.read_fits_polygons`. * Added cross-references to classes, functions, etc. 0.3.0 (2015-02-20) ------------------ * Use `astropy_helpers`_/v0.4.3, package-template_/v0.4.1. * Avoided (but did not fix) a bug in :class:`~pydl.pydlutils.spheregroup.chunks` that occurs when operating on a list of coordinates of length 1. * Fixed a typo in :class:`~pydl.pydlutils.bspline.bspline`, added documentation. * Simplify documentation files. * :func:`~pydl.pydlutils.sdss.sdss_flagname` now accepts more types of numeric input. * Added :doc:`credits` file. 0.2.3 (2014-07-22) ------------------ * Added :mod:`pydl.photoop.window`. * Added stub :func:`~pydl.photoop.sdssio.sdss_calib`, updated :func:`~pydl.photoop.window.sdss_score`. * Added :func:`~pydl.photoop.photoobj.unwrap_objid`. * Merged pull request #4, fixing some Python3 issues. 0.2.2 (2014-05-07) ------------------ * Updated to latest package-template_ version. * Added ability to `write multiple ndarray to yanny files`_. * Fixed :func:`~pydl.pydlutils.misc.struct_print` test for older Numpy versions. * Fixed failing yanny file test. * Improve test infrastructure, including Travis builds. * Allow comment characters inside quoted strings in yanny files. 0.2.1 (2013-10-06) ------------------ * Added :func:`~pydl.pydlutils.sdss.sdss_sweep_circle`. * Added first few :mod:`pydl.photoop` functions. * Clean up some import statements. 0.2.0 (2013-04-22) ------------------ * Using the astropy package-template_ to bring pydl into astropy-compatible form. * Some but not all tests are re-implemented. 0.1.1 (2013-03-06) ------------------ * Creating a tag representing the state immediately after creation of the `git repository`_. 0.1 (2010-11-10) ---------------- * Initial tag (made in svn, not visible in git). Visible at http://www.sdss3.org/svn/repo/pydl/tags/0.1 . .. _`astropy_helpers`: https://github.com/astropy/astropy-helpers .. _package-template: https://github.com/astropy/package-template .. _`git repository`: https://github.com/weaverba137/pydl .. _`write multiple ndarray to yanny files`: https://github.com/weaverba137/pydl/pull/3 pydl-0.7.0/docs/photoop.rst0000644000076500000240000001113613434104050016250 0ustar weaverstaff00000000000000.. _pydl.photoop: =================================== SDSS Imaging Data (`pydl.photoop`) =================================== Introduction ++++++++++++ The photoop_ package is used to process SDSS imaging data. This package is used to reduce_, resolve_ and flux-calibrate_ the SDSS raw data, resulting in both flux-calibrated images and catalogs. SDSS ceased taking imaging data in 2009, and there has only been one full processing of the imaging data since then, although adjustments have been made to the astrometry and flux calibration. The primary emphasis of this implementation is: 1. Functions related to the SDSS photometric ``objID``, which is a unique integer used in SDSS databases, constructed from quantities that specify a particular astronomical object on a particular image. 2. Functions related to finding SDSS photometric data on disk. 3. The SDSS "window" function, which defines what parts of the sky are covered by SDSS images. This can be used in conjunction with the :mod:`~pydl.pydlutils.mangle` module to find points and regions that have SDSS imaging. 4. In general, functions that work with existing imaging data, rather than functions to reduce the data. The photoop package is itself divided into a number of subpackages. Below we list the subpackages and the usability of the PyDL equivalent. The readiness levels are defined as: Obsolete No point in implementing because the purpose of the code lapsed many years ago. Not Applicable (NA) No point in implementing because another built-in or numpy/scipy/astropy package completely replaces this. None Not (yet) implemented at all. Rudimentary Only a few functions are implemented. Fair Enough functions are implemented to be useful, but some are missing. Good Pretty much anything you could do with the photoop code you can do with the equivalent here. =========== =============== =================================================== Subpackage Readiness Level Comments =========== =============== =================================================== apache Obsolete Processing of "Apache Wheel" images, used for calibration. astrom None Astrometry for SDSS images. atlas None Construction of "atlas" images, small cutouts of individual objects. bluetip None Tools for "blue-tip" photometry and extinction estimation. compare Obsolete Compare the same images in two different data reductions. database Obsolete Experimental database loading code. flats None Analysis of flat-field files. hoggpipe Obsolete Another version of "Apache Wheel" processing code. image None Tools for creating "`corrected frame`_" images. These are flux-calibrated and sky-subtracted images with physical flux units. ircam Obsolete Tools for processing all-sky "cloud camera" images, used to establish photometricity. match None Tools for matching SDSS spectra to corresponding photometric objects. misc None Code with no obvious home in any other category. pcalib None Tools related to "ubercalibration_". photoobj Rudimentary Tools for creating calibrated catalogs from images. plan Obsolete Tools for planning photometric reductions. plots None Plots for high-level quality assurance. psf None Analysis of point-spread functions. ptcalib Obsolete Processing of "photometric telescope" data, an obsolete technique for flux-calibration. resolve None Code for the resolve_ stage of image processing. sdss3_runqa Obsolete Quality assurance tests from the most recent photometric reduction. sdssio Rudimentary Tools for reading and writing various data files produced by the photometric reductions. window Rudimentary Tools for determining the sky coverage of the survey. =========== =============== =================================================== .. _photoop: https://svn.sdss.org/public/repo/sdss/photoop/trunk/ .. _reduce: https://www.sdss.org/dr14/imaging/pipeline/ .. _resolve: https://www.sdss.org/dr14/algorithms/resolve/ .. _flux-calibrate: https://www.sdss.org/dr14/algorithms/fluxcal/ .. _`corrected frame`: https://data.sdss.org/datamodel/files/BOSS_PHOTOOBJ/frames/RERUN/RUN/CAMCOL/frame.html .. _ubercalibration: https://www.sdss.org/dr14/algorithms/fluxcal/ API +++ .. automodapi:: pydl.photoop .. automodapi:: pydl.photoop.photoobj .. automodapi:: pydl.photoop.sdssio .. automodapi:: pydl.photoop.window :skip: warn, sdss_name, sdss_calib, sdss_flagval, set_use_caps pydl-0.7.0/docs/notebooks/0000755000076500000240000000000013434104632016035 5ustar weaverstaff00000000000000pydl-0.7.0/docs/notebooks/Vacuum Wavelength Conversions.ipynb0000644000076500000240000043513713301642157024735 0ustar weaverstaff00000000000000{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Vacuum Wavelength Conversions\n", "\n", "This notebook compares various air to vacuum conversion formulas.\n", "\n", "* [Greisen *et al.* (2006)](http://adsabs.harvard.edu/abs/2006A%26A...446..747G) cites [International Union of Geodesy and Geophysics (1999)](http://www.iugg.org/assemblies/1999birmingham/1999crendus.pdf)\n", " - This version is used by [specutils](https://github.com/astropy/specutils).\n", " - Specifically, this is based on the *phase* refractivity of air, there is a slightly different formula for the *group* refractivity of air.\n", "* [Ciddor (1996)](http://adsabs.harvard.edu/abs/1996ApOpt..35.1566C)\n", " - Used by [PyDL](https://github.com/weaverba137/pydl) via the [Goddard IDL library](http://idlastro.gsfc.nasa.gov/).\n", " - This is the standard used by SDSS, at least since 2011. Prior to 2011, the Goddard IDL library used the IAU formula (below) plus an approximation of its inverse for vacuum to air.\n", "* [The wcslib *code*](https://github.com/astropy/astropy/blob/master/cextern/wcslib/C/spx.c) uses the formula from Cox, *Allen’s Astrophysical Quantities* (2000), itself derived from [Edlén (1953)](http://adsabs.harvard.edu/abs/1953JOSA...43..339E), even though Greisen *et al.* (2006) says, \"The standard relation given by Cox (2000) is mathematically intractable and somewhat dated.\"\n", " - Interestingly, this is the **IAU** standard, adopted in 1957 and again in 1991. No more recent IAU resolution replaces this formula.\n", "\n", "This would be a bit of a deep dive, but it would be interesting to see if these functions are based on measurements of the refractive index of air, *or* on explicit comparison of measured wavelengths in air to measured wavelengths in vacuum.\n", "\n", "As shown below, the Greisen formula gives consistently larger values when converting air to vacuum. The Ciddor and wcslib values are almost, but not quite, indistinguishable. The wcslib formula has a singularity at a value less than 2000 Å. The Ciddor formula probably has a similar singularity, but it explicitly hides this by not converting values less than 2000 Å." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "```\n", "int waveawav(dummy, nwave, swave, sawav, wave, awav, stat)\n", "\n", "double dummy;\n", "int nwave, swave, sawav;\n", "const double wave[];\n", "double awav[];\n", "int stat[];\n", "\n", "{\n", " int status = 0;\n", " double n, s;\n", " register int iwave, k, *statp;\n", " register const double *wavep;\n", " register double *awavp;\n", "\n", " wavep = wave;\n", " awavp = awav;\n", " statp = stat;\n", " for (iwave = 0; iwave < nwave; iwave++) {\n", " if (*wavep != 0.0) {\n", " n = 1.0;\n", " for (k = 0; k < 4; k++) {\n", " s = n/(*wavep);\n", " s *= s;\n", " n = 2.554e8 / (0.41e14 - s);\n", " n += 294.981e8 / (1.46e14 - s);\n", " n += 1.000064328;\n", " }\n", "\n", " *awavp = (*wavep)/n;\n", " *(statp++) = 0;\n", " } else {\n", " *(statp++) = 1;\n", " status = SPXERR_BAD_INSPEC_COORD;\n", " }\n", "\n", " wavep += swave;\n", " awavp += sawav;\n", " }\n", "\n", " return status;\n", "}\n", "\n", "int awavwave(dummy, nawav, sawav, swave, awav, wave, stat)\n", "\n", "double dummy;\n", "int nawav, sawav, swave;\n", "const double awav[];\n", "double wave[];\n", "int stat[];\n", "\n", "{\n", " int status = 0;\n", " double n, s;\n", " register int iawav, *statp;\n", " register const double *awavp;\n", " register double *wavep;\n", "\n", " awavp = awav;\n", " wavep = wave;\n", " statp = stat;\n", " for (iawav = 0; iawav < nawav; iawav++) {\n", " if (*awavp != 0.0) {\n", " s = 1.0/(*awavp);\n", " s *= s;\n", " n = 2.554e8 / (0.41e14 - s);\n", " n += 294.981e8 / (1.46e14 - s);\n", " n += 1.000064328;\n", " *wavep = (*awavp)*n;\n", " *(statp++) = 0;\n", " } else {\n", " *(statp++) = 1;\n", " status = SPXERR_BAD_INSPEC_COORD;\n", " }\n", "\n", " awavp += sawav;\n", " wavep += swave;\n", " }\n", "\n", " return status;\n", "}\n", "```" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "%matplotlib inline\n", "import numpy as np\n", "import matplotlib\n", "import matplotlib.pyplot as plt\n", "from matplotlib.font_manager import fontManager, FontProperties\n", "import astropy.units as u\n", "from pydl.goddard.astro import airtovac, vactoair\n", "wavelength = np.logspace(3,5,200) * u.Angstrom" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "def waveawav(wavelength):\n", " \"\"\"Vacuum to air conversion as actually implemented by wcslib.\n", " \"\"\"\n", " wave = wavelength.to(u.m).value\n", " n = 1.0\n", " for k in range(4):\n", " s = (n/wave)**2\n", " n = 2.554e8 / (0.41e14 - s)\n", " n += 294.981e8 / (1.46e14 - s)\n", " n += 1.000064328\n", " return wavelength / n\n", "\n", "def awavwave(wavelength):\n", " \"\"\"Air to vacuum conversion as actually implemented by wcslib.\n", " \n", " Have to convert to meters(!) for this formula to work.\n", " \"\"\"\n", " awav = wavelength.to(u.m).value\n", " s = (1.0/awav)**2\n", " n = 2.554e8 / (0.41e14 - s)\n", " n += 294.981e8 / (1.46e14 - s)\n", " n += 1.000064328\n", " return wavelength * n" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "#\n", "# These functions aren't defined until specutils 0.3.x. However specviz needs 0.2.2.\n", "#\n", "def air_to_vac(wavelength):\n", " \"\"\"\n", " Implements the air to vacuum wavelength conversion described in eqn 65 of\n", " Griesen 2006\n", " \"\"\"\n", " wlum = wavelength.to(u.um).value\n", " return (1+1e-6*(287.6155+1.62887/wlum**2+0.01360/wlum**4)) * wavelength\n", "\n", "def vac_to_air(wavelength):\n", " \"\"\"\n", " Griesen 2006 reports that the error in naively inverting Eqn 65 is less\n", " than 10^-9 and therefore acceptable. This is therefore eqn 67\n", " \"\"\"\n", " wlum = wavelength.to(u.um).value\n", " nl = (1+1e-6*(287.6155+1.62887/wlum**2+0.01360/wlum**4))\n", " return wavelength/nl" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [], "source": [ "greisen_a2v = air_to_vac(wavelength) / wavelength - 1.0\n", "ciddor_a2v = airtovac(wavelength) / wavelength - 1.0\n", "wcslib_a2v = awavwave(wavelength) / wavelength - 1.0\n", "good = (greisen_a2v > 0) & (ciddor_a2v > 0) & (wcslib_a2v > 0)" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAA3gAAANNCAYAAADF2dxQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAIABJREFUeJzs3Xt8XFW5+P/Pk2m5BEiKlEthaCgmVEBuBbmICFS5HEVF4QcKnCPaYyvw9YKgHDxeUAQPKHhHG08AxRseRY4CxwtYtAhyR1EKJNBYB8pNIAFqCyTr98feCZNp0qbtpBOmn/frtV+TrP3svZ/ZM5PXPFlrrx0pJSRJkiRJL38NtU5AkiRJklQdFniSJEmSVCcs8CRJkiSpTljgSZIkSVKdsMCTJEmSpDphgSdJkiRJdcICT5IkSZLqhAWeJEmSJNUJCzxJkiRJqhMWeJLWCRGxa0RcEhELI2JpRDwbEXdExMci4hW1zm+sRcSlEdFd6zzWVETsERG/i4ieiEgR8eFRbDM5Ipbl8XuNELPG5yciTo6IE9dkHxX72zwino+IH60gpikilkTEz6t13JeziDgwIv4nIh7Kz11PRPwhIuZERGOt81tbIuJzEfFirfOQVBuRUqp1DpI0piLifcBFwH354z3ARGAv4H3An1JKb69dhmMvIl4JNKWU7qx1LmsiIu4ENgI+BDwFdKeUHlnJNqcCF+a/fiuldNIwMWt8fiLiL8ATKaWDVncfw+zzJ8ARwJSU0lPDrJ8NzAWOTCn9b7WO+3IUEecAHwduAC4BuoCNgdcCs4HvpJQ+WrsM156IKALbpJRurnUuktY+CzxJdS0i9gPmA78h+xK8rGL9esDhKaW67AGJiMaU0pJa51EtEfEC8O2U0smrsM3dwBbA34AdyIqlf67GsQPYYKRtx6jA+xfgGuADKaWvD7P+j8B2QDGltM722ETEu4AfAO3A+1PFl5uIaAL2SSn9phb5VUP+t6p/XX6dJY2OQzQl1buPAwmYXVncAaSUni8v7iKiIR+2eW8+rO+xiPhu/h9xyuKuj4i/RMR+EXFjRPwzIroj4j35+jfnQ0CXRMTdEXF4xfZn5UMG94iIKyKiNx9O9r2I2Lwi9tiI+HVELM6PsyAi/isiNqqIuzQferpLHv8McF3Zuu6K+P8vIm7Oj7skIh6MiIsrYqbmOT2Wn48FEXFaRDSUxWyXP5fTI+Ij+TDYZyPipojYdzQvUkS8OiL+NyKeyofQ3hUR7y5bf2JEJGACcFJ+vJX+hzIi9gFeDVwGfBtoBo4aJm6485Mi4usR8f6IWAAsA95duW0e2w3sDBw4kFv5/kZzHkfwK6AEvGeYY+4I7AN8d+BLf0TskD+Xrvw1fSgifh4ROw+z/aYR8aX8dV8WEY9GxNURsUO+/o3583hdxXatefsJZW03RMS1wxzjexHRNcy2p0bEmRHxtzzPefm69SLiC/l7/en8s7F55X6H8SngH8CHKos7gJRSb3lxFxEbRsR5+Wf2+YgoRcTXIqK5Iv9SRFwZEW+KiDvLPn/l78298ue03HsjIt6Sr3tTWdv0iPhR2Xvhnog4qWK7gXN/XP4aPQwsBbaLiI0i4sJ4abj5kxFxa0QcU7b9ckM0I6IQEf8REfeVvd7fiYitK+JuyD9/+0Q2vHVJRDwQ2d/FqNjfp/L9/TN/vf4cEf9vRS+UpLE3odYJSNJYiYgCMBO4PaX091Fu9k2y4VxfB64i6x05GzgoImaklJ4oi92KbCjY+WRfwj8AXBwR2wJHA+cCPWRfPq+MiO1TSg9XHO9nwI+Bb5EVCGcDO0XEPimlF/KYNrJenC8DzwGvAs4A9s6fX7n1gJ+TDdv7L0b4Ox9Zz+bl+XIW2ZfHlvL95V+sb8z3+Umgm2y44BeBVwKVvWinAPcCA9fFnQ1cExHTUko9w+WRH2d6fpzHgA+SfVE/Abg0IrZMKZ0PXA3sB9wE/AS4YKT9VZiVP14M/J3sHM4CvjfK7Y8EDgA+CzyS5zict+d59fDSeVkGq3UeB6WU+iPiUuATEbFbSulPZasHir7yonwb4HHgY8ATwGbAicAt+fZdeU7NeU5FsvfJLcAmwEFk7+v7R8qpSj4E3EX23DclG0L7c+BOYEme8/Zk52gu8I6RdpR/3l4FfD+ltHRlB86L6p8DB5J9Rv8A7E72Odg3IvZPKT1ftskMss/4f5G9/nPI3pudKaUbU0q3RdZL/B7gOxWHOxFYTFaoExGvzo+3EPgI8CjwJuDrEfGKlNI5Fdufn8fPzn//B/AV4J3Af5Kdw42BXche6xVpz3P8Gtnfk2lkn9HXR8SeKaUny2K3Br5L9jn7NNnfs/PI/s79II85k+z9fDbZsNj1yF6HTVeSh6SxllJycXFxqcsF2JKs9+6Ho4x/VR7/jYr2vfP2c8rars/b9ixrewXwItkX1K3L2nfLYz9Q1nZW3nZhxbGOy9uPHyHHICvaXp/H7Vq27tK87T3DbHcp2fVqA7+flsc2r+B8fD6P2bui/SKgH9gh/327PO7PQKEs7jV5+ztXct5/SFZgblvRfg1ZQdtc1paAr4/y9WwkK7huqjgP/cArV3R+yo71NLDpKI/3F+D61T2PK9jvtDzuK2VtE8gKhxtWsm2B7HrTB4Hzy9o/k+d08Aq2fWMe87qK9ta8/YSythuAa4fZx/eArmG2vZ38MpGK9+NPK7b/Wv7cG1eQ5/75tmeP8nV6cx5/6gifvfeUtZXy9+A2ZW0b5u+Lr5e1nVr5viIruJYB/1XWdi3ZUOFNKo79TbK/G80V5/66YfJfAPzPSp7j54AXy37fOd/fVyriXpu3f6bitewHZpS1Bdk/b64qa/s/4NbRnHMXF5e1uzhEU5JecnD+eGl5Y0rpFrIvVW+oiF+cUrq9LO5Jsv/w35WG9tQtyB9bhjnm9yt+/zFZkTiQCxGxfUT8ICIeAfqAF4Df5at3HGafPx2mrdKtA8eLiGMiYpthYmYC9+TPv9ylZF/4KnsPr04p9ZX9/uf8cbjnXXmc69LyvayXkhVp+61k+5EcAzQxtIfrYrLclxvyOILfpmEmN1lFq3oeh0gpLQTmAcdHdh0WwL+Q9bRVDqmdGBGfyIf9PU/2XnqerEgsf6/8S57TvNV7Smvs6pRS+VDKgc/I1RVxC8jO0dQqHnvgfF9a0f4jsn80VH7O70gpPTTwS8quwexk6Pv6e2TnunyY5vFkvVqXAEQ2pPogss/nPyNiwsBC9s+MDcn+mVRuuM/yLcAREXFuZLOGbjjyUx007HNOKd2YP5fK5/xQSumOsrgE3M3Q53wLsGdkw5gPjew6R0njgAWepHr2BNl/xaeNMn5giNPiYdY9zPJDoJ4cJu75yvb00nCvDYaJf6Qi9kWyYVibAUTExmSTxOwDfILsC+JreGnIWuWXuyUppd5hjjNESun3ZMMPJ5ANxSpFdk3hu8rCNmPkczGwvtw/Ko4xcM3jyr6ArupxRmsW2Rf2X0bEpIiYRFZ0dgMn5kN4V2a4vFZVNZ5fRx731vz39wDPkv1DoNxXyHqHryAbBroP2fvlrwx9HTYn652qlcrPzvMraR/uszNgUf64Kp/zZZWFe0qpn2zI5Arf17lllJ3PlNLjZEO63x0vXVd5InBjSum+/PfJZD2qp5L9k6Z8+XlZTLnh3jenkA1dPYpsJMGTEfGzyGaCHcmq/m1b6XMm6yX8GPA64JfAPyLi2oiYsYI8JK0FFniS6lbem3Qd2X+ZiyuL56UvNVOGWbc1WcFYbVuV/5L/N3+zslxm5sd+b0rpv1NKv08p3QY8M8L+Rj01ckrpf1NKbyCbeOQg8utr8uvzyHMY6VxA9c5H1Y8T2UQhryMrDBaR3VJhYNmO7Fq1w0axq2pMNV2N53cFWe7vza/pOwK4PKX0bEXcCcAlKaVPpJR+nVK6JX+/VBYOj5Ndf7ciA9ezrV/RXrmvgdjKuJFiqyrv+V0AHBYRKyoEB/wDWD8ihlwrlhdmW7L67+tLyHoaZ0bErsAeeduAJ8mGPnaQFd3DLb+q2OdwE8Y8m1L6ZEppOtn76mSyYaoruk1G1f+2pZReSCl9MaW0O9nfrOPIPlu/GuXrIGmMWOBJqnefJxvi9e2y4W2D8iFtb8l//W3+eEJFzGvIhrddNwb5HV/x+zFkvWrX578PfMGrnAF0TrUSSCktSyn9jmziFsi+mEL2fHca5j/y/5bnVa3hfdeRfSneuqL938h6YP+4GvscmFzlfWTDXcuXN5H1mrx3tbIdWWUPx4A1Po8pmzzkB8ChZK/TRCqGZw6EUvFeiYi3kRUu5f4vz+nAFRy2O3/ctaL9rSyvG3hV+WcsL0RHNYtqFZxNVkx+uXymx7JcNomIN+a/DnyOT6gIO4bsHwKr+zn/P7Ie+ffkyxKySYwASCk9A/ye7PP1p5TSbcMsw40KGFFK6ZGU0iX5cXaOiOGKbBj5b9u+ZJM4rdHftpTSUyml/yG7lnAy1R1SK2kVOYumpLqWUropn4L8IuD2iPgm2XC1iWRftGaTTY7xi5TSfRHRDnwgIvrJvrBtR/bl8e/Al8YgxXfk05n/hpdm0fwTLw29u5Gs5+ZbEfEZssLkeLKJW1ZbRHyWrAfnOrKeu0lkMxuWX9/3JbIi5OqI+BTZ5BBvJusx+GZKqVozLX6GrEdqXp7Xk2TP8c3Ax9IKZuAcTt4L+m/AgpTSf48Q8wvgrRGxeT68rhruBt4ZEceSTWqyNKV0N9U7jx1kw/M+AtybXz9V6SpgVkR05vm8hmwCk4cq4i4kK2h+EREDs2huRDaz5M9SSvNTSqWIuJ5sBs9esp7QQ8iG9la6jKyoviwi/ptsCOgZZJORjLmU0g/zGSo/Tla4XgI8wEvXcM7Jc7yW7HN9LfDFfNjuTWSfp88At/HSLJGrmsOLEXEZ2Wy6zwI/yYu6ch8kG3L9+4j4Fllh3ERWZL0571FfoYi4DbiS7PV9iuzvxvHA79Mwt4LJc/trZLdAOTWvf3/FS7No/g346qo9W4iIa8hm8bydrEd4GtlzfzBfJNVKrWd5cXFxcVkbC9kXuEvJvswsI/sCdgfZl7rNy+IayK4ruY/s+p/Hyb4YFiv2dz3wl2GO003ZTHNl7UNmf+SlWTRnkF1/8wzQS/blcouKbfcjK/SeI5vE5dtkxWkCTiyLuxR4doTnfylDZ9F8M9nEDqX8fDxKNsFF5YyJU8kmgnkiPx/3AqcDDWUx2+W5nD7C8z5rFK/Pq/Pz8HSez13lz22k8zjCvt6Wx31oBTGH5TEfGe78jPZYFfEtZF+ce/Nty8/3Ss/jKI9xR77vj46wflOynr3H8vfL78lmSlxulkuyWV+/Sla4PZ+/B34OtJbFbE020cc/yIqJ7/DSrLInVOzvPWRDJf9JVnwcxcizaH64YtuBWSOPrGj/97x991Gen4PIblfxMNk/K3rIbjNwGrBxWVwj2S0I/pY/94fIbo3SXLG/EnDlMMcZadbQHfN8E3DQCDluTzZ0s5Qf+7F8f/+xsvORrzufrBB9Mj/XD5DdzuAVZTFDZtHM2wpktza4n5f+tn2Xshl/y57bXcMct/K1/Gh+bh8n+8z+jexWDNsO97xdXFzW3hIpVePyAknSqoiIs8juL7V5GnpvPUmSpNXmNXiSJEmSVCcs8CRJkiSpTjhEU5IkSZLqhD14kiRJklQnLPAkSZIkqU5Y4EmSJElSnfBG5+NIZHcf3ZrsfliSJEmS1m2bAA+nVZg4xQJvfNma7MankiRJkgRQBB4abbAF3vjyDMDf//53mpqaap2LJEmSpBrp7e1l2223hVUc3WeBNw41NTVZ4EmSJElaZU6yIkmSJEl1wgJPkiRJkuqEBZ4kSZIk1QmvwZMkSZLWEX19fbzwwgu1TkO59dZbj4aG6va5WeBJkiRJdS6lxCOPPMLTTz9d61RUpqGhgWnTprHeeutVbZ8WeJIkSVKdGyjutthiCxobG4mIWqe0zuvv7+fhhx9m8eLFTJ06tWqviQWeJEmSVMf6+voGi7vNNtus1umozOabb87DDz/Miy++yMSJE6uyTydZkSRJkurYwDV3jY2NNc5ElQaGZvb19VVtnxZ4kiRJ0jrAYZnjz1i8JhZ4kiRJklQnLPAkSZIkrZMOOuggPvzhD9c6jaqywJMkSZI0bj3yyCN86EMforW1lQ022IAtt9yS173udXzrW99iyZIla7TvK664grPPPrtKmY4PzqIpSZIkaVx68MEH2X///Zk0aRLnnnsuu+yyCy+++CL3338/F198MVtvvTVvfetbl9vuhRdeGNWslK94xSvGIu2asgdPkiRJ0qiVSiXmzZtHqVQa82OdfPLJTJgwgdtuu41jjjmGHXfckV122YWjjjqKq6++mre85S1ANlnJt771Ld72trex0UYb8bnPfQ6Ae+65hze96U1svPHGbLnllvzrv/4rTzzxxOD+K4doXnTRRbS1tQ32FB599NGD61JKnH/++Wy//fZsuOGG7LbbbvzkJz8ZXH/99dcTEVx33XXstddeNDY28trXvpb77rtvrE/TEBZ4kiRJkkalo6ODlpYWZs6cSUtLCx0dHWN2rH/84x/8+te/5pRTTmGjjTYaNqZ8FspPf/rTvO1tb+Puu+/mve99L4sXL+bAAw9k991357bbbuOXv/wljz76KMccc8yw+7rtttv44Ac/yGc/+1nuu+8+fvnLX/L6179+cP0nPvEJLrnkEr75zW/y17/+lVNPPZUTTjiB3/3ud0P285//+Z9ccMEF3HbbbUyYMIH3vve9VTgbo+cQTUmSJEkrVSqVmD17Nv39/QD09/czZ84cDjvsMIrFYtWP19XVRUqJ6dOnD2mfPHkyS5cuBeCUU07hvPPOA+C4444bUkx96lOfYsaMGZx77rmDbRdffDHbbrst999/PzvssMOQ/S5atIiNNtqII444gk022YSWlhb22GMPAJ577jkuvPBCfvvb37LffvsBsP3223PDDTcwd+5cDjzwwMH9nHPOOYO//8d//AdvfvObWbp0KRtssEG1Ts0KWeBJkiRJWqnOzs7B4m5AX18fXV1dY1LgDai8V9wtt9xCf38/xx9/PMuWLRts32uvvYbE3X777cybN4+NN954uX0+8MADyxV4hxxyCC0tLWy//fYcfvjhHH744bz97W+nsbGRe+65h6VLl3LIIYcM2eb5558fLAIH7LrrroM/T5kyBYDHHnuMqVOnrsKzXn0WeJIkSZJWqq2tjYaGhiFFXqFQoLW1dUyO19raSkRw7733DmnffvvtAdhwww2HtFcO4+zv7+ctb3nLYA9fuYHCq9wmm2zCHXfcwfXXX8+vf/1rPvWpT3HWWWdx6623Dj7nq6++mm222WbIduuvv/6Q38sndxkoTisL47HkNXiSJEmSVqpYLNLe3k6hUACy4m7u3Llj1nu32Wabccghh/D1r3+d5557bpW3nzFjBn/961/ZbrvtaG1tHbKMdE3fhAkTeOMb38j555/Pn//8Z7q7u/ntb3/LTjvtxPrrr8+iRYuW29e22267pk+1quzBkyRJkjQqs2bN4rDDDqOrq4vW1tYxHZoJ2ayW+++/P3vttRdnnXUWu+66Kw0NDdx6663ce++97LnnniNue8opp/Dtb3+bd73rXXz0ox9l8uTJdHV18aMf/Yhvf/vbg4XqgKuuuooHH3yQ17/+9Wy66aZcc8019Pf3M336dDbZZBNOP/10Tj31VPr7+3nd615Hb28vN954IxtvvDHvfve7x/Q8rAoLPEmSJEmjViwWx7ywG/DKV76SO++8k3PPPZczzzyTUqnE+uuvz0477cTpp5/OySefPOK2W2+9NX/4wx8444wzOOyww1i2bBktLS0cfvjhNDQsP5Bx0qRJXHHFFZx11lksXbqUtrY2fvjDH7LzzjsDcPbZZ7PFFlvw+c9/ngcffJBJkyYxY8YMPv7xj4/Z818dkVKqdQ7KRUQT0NPT00NTU1Ot05EkSVIdWLp0KQsXLmTatGlrbSZHjc6KXpve3l6am5sBmlNKvaPdp9fgSZIkSVKdsMCTJEmSpDphgSdJkiRJdcICT5IkSZLqhAWeJEmSJNUJCzxJkiRJqhMWeBpRqVRi3rx5lEqlWqciSZIkaRQs8DSsjo4OWlpamDlzJi0tLXR0dNQ6JUmSJEkrYYGn5ZRKJWbPnk1/fz8A/f39zJkzx548SZIkaZyzwNNyOjs7B4u7AX19fXR1ddUoI0mSJGl4EcGVV1454vru7m4igrvuumvEmOuvv56I4Omnnx6LFNeqCbVOQONPW1sbDQ0NQ4q8QqFAa2trDbOSJElSrSzqWcQTS54Ycf3kxslMbZ46Jsd+5JFHOOecc7j66qt56KGH2GKLLdh999358Ic/zBve8AYWL17MpptuOibHfjmywNNyisUi7e3tzJkzh76+PgqFAnPnzqVYLNY6NUmSJK1li3oWseM3dmTJC0tGjGmc2MiCUxZUvcjr7u5m//33Z9KkSZx//vnsuuuuvPDCC/zqV7/ilFNO4d5772Wrrbaq6jFX1/PPP896661X6zQcoqnhzZo1i+7ububNm0d3dzezZs2qdUqSJEmqgSeWPMGSF5bwvbd/j9tn377c8r23f48lLyxZYQ/f6jr55JOJCG655RaOPvpodthhB3beeWc+8pGP8Mc//hFYfojmLbfcwh577MEGG2zAXnvtxZ133rncfq+55hp22GEHNtxwQw4++GC6u7uXi/npT3/KzjvvzPrrr892223HBRdcMGT9dtttx+c+9zlOPPFEmpubed/73lfdJ7+a7MHTiIrFor12kiRJAmDHzXdkxpQZa+14Tz75JL/85S8555xz2GijjZZbP2nSpOXannvuOY444ghmzpzJ9773PRYuXMiHPvShITF///vfecc73sH73/9+TjrpJG677TZOO+20ITG33347xxxzDGeddRbHHnssN954IyeffDKbbbYZJ5544mDcF77wBT75yU/yiU98ojpPugos8CRJkiSNO11dXaSUeNWrXjXqbb7//e/T19fHxRdfTGNjIzvvvDOlUomTTjppMOab3/wm22+/PV/60peICKZPn87dd9/NeeedNxhz4YUX8oY3vIFPfvKTAOywww7cc889fOELXxhS4M2cOZPTTz99zZ9sFTlEU5IkSdK4k1ICsiGYo7VgwQJ22203GhsbB9v222+/5WL23XffIfsdLmb//fcf0rb//vvT2dlJX1/fYNtee+016tzWFgs8SZIkSeNOW1sbEcGCBQtGvc1AUViNmMrCcrjthhs6WmsWeJIkSZLGnVe84hUcdthhfOMb3+C5555bbv1w96zbaaed+NOf/sQ///nPwbaByVjKYyrbhou54YYbhrTdeOON7LDDDhQKhVV+LmuTBZ4kSZKklVrw+ALuWHzHcsuCx0ffw7aqLrroIvr6+th777356U9/SmdnJwsWLOCrX/3qcsMqAY477jgaGhqYNWsW99xzD9dccw1f/OIXh8S8//3v54EHHuAjH/kI9913Hz/4wQ+49NJLh8ScdtppXHfddZx99tncf//9fOc73+HrX//6uLvebjhOsiJJkiRpRJMbJ9M4sZETfnbCiDGNExuZ3Di56seeNm0ad9xxB+eccw6nnXYaixcvZvPNN2fPPffkm9/85nLxG2+8Mb/4xS94//vfzx577MFOO+3Eeeedx1FHHTUYM3XqVH76059y6qmnctFFF7H33ntz7rnn8t73vncwZsaMGfz4xz/mU5/6FGeffTZTpkzhs5/97JAJVsarGM0YVK0dEdEE9PT09NDU1FTrdCRJklQHli5dysKFC5k2bRobbLDBau1jUc+iFd7nbnLj5Krf5HxdsKLXpre3l+bmZoDmlFLvaPdpD54kSZKkFZraPNUC7mXCa/AkSZIkqU5Y4EmSJElSnbDAkyRJkqQ6YYEnSZIkSXXCAk+SJEmS6oQFniRJkiTVCQs8SZIkSaoTFniSJEmSVCcs8CRJkiSpTljgSZIkSRqdvj64/nr44Q+zx76+MT3ciSeeyJFHHjmk7cYbb6RQKHD44YcvF3/99dcTETz99NPLrdt9990566yzxirVccMCT5IkSdLKXXEFtLbCwQfDccdlj62tWftadPHFF/OBD3yAG264gUWLFq3VY78cWOBJkiRJWrErroCjj4ZddoGbboJnnsked9kla19LRd5zzz3Hj3/8Y0466SSOOOIILr300rVy3JcTCzyNWqlUYt68eZRKpVqnIkmSpLWlrw9OOw2OOAKuvBL23Rc23jh7vPLKrP3008d8uCbA5ZdfzvTp05k+fTonnHACl1xyCSmlMT/uy4kFnkalo6ODlpYWZs6cSUtLCx0dHbVOSZIkSWvD/PnQ3Q0f/zg0VJQPDQ1w5pmwcGEWN8Y6Ojo44YQTADj88MN59tlnue6668b8uC8nFnhaqVKpxOzZs+nv7wegv7+fOXPm2JMnSZK0Lli8OHt89auHXz/QPhA3Ru677z5uueUW3vnOdwIwYcIEjj32WC6++OIxPe7LzYRaJ6Dxr7Ozc7C4G9DX10dXVxfFYrFGWUmSJGmtmDIle/zLX7JhmZX+8pehcWOko6ODF198kW222WawLaXExIkTeeqpp9h0001pamoCoKenh0mTJg3Z/umnn6a5uXlMcxwP7MHTSrW1tdFQ0R1fKBRobW2tUUaSJElaaw44ALbbDs49Fyr+6U9/P3z+8zBtWhY3Rl588UW++93vcsEFF3DXXXcNLn/6059oaWnh+9//PvDS99Zbb711yPaLFy/moYceYvr06WOW43hhgaeVKhaLtLe3UygUgKy4mzt3rr13kiRJ64JCAS64AK66Co48cugsmkcembV/8YtZ3Bi56qqreOqpp5g1axavfvWrhyxHH3304PwQm2yyCXPmzOG0007jyiuvZOHChfzhD3/gXe96FzvuuCOHHnromOU4XoSzzowfEdEE9PT09Ax2L48npVKJrq4uWltbLe4kSZJeJpYuXcrChQuZNm0aG2ywwerv6Iorstk0u7tfaps2LSvu3vGONc5zOCeeeCJPP/00fX199Pf3c/XVVy8Xc8cdd7Dnnnty++23M2PGDJYtW8Zqt5CKAAAgAElEQVT555/PD3/4Q7q7u9liiy04+OCD+fznP89WW201JnmurhW9Nr29vQNDSptTSr2j3acF3jgy3gs8SZIkvfxUrcCD7FYI8+dnE6pMmZINyxzDnrt6NxYFnpOsSJIkSRqdQgEOOqjWWWgFvAZPkiRJkuqEBZ4kSZIk1QkLPEmSJEmqExZ4kiRJ0jrAyRXHn7F4TSzwJEmSpDo2ceJEAJYsWVLjTFTp+eefBxi833Q1OIumJEmSVMcKhQKTJk3iscceA6CxsZGIqHFW6u/v5/HHH6exsZEJE6pXllngSZIkSXVu4AbfA0WexoeGhgamTp1a1YLbAk+SJEmqcxHBlClT2GKLLXjhhRdqnY5y6623Hg0N1b1qzgJPkiRJWkcUCoWqXu+l8cdJViRJkiSpTljgSZIkSVKdsMCTJEmSpDphgSdJkiRJdcICT5IkSZLqhAWeJEmSJNUJCzxJkiRJqhMWeJIkSZJUJyzwtNpKpRLz5s2jVCrVOhVJkiRJWOBpNXV0dNDS0sLMmTNpaWmho6Oj1ilJkiRJ67xIKdU6B+Uiogno6enpoampqdbpjKhUKtHS0kJ/f/9gW6FQoLu7m2KxWMPMJEmSpPrQ29tLc3MzQHNKqXe029mDp1XW2dk5pLgD6Ovro6urq0YZSZIkSQILPK2GtrY2GhqGvnUKhQKtra01ykiSJEkSWOBpNRSLRdrb2ykUCkBW3M2dO9fhmZIkSVKNeQ3eOPJyuQZvQKlUoquri9bWVos7SZIkqYpW9xq8CWOXkupdsVi0sJMkSZLGEYdoSpIkSVKdsMCTJEmSpDphgSdJkiRJdcICT5IkSZLqhAWeJEmSJNUJCzxJkiRJqhMWeJIkSZJUJyzwJEmSJKlOWOBJkiRJUp0YlwVeRJwcEQsjYmlE3B4RB6wk/qiIuCciluWPb69YHxFxVkQ8HBH/jIjrI2LniphNI+KyiOjJl8siYlLZ+ukRMS8iHs3zejAiPhcRE8ti3hERt0XE0xHxXETcFRH/Wq3zIkmSJEkrMu4KvIg4FvgycA6wBzAf+L+ImDpC/H7A5cBlwG75448jYp+ysI8BHwH+H/Aa4BHgNxGxSVnMD4DdgcPzZfd8XwNeAL4LHApMBz4MvA/4TFnMk3ne+wG7ApcAl0TEYat0EiRJkiRpNURKqdY5DBERNwN3pJROKmtbAFyZUjpzmPjLgaaU0r+Utf0SeCql9K6ICOBh4MsppfPy9esDjwJnpJTmRsSOwD3Avimlm/OYfYGbgFellO4bIdcLgdeklEbsYYyIO4CrU0qfHMVzbwJ6enp6aGpqWlm4JEmSpDrV29tLc3MzQHNKqXe0242rHryIWA/YE/h1xapfA68dYbP9hon/VVn8NGCr8piU0jLgd2Ux+wE9A8VdHvNHoGek40ZEK1lP3+9GWB8R8Qay3r7fjxCzfkQ0DSzAJsPFSZIkSdJojKsCD5gMFMh618o9SlakDWerlcRvVda2opjHhtn3Y5XHjYgbI2Ip0Ek2fPRTFeubI+JZ4HngauADKaXfjJD7mWRF5MBSGiFOkiRJklZqvBV4AyrHjcYwbasav7KY4fY/3H6OBWYAxwFvBk6vWP8M2fV7rwH+E7gwIg4aIe/PA81lS3GEOEmSJElaqQm1TqDCE0Afy/fWbcHyPXADHllJ/CP541bA4hXEbDnMvjevPG5K6e/5j/dERAFoj4gLUkp9+fp+oCuPuSu/vu9M4PrKnedDRZcN/J5dLihJkiRJq2dc9eCllJ4HbgcOqVh1CHDjCJvdNEz8oWXxC8kKuMGY/Fq/A8tibgKaI2Lvsph9yHrVRjouZD18E/PHFcWsv4L1kiRJklQV460HD+BC4LKIuI2s8JoNTAW+BRAR3wUeKptR8yvA7yPiDOB/gbcBbwReB5BSShHxZeDjEdFJdu3cx4ElZLdGIKW0IJ9589sRMSffbztw1cAMmhFxPNmtEu4m63Xbk2yI5eUppRfzmDOB24AHgPWANwH/BgzOCFrPSqUSnZ2dtLW1USw62lSSJEla28ZdgZdSujwiNiObvGQK8BfgTSmlv+UhU4H+svgbI+KdwOeAs8mKq2PLZ8QEzgc2BC4CNgVuBg5NKT1TFnM88FVemm3z52T3zRvwInAGsANZr9zfgG8AXyqL2Sg/RhH4J3AvcEJK6fJVPxMvLx0dHcyePZv+/n4aGhpob29n1qxZtU5LkiRJWqeMu/vgrctervfBK5VKtLS00N8/WHdTKBTo7u62J0+SJElaDXVxHzy9PHV2dg4p7gD6+vro6uoaYQtJkiRJY8ECT2usra2Nhoahb6VCoUBra2uNMpIkSZLWTRZ4WmPFYpH29nYKhQKQFXdz5851eKYkSZK0lnkN3jjycr0Gb0CpVKKrq4vW1laLO0mSJGkNrO41eONuFk29fBWLRQs7SZIkqYYcoilJkiRJdcICT5IkSZLqhAWeJEmSJNUJCzxJkiRJqhMWeJIkSZJUJyzwJEmSJKlOWOBJkiRJUp2wwJMkSZKkOmGBJ0mSJEl1wgJPkiRJkuqEBZ4kSZIk1QkLPEmSJEmqExZ4kiRJklQnLPAkSZIkqU5Y4EmSJElSnbDA05gplUrMmzePUqlU61QkSZKkdYIFnsZER0cHLS0tzJw5k5aWFjo6OmqdkiRJklT3IqVU6xyUi4gmoKenp4empqZap7PaSqUSLS0t9Pf3D7YVCgW6u7spFos1zEySJEl6eejt7aW5uRmgOaXUO9rt7MFT1XV2dg4p7gD6+vro6uqqUUaSJEnSusECT1XX1tZGQ8PQt1ahUKC1tbVGGUmSJEnrBgs8VV2xWKS9vZ1CoQBkxd3cuXMdnilJkiSNMa/BG0fq5Rq8AaVSia6uLlpbWy3uJEmSpFWwutfgTRi7lLSuKxaLFnaSJEnSWuQQTUmSJEmqExZ4kiRJklQnLPAkSZIkqU5Y4EmSJElSnbDAkyRJkqQ6YYEnSZIkSXXCAk+SJEmS6oQFniRJkiTVCQs8SZIkSaoTFniSJEmSVCcs8CRJkiSpTljgSZIkSVKdsMCTJEmSpDphgSdJkiRJdcICT2tNqVRi3rx5lEqlWqciSZIk1SULPK0VHR0dtLS0MHPmTFpaWujo6Kh1SpIkSVLdiZRSrXNQLiKagJ6enh6amppqnU7VlEolWlpa6O/vH2wrFAp0d3dTLBZrmJkkSZI0PvX29tLc3AzQnFLqHe129uBpzHV2dg4p7gD6+vro6uqqUUaSJElSfbLA05hra2ujoWHoW61QKNDa2lqjjCRJkqT6ZIGnMVcsFmlvb6dQKABZcTd37lyHZ0qSJElV5jV440i9XoM3oFQq0dXVRWtrq8WdJEmStAKrew3ehLFLSRqqWCxa2EmSJEljyCGakiRJklQnLPAkSZIkqU5Y4EmSJElSnbDAkyRJkqQ64SQrGl5fH8yfD4sXw5QpcMABkN/mQJIkSdL4ZA+elnfFFdDaCgcfDMcdlz22tmbtkiRJksYtCzwNdcUVcPTRsMsucNNN8Mwz2eMuu2TtFnmSJEnSuOWNzseRmt/ovK+PF7ffjmenb8+Dl1wADWX1f38/27/nNDa+fyETHljocE1JkiRpDHmjc62xR//vJ2y5qMThh5S4+b9fs9z6fafCTb/J4444tgYZSpIkSVoRCzwNWrLoAQBO/ff/pq1lj+XWd3bfAR3vG4yTJEmSNL5Y4GnQC1tOBmC3Jybwqn1nLLd+o9v+PCROkiRJ0vjiJCsa9Ow+e7BwEmz11Yuhv3/oyv5+tvzaJTw4KYuTJEmSNP5Y4OklhQKnHQrN186HI48cOovmkUfSfO18Tj8UJ1iRJEmSxikLPA3xs51g4bfPh7vvhte+Fpqasse//IWF3z6fn+1U3eOVSiXmzZtHqVSq7o4lSZKkdZAFnpbz9JtmQlcXzJsHP/hB9tjZmbVXUUdHBy0tLcycOZOWlhY6Ojqqun9JkiRpXeMkK1rOgscXZD9Mb8oWgMf+9FJ7FZRKJWbPnk1/fq1ff38/c+bM4bDDDqNYLFbtOJIkSdK6xAJPgyY3TqZxYiMn/OyEEWMaJzYyuXHNZ9Hs7OwcLO4G9PX10dXVZYEnSZIkrSYLPA2a2jyVBacs4IklT4wYM7lxMlObp67xsdra2mhoaBhS5BUKBVpbW9d435IkSdK6ygJPQ0xtnlqVAm5lisUi7e3tzJkzh76+PgqFAnPnzrX3TpIkSVoDkVKqdQ7KRUQT0NPT00NTU1Ot01krSqUSXV1dtLa2WtxJkiRJud7eXpqbmwGaU0q9o93OHjzVVLFYtLCTJEmSqsTbJEiSJElSnbDAkyRJkqQ6YYEnSZIkSXXCAk+SJEmS6oQFniRJkiTVCQs8SZIkSaoTFniSJEmSVCcs8CRJkiSpTljgSZIkSVKdsMCTJEmSpDphgSdJkiRJdcICT+NKqVRi3rx5lEqlWqciSZIkvexY4Gnc6OjooKWlhZkzZ9LS0kJHR0etU5IkSZJeViKlVOsclIuIJqCnp6eHpqamWqezVpVKJVpaWujv7x9sKxQKdHd3UywWa5iZJEmStPb19vbS3NwM0JxS6h3tdvbgaVzo7OwcUtwB9PX10dXVVaOMJEmSpJcfCzyNC21tbTQ0DH07FgoFWltba5SRJEmS9PJjgadxoVgs0t7eTqFQALLibu7cuQ7PlCRJklaB1+CNI+vyNXgDSqUSXV1dtLa2WtxJkiRpnbW61+BNGLuUVFf6+mD+fFi8GKZMgQMOgLy3rZqKxaKFnSRJkrSaHKKplbviCmhthYMPhuOOyx5bW7N2SZIkSeOGBZ5W6PHL5pKOPpqn26Zy7y8u5a7O+dz7i0t5um0q6eijefyyubVOUZIkSVLOa/DGkfF2Dd6iJxfS3/pK/rx54sh3Qir7d0D0w5U/gl0eDwpdDzD1FdNql6gkSZJUZ9bKNXgR8dZVTQz4TUrpn6uxnWps6bzfsMNTiXu/eha3veEty61/vPXnTPu3z3D/vN/AUbNrkKEkSZKkcqs6ycqVqxifgDbgwVXcTuPAxEefAGCrfd/A7lNmLLf+rn2fAz4zGCdJkiSptlbnGrytUkoNo1mAJdVOWGvPC1tOBmCDex8Ydv2GeftAnCRJkqTaWtUC7zvAqgy3/B4w6vGiAyLi5IhYGBFLI+L2iDhgJfFHRcQ9EbEsf3x7xfqIiLMi4uGI+GdEXB8RO1fEbBoRl0VET75cFhGTytZPj4h5EfFonteDEfG5iJhYFvO+iJgfEU/ly7URsfeqPv/x4tl99mDhJNjqqxdDf//Qlf39bPm1S3hwUhYnSZIkqfZWqcBLKb0npfTMKsSflFJapfF7EXEs8GXgHGAPYD7wfxExdYT4/YDLgcuA3fLHH0fEPmVhHwM+Avw/4DXAI8BvImKTspgfALsDh+fL7vm+BrwAfBc4FJgOfBh4H/CZspiDgB8CBwP7AYuAX0fENqtyDsaNQoHTDoXma+fDkUfCTTfBM89kj0ceSfO18zn9UMbkfniSJEmSVt24m0UzIm4G7kgpnVTWtgC4MqV05jDxlwNNKaV/KWv7JfBUSuldERHAw8CXU0rn5evXBx4FzkgpzY2IHYF7gH1TSjfnMfsCNwGvSindN0KuFwKvSSkN28MYEQXgKeD/pZS+O4rnPq5m0bxj8R3s2b4nDxS/wPaf+wZ0d7+0cto0HvzPk3ll6aPcPvt2ZgxzjZ4kSZKk1bNWZtEcjYjYFvhMSum9q7HtesCewH9VrPo18NoRNtsP+FJF26/IetgApgFb5fsAIKW0LCJ+l+9zbr6PnoHiLo/5Y0T05DHLFXgR0UrW07eiu303AhOBJ4dbmRea65c1bTJcXK3d9JopPD3/x2x8851MfPQJXthyMs/uswcLnrwfSrXOTpIkSdKAqhd4wCuAdwOrXOABk4ECWe9auUfJirThbLWS+K3K2ipjWspiHhtm349VHjcibgRmkBVm7cCnRsgLskL1IeDaEdafCXx6BdvX1OTGyTRObOSEn50wdMXjwF+yHxsnNjK50UlWJEmSpPFglQu8UdwLb/vVzKVc5bjRGKZtVeNXFjPc/ofbz7FkPW27AV8ATgfOX27DiI8B7wIOSiktHSHvzwMXlv2+CeOoT2xq81QWnLKAJ5aMfBnl5MbJTG0e9vLIqiiVSnR2dtLW1kaxWByz40iSJEn1YHV68K4kK3piBTGre2HfE0Afy/fWbcHyPXADHllJ/CP541bA4hXEbDnMvjevPG5K6e/5j/fk19i1R8QFKaW+gZiIOB34OPDGlNKfR8iblNIyYFnZdiOF1szU5qljWsCtSEdHB7Nnz6a/v5+Ghgba29uZNWtWTXKRJEmSXg5W5z54i4GjVnDvu9WebSOl9DxwO3BIxapDgBtH2OymYeIPLYtfSFbADcbk1/odWBZzE9BcfkuDfBbO5hUcF7IidyJlxW5EfBT4JHB4Sum2FWyrFSiVSoPFHUB/fz9z5syhVBo3HZySJEnSuLM6PXi3kxVxV46wfmW9eytzIXBZRNxGVnjNBqYC3wKIiO8CD5XNqPkV4PcRcQbwv8DbgDcCrwNIKaWI+DLw8YjoBDrJeteWkN0agZTSgnzmzW9HxJx8v+3AVQMzaEbE8WS3SribrNdtT7IhlpenlF7MYz4GnA0cB3RHxEDP4rMppWfX4Jysczo7OweLuwF9fX10dXU5VFOSJEkaweoUeF8ANlrB+i6y+8CtlpTS5RGxGdnkJVPIpvN4U0rpb3nIVKC/LP7GiHgn8Dmy4uoB4NjyGTHJrpHbELgI2BS4GTi04p5+xwNf5aXZNn9Odt+8AS8CZwA7kBWwfwO+wdAZPE8G1gN+UvG0PgOcNbozIIC2tjYaGhqGFHmFQoHW1tYaZiVJkiSNb+PuPnjrsvF2H7xa6+joYM6cOfT19VEoFJg7d67X4EmSJGmdsLr3wbPAG0cs8JZXKpXo6uqitbXVoZmSJElaZ4ybG51L1VQsFi3sJEmSpFFanVk0JUmSJEnjUFUKvIiYkd96QJIkSZJUI9XqwbsV2K5K+5IkSZIkrYZqFXhrct87SZIkSVIVeA2eJEmSJNUJCzxJkiRJqhMWeJIkSZJUJyzwJEmSJKlOWOBJkiRJUp2wwNPLSqlUYt68eZRKpVqnIkmSJI071SrwPgM8UaV9ScPq6OigpaWFmTNn0tLSQkdHR61TkiRJksaVSCnVOgflIqIJ6Onp6aGpqanW6YwrpVKJlpYW+vv7B9sKhQLd3d0Ui8UaZiZJkiRVX29vL83NzQDNKaXe0W43YexSUl3r64P582HxYpgyBQ44AAqFMTtcZ2fnkOIuS6GPrq4uCzxJkiQpZ4GnVbKoZxEv/uTHbPOZL7H+3x8ebF+27dY89OlTmXD0MUxtnlr147a1tdHQ0LBcD15ra2vVjyVJkiS9XFVtkpWImFitfWl8WtSziDNObmO7f/8ov9rwYfadBRufCfvOgl9t+DDb/ftHOePkNhb1LKr6sYvFIu3t7RTyXsJCocDcuXPtvZMkSZLKVOUavIhoAO5IKe2+5imtu8b7NXh3lG5l0132ZuJue/DY99uhoez/A/39bHH8bJ7/0508ffctzCi+ZkxyKJVKdHV10draanEnSZKkulXTa/BSSv0RcUtE7JxS+ms19qnxZ+Ob72Ta03Dv6R9ixjZ7Lbf+vtM+yPS3vof7b74TxqjAKxaLFnaSJEnSCKp5Dd7ewJ0RcT+wBAggpZT2ruIxVEMTH83uhLH0Va8cdv0/8/aBOEmSJElrVzULvLdVcV8ah17YcjIAG9z7ALS+brn1G977wJA4SZIkSWtX1Qq8lNLfqrUvjU/P7rMHCyfBVl+9GN70r8tdg7fl1y7hwUlZnCRJkqS1b41n0YyIy/LHW/Pr8AaWWyPiljVPUeNGocBph0LztfPhyCPhppvgmWeyxyOPpPna+Zx+KGN6PzxJkiRJI6tGD97H8sejq7AvjXM/2wnmf/GD7PPl/2H91752sH3Z1G24+Ysf5GfPfIVP1DA/SZIkaV22xgVeSmlx/vg3gIjYElh/Tfer8Wdy42QaJzZy4DNfoeE9cMDfYMqzsHhjmN/yEP3PfIXGiY1MbvQaPEmSJKkWqnYNXkQcCXwe2Bb4O7AD8GfAC7LqxNTmqSw4ZQFPLBl5lszJjZOZ2jx1LWYlSZIkaUA1Z9H8LLAP8PuU0u4RsTdwUhX3r3FgavNUCzhJkiRpnFrjSVbKLBu4w3pErJdSugXYrYr7l5ZTKpWYN28epVKp1qlIkiRJNVfNAm9xREwCfgFcExGXA49Xcf/SEB0dHbS0tDBz5kxaWlro6OiodUqSJElSTUVKqfo7jTgIaAJ+lVJaVvUD1KmIaAJ6enp6aGpqqnU641qpVKKlpYX+/v7BtkKhQHd3N8VisYaZSZIkSWuut7eX5uZmgOaBkZKjUc1r8AallK4fi/1KAzo7O4cUdwB9fX10dXVZ4EmSJGmdVZUCLyImANOBVw8sKaW3V2Pf0nDa2tpoaGhYrgevtbW1hllJkiRJtbXK1+BFxPYR8baI+M+I+GFE3A08R3ZLhEuBNwOj7kKUVkexWKS9vZ1CoQBkxd3cuXPtvZMkSdI6bZWuwYuI7wHvAhKwBNgIuBq4DLgb6Ewp9Y1BnusEr8FbdaVSia6uLlpbWy3uJEmSVDfW1jV4RwMfAC7Otz0HmAPcC1xlcae1rVgsWthJkiRJuVUdovkF4LsppaUppWdTSh8C9gcOBu6JiMOrnqEkSZIkaVRWqcBLKX0ypfRsRdvtwN7Al4HLI+IHEbF5FXOUJEmSJI1CVW50njJfAXYC1icbsilJkiRJWotWZxbNcyNi7+HWpZQeSikdBfzbGmcmSZIkSVolq9ODNwW4KiIWR0R7RLw5ItYvD0gpXV2d9CRJkiRJo7XKBV5K6T3AlsAxwNPABcATEXFFRJwYEZOrnKMkSZIkaRRW6xq8/Jq7+Smlj6WUXkU2ycofgfcBD0XE7yPi9IjYpprJSpIkSZJGtqr3wRtWSmkBsAA4PyK2AN4CvDVf/cVqHEOSJEmStGJVKfDKpZQeAzryReuKvj6YPx8WL4YpU+CAA6BQWOtplEolOjs7aWtr8wbokiRJWudUrcCLiFcCHwBagMFv9imlt464kV72FvUs4sWf/JhtPvMl1v/7w4Pty7bdmoc+fSoTjj6Gqc1T10ouHR0dzJ49m/7+fhoaGmhvb2fWrFlr5diSJEnSeBAppersKOJu4OvAn4H+gfaU0s1VOcA6ICKagJ6enh6amppqnc5KLepZxBknt/H9HzzPVTvAuQfAX7aAVz8GH58PR9wPxx+3Hudd1DnmRV6pVKKlpYX+/sG3HoVCge7ubnvyJEmS9LLT29tLc3MzQHNKqXe021VziOZzKaW5VdyfxrknnnmUc695nocP3IPi99u5qKFszp7+fh4+fjbnXHMnTzzz6JgXeJ2dnUOKO4C+vj66uros8CRJkrTOqGaB9/mIOA+4Flg20JhS+n0Vj6FxZOOb72Ta03Dv6R9ixjZ7Lbf+vtM+yPS3vof7b74Tiq8Z01za2tpoaGhYrgevtbV1TI8rSZIkjSfVLPAOAw4CWnlpiGYCLPDq1MRHnwBg6ateOez6f+btA3FjqVgs0t7ezpw5c+jr66NQKDB37lx77yRJkrROqWaBdyCwc6rWRX0a917YMrun/Qb3PgCtr1tu/Yb3PjAkbqzNmjWLww47jK6uLlpbWy3uJEmStM5ZrRudj+AWYPiuHNWlZ/fZg4WTYKuvXgwV17/R38+WX7uEBydlcWtLsVjkoIMOsriTJEn/P3v3Hx31dd/5//meqWyQCRLNJML2LFOMFMCJ7WAvBqdAkEwQ9X7rxoe4btP4uwElKI3XbGto3LhO0jZpc5KCY3DcLUrG5ht328aUJin9YVi2qlFrKuMYk2SNsATI6sfBcmbjkU0HHHnmfv/4zMBoEEKaGWlGmtfjnDkafe7VnTcc7MObe+/7LVKRipngLQJ+ZGZHzOxZMztkZs8WcX0pN8Egm1ZDzf4O+PCH4eBBePNN/+uHP0zN/g42r6Yk/fBERERERCpRMY9o/koR15JJ4jvXQseWjSx5eBeXf+AD556/NedqOrds5DtvbuPBEsYnIiIiIlJJin0HbzjfKuJnSBkJVYeorqrmg29uI7AOlr8MV56GUzOgI/IKqTe3UV1VTah6Yu7giYiIiIhUumImeNdlvb8c+BB+03MleFPUnJo5HL3nKLHExatkhqpD494DT0REREREfEVL8Jxzv5P9vZnNAP6qWOtLeZpTM0cJnIiIiIhImShmkZVcDnjPOK4vIiIiIiIiWYq2g2dmh/CTOoAgcCXwJ8VaX0REREREREZWzDt4H8l6/zbwmnNusIjri+TF8zy6u7tpaGhQfzwRERERmdLGtYqmmeGcU5EVKZloNMqGDRtIpVIEAgHa2tpoaWkpdVgiIiIiIuPCnHOXnjWahcyyj2Oeq6LpnLurKB9QAcxsJjAwMDDAzJkzSx3OpOd5HpFIhFQqde5ZMBikt7dXO3kiIiIiUtbeeOMNampqAGqcc2+M9udURVOmrO7u7iHJHUAymaSnp0cJnoiIiIhMSaqiKVNWQ0MDgcDQP+LBYJD6+voSRSQiIiIiMr6KluCZ2SEzezb9+j7QDfxpsdYXGatwOExbWxvBYBDwk7sdO3Zo905EREREpqxi3sGLZH2rKpp50B288eF5Hj09PdTX1yu5ExEREZFJoRzu4L1crLVEiikcDiuxExEREZGKUMwjmn9sZrVZ388ysz8q1voiIiIiIiIysmIWWfkl51w8841z7nXgl4q4voiIiIiIiIygmAleMN0aATh3n6yqiOuLiIiIiIjICIp2Bw94BPhXM/s2YMCvAl8r4voiIiIiIiIygmIWWfmGmXUCH0w/+qhz7v8Ua32ZHPoG+oglYpBMMqPzMFX9MQbrQvQn51AAACAASURBVJxesgiCQULVIebUzCl1mCIiIiIiU1Ixd/Bwzv3AzPqBywHMbI5zrq+YnyHlq2+gj4WPLqT5SIKt+2Bu/PzYyVrYtBr23lDN0XuOKskTERERERkHxayieYeZHQWOA3uBk8D3irW+lL9YIkbzkQS7dxmzFq+ga89OXujuoGvPTmYtXsHuXUbzkYS/w1dCnufR3t6O53kljUNEREREpNiKuYP3B8AS4IBz7v1mdjPwm0VcX8pdMsnWfTCwajm1T7VTG0j/+0H9MrjtbuJrGtmy7wDxZLJkIUajUTZs2EAqlSIQCNDW1kZLS0vJ4hERERERKaZiVtF8K9Nh3cwuc849C9xQxPWlzM3oPMzcOLy6cT0Ecv5oBQL037uOa+L+vFLwPO9ccgeQSqVobW3VTp6IiIiITBnFTPBOpRud7wH+IV1N8ydFXF/KXFW/f/Ty7IJ5w46fST/PzJto3d3d55K7jGQySU9PT0niEREREREptmIe0fwz4LRz7nNmthKYiX8XTyrEYF0IgGldx/1jmTmmdx0fMm+iNTQ0EAgEhiR5wWCQ+vr6ksQjIiIiIlJsxdzB+whwzMx2AtXAPzjn3iri+lLmTi9ZxMlamL39McjZKSOVou6RxzlR688rhXA4TFtbG8FgEPCTux07dhAOh0sSj4iIiIhIsRUtwXPOrQfmA38FrAW6zOzxYq0vk0AwyKbVULO/g/iaRo6lq2ge27OT+JpGavZ3sHm1P69UWlpa6O3tpb29nd7eXhVYEREREZEppdh98N42s2eAdwFXASuLub6Ut1B1iL03VLOWBFv3HWD+7QfOjZ2ohfV3+n3wHq4uzRHNjHA4rF07EREREZmSzDlXnIXMPg7cCcwD/hb463QlTRklM5sJDAwMDDBz5sxSh5OXvoE+v89dMsmMzsNU9ccYrAv5xzKDQULVITU5FxERERG5hDfeeIOamhqAmky3gtEo5g7eQuALzrnnirimTDJzauacT+DCi0sbjIiIiIhIhSlaguecu79Ya4mIiIiIiMjYFSXBM7Ofwy+w8r7Myzl3RzHWFhERERERkdEZc4JnZtcA15GVzAHvAaqAt4CjwA+LGKOIiIiIiIiMwpgSPDP7c+DXAQckgCuAvwf+ED+p63bOJYsdpMh48jyP7u5uGhoaVF1TRERERCa1sfbB+whwLzADvw3C14HVwGLgZSV3MtlEo1EikQhNTU1EIhGi0WipQxIRERERyduY2iSY2ReBrzjnTmc9uwn4MyAE/KZz7qmiR1khpkKbhMnE8zwikQipVOrcs2AwSG9vr3byRERERKSk8m2TMKYdPOfc57KTu/Sz7wM3Aw8D3zazvzCzd41lXZFS6O7uHpLcASSTSXp6ekoUkYiIiIhIYcZ6RHNYzrcNuBa4HOgqxroi46mhoYFAYOh/AsFgkPr6+hJFJCIiIiJSmKIkeBnOuVecc2uB/7eY64qMh3A4TFtbG8FgEPCTux07duh4poiIiIhMWmO9g3c98CPnXOqSk/357wWOOefezjO+iqI7eKXheR49PT3U19cruRMRERGRspDvHbyxJnhJYLZz7iejnP8G8H7n3IlRf0gFU4InIiIiIiKQf4I31kbnBnzRzBKjnH/ZGNcXERERERGRPI01wTsAzB/D/IPAmTF+hkxFySR0dMCpU3DllbB8OaTvvomIiIiISHGMKcFzzq0cpzhkiuob6OPtv36Sq//ga1z+7z8+9/yt/3QVr3zht/m5j/wqc2rmlDBCEREREZGpo6hVNIvBzD5tZifN7KyZfd/Mll9i/loze9HM3kp/vSNn3Mzs983sx2Z2xsz+OV38JXvOLDN7wswG0q8nzKw2a3y+mbWbWX86rhNm9iUzq8qa814z221mvWbmzOy3ivV7Mln1DfRx/6cb+IVP/A57p/+YpS0w47OwtAX2Tv8xv/CJ3+H+TzfQN9BX6lBFRERERKaEsR7RHFdmdhd+w/RPA/8KtAL/aGbXOucuyALM7Bbg28DngO8AdwBPmtky51xnetpngPuAjwMvAQ8C/8vM5jvn3kzP+QsgDKxJf98GPAH8cvr7QeBbwPNAHLgB+AZ+gvxAek41cALYBXytoN+IKSL2Zj9//A8/48cfXET4f7bxp9k951IpfvwbG/ijfzhM7M3+strF8zyP7u5uGhoaVFVTRERERCaVMVXRHG9m1gk875z7zaxnR4HvOuc+O8z8bwMznXO/lPXsKeB159yvm5kBPwYeds59JT1+OdAP3O+c22FmC4EXgaWZpNDMluLfH1zgnDt2kVgfAhY75y7YYTSz3vRnPjzGX/+UqqL50u423vORVrr27GTB//NfLxg/tmcn829fx0t/vYP3rN1QgggvFI1G2bBhA6lUikAgQFtbGy0tLaUOS0REREQqTL5VNMvmiKaZXQbcBOzLGdoHfOAiP3bLMPP3Zs2fC8zOnuOcewt4OmvOLcBA1o4fzrl/AwYu9rlmVo+/2/f0iL+oClfVHwPg7IJ5w46fST/PzCs1z/POJXcAqVSK1tZWPM8rcWQiIiIiIqNTNgkeEAKC+Ltr2frxk7ThzL7E/NlZz0aa89owa7+W+7lm9oyZnQW6gQ7g8xeJa1TM7HIzm5l5Ae8oZL1yM1gXAmBa1/Fhx6enn2fmlVp3d/e55C4jmUzS09NToohERERERMamoDt4ZnYrcCvwbnKSRefc+jyXzT0zasM8G+v8S80Zbv3h1rkLPwm7AfgTYDPw1RFiu5TPAl8o4OfL2uklizhZC7O3Pwa33Q05d/DqHnmcE7X+vHLQ0NBAIBAYkuQFg0Hq6+tLGJWIiIiIyOjlvYNnZl/AP/p4K/7u26yc11jFgCQX7ta9mwt34DJevcT8V9NfLzWnbpi135X7uc65f3fOveic+0vgd4HfN7NCmrl9GajJek2tih7BIJtWQ83+Dvjwh+HgQXjzTf/rhz9Mzf4ONq+mbPrhhcNh2traCKbjCQaD7NixQ4VWRERERGTSKGQH71PAx51zTxQjEOfcz8zs+8CH8CtiZnwI+N5Ffuxgejy7auVq4Jn0+5P4CdyHgMNw7q7fB4H7s9aoMbObnXPPpucswU+4nuHiDKhKf81L+j7gW+cWtLyXKlvfuRY6tmxkycO7uPwD5680vjXnajq3bOQ7b27jwRLGl6ulpYXm5mZ6enqor69XciciIiIik0ohCd5ljJwA5eMh4Akzew4/8doAzAH+DMDMvgW8klVRcxtwwMzux08CfwVYBSwDcM45M3sYeMDMuvHvzj0AJPBbI+CcO5quvPkNM2tNr9sG/F2mgqaZ/QZ+q4Qf4idkN+Hvvn3bOfd2es5lwLVZvzdXm9n7gdPOuYq8xBWqDlFdVc0H39xGYB0sfxmuPA2nZkBH5BVSb26juqqaUHV53MHLCIfDSuxEREREZFIqJMH7JvBR4ItFigXn3LfN7J34xUuuBH4E3Oacezk9ZQ6Qypr/jJn9GvCldBzHgbuyK2Li35GbDvwp/tHRTmB1Vg88gN8AtnO+2ubfAv8ta/xt/B2/9+Dv2L0MPMrQncOrSO8Spm1Ov54GVo76N2EKmVMzh6P3HCWWuHiVzFB1qKx64ImIiIiITGZj6oOX7v2WEQD+K/CD9Gswe65z7r5iBFhJplofPBERERERyU++ffDGuoOXW+7whfTX9+U8L5/u6SIiIiIiIhViTAmec65xvAIRERERERGRwhTSJmGOXaTso5npUpVMGZ7n0d7ejud5pQ5FRERERGREeSd4+C0I3pX7MF0k5WQB64qUjWg0SiQSoampiUgkQjQaLXVIIiIiIiIXNaYiK0N+0CwF1DnnfpLzPAK86Jy7ogjxVRQVWSkvnucRiURIpc4VbiUYDNLb26s2CiIiIiIyriaqyEp2JU0HfNHMElnDQWAJ54uviExa3d3dQ5I7gGQySU9PjxI8ERERESlL+fTBy1TSNOA64GdZYz8DjgBbCoxLpOQaGhoIBAIX7ODV19eXMCoRERERkYsbc4KXqaRpZo8DG3MahotMGeFwmLa2NlpbW0kmkwSDQXbs2KHdOxEREREpW4Xcwfsaw/e7c8BZoAf4nnPup/mHV1l0B688eZ5HT08P9fX1Su5EREREZELkewevkASvHbgR/97dMfwjmw1AEugC5uMne8uccy/m9SEVRgmeiIiIiIjABBZZyfI94KfAuswHphOUKPAvwDeAvwC+BjQX8DkyxfQN9BFLxCCZZEbnYar6YwzWhTi9ZBEEg4SqQ8ypUStFEREREZGxKmQH7xXgQ7m7c2b2XmCfc+5qM7sx/T5UeKhTXyXs4PUN9LHw0YU0H0mwdR/MjZ8fO1kLm1bD3huqOXrPUSV5IiIiIlKx8t3BK6TReQ3w7mGevwvIZCdx4LICPkOmmFgiRvORBLt3GbMWr6Brz05e6O6ga89OZi1ewe5dRvORhL/DJyIiIiIiY1LoEc3HzGwTcAj/vt3N+C0SvpueczPwUkERytSSTLJ1HwysWk7tU+3UBtL/xlC/DG67m/iaRrbsO0A8mSxtnCPwPI/u7m4aGhpUdEVEREREykohO3itwP8G/gp4GehLv//fwKfSc7qATxQSoEwtMzoPMzcOr25cD4GcP36BAP33ruOauD+vHEWjUSKRCE1NTUQiEaLRaKlDEhERERE5J+8Ezzl32jn3SeCd+M3PbwTe6Zzb4Jz7j/ScF5xzLxQnVJkKqvr9o5dnF8wbdvxM+nlmXjnxPI8NGzaca3yeSqVobW3F87wSRyYiIiIi4itkBw84l+j9wDl3xDl3uhhBydQ1WOfX25nWdXzY8enp55l55aS7u/tccpeRTCbp6ekpUUQiIiIiIkMVcgcPM7sVuBW/2MqQZNE5t76QtWVqOr1kESdrYfb2x+C2u4ce00ylqHvkcU7U+vPKTUNDA4FAYEiSFwwGqa+vL2FUIiIiIiLn5b2DZ2ZfAPbhJ3ghYFbOS+RCwSCbVkPN/g7iaxo5lq6ieWzPTuJrGqnZ38Hm1f68chMOh2lrayOYji0YDLJjxw4VWhERERGRslFIH7xTwGecc08UN6TKVel98E7UwuZJ0AfP8zx6enqor69XciciIiIi4yLfPniFJHj/F7jZOTf8ZSoZs0pI8MBP8mKJGCSTzOg8TFV/jMG6kH8sMxgkVB0q2+RORERERGQilCLB+wpw2jn3xbwWkAtUSoInIiIiIiIjyzfBK6TIyjRgg5mtAn4ADGYPOufuK2BtERERERERGaNCErzrgUyPu/fljOW3LSgiIiIiIiJ5yzvBc841FjMQkcnM8zy6u7tpaGhQ4RURERERKZmCG52LVLpoNEokEqGpqYlIJEI0Gi11SCIiIiJSofIusgJgZsuBVmAe8BHn3Ctmdjdw0jn3L0WKsWKoyMrk43kekUjkgubnvb292skTERERkbzlW2SlkEbna4G9wBlgEXB5eugdwAP5risymXR3dw9J7gCSySQ9PT0likhEREREKlkhRzQfBD7lnPskQytoPgPcWFBUIpNEQ0MDgcDQ/4yCwSD19fUlikhEREREKlkhCd584MAwz98AagtYV2TSCIfDtLW1EQwGAT+527Fjh45nioiIiEhJFNIm4RRQD/TmPF8GnChgXZFJpaWlhebmZnp6eqivr1dyJyIiIiIlU0iCtwPYZmbr8fveXWVmtwBbgD8sRnAik0U4HFZiJyIiIiIlV0gfvK+aWQ3QDkzDP675FrDFOff1IsUnFaBvoI9YIgbJJDM6D1PVH2OwLsTpJYsgGCRUHWJOzZxShykiIiIiUvYK2cHDOfd7ZvZHwLX49/ledM6dLkpkUhH6BvpY+OhCmo8k2LoP5sbPj52shU2rYe8N1Ry956iSPBERERGRSyi40blzLuGce84596ySOxmrWCJG85EEu3cZsxavoGvPTl7o7qBrz05mLV7B7l1G85GEv8M3iXieR3t7O57nlToUEREREakgY9rBM7OHRjvXOXff2MORipNMsnUfDKxaTu1T7dRmWg7UL4Pb7ia+ppEt+w4QTyZLG+cYRKNRNmzYQCqVIhAI0NbWRktLS6nDEhEREZEKMNYjmotGOc+NNRCpTDM6DzM3Dl0b159P7jICAfrvXcf82w/wUudhCC8uTZBj4HneueQOIJVK0draSnNzs4qwiIiIiMi4G1OC55xrHK9ApDJV9ftHL88umDfs+Jn088y8ctfd3X0uuctIJpP09PQowRMRERGRcVfQHTwzW25mf25mz5jZ1elnd5vZsuKEJ1PdYF0IgGldx4cdn55+nplX7hoaGgjk7EQGg0Hq6+tLFJGIiIiIVJK8EzwzWwvsBc4ANwKXp4feATxQeGhSCU4vWcTJWpi9/THI2fkilaLukcc5UevPmwzC4TBtbW0Eg0HAT+527Nih3TsRERERmRCF7OA9CHzKOfdJYDDr+TP4CZ/IpQWDbFoNNfs7iK9p5Fi6iuaxPTuJr2mkZn8Hm1f78yaLlpYWent7aW9vp7e3VwVWRERERGTCFNIHbz5+c/NcbwC1BawrFSRUHWLvDdWsJcHWfQeYf/v5P1InamH9nX4fvIerJ8cRzYxwOKxdOxERERGZcIUkeKeAeqA35/ky4EQB60oFmVMzh6P3HCWWiPF6Mslg52Gq+mMM1oU4vWQRDwaDPFwdUpNzEREREZFRKCTB2wFsM7P1+G0RrjKzW4AtwB8WIzipDHNq5pxP4CZBKwQRERERkXKVd4LnnPuqmdUA7cA0/OOabwFbnHNfL1J8IlOC53l0d3fT0NCgo5siIiIiMm4KapPgnPs9IATcDCwF3uWc+1wxAhOZKqLRKJFIhKamJiKRCNFotNQhiYiIiMgUZc65/H7Q7HHgz4F/cvkuIkOY2UxgYGBggJkzZ5Y6HCkCz/OIRCJDmp8Hg0F6e3u1kyciIiIiF/XGG29QU1MDUOOce2O0P1fIDt47gb8HPDPbambvL2AtkSmpu7t7SHIHkEwm6enpKVFEIiIiIjKV5Z3gOeduB2YDfwDcBHzfzF40swfM7BeKE57I5NbQ0EAgMPQ/s2AwSH19fYkiEhEREZGprNA7eHHnXJtzbiUQAR4H7ga0PSGC3w+vra2NYLpRezAYZMeOHTqeKSIiIiLjIu87eEMWMasC/gvwsfTXnzrnri544QqjO3hTl+d59PT0UF9fr+RORERERC4p3zt4hfTBw8wagY8Ca4Eg8DfALwP/VMi6Utn6BvqIJWKQTDIjp/E5wSChSdj4PBwOK7ETERERkXGXd4JnZh5+oZW9QCuwxzl3tliBSWXqG+hj4aMLaT6SYOs+mBs/P3ayFjathr03VHP0nqOTLskTERERERlvhezg/SGwyzn3erGCEYklYjQfSbB7lzGwajldG9dzdsE8pnUdZ/b2x9i9q4O1JIglYpM6wVPjcxEREREZD3kneM65tmIGIgJAMsnWfTCwajm1T7VTm6lAWb8Mbrub+JpGtuw7QDyZLG2cBYhGo2zYsIFUKkUgEKCtrY2WlpZShyUiIiIiU0BBVTQBzOxaM1tjZrdnv4oRnFSeGZ2HmRuHVzeuh5z2AgQC9N+7jmvi/rzJyPO8c8kdQCqVorW1Fc/zShyZiIiIiEwFhdzBuwb4DnAd4ABLD2XKcgYLC00qUVV/DICzC+YNO34m/Twzb7IZqfG5jmqKiIiISKEK2cHbBpwE6oAE8F5gBfAcsLLgyKQiDdaFAJjWdXzY8enp55l5k40an4uIiIjIeCokwbsF+Lxz7idACkg55/4F+CywvRjBSeU5vWQRJ2th9vbHIGeni1SKukce50StP28yUuNzERERERlPhVTRDAKn0+9jwFXAMeBlYH6BcUmlCgbZtBp27+ogvqaR/nvXcWbBPKZ3Hafukcep2d/B+jvhweDkPQHc0tJCc3OzGp+LiIiISNEVkuD9CLgeOAF0Ap8xs58BG9LPRMYsVB1i7w3VrCXB1n0HmH/7gXNjJ2ph/Z1+H7yHqyfnEc0MNT4XERERkfFgzrlLzxruB82agSucc3+TLrjyd8AC4P8Cdznn/ql4YVYGM5sJDAwMDDBz5sxSh1MyfQN9xBIxSCaZ0XmYqv4Yg3Uh/1hmMEioOjSpe+ANR33xRERERCTbG2+8QU1NDUCNc+6N0f5c3gnesIuZ/TzwuivmohVECV5lUl88EREREck1YQmemdU753rGGJ+MghK8yuN5HpFIZEjrhGAwSG9vr3byRERERCpYvglePlU0XzKzfzezb5nZOjP7hTzWEBFG7osnIiIiIjJW+RRZ+WD6tRL4OjDNzPqAfwLagXbn3CtFi1BkCsv0xcvdwVNfPBERERHJx5h38JxzHc65LznnVgG1QCPwODAXaAP6zOxYccMUmZrUF09EREREiqkoRVbMbDqwDGgGPgnMcM5N3kZlJaI7eJXL8zz1xRMRERGRc/K9g5dXHzwzmwZ8AH/3biWwGDgJPA38ZvqriIyS+uKJiIiISDGMOcEzs6fxE7rjwAHgEeBp51x/kWMTqVjqiyciIiIi+chnB+8DwCn8gir/DBxwzsWKGZRItkprfK6+eCIiIiKSr3z64F0BLMc/mtkIvB94Cf9Y5j/j7+b9pKhRVgjdwbtQ30AfCx9dSPORBFv3wdz4+bGTtbBpNey9oZqj9xydEkme+uKJiIiICExgHzzn3H84555yzv2uc24JEAI+AyTSXz0z+9FY1xUZTiwRo/lIgt27jFmLV9C1ZycvdHfQtWcnsxavYPcuo/lIwt/hmwLUF09ERERECpFXkZUc/wH8NP16HXgbWFiEdUUgmWTrPhhYtZzap9qpDaT/TaJ+Gdx2N/E1jWzZd4B4MlnaOItEffFEREREpBBj3sEzs4CZ3WxmnzGzfwTiwDPAp4FXgXuAa4obplSqGZ2HmRuHVzeuh0DOH9dAgP5713FN3J83FagvnoiIiIgUIp8dvDhwBX6hlX8G7gPanXPHixiXCABV/f7Ry7ML5g07fib9PDNvKmhpaaG5uVl98URERERkzPJJ8H4HP6F7qdjBiOQarAsBMK3ruH8sM8f0ruND5k0V6osnIiIiIvnIp8jKDiV3MlFOL1nEyVqYvf0xyCk+QipF3SOPc6LWnzfVeZ5He3s7nueVOhQRERERKVNjTvBEJlQwyKbVULO/g/iaRo6lq2ge27OT+JpGavZ3sHm1P28qi0ajRCIRmpqaiEQiRKPRUockIiIiImVozH3wZPyoD96FRuqDd6IWNk+xPnjDUW88ERERkcqTbx+8YrRJEBk3c2rmcPSeo8QSMV5PJhnsPExVf4zBuhCnlyziwWCQh6tDUza5g5F74ynBExEREZFsSvCk7M2pmXM+gQsvLm0wJaDeeCIiIiIyWgUleGZ2K3Ar8G5y7vM559YXsraI+DK98VpbW0kmk+qNJyIiIiIXlfcdPDP7AvB54Dn8nnhDFnLO3VFwdBVGd/BkJJ7nqTeeiIiISIUoxR28TwEfd849UcAaIjJKub3xPM+ju7ubhoYGJXwiIiIiAhTWJuEy4JliBSIio6e2CSIiIiIynEKOaH4FOO2c+2JxQ6pcOqIpo6G2CSIiIiJTXymOaE4DNpjZKuAHwGD2oHPuvgLWFrmovoE+YokYJJPMyGmbQDBISG0TRERERKRCFZLgXQ+8kH7/vpwxdU+XcTFS4/OTtbCpAhqfq22CiIiIiFxM3gmec66xmIGIjEYsEaP5SILdu4yBVcvp2rieswvmMa3rOLO3P8buXR2sJUEsEZuyCZ7aJoiIiIjIxeR9B0+KT3fwLu157xCzrruZWYtXUPtUOwSy6gSlUsTXNPLTQweI//BZbpziTdHVNkFERERk6pqQO3hm9hDwOefcf6TfX5Tu4Ml4mNF5mLlx6Nq4ntpAThHYQID+e9cx//YDvNR5GKZ4gqe2CSIiIiKSa6xHNBcBVVnvL0bbgjIuqvpjAJxdMG/Y8TPp55l5lSIajbJhwwZSqRSBQIC2tjZaWlpKHZaIiIiITLAxJXjZ9+50B09KYbAuBMC0ruNQv+yC8eldx4fMqwSe551L7gBSqRStra00NzdrJ09ERESkwhTS6Fxkwp1esoiTtTB7+2OQ0yqAVIq6Rx7nRK0/r1KM1DZBRERERCqLEjyZXIJBNq2Gmv0dxNc0cmzPTl7o7uDYnp3E1zRSs7+Dzav9eZUi0zYhm9omiIiIiFSmskzwzOzTZnbSzM6a2ffNbPkl5q81sxfN7K301ztyxs3Mft/MfmxmZ8zsn83svTlzZpnZE2Y2kH49YWa1WePzzazdzPrTcZ0wsy+ZWVXOOiPGIoUJVYfYe0M1a+90vH7oAPNvX8f737OC+bev46eHDrD2TsfeG6oJVVfOEc1M24RgOqlV2wQRERGRylV2bRLM7C7gCeDTwL8CrcAngGudc33DzL8F6AA+B3wHuAP4Q2CZc64zPed+4PeAjwMvAQ8CK4D5zrk303P+EQgDG9JLtwG9zrlfTo9fA3wQeB6IAzcA3wCizrkHRhvLJX7tapMwCn0DfcQSMUgmmdF5mKr+GIN1If9YZjBIqDo0ZXvgjURtE0RERESmjnzbJJRjgtcJPO+c+82sZ0eB7zrnPjvM/G8DM51zv5T17Cngdefcr5uZAT8GHnbOfSU9fjnQD9zvnNthZguBF4GlWUnhUuAgsMA5d+wisT4ELHbOLR9NLKP4tSvBk6JR2wQRERGRySvfBK+gI5pmttzM/tzMDprZ1elnd5vZheUNR7feZcBNwL6coX3ABy7yY7cMM39v1vy5wOzsOc65t4Cns+bcAgxk77I55/4NGLjY55pZPbAmvc5oYxGZENFolEgkQlNTE5FIhGg0WuqQRERERGQC5J3gmdla/OTlDH5PvMvTQ+8AHshz2RAQxN9dy9aPn6QNZ/Yl5s/OejbSnNeGWfu13M81s2fM7CzQjX8c8/NjiGUIM7vczGZmXvi/dyIFuVjbBM/z5TvpKwAAIABJREFUShyZiIiIiIy3QnbwHgQ+5Zz7JDCY9fwZ4MaCorqwUboN82ys8y81Z7j1h1vnLvxf30eB/wJsziOWjM/i7xJmXvobuBRMbRNEREREKteYGp3nmA8cGOb5G0DtMM9HIwYkuXDH691cuDOW8eol5r+a/jobODXCnLph1n5X7uc65/49/fZFMwsCbWa21TmXHEUsub4MPJT1/TtQkicFyrRNyE7y1DZBREREpDIUsoN3Chjub4zLgBP5LOic+xnwfeBDOUMfwt8ZHM7BYeavzpp/Ej/xOjcnfdfvg1lzDgI1ZnZz1pwlQM0Inwv+7lxV+utoYhnCOfeWc+6NzAt4c4TPkovoG+jj+VPP87x3iJd2t3HyT/+Yl3a38bx3iOdPPU/fwAXFV6c0tU0QERERqVyF7ODtALaZ2Xr8I4hXpdsEbMFvDZCvh4AnzOw5/IRpAzAH+DMAM/sW8EpWRc1twIF0K4TvAb8CrMJPNHHOOTN7GHjAzLrx7849ACSAv0jPOZqudvkNM2tNr9sG/F2mgqaZ/Qb+UdQfAm/hF4P5MvBt59zbo4lFiq9voI+Fjy6k+UiCrftgbvz82Mla2LQa9t5QzdF7jlZU64SWlhaam5uHtE1QVU0RERGRqS/vBM8591UzqwHagWn4xzXfArY4575ewLrfNrN34hcvuRL4EXCbc+7l9JQ5QCpr/jNm9mvAl4AvAseBu3L6zn0VmA78KTAL6ARWZ3rgpf0GsJ3zVTD/FvhvWeNvA/cD78HfsXsZeBT42hhjkSKKJWI0H0mwe5cxsGo5XRvXc3bBPKZ1HWf29sfYvauDtSSIJWIVleCBv5OXSeSi0ei5wiuBQIC2tjZaWlpKHKGIiIiIFFvBffDMrBq4Fv+454vOudPFCKwSqQ/e2D3vHWLWdTcza/EKap9qh0DWqeNUiviaRn566ADxHz7LjeHFpQu0hDzPIxKJXHAnr7e3Vzt5IiIiImUq3z54ee/gpZt8D/fcAWeBHuB7zrmf5vsZIpcyo/Mwc+PQtXE9tYGcK6WBAP33rmP+7Qd4qfMwVGiCN1JVTSV4IiIiIlNLIXfwFuG3CwgCx/CPLTbgV8HsAj4NbDWzZc65FwsNVGQ4Vf0xAM4umDfs+Jn088y8SqSqmiIiIiKVo5Aqmt8D9gNXOeducs7dCFwN/C/gL9PvD5B1R02k2AbrQgBM6zo+7Pj09PPMvEqkqpoiIiIilSPvO3hm9grwodzdOTN7L7DPOXe1md2Yfl+5f7seA93BGzvdwRs9z/NUVVNERERkksj3Dl4hO3g1+E28c70LyGQnceCyAj5DZGTBIJtWQ83+DuJrGjm2ZycvdHdwbM9O4msaqdnfwebV/rxKFw6HWblyJeFwmGg0SiQSoampiUgkQjQaLXV4IiIiIlIEhezg/U/gFmATcAi/F97N+H3wnnHO3Z1uGbDZOfefixTvlKYdvLEbqQ/eiVrYXKF98EaiqpoiIiIi5W/Cq2gCrfj36/4qa523gf8P+O30913AJwr4DJERzamZw9F7jhJLxHg9mWSw8zBV/TEG60KcXrKIB4NBHq4OKbnLoqqaIiIiIlNXMfrgzQCuwa+ieVx98PKnHTyZCNrBExERESl/pbiDB4Bz7rRz7gfOuSNK7kTKn6pqioiIiExdBe3gmdmtwK34xVaGJIvOufWFhVZ5tIMnEym3qmbmmSprioiIiJTehO/gmdkXgH34CV4ImJXzEpEyll1VE1BlTREREZEpoJAqmqeAzzjnnihuSJVLO3jF0zfQRywRg2SSGTmFVwgGCanwyhC6lyciIiJSXkpRRfMy4JkCfl5kXIzUOuFkLWxS64QLqLKmiIiIyNRQSJGVbwIfLVYgIsUSS8RoPpJg9y5j1uIVdKWbn3ft2cmsxSvYvctoPpLwd/gEgIaGBgKBof87CAaD1NfXlygiEREREclHITt404ANZrYK+AEwmD3onLuvkMBE8pZMsnUfDKxaTu1T7dRmEpf6ZXDb3cTXNLJl3wHiyWRp4ywjmcqara2tJJPJc5U1Adrb21V0RURERGSSKGQH73rgBSAFvA9YlPV6f+GhieRnRudh5sbh1Y3rIWdXikCA/nvXcU3cnyfntbS00NvbS3t7O729vQAquiIiIiIyyeS9g+ecayxmICLFUtXvH708u2DesONn0s8z8+S8cDhMOBzG8zw2bNhw7l5eKpWitbWV5uZm7eSJiIiIlLGCG52LlJvBuhAA07qODzs+Pf08M08uNFLRFREREREpX4XcwQPAzK4F5uBX1TzHOfe3ha4tko/TSxZxshZmb38Mbrt76DHNVIq6Rx7nRK0/T4aXKbqS2zZBRVdEREREylshjc6vMbMjwI+Avwe+m359J/0SKY1gkE2roWZ/B/E1jRxLV9E8tmcn8TWN1OzvYPNqf54ML1N0JZj+PcoUXdHxTBEREZHyVkij8z1AEvgkcAK4GXgnsBXY7JzrKFaQlUKNzotjpD54J2phs/rgjZrnefT09FBfX3/ubl53d7eqaoqIiIiMs3wbnReS4MWAJufcD8xsALjZOXfMzJqArc45nX8bIyV4xdM30Of3uUsmmdF5mKr+GIN1If9YZjBIqDqk5G6MotHoucIrgUCAtrY2WlpaSh2WiIiIyJRUigTvdeAm59wJMzsOfMI5125m84AfOueq81q4ginBk3LleR6RSOSCO3m9vb3ayRMREREZB/kmeIVU0fwRfi88gE7gM2b2i8Dn8Y9sisgUoaqaIiIiIpNDIQnel7J+/kEgAnQAtwEbC4xLRMpIpqpmtmAwyBVXXEF7ezue55UoMhERERHJVkij871Z708A15rZzwOvu3zPfYqME93JK0ymqmZrayvJZJJgMMjHPvYxli5dqjt5IiIiImUk7zt4Uny6gzc+RqqqebIWNqmq5qhlqmpeccUV55K7DN3JExERESmefO/gFdTo3MxuBW4F3k3OcU/n3PpC1hYpllgiRvORBLt3GQOrltO1cT1nF8xjWtdxZm9/jN27OlhLglgipgTvEsLhMOFwmPb29oveyVOCJyIiIlI6eSd4ZvYF/IIqzwGnAG0FSnlKJtm6DwZWLaf2qXZqM3fJ6pfBbXcTX9PIln0HiCeTpY1zEsncycvdwauvry9hVCIiIiJSSJGVTwEfd84tcc592Dl3R/arWAGKFGpG52HmxuHVjeshp1AIgQD9967jmrg/T0YncycvGAwCfnK3Y8cOABVdERERESmhQhK8y4BnihWIyHip6o8BcHbBvGHHz6SfZ+bJ6LS0tNDb20t7ezu9vb0ARCIRmpqaiEQiRKPR0gYoIiIiUoEKSfC+CXy0WIGIjJfBuhAA07qODzs+Pf08M09GLxwOs3LlSgA2bNhw7shmKpWitbVVO3kiIiIiE2xMd/DM7KGsbwPABjNbBfwAGMye65y7r/DwRAp3eskiTtbC7O2PwW13Dz2mmUpR98jjnKj150l+RmqErqIrIiIiIhNnrEVWcv8G/EL66/tynqvgipSPYJBNq2H3rg7iaxrpv3cdZxbMY3rXceoeeZya/R2svxMeTN8nk7FT0RURERGR8jCmBM851zhegYiMl1B1iL03VLOWBFv3HWD+7QfOjZ2ohfV3+n3wHq7WEc18DdcIPbvoSkNDg3byRERERCbAmBudm1kT8HVgaW7DPTOrwS+88innXEfRoqwQanQ+fvoG+oglYpBMMqPzMFX9MQbrQv6xzGCQUHVIPfCKINMIvb6+nr179567lxcIBGhra6OlpaXUIYqIiIhMCvk2Os8nwftboN0597WLjG8EGtUqYeyU4MlU4XkekUjkgiObvb292skTERERGYV8E7x8qmjeADw1wvg+4KY81hWRKWKkoisiIiIiMn7GWmQFoI6cipk53gbelV84IhPj3JHNi9CRzcKMVHTF8zy6u7t1L09ERERkHOST4L0CXAdc7J/irwdO5R2RyDjrG+hj4aMLSQwmCKRg+ctw5Wk4NQM6IpAKQHVVNUfvOaokL08XK7qie3kiIiIi4yufO3iPACuBxc65szlj04Fn8e/obSxWkJVCd/AmxvOnnuemtpt4+h3/nSUP7+Lyf//xubG3/tNVdP7WnXzwzW18f8P3ufHKG0sY6eSXXXQF0L08ERERkVGayDt4XwJ+HnjJzD5jZr9iZreb2f3AsfTYH+WxrsiEueNFWL55O5e//yY4eBDefBMOHuTy99/E8s3buePFUkc4NYTDYVauXEk4HNa9PBEREZEJMOYjms65fjP7APA/gC8DlhkC9gKfds71Fy9EkSJLJtm6DwZWLaf2u9+FQPrfOZYuhe9+l4E1jWzZd4B4MlnaOKeYi93Lu+KKK9QrT0RERKRI8tnBwzn3snPuNiAELAGWAiHn3G3Oud4ixidSdDM6DzM3Dq9uXH8+ucsIBOi/dx3XxP15UjyZe3nBYBDwk7uPfexjLF26lKamJiKRCNFotMRRioiIiExu+RRZOcc59zpwqEixiEyIqn6/eubZBfOGHT+Tfp6ZJ8XT0tJCc3MzPT09XHHFFSxduvTcjl4qlaK1tZXm5mbt5ImIiIjkKa8dPJHJbLAuBMC0ruPDjk9PP8/Mk+LK3Ms7ffq07uSJiIiIFFlBO3gik9HpJYs4WQsztmzj+UXvHXpMM5Xi3Vu3c6LWnyfjR3fyRERERIpPO3hScULvqOOB2y7jqqcP4zUt5p7P3cQHH7mJez53E17TYq56+jC/d9tlhN5RV+pQpzTdyRMREREpvjH3wZPxoz54E6dvoI+3//pJrv6Drw3tgzfnal75/G/xcx/5VTU5nyCZXnm5d/JAffJERESkcuXbB08JXhlRglcCySR0dMCpU3DllbB8OaR3lGRitbe309TUdMHzJ598klAopCObIiIiUlEmJMEzs4dGO9c5d9+oFxZACZ5UNs/ziEQiQ3bwzAwzI5VKEQgEaGtro6WlpYRRioiIiEyMiUrw2kc51TnnLvyneBmRErzS6RvoI5aIQTLJjM7DVPXHGKwL+YVWgkFC1SEd2ZwA0WiU1tZWkskkgUAA5xzZ/4/SkU0RERGpFDqiOQUowSuNvoE+Fj66kOYjCbbug7nx82Mna2HTath7QzVH7zmqJG8CZO7kvfbaa9x1110XjLe3t7Ny5cqJD0xERERkAuWb4BXcJsHMrgXmAJdlPXbOuT2Fri0yEWKJGM1HEuzeZQysWk7XxvWcXTCPaV3Hmb39MXbv6mAtCWKJmBK8CRAOhwmHw3iepzYKIiIiImOUd4JnZtcA3wGuAxxg6aHMlqAqVcjkkEyydR8MrFpO7VPt1Gb64tUvg9vuJr6mkS37DhBPJksbZ4XJtFHIHNnMbqOgO3kiIiIiwyukD9424CRQBySA9wIrgOeAlQVHJjJBZnQeZm4cXt24fmjTc4BAgP5713FN3J8nE6ulpYXe3l7a29s5ePAgTzzxxLkdvVQqRWtrK57nlThKERERkfJRSIJ3C/B559xPgBSQcs79C/BZYHsxghOZCFX9MQDOLpg37PiZ9PPMPJlY4XCYlStXcvr06SHHNQGSySQHDx6kvb1diZ6IiIgIhSV4QeB0+n0MuCr9/mVgfiFBiUykwboQANO6jg87Pj39PDNPSqOhoYFAzg6rmfFrv/ZrNDU1EYlEiEajJYpOREREpDwUkuD9CLg+/b4T+IyZ/SLweeBEoYGJTJTTSxZxshZmb38McnaISKWoe+RxTtT686R0MnfygulG9JlkT0c2RURERM4rJMH7UtbPPwhEgA7gNmBjgXGJTJxgkE2roWZ/B/E1jRzbs5MXujs4tmcn8TWN1OzvYPNqf56UVvadvL/8y78kt81LMpmkp6cHz/N0bFNEREQqUt5VNJ1ze7PenwCuNbOfB153aq4nk0ioOsTeG6pZS4Kt+w4w//YD58ZO1ML6O/0+eA9X64hmObhUG4XnnnuOW2+9VZU2RUREpCKp0XkZUaPz0ukb6COWiEEyyYzOw1T1xxisC/nHMoNBQtUh9cArQ9FodEgbhS9/+cv87u/+7gVJX29vr3rmiYiIyKSSb6PzghI8M7sVuBV4NznHPZ1z6/NeuEIpwRMZO8/z6Onpob6+nu7ubpqami6Y8+STTxIKhdQcXURERCaNfBO8QhqdfwG/oMpzwCnONzgXmfS0ozd5ZI5sZuQe28xU2tSRTREREakEee/gmdkp4DPOuSeKG1Ll0g5eeegb6GPhowtpPpJg6z6YGz8/drIWNq327+QdveeokrwylH1sMxAI4JwbUoxFRzZFRERkMsh3B6+QKpqXAc8U8PMiZSmWiNF8JMHuXcasxSvoSlfV7Nqzk1mLV7B7l9F8JOHv8EnZGU2lTTVHFxERkamqkB28rwCnnXNfLG5IlUs7eOXhee8Qs667mVmLV1D7VDtkN9dOpYivaeSnhw4Q/+Gz3BheXLpA5ZI8zyMSiVxwZNPMdGRTREREylopdvCmAfeZ2dNm9oiZPZT9KmBdkZKa0XmYuXF4deP6ockdQCBA/73ruCbuz5PypuboIiIiUmnyLrICXA+8kH7/vpwxFVyRSauq3z96eXbBvGHHz6SfZ+ZJeWtpaaG5uZmenh5ee+017rrrriHjmSObqrIpIiIiU0Ehjc4bixmISLkYrPMbmk/rOg71yy4Yn951fMg8KX8jNUdXlU0RERGZSgo5oomZ1ZrZJjP7ppl9w8x+28xqihWcSCmcXrKIk7Uwe/tjkJUIAJBKUffI45yo9efJ5DLaI5uHDh1SERYRERGZlPJO8MzsPwPHgd8Gfh4IAfcBx83sxuKEJ1ICwSCbVkPN/g7iaxo5lq6ieWzPTuJrGqnZ38Hm1f48mXxGU2Vz6dKlNDU1EYlEiEajJYpUREREZOwKqaLZAfQAn3TOvZ1+9nPAN4FrnHMrihZlhVAVzfIwUh+8E7WwWX3wpozhqmzmUt88ERERKYV8q2gWkuCdARY557pynl8LPOecq85r4QqmBK989A30+X3ukklmdB6mqj/GYF3IP5YZDBKqDim5myJyG6MPl+w9+eSTKsIiIiIiE6oUCV4/cLdzbl/O82bgW865urwWrmBK8MqXEr6pzfM8enp6uOKKK1i6dKn65omIiEjJlSLB2w7cAWwGnsFvjbAM+BNgt3Put/JauIIpwStPIx3ZPFkLm3Rkc0rJ3dFzzg25p5c5sgnQ3d2tXT0REREZF6VodL4Z+BvgW0Av8DKwE/hr4P4C1hUpK7FEjOYjCXbvMmYtXkFXuuhK156dzFq8gt27jOYjCX+HTya90RRh2bZtG5FIRIVYREREpOzkvYN3bgGzamAeYECPcy5RjMAqkXbwytPz3iFmXXczsxavoPapdghk/btIKkV8TSM/PXSA+A+f5cbw4tIFKkU3XBGW3NYK4O/qHTx4kNOnT2tHT0RERIqiFDt4ADjnEs65HzrnfqDkTqaiGZ2HmRuHVzeuH5rcAQQC9N+7jmvi/jyZWnL75gWDQe67774LCrGotYKIiIiUi58by2Qzewj4nHPuP9LvL8o5d19BkYmUiap+/+jl2QXzhh0/k36emSdTS0tLC83NzfT09FBfXw/AQw89dEGSl9ss/frrr9eOnoiIiEy4MSV4wCKgKuv9xRR27lOkjAzWhQCY1nUc6pddMD696/iQeTL1hMPhIUlaW1vbiK0VMjt6qrwpIiIiE63gO3hSPLqDV550B0+GM1JrhVy6oyciIiJjNeF38MxsjpnZxcbyXVek7ASDbFoNNfs7iK9p5Fi6iuaxPTuJr2mkZn8Hm1f786RyhMNhVq5cyeLFi4fc0wvk3tNEd/RERERk4hTSBy8JXOmcey3n+TuB15xz+tvuGGkHrzyN1AfvRC1sVh88QTt6IiIiUlylaHSeAuqccz/JeR4BXnTOXZHXwhVMCV756hvo8/vcJZPM6DxMVX+MV2caJ9/n/8V87o88Zr/hGKwLcXrJIggGCVWHlPBVqNxm6cMle5nnuqMnIiIiw5mwBC+reuZ/B74BZLdGCAJLgKRz7hfHtLAowZtERtrVO1kLm7SrV/G0oyciIiKFyDfBG2sVTThfPdOA64CfZY39DDgCbMljXZFJI5aI0Xwkwe5dxsCq5XRtXM/ZBfOY1nWc2dsfY/euDtaSIJaIKcGrUNmVN1V1U0RERCZKIUc0Hwc2OufeLG5IlUs7eJOHKmvKWI11R6+3txeA7u5u7eqJiIhUoAmvogl0A3fmPjSz9WZ2fwHripS9GZ2HmRuHVzeuH5rcAQQC9N+7jmvi/jwRGHvVzW3bthGJRFR5U0RERMakkARvA9A1zPP/A3yqgHVFyl5VfwyAswvmDTt+Jv08M08kW0tLC729vbS3t/Nv//ZvFyR5gUCAhx566NwuXyqVorW1lUOHDtHe3o7neaUIW0RERCaBQhK82cCpYZ7/BLiygHVFyt5gXQiAaV3Hhx2fnn6emSeS62I7esFgkPvuu++i9/Syd/Q8z1PCJyIiIkMUcgevG/gD59yf5zy/O/38miLEV1F0B2/y0B08KbbMHb36+noAIpHIiPf0zAwzU2EWERGRKWoiq2hmfBN42MyqgH9KP7sV+CqwtYB1RcpfMMim/5+9O4+Pqrr/P/46CREMSIKE1TQKKksVCVBAVBCUrRZRRLStWlREWm2tC24o7gutG9pqK4siWL8/WSpIVTYNJC6sSXAjBCESEAhEyCAEIYTz++POJJPJJJMJIZmZvJ+PxzzinXvm3puowJtzzuczGObNSaNg6ADy/nIThzqdyclZm2n1jzeJW5bGzaPgYfesjEgg3lU3IXDlTWstnr+g8yzhPO+889RqQUREpJ47nhk8A0wC7gBOcr/9M/A3a+0TNfN49Ytm8MJHZX3wtsTDePXBkxoQTOVNKN88fciQIarCKSIiEqZqrdF5uQsY0wToDBwCNllrDx/XBesxBbzwkuvKJb8wH4qLabIqg4O535EfF0N+1w4krM8mwVVE46SzONC7G0RHkxCboLAn1TZ9+vQyM3reM3j++FvCqcAnIiISPuos4EnNUcALX5XN6OXEwz2a0ZMa4L1Pb/HixZUu4fSlPXsiIiLhpS764AFgjPmlMWaoMWa49+s4rnebMSbHGPOzMWadMaZvgPEjjTHfGmMOu7+O8DlvjDGPGWN2GGMOGWOWG2PO8RnTzBgzyxjjcr9mGWPivc73N8YsMMbsNMYcNMZkGmOu87lGjDHmEWPMZvezrzfGDK3uz0HCS35hPkPWFzJvjqFZz35kLZxB5qY0shbOoFnPfsybYxiyvtCZ8ROpJk/lzcTExICtFnxZa8u1Xdi+fbsqcYqIiESYahdZMca0B94DugAWMO5TninBoKtLGGOuBSYDtwGfAeOAj4wxv7TW5voZ3wd4F5jofpYRwGxjzEXW2lXuYfcBdwM3AtnAw8BSY0xHa+1P7jHvAImAJ5BNAWYBl7uPLwC+BP4G5AG/AWYaY/Zbaxe6xzwFXA+MxekPOAR4zxhzgbVW3a4jXXExLywB18C+xC9KId7zh+2zLoLLbqBg6ACeX5JKQXFx3T6nRBTvwiy+RVkCLeH0NFP39NvTMk4REZHIcDxFVhYCxTiBZgvQC2iOU0FzvLU2rRrXXAWkW2v/5PXeBmC+tfZBP+PfBZpaa3/t9d4iYJ+19nfuQjA7gMnW2r+5zzfECWn3W2tfN8Z0Br4FzveEQmPM+cAXQCdr7cYKnvUDIM9ae7P7eAfwtLX2Va8x84ED1trrq/j9a4lmmMqeN4UOV48ja+EMOg0bXe78xoUz6Dj8JrLnvk6HkbfWwRNKfVDZEk7fwOeZ8fNe2ql9eyIiIqGjLpZo9gEesdbuAY4Bx6y1nwIPAq8EezFjzElAD2CJz6klODNoFT2D7/jFXuPb4TRkLxnjLgKzwmtMH8DlNeOHtXYl4KrkvgBxwF6v44Y4VUS9HQIuquQaEiFi8pyllz93OtPv+UPu9z3jRE6EipZwbt26lalTpwZspu67jHPs2LGcfvrpaq4uIiISRo6nD140cMD9z/lAW2AjsBXoWI3rJbivmefzfh5OSPOndYDxrb3e8x1zuteY3X6uvbui+xpjrgZ64iwh9VgM3G2MSQU24/QEvIJKlqq6ZxMber11SkVjJbQVtUoAoFHWZmdZpo+TszaXGSdSG7yXcI4ZM4YhQ4aUaabuWZ5ZEd9ee2PHjtUMn4iISIg7nhm8r4Hz3P+8CrjPGHMh8AjOks3q8l0zavy8F+z4QGP8Xd/vfY0x/YEZwFhr7Tdep/4KbMLZf3cE+CfwJs4y1oo8iDNT6Hnpr8TD1IHe3ciJh9avvAG+f2A+doxW/3iTLfHOOJG64j3Dl5iYyJQpU0pm9aKionBWtVesKjN8IiIiUreOJ+A95fX5h3FmxNKAy3CanwcrHycM+c6ataT8DJzHrgDjd7m/BhrTys+1W/je1xhzMbAQuNtaO9P7nLV2j7X2SqAxzs+iE84MZ04Fzw7wLM5ST89Lf/0drqKjuWcwxC1Lo2DoADa6q2huXDiDgqEDiFuWxvjBzjiRUFHZMs7qBL5x48axZs0aLeEUERGpQzXaB88YcypOgZNqXdRdZGWdtfY2r/e+BRZUUmTlFGvtZV7vfQQU+BRZecla+3f3+ZNwll/6Flnpba1d7R7TG1iJV5EV98zd/9yfKymkUsn3EgNsAGZbaydU8ftXkZUwVVkfvC3xMF598CRMBFOoxR9PTz7vXnvbt28vs4zT91hERETKq9VG5+7wsgQYZ63NDvoCFV/3Wpz2BH/EqWJ5K06VznOstVuNMTOBHzxhzxhzAZAKPAQswNnz9hRQ0ibBGHM/zlLIm3CWUE4A+gMlbRLcobAtpXvqpgBbrbWXu8/3Bz4AXqZsAZkj1tq97jG9gdOATPfXx3CKvHS31nr9cb/S718BL4zlunKdPnfFxTRZlcHB3O/Ij4shv2sHEtZnk+Djqoo1AAAgAElEQVQqonHSWc4yzehoEmITFPYk5B1P4IuOjmbSpEncf//9JaHvhhtuYNasWdrHJyIiEkCtBjwAY8we4AJr7aZqXaDi696G07uuDc4+v7ustanuc8uB7621N3qNvxon1LXHKW7ykLX2v17nDfAoTnhrhrNf8HZr7ddeY07FCW6eBu3vA3/2BDNjzAygfO17WGGt7e8eczHwL/dzHAA+BB6w1u4I4ntXwIsQlc3o5cTDPZrRkzBVWeDzV7Clovc91JpBRETEv7oIeC8ARdbaB6p1ASlHAS9ypO9M56k7ezBvjsE1sC+77riZnzudSaOszbR+5Q3ilqUxcpTl4cnr6N6me10/rki1eQJf48aNOf/888uEuUDhzh8FPhEREUddBLx/AH8AvgPWAge9z1tr767WhesxBbzIkb59Dc269KJZz37EL0qBKK96RseOUTB0AHvXpFLw1Wq6J/asuwcVqUHTp08vmdGLjo7m2Wef5YEHHgg65HlT4BMRkfqqLhqdnwukA/uBDkA3r1fycVxXJOw1WZVBuwLYdcfNZcMdQFQUeX+5ifYFzjiRSOFdlfP777/n3nvvLdOKITo6mtGjR5+Q1gy+DdjVkF1EROqroBudG2PaAznW2gEn4HlEIkJMXj4AP3c60+/5Q+73PeNEIoV3c3Uo32A9MTGRp556qtqFW3ybr48bN459+/apkIuIiIhb0Es0jTHFQBtr7W738bvAHdbainrVSRVpiWbkyJ43hQ5XjyNr4Qw6DStfn2fjwhl0HH4T2XNfp8PIW+vgCUVCR021ZqhIVZZ5qnWDiIiEmlrbg2eMOQa09gp4PwFdrbVbgrqQlKOAFzm0B0+k+oIJfDVRyMXfjJ/694mISF2riz14IlKR6GjuGQxxy9IoGDqAjQtnkLkpjY0LZ1AwdABxy9IYP9gZJyJlJSYm0r9/fxITE8vs69u6dStTp04ts6dv0qRJRPnucw3Ad1/fW2+9VeZ43LhxPP/882X2+t14443l9v5p35+IiISi6i7RbG2t3eM+/gk4z1qbcwKer17RDF7kqKwP3pY4eLMb5LaI4Y/DH6dh/4FqfC4SBO8ZvsTExHLVO6+//nrefvvtoJZ5+gp22WdV9v1pBlBERIJR20s0PwIOu9+6HPiE8m0SrgrqwqKAF2FyXbnkF+ZDcTGHly/j3+8/StKeIm7OVONzkZrmG/qOZ19fdZZ9+tIyUBEROV61GfDerMo4a+1NQV1YFPAimBqfi9StigKf74xfTfXvC8SzvDTY6p8KhCIi9UetNzqXmqeAF7lUdEUktFQ243eiln36Ot5loFVtB6FQKCISnhTwIoACXuRS2wSR8FPVZZ/VCYA1sQzUl792EAC33nqrZglFRMKQAl4EUMCLXDmvPUO72x8ic1MayWddVO585qY0kjv0I+fVp2l324Q6eEIRCVZ1A2BtLQP1VBcNhVlChUQRkeAp4EUABbzIpRk8kfonFJaB1rTqzBLW1KyhQqOI1DcKeBFAAS9yaQ+eiPhzPLOAgQJhVWbwjld17lGdWUOo+aWm1QmRCpUiUpsU8CKAAl7k8q2imfeXm8hJbIz55hvOmbqA01Izeel86DJ2As0HX6m+eCICBJ4FrCwQvv766wBhN0vo60QsNa1OiKytmcjjPfZ3DxEJT9UNeCW/uOtV9y+gKWBdLpeVyLK1YKuNfTrWjrgGuyUeayl9HYkqe7wlHjviGmzs07F2a8HWun50EQkj27ZtsykpKXbbtm0Vvud9PG3aNBsdHW0BGx0dbUePHl3hcVRUlDXGWKDkFRUVZaOiosq8F46v6nwfxpiSz0RFRdnRo0dXejxt2jQ7bdq0oD4T7LG/e0ybNs1u27bNfvLJJ2X+Gzie45q4RrjcQ6QuuVwuz685TW0wmSKYwXop4En1bS3YatftWGfXbVttl0yfYF84H3vMYLf162Y/mvWY/Shjjt2wcIbdN6ifPWaMHXENdt2OdXX92CIS4SoLgL7HvoHQEyiOJySeiPAViq/a+D783SPYIBqqQbUu7lET4bg6nwmX8Ftf7lGXFPAi4KWAV3+s27babonH7hvUz9ri4rIni4vtvkH97OZ47Lptq+vmAUVEKhDsLKHvcTCBsCohsjqhMVJDZG286iqo1sU9jjccn4igGirht77cY9q0aXXy66yHAl4EvFDAqzc2zn3dWrAbFs7wez7r/TetBWeciEiECSYQVuUzwYbGYENkuMxEKqiG1utE/PuIlP+OwuUe0dHRdTqTp4AXAS8U8OqNLa8+bS3YjE1pfs9nZKdaC844EREJKNjQGGozkTVx7HuPmpjdDNc/mIfCPfSKjFdKSkot/kpWlgJeBLxQwKs3NIMnIhL+TnSIrIl71HSorIugWhf3CNWlv5ESfsPlHprB00sBT6pMe/BERKS2nOgQGan3qO2lv+ESfuvTPcJ1D5764IUQ9cGrP9QXT0REJPQF04vS33FNXKM2rql7VH6NuqJG5xFAAa/+yHXl0vnVzgxZX8gLS6BdQem5oiiI8ernmxMP9wyGxV1j2XD7BoU8ERERkXqgugGvwYl7JBGpSFJcEhtu30B+YT77iov5bsl8vpr6DHetgryLuvH12Cvg3HM4Y/tBWr/yBvPmpDGSQvIL8xXwRERERKRCmsELIZrBq7/St6+hWZdeNOvZj/hFKRAVVXry2DEKhg5g75pUCr5aTffEnnX3oCIiIiJSK6o7gxcVeIiInGhNVmXQrgB23XFz2XAHEBVF3l9uon2BM05EREREpCIKeCIhICYvH4CfO53p9/wh9/uecSIiIiIi/ijgiYSAolYJADTK2uz3/Mnu9z3jRERERET8UZEVkRBwoHc3cuKh9StvwGU3lCzTzHXlkn9gNy2ff5kdjWFPdibMm8KB3t3UOkFEREREylHAEwkF0dHcMxjmzUmjYOiAkr54k/59Pc8tPMJpO8EAbSf8C1DrBBERERHxTwFPJAQkxCawuGssIynkhSWpdByeSkdgKGCBtW1h/OUn8eAf/6PWCSIiIiJSIbVJCCFqk1C/5bpyyS/Mh+JimqzKYHd2Bmc+82+Ku3dj9ztTSGjSsjTIqXWCiIiISERTo3ORMJcUl1Qa4BJ7wrwptDkAWff+le6n/arsYHfrhI7DU8leleGMFxEREZF6T1U0RUKUWieIiIiISLAU8ERClFoniIiIiEiwtERTJET5a51Qsk+vqIhfTHyavFjI3fc9B7avUdsEEREREVHAEwlZPq0Tvrnlcq78+mHGfH6YiSug8VFnWKuxz5IT/6zaJoiIiIiIAp5IqPJtnXDh0lT2uM8dbAAPXArTLmjIgi5Pc87U99U2QURERETUJiGUqE2C+PJunfDjov/S9a+TMGefzbb3Z0FMTOmSTLVNEBEREYkoapMgEoG8WydkN8ugZSFkPfUQ3ZN6lx2otgkiIiIigqpoioQNtU0QERERkUAU8ETChNomiIiIiEggWqIpEibUNkFEREREAlHAEwkXapsgIiIiIgEo4ImECbVNEBEREZFA1CYhhKhNggSitgkiIiIi9YPaJIjUA2qbICIiIiKVURVNkTCltgkiIiIi4ksBTyRMqW2CiIiIiPjSEk2RMKW2CSIiIiLiSwFPJFypbYKIiIiI+FDAEwlTapsgIiIiIr7UJiGEqE2CBEttE0REREQik9okiNRDapsgIiIiIt5URVMkQqhtgoiIiIhoBk8kQpRpm3DWRUDZJZxHZ0wH4Mfvv2WfqmqKiIiIRCTtwQsh2oMnxyN9+xqadelFs579iF+UQu5P2+n8ameGrC/khSXQrqB0bE48qqopIiIiEsKquwdPSzRFIoW7bULcMqdtwoaFb3DlukLmzYHW0U2xwO9HwKJZj9GsZz/mzTEMWe9U1RQRERGRyKCAJxIhStomjLLsW5PKkBse5z//BWNh59H9jLwGFvwqll9efhPxi1JwDezL80uA4uK6fnQRERERqSEKeCIRIikuiQ23b+DhyevY99Vq1t57HQCrJ4ym4OvVPDx5XelyTHdVzfYF0GRVRh0/uYiIiIjUFBVZEYkg3m0Tcs74JQAn3XQLye62CLmuXNJ3pkNxMQXbvqEjcHjxh6T37qaiKyIiIiIRQAFPJEL5VtXMdeWWKbrS3V10pcvUBeTMWaCiKyIiIiIRQEs0RSLUgd7dyImH1q+8AceOkV+Yz5D1hcybY2jWsy+7u3UgJw4Wv/WIiq6IiIiIRAgFPJFI5VNVc8/S+by4GPYknw0WWmRu4p4h0GLQFSq6IiIiIhIhFPBEIlS5qpqjn+QMF7TMyGbv2jRGjrIs7hpLQmyCiq6IiIiIRAjtwROJUJ6qmvmF+ewrLubAk0/TZcoCPnl1PPHDr+Hh6GgmexVVOdTpTABi8rREU0RERCRcKeCJRDDvqprZgy+DKQtom3QundxVNcGprJlfmM+eL5YCcGD9arLnTeGAKmuKiIiIhB1jra3rZxA3Y0xTwOVyuWjatGldP45EmPTta2jWpRfNevYjflEKREWVVNYcmlnIzPeg8dHS8TnxqLKmiIiISB3Zv38/cXFxAHHW2v1V/Zz24InUFz5FVzYunMEXG5bwl08KmTsHYo/CA5fCuyunkbVwhiprioiIiIQhzeCFEM3gyYnk2wevXUHpuYMN4A8jYFGy12zdsWMUDB3A3jWpFHy1mu5eyzpFRERE5MSq7gye9uCJ1BO+RVeKVmVwePGHdJm6gE+nP8JDg67gJe/9du7Kmh2Hp5K9KgMU8ERERERCngKeSD3iXXSFxJ7k5OUDC2h1wSCS23QHSouuUFxMwbZv6AgcXvwh6Sq6IiIiIhLyFPBE6rGiVgkANMraDGddVG4ZZ3f3Ms4uUxeQM2eBiq6IiIiIhDgVWRGpxw707kZOPLR+5Q04doz8wnyGrC9k3hxDs5592d2tAzlxsPitR1R0RURERCQMKOCJ1Gc+lTX3LJ3Pi4thT/LZYKFF5ibuGQItBl1B/KIUXAP78vwSoLi4rp9cRERERPxQwBOpxxJiE1jcNZaRoyz71qQyZPSTnOGClhnZ7F2bxshRlsVdY0mITSgputK+AJqsyqjrRxcRERERP7QHT6Qe862seeDJp+kyZQGfvDqe+OHX8HB0NPcePUx+YT75P+Wp6IqIiIhIiFMfvBCiPnhS17LnTaHD1ePIWjiDTsNGV9o7LyceFV0REREROUGq2wdPSzRFpISKroiIiIiENwU8ESmloisiIiIiYU178ESkREnRFQp5YUkqQ5amOicystkSn83No+Cj8xox5sBO0oE9fxjAkKWpLF0yn/RfR5dcQ8s1RUREROqG9uCFEO3Bk1CQ68p1llwWFxPjU3Rl56HdjJo9ikNHDxF1DAZ/Bx+9A0/0g8f7w7EoiI3RnjwRERGR46U9eCJSI5LikujepjvdE3vScPBlALRNOpfuiT1p06QNh44eYsUpf6XwzbZ89I7zmUdSofDNts77RdqTJyIiIlJXFPBEpEK+RVcARnwLfce/QsOu3fmpZzI5cZC1YDoNk3vQd/wrjPi2jh9aREREpB5TwBORilVSdKXgZxdN1q7nniFQ2DMZ5s9X0RURERGROqaAJyIVKim6Msqyb00qQ0Y/yRkuaJmRzd61aYwcZfnovEbsPLCT9B3ryO7ZjvYFcOyVl0nfvob0nenkunLr+tsQERERqTdUZCWEqMiKhKKqFF0Z+uUhNUIXERERqUEqsiIiJ0SgoitDvzzkboTej5WPjgFg9YTRaoQuIiIiUgcU8ESkyjxFV5o8/zLpP6wlK+8bXlgCP/RLZsv050hcns6WeGjwp9vVCF1ERESkDijgiUiVJZzSigmXnUTbFRlsv6Qnnz70B9oVwGNNM9g+sDdtV2Qw4dcx7Dy0W3vyREREROqA9uCFEO3Bk3CQ68rl6NzZnPb4SzTctqPk/cNJp/H1vaPp63pJe/JEREREjpP24IlIrUiKS6L9mPE0zMll22N3A5D7zP003LIVM3Kk9uSJiIiI1CEFPBGpnuho9oz5LTnx0DTlCzAGiot5YQm4BvYl/sOP6fT5Ju3JExEREalFCngiUn0+jdCP/utV2hVA1gVnU3DZpcQtS2P8YJw9eXmZrLr+YtoXQMyTT5M9b4r25YmIiIjUMO3BCyHagyfhJteVS+dXOzNkfWG5PXdb4mH8YPjovEYYDEO/PMSLi+EMV+kY7csTERER8S9i9uAZY24zxuQYY342xqwzxvQNMH6kMeZbY8xh99cRPueNMeYxY8wOY8whY8xyY8w5PmOaGWNmGWNc7tcsY0y81/n+xpgFxpidxpiDxphMY8x1fp7lTmPMRvd9thljXjLGNDren4lIqEqKS2LD7Rt4ePI69n21mrX3Ov9brJ4wmoKvVvPw5HXMvWZuyb682PYdAPjk1fFkLZyhfXkiIiIiNSykAp4x5lpgMvA00A1IAz4yxvj9a31jTB/gXWAW0NX9dbYxprfXsPuAu4E/Az2BXcBSY8wpXmPeAZKBoe5XsvtaHhcAXwIjgfOAN4CZxpjLvZ7lOmAS8DjQGRgDXAs8G+zPQSSceDdCj7rjr+TEQ4c1OXRv24PubbrT5uSW7n15F3FS81ZsiYf44dfQadho7csTERERqWEhFfBwgth0a+00a+0Ga+2dwDbgTxWMvxNYaq191lqbZa19FvjY/T7GGOP+56ettf+11n4NjAZigd+7x3TGCXW3WGu/sNZ+AYwFhhljOgJYa5+x1k601n5urd1srX0FWAR4zxb2AT6z1r5jrf3eWrsE+D/gVzX34xEJcT578jYunEHB+7NpVwBH8vOI+/jT0j15O9PVK09ERESkhoVMwDPGnAT0AJb4nFqCM4PmTx8/4xd7jW8HtPYeY609DKzwGtMHcFlrV3mNWQm4KrkvQByw1+v4U6CHMaaX+/tpD1wGfFDJNUQiSkJsAou7xjJylGXfmlQ6Dr+JS25/HoCDW7IZOcry0XmNGDV7FE/d2YNmXXrR65m3APjVc/+hWZdePHVnDzq/2lkhT0RERKQaGtT1A3hJAKKBPJ/383BCmj+tA4xv7fWe75jTvcbs9nPt3RXd1xhzNc5yz3Ge96y1/88Y0wL41D1z2AD4l7V2UgXPjjGmIdDQ661TKhorEg48e/LyC/PZV1xM0aoMDi/+kC5TF5D9yiM8POgKxhzYyfSHhzFvjsE1sC8rLzib8x+fzuoJo+mwJod5c9IYibMnT0VXRERERIITMjN4XnzLeho/7wU7PtAYf9f3e19jTH9gBjDWWvuNz/sPAbcB3YGrcJZ5Tqzk2R/EmSn0vLZXMlYkLHjvyesw8laKHnmInHjo/fZyurdK9tqTp155IiIiIjUtlAJePlBM+VmzlpSfgfPYFWD8LvfXQGNa+bl2C9/7GmMuBhYCd1trZ/qMfxKY5d4/+JW19j1gAvCgMaain/OzOEs9Pa/ECsaJhK8q9srbsC9be/JEREREjlPIBDxr7RFgHTDI59Qg4PMKPvaFn/GDvcbn4AS4kjHuvX4Xe435Aojz7J1zj+mNE7g+93qvP85+ugestVP8PEsscMznvWKcmUDj7+GttYettfs9L+CnCr5PkbDluy/Ps+fu/Mens3dNKiNHWd77Jcx76nrtyRMRERE5TiET8NxeBG4xxtxsjOlsjHkJSAL+DWCMmWmM8W478DIw2BhzvzGmkzHmfmAgTqsFrNPFfTIwwRgzwhhzLs7yykKc1ghYazfgVMScaow53xhzPjAV+J+1dqP7vv1xwt0rwDxjTGv361SvZ1kI/MkY81tjTDtjzCCcWb33rbVaayb1VqBeeSMffpsR38K8OYZmPfux8tExJefVJ09EREQkOMbJQKHDGHMbTu+6NsDXwF3W2lT3ueXA99baG73GXw08BbQHNgMPWWv/63XeAI/iFERpBqwCbne3TPCMORUnvA13v/U+8GdrbYH7/Ayc9gq+Vlhr+7vHNMDZg3cDcBqwByf0PeS5ThW+96aAy+Vy0bRp06p8RCTspG9fQ7MuvWjWsx/xi1JI37Gu9PjDjym47FL2rkllZcpMOp3agV8Mv55jm77jq5cf5NShIyA6moTYBBVgERERkYi2f/9+4uLiAOLcq/2qJOQCXn2mgCf1QfrOdJ66s0dJFc3snu3o9cxbrHx0DJ0+30TcsjRGjnJ+XXphCbTz+uuRnHi4ZzAs7hrLhts3KOSJiIhIxKpuwAu1JZoiEuGqsicPSpdsLpv6AADrb7tKSzZFREREAlDAE5FaFWhP3tUPzixto7AohV+06gjA7h6d2DLt7+Sfdyb/+h/s/WCuqmyKiIiI+NASzRCiJZpSH/nuyct+bxodrh5H1sIZdLrsBvIG9eFg+mruGwTPLdWSTREREakftERTRMKTT5+8Q58sBmD7ro0UDB1Ay5Q1zPklzJmrJZsiIiIigSjgiUid8t2T1/U1pwjuwLHPsndNKqNGWq75FnYP6KklmyIiIiIBaIlmCNESTamvcl25zgxccTE/LvovXf86CXP22Wx7fxY/LnufQWOe0ZJNERERqVe0RFNEwlZSXBLd23Sne2JPmv9mFH8cBglffkf7W+6j5bosQEs2RURERKqiQV0/gIiIt5IlmxTywpJUui513h849lm2xMMtI+G5Zc6SzVaLUvjFBzMBKOzzK7ZMGETL34/lnx9kkpM8jewOyRzo3U3N0UVERKTe0BLNEKIlmiKOYJZsFgwdwN41qXS5uyFDvj7MPz+EtgdKr6VlmyIiIhKOtERTRCJGMEs245alMX4wDPn6MPPmGGxyVwA+f3wsWQtnaNmmiIiI1CsKeCIS0gJV2Rw5yrL43IYlzdEPjL8DANO+HYU9uqjSpoiIiNQrWqIZQrREU8S/ypZsEhPD4Y8X0+eGCWQteIPW/5xRZsnmC0tUaVNERETCj5ZoikjEqmzJZuO1X8KWLQCc8tzkcks2VWlTRERE6hNV0RSRsOJbZbPj8NTSk5lfMnIULD63IV+/eBjXwL5Oc3R3pc3dPTpRfP9wfjH8ev71v+/4atBc0ocWq8qmiIiIRAwt0QwhWqIpUjXeSzabrMpgd3YGZz7zb4q7d2P3O1M4nLLUWbKp5ugiIiISpqq7RFMzeCISdpLikkqDWGJPDuxM5/bMfzNvTiZNbrqHrWcnAE6lzdbu5uh/7+M0R3cN7Muyay5g4NhJLL/pEs7avJd5szN5KbeQja1fJ3/wlZrRExERkbClGbwQohk8kerJdeXS+dXODFlfWK6oypZ4uG+g0xw9tnsvWi39gs/mvsiF197LxP5wc6Zm9ERERCT0qMiKiNRbSXFJbLh9Aw9PXse+r1azdOoD7I6FPV3PpmD9Ssb9egLtCmDfXbcBcPpr77ArFp5Y4RRhWfnoGABWTxitIiwiIiIS1hTwRCQiBNMc/bQVGRgDP/RLJn5RCo2uGw3AkaaN1TdPREREwpoCnohEnEDN0R/pD60Owtbbr4OoKA5mrAJg8aLXaNb1fFqs/45Whc74Zl168dSdPej8ameFPBEREQl5CngiEnECLdkcMtRZqtm4W284dqzckk1P37zlN11CzHnJzJsNTy4sZOO81zWjJyIiIiFNAU9EIlJlSzZP2n8QgJ/fnuF3yebJTZsD8HHOJxR9mYkB7l4Jg8Y8oxk9ERERCWmqohlCVEVT5MSotMpmHLzZDZ5cDp/Nfp4LR97F9kt+RYM1GbQ65LRVyLrgbM5/fDqfjB9JhzVbOG1FBo/0hyFDb6PlWV050LubWiuIiIhIjVIfPBGRCniWbOYX5rOvuJjvlsznq6nPcNcqOKlrNy5tFw/LUzjk+rFkRm93Y2dGL3FRCo02fwaPT+etre9zoGURMxs4gZDlrwFqrSAiIiKhQwFPROoF7+bouSNbceWuyXyaVMgLSzLon+qMGTj2WbbEwwv93TN6t19HYlQUJ2dtdq6xp4gnVhj2JJ9N44xs3nthLCe3PI1zpy1g3pxMRlJIWm4anRM6A2hGT0RERGqdlmiGEC3RFKk9ua5cp89dcTE/LvovXf86CXP22Wx7fxaF/zeTix54jcxNaSS3v4CCoQPYuzoVY6BZz358/48nSO7Un9GjYph5ThHRR2HFDDhrL/zualhxBhyLgtgYzeiJiIhI9ajRuYhIEIIpwhK3LI03uzl793bdcTMnZ+cAsDW2iBWn/JWDb7Xlwu3QqhA+mQmuaS2YHf1bCoucGb30nemqvCkiIiK1QjN4IUQzeCJ1I1ARlvFD4BQbw1tzisjcuIIz/jyRvWtSuW8QzJlrMMOG8cXI3vS58WEm9oeeP8CwTTD5fHi/I6Sdrhk9ERERCY6KrIiIVFOgIizjLh/Ood07YM5U2v52LHGZm7hlJDy3FFwD+xI/fz7733kCgEvbXUKfzVlEsYO7VzrtFQ60SWDtr7vwuiuFjfNeJ3/wlaq6KSIiIieEZvBCiGbwREJDZTN6BxvAH0bAT6fEsOSNIrIWzqDTZTeUaa1ghg1jfa/T6Trxn0zpDtd9CY2Pll5DVTdFREQkEM3giYjUEN8ZvaJVGez+bj2LF73GEysML+9OZltcK2AR23dtpLVva4X588la/QZdJ8LYdHAN6svu/DwObslm+dO3MGj2WubNzuSl3EI2ttaMnoiIiNQczeCFEM3giYSuSvfpxcObyV7N0kfdw+KZjzBk9JPs6Xo2LdKz+Gzui1x47b1M7A83Z5b9vGb0RERExJeqaIqInECeWb2HJ69j31erWTr1AXbHwrZzfsHKj2fQve8oALYnxZP+w1p++fcZAKTf/TuIimJ7UjwATyx3Wi2sfHQMAB/dOYyY85KZNxueXFjIxnmvk759japuioiISLVoiaaISBWVaZY+qhV/TXmR/7yzjSa/u5FFZ8EIYOnTt3DyRkjOdj5jz+sCQHy2E9byu55Ni0UpFKyfB49PZ8PK/9EpCwxOQRZWPkNO/DOa0RMREZFq0UBeYPcAACAASURBVAyeiEg1JMUl8bfXNvH9tOcYcqgtr33ovD9tIQw4kEDqzQOcN77+xu+MnvnmGwDuWlV2Ru+T8SOJ6dqNebPh7qWF5P77b2TPm6JZPREREakS7cELIdqDJxKmiovJ+2gus5+9nts/P8oHHWDShfCf/8K2ONjX0OmLFwUsypjD0POuYvuAHrT6NJO8i7qRmLKWzM2fkdyhH6NHxXCAIma+51N5Mw7e6Aa5LWL44/DHadh/oAqziIiIRDDtwRMRqSvR0bQadi1XfLiZ76c9x+BDbfnsTTjDBX1z4bJtJ5Hxm+4AxM9fRMHQAZyWmknMMfh67BUQFcXJWZsBSNpTxNy5hoNdOgAw/IYGPHAptDzoFHF5a04RfW6YQLMuvXjqzh50frWzZvVERESkhGbwQohm8EQihHtG7/5Zf+AXu4+Uq5q5JQ7md3b23Hlm9AqGDmDv6lSMcZZsfv+PJ0ju1J8Xz4e7VhlcAy/iiE+rhdNSM3npfOgydgLN1WpBREQkolR3Bk8BL4Qo4IlEllxXLvmF+VBczI9L5vPV1Ge4axX80LcbX3Vry69f/oCVj46h0+ebiFuWxiMXW55cDlkLZ2CspePwm9jZBE7u04/4RSmVt1rQEk4REZGIooAXARTwRCJXpX304mD8EDjFxvDWnCIyN67gjD9PpPDzVNoedAJfp2GjeXflNK7tMxYLuAb1Y22P1gycNJsp3eG6L3327Ll76310XiPmXjOXNk3aACjwiYiIhInqBjy1SRARqQWePnr5hfnsKy7mO68ZvZO6dmPc5cM5tHsHzJlK29+OJS5zE2/0dpZx/tzpTKB8q4UfV78Bk2YzNh1cg/ryQ952jmzN4bqR8EAazJ0Nk3N/5rlNw0g7HY5FQWyMWi+IiIhEMgU8EZFaUqaP3shWXLlrMp8mFfLCkgyG3JBRMq7xV9lcfTX8dEoMd68solHWZmh/Ad1f/D/AabUwJCrKJ/AtZ9U7TzDkhsd5xl5Cn5+ziGIHd690QuKBNgms/XUXXnelsHHe6+Rrz56IiEhE0hLNEKIlmiL1i/cevSarMtj93XoWL3qNJ1YYfuiXzOe/v4he9/2D2PYdOKl5K+KWpWHAWarZ8yb2dO9Ii/XfsfitRxjyh8dLl3AaMMMuZ32v0+k68Z8VLuG8dyDsbxrD3869k8ZJZ3GgdzeFPhERkRChJZoiImHGe0aPxJ40cuXyYvEMvmnpzOpds8I9q5eRzcEG2UztBrdmQPP5SyiYOJOE9d8BUNDRuUbJjN55Z9Ni/nyyVr9B14mULOHc7a7CeeOoBjz/wVHmzAVDEfAcoEItIiIikUABT0QkRPju09u1fBn/fv9RkvYUcXOmE+4ABk6azZY4eKG/0xsvMbcAeh4LuIRz4wcz6Tj8Jvp9d5Rf7TL83PwU9h7eT/IdMYxZVcTEFc71oAjmTCAnboICn4iISJjREs0QoiWaIuKrslYLy0Z1p//D06u8hDNzUxrJHfqxOxZOurAfebePpuOVY5jYH55Y4fTaO+gu1DKlBzycqsqcIiIidUVLNEVEIlBlhVluTA1uCefJWZsBaFkIWXfczM8d2wPwl9XgGtiX+EUppYVaUpz3qlKZs1EDBT4REZFQoYAnIhImqruEk6+/Jj1xLS1feIW8WGhV6LReOHmDEwA9gS8+KgpXu7YA7O10Bs0DVOb07Nn7rrkCn4iISKhQwBMRCSNlCrNc35PTLr+uTOBb89pE/vxFMd+0gk/awZgMaPvEZLb/bTLJm+CpS6J55JNiGn27iVb/nFEm8AGc8eHnAGwecTHNo6IoOCsRgItnfIIZdjkrb/415z8+nfRhPei67GueXH645Nn8BT6AS7erUqeIiEht0R68EKI9eCJyvHJduRydO5vTHn+Jhtt2lLx/tNFJfHXrlawclswVV02gWcOmNPpxP4/0d2b5shbOoNNlN/Djue1pvmErsz+bwjUXjGXxzEcYMvpJ9nQ9mxbpWby7+o3SVgy/GcbuH7I5uCWbnFGXcvHbnxL9c2ngy4sFDLQ6WPp8qtQpIiJSNdXdg6eAF0IU8ESkxhQXk/fRXO6f9Qd+sfsIN2dCu4LS0xZY2xbu/XU0M+YWlyvUsmjWYwz9/cRyhVp8A99i9xJOT+D7KW87+dmZTOkBz3zi3OvBS2Fqn5MY+8URJq7wKdyiwCciIuKXAl4EUMATkZrm20z9YO533P/1ZJr+VMRzS8uGvoMN4Km+MC4dYs5oR+MWiQErc87+bArXXDSOHzufQfOvN5e0YtjZBE7u0xcs7F2bxn2DYM7cKlTqVOATEREBVEVTRET88G2mDjDN9ecKC7U8m+L+4PocDjbICViZ03fP3iH3Xr42ByDrjjEYa+k4PI1/fBS4Uqcn8Pnrxbe1RQOG9x/HkQt6Y6OjiG8UX1LEBVTIRURExEMBT0SknglUqCWYypy/mLMEgO9/cyG9KG3FAO7CLe5VIk7gC1Cps9LAdxTmvEpO3KvuQi6wswkq5CIiIuJDAU9EpJ473sqcAHbDt6Sf7rRi2NEY2h6ERlmbMV7bACqq1Bko8OXu2kpxbi7rh/di2JxMnlx+pOSapYVcioDngNJlnpr1ExGR+kgBT0REyvAX+L6fO5vBj7/E5W86lTnPcDl79h4a4OzZ8w58v7smmklLimny/MtgbZnAR/sLOPO95UDprF9VA98Zb6+usJDLAz6FXCqb9fvsF3DhNji9UPv8REQk8ijgiYhIpZLikmDMeLjxLkhLI3/zVxzJ+obm777Psyk7gdLAN+ES+PDMYtr1gGc/dtZ3Trw0ilvWHuPUR56h4OXpnLphKwBNt+yA848FF/gWLGCHu5DLHavBNcgp5HLr2jS+O/UIz37iFHKpbNavKApijoHvPr/cFjH8dvBd7PvVuQC0+3o7rfdbilolaNmniIiEDVXRDCGqoikiYaW4uFzga7htZ8npXbFgfPrgVVap09OawbcX3+qHbqTXM2+xesJoej09g8xNaSR36Ac4/fucQi6eyp39iF+UUq59w+4fsrEbs2l5CI42P5UG+Xu5bgQk7adc64bK+vd5ln0W9e5JwvpsElxF2vcnIiInhKpoiohI7YqOhv79Sejf3zme9CqkpcHOneSdEsWO5PYAbAlQqXOCe5lnl+ffouCtjwPO8AVdyGXBAjLefqy0f9/ab9k+sBcvLs6g1SFnxm93fh4Ht2RXedln6SygQwFQRERChQKeiIjUDHfgA2jlfgEBK3UGG/ha/ePNoAu5xBw4BMCOC7vQokEDvrl5GENWZLCn61m0WLS8pH9foGWfDTfnOrOAzZoS/eN+rveaBaxKAPS3DLTRnn3kx8Wwp3snFYMREZHjpoAnIiInXDCtGSoKfB0mTWX71PmclraeW0bCc8uociGX5t9sAeDHc5xZxTKBr6r9+96ufBawagGwCOb8vdJloL7FYJIKy84KxuTlU9QqQYFQRET8UsATEZFaV63A5/qBgw1+4MEBsPhsOGtf1Qu5tP3sSwCKmsQC5QNfVZZ9BpoFDBQAK1oG+s9e8H7+IPrNWlFBMZjKZwX9VQfVLKGISP2lgCciInWussBXtCqDg7nfEfXdZs6cn8qkT/KZ5A5InkIuT33sTj4Z2RxskF1u1q/t+u/IawztXn+X9AG/ocWKNYA78B2r2rLPQLOA1V8GCgPmLsP85jdlisFw6qnY/L1M7gV3rik7KziqaS8uf7ey6qA1M0uY37WDZg1FRMKMAp6IiIScMoEvsWfpiVeKgyvk4p71m3AJbIuHWf/dyO7OvWl5EHbHwrkvzqJgVgpxH39asuyz9SvTwVJu2WegWcDjXgbqrxjMpT3586eZ/NC3G4kfr/aaFVxTYSD0VAd99mPnR3A8s4SBZg3TTnfev3R7DBPOuslvKFRIFBGpXQp4IiISPqpYyMXfrJ9H80MQBbQspGSf34MDIKVTDK/vK+LZpWkA/M09C+hZ9pngngU8542F8PuJZQPfsROzDHTHRV1JTM1kx0Xnkeg7K+gnEBZcdilPfZKKMcc3S/hyFWcNS2cJi4ApzvcfZEisaBYxwVXEzy2akXNuIlB2qWlFofFI8RFOij6p5F4KkSJSHyngiYhIRAg065e/+Sv2xjfkwK/O43BaSpkZv0mfwKRPioDSZZ8ls4DuZZ8PumcB334vg0Otm9P8x/3sjoXOz8+gYFZKwABYnWWgxQ1jADja0AktgZaF5v3lJjoOTwWOY5YwwKzhT3nbyc/OLLeXMDcO/vNe8CEx0Cxiez9LTasaGtscqLkQWdWZSYVMEalrCngiIhLZvPr1JXjeO/38cjN+MXn57GpqSv6w32zt1/y/JS+VCYEeDfbtL50F/CqXgxtyAwbA6iwDjT7shM4Gh51AFGhZqGcZKFR/ljDQrOEOf3sJ16Q5s4YD+xH/4cfVWlrqPYvY4Mf9fpea+obIikLjsegooopLE2BNhMjqzkx6QmZ1lrNW9fh4gmpFwVRBVSR8KeCJiEi95G/Grx3QxzMgGc4ZdXu5ZZ+ePzTHrFrD+8tf5/Q9R6sWACtZBho74RG2Pze5XDGY+JQvKIqCtp9+CUePBlwW6r0MtLqzhIFmDSvaSwjuEFmNpaW+gdDvUlPfEFlJaIxKaIHNyysJjccTIn1nIqs6M+kbMquznDXY4+oE1arOfoZ6UA3XewQTsAEKfi7we+zvM8EeV+ceCv6hSQFPRESkAhUu+wToMZpe190XVACscBnoV7kcbJDrtxjMy73gzrSys4Jn/m0K26e/Xy4QtnzhFbbEOdeu7ixhoFnDivYSlhwT/NJS30BY0VJT59gJkYFC4w8Blp5WKUQGuXzVX8isiRBZ2XF1g2pVg2m4BNVwvUd1A3bfrRWPOd7j47lHoIq8oRawK7tG46SzONC7G0RHh19wtdbqFSIvoClgXS6XFRGR8Le1YKtdt2OdXbdttd0493W75dWn7eeznrH/yZhp/5Mx034+6xmb/uK9NvO2q+xPbROsdeKStWCPRFHp8YEG2Psvwf7+Kmwx2J2Nna83jIy291+KPYbzemAANicOu7VLkt3WL9kWg93VGLvxgo52Xe4qu7FPB3skCrutXzdri4rstn7JZY53dz3LWrCL3nrE2uJiu29QP/tDY+cZNiycYbPef7PkmTYsnGGttXbVhNHWgvPVWpt521XWgv341fF+jz9//BZrwX72+FhrrbUZ2akl18zYlFbu2FrrfBaca/k59n0G32Pf567K9+F7vOitidaC3d31bGuLi8seFxXZfYP62c1x2C3x2H2D+tp9A/t6Hffz+/MO9rha97i4m90Ziz1mjLXDhtm8bh3srljsMYO1rVrZY2Bf7FX2+HcjKPPf1f2XYn93lTOmsHlTW+z1meoc17d7/G4E9r+je9miRieV+f+6ODqq0uMD8bH2QHzjoD4T7HF17hHo1yvf412xzq9Dx3ON2rjHlnjsiGuwsU/H2q0FW2vit4WguFwuC1igqQ0mUwQzWC8FPBEROUGOHrU2JcXumfaK3Tj3dbvu+y/sxrmv2/QX77VLpk+w76ydYZdMn2BTnrm1SoFwp58/3FQUCgP9ATYvFrvpl63ttn7J9pgxduTVzh98tl3czW7rlxwwUPiGxDLHtnrhK9jQWJ0QGegagUJmTYTIExFUAwXTGgmRtRFUw/UeNRqwTYWfCfb4eO5xJOHUMsehHrCrco2PZj1m9w3qZ48ZY0dcg123Y10Qv6DXjOoGPC3RFBERCQUVFIMpo4fXP/upDtpk7ZflisV4lidV1jLirtXOsWfvoO9xy0Jo+e0uDjbYxYMDYPHZcNY+ePbjDAAeHhjN902Lefu9VA61bk5bz5LAKi4t9d1r2Oofb5Zbalp6/AYMvS7gfsRAS08r26/YKGsznHVR0MtXq1r0puSYwHsgAx1X5x4nYslsoGI9gY7r0z2CXV7sqnQ5cV/iP/y43GeCPa6JewRa0lwjy6Jr4R4Fl13KUympFLxyGfG/n0jB0AE8vySVguJiwoUCnoiISDiqJBD6FospUUEo9N47WNH+lGbb80sCoqegjGcv4dPLSv/gU5WQGGivYdyP+7nNs6/MXZBmUkmV0tRKQ2ObV9/iP13a02VpCkVREJ+ykvRtq6sXIj9dX7bIje/xcRS9qWqIPBFBNVAwrYkQWRtBNVzvoYAd2vfwXCN7VQaM7Fn22HcvdohSwBMREakvgp0l9D12B0R27iTvlCh2JDuBwOVTYKYqIdHD3yyiAd55zzn2hMiKqpSWD41badn3xpLQ+Ncvqh8ifWciqzoz6QmZPSdNKgmNTZ5/GawNPkQGOK5WUA1y9jNUg2q43kMBO7Tv4blGTF6+3+NwoIAnIiIiVeMOiACt3C/AT4VRKj6u4ixiVSvh+QuNLX+OIopjJaHxeEJkMMtX/YXM60sqXFZtOWuwx9UJqlWd/Wy0dDn/ufqt0A2q4XoPBeyQvofnGkWtEvweh4VgNuzppSIrIiIiIcddoMa+847z9fDhKhWsSX/xXr9VTX2L2gQ69lf0xrfKoL+iNzVdNbA696is8I73sW/higkDo+3vT3RxDPdzReI9vAsXeSrbru91un07fYZd3zPJHonCru/pHG+8oGNJoZaNF3S0G/t0KHPs7zPBHh/PPTb2qVpF3pApclOVa8Rj121bXVI5uOS4llW3yIqx1lYaAKX2GGOaAi6Xy0XTpk3r+nFERESkqopLl6/Spg1ccAF8/nnJctYf3MtZm1RxOWtt9P3yzH422VE6+xmoD55ntvNE9o+rT/c42ACe7OeZ+YXdjSldXry69Ph6n16HD14S+DPBHtfUPe5cAz+f2pSGXv0Xfz61KY389Gz03OPt9/x/JtjjmriH5xqjr3qU3jNTiFuWxshRlocnr6N7m+7Upv379xMXFwcQZ63dX9XPKeCFEAU8ERERqVWVBNNwCarheo/qBOxDpzbFAI327q/yZ4I9rs49wjVgBzreEg/jB8PirrFsuH1DrTc7V8CLAAp4IiIiIvVIEAGbNm2gb1/nc8F8JtjjIO4RbEXeUArYga7ROOksDvTuBtHRJMQm1Hq4AwW8iKCAJyIiIiIiUP2AF3XiHklERERERERqkwKeiIiIiIhIhFDAExERERERiRAKeCIiIiIiIhFCAU9ERERERCRCKOCJiIiIiIhECAU8ERERERGRCKGAJyIiIiIiEiEU8ERERERERCKEAp6IiIiIiEiEUMATERERERGJEAp4IiIiIiIiEUIBT0REREREJEIo4ImIiIiIiEQIBTwREREREZEIoYAnIiIiIiISIRTwREREREREIkRIBjxjzG3GmBxjzM/GmHXGmL4Bxo80xnxrjDns/jrC57wxxjxmjNlhjDlkjFlujDnHZ0wzY8wsY4zL/ZpljIn3Ot/fGLPAGLPTGHPQGJNpjLnO5xrLjTHWz+uDmvi5iIiIiIiIVCbkAp4x5lpgMvA00A1IAz4yxiRVML4P8C4wC+jq/jrbGNPba9h9wN3An4GewC5gqTHmFK8x7wDJwFD3K9l9LY8LgC+BkcB5wBvATGPM5V5jrgLaeL3OBYqBOUH9EERERERERKrBWGvr+hnKMMasAtKttX/yem8DMN9a+6Cf8e8CTa21v/Z6bxGwz1r7O2OMAXYAk621f3OfbwjkAfdba183xnQGvgXOt9auco85H/gC6GSt3VjBs34A5Flrb67g/J3AE0Aba+3BKnzvTQGXy+WiadOmgYaLiIiIiEiE2r9/P3FxcQBx1tr9Vf1cSM3gGWNOAnoAS3xOLcGZQfOnj5/xi73GtwNae4+x1h4GVniN6QO4POHOPWYl4KrkvgBxwN5Kzo8B/l9F4c4Y09AY09TzAk7xN05ERERERKQqQirgAQlANM7smrc8nJDmT+sA41t7vVfZmN1+rr27ovsaY67GWe75ZgXne+Es0ZxWwXMDPIgTIj2v7ZWMFRERERERqVSoBTwP33Wjxs97wY4PNMbf9f3e1xjTH5gBjLXWflPBM40BvrbWrq7gPMCzOLOAnldiJWNFREREREQq1aCuH8BHPk5REt9Zs5aUn4Hz2BVg/C7319bAzkrGtPJz7Ra+9zXGXAwsBO621s7090DGmFjgt8AjFTwzULJU9LDX5yobLiIiIiIiUqmQmsGz1h4B1gGDfE4NAj6v4GNf+Bk/2Gt8Dk6AKxnj3ut3sdeYL4A497JKz5jeOLNqn3u91x/4AHjAWjulkm/lGqAh8HYlY0RERERERGpUqM3gAbwIzDLGrMUJXrcCScC/AYwxM4EfvCpqvgykGmPuBxYAVwADgYsArLXWGDMZmGCM2QRsAiYAhTitEbDWbnBX3pxqjBnnvu4U4H+eCppe4e5lYJ4xxjNreMRa61toZQxO1c8fa+hnIiIiIiIiElDIBTxr7bvGmOY4yxvbAF8Dl1lrt7qHJAHHvMZ/boz5LfAU8CSwGbjWuyIm8HfgZOA1oBmwChhsrf3Ja8x1wCuUVtt8H6dvnseNQCxOYRTvdg0rgP6eA2NMB5xwOTjIb73E/v1VroIqIiIiIiIRqLqZIOT64NVnxpjTUCVNEREREREplWit/aGqgxXwQoi7KXtb4KdAY2vJaqBXwFH1W6T9jEL9+wmF56vtZzjR9zsR16+pa56C85deiYTOr4ty4oXC/+ehLtJ+RqH+/YTC8+n3ntq7Zqj93nMKsMMGEdpCbolmfeb+F1fldH6iGWOOWWu1XrQSkfYzCvXvJxSer7af4UTf70Rcv6au6VVZ+Ke6/vcutScU/j8PdZH2Mwr17ycUnk+/99TeNUPw956gnyGkqmhKyHm1rh8gDETazyjUv59QeL7afoYTfb8Tcf1Q+Pck4Uv//QQWaT+jUP9+QuH59HtP3VwzLGmJpoiIhCRjTFPABcSFyN+iiohIhIuE33s0gyciIqHqMPC4+6uIiEhtCPvfezSDJyIiIiIiEiE0gyciIiIiIhIhFPBEREREREQihAKeiIiIiIhIhFDAExERERERiRAKeCIiIiIiIhFCAU9ERMKWMSbWGLPVGPN8XT+LiIhEPmPMUWNMpvs1ra6fx58Gdf0AIiIix+EhYFVdP4SIiNQbBdba5Lp+iMpoBk9ERMKSMeZsoBPwYV0/i4iISKhQwBMRkVpnjOlnjFlojNlhjLHGmCv9jLnNGJNjjPnZGLPOGNPXZ8jzwIO188QiIhLuauj3nqbu9z81xlxcS48eFAU8ERGpC42B9cCf/Z00xlwLTAaeBroBacBHxpgk9/krgGxrbXbtPK6IiESA4/q9x+0Ma20P4I/ATGNM0xP7yMEz1tr/396dx9hVlnEc//4oi0hJBFEkEUQhghgSlMUFSUoENWoMaqLiAq2ICnFDCAa31C1ucdeoCAkoBRXQGhQF0S7BooBsIhZBkJ0iixoUhNbHP8476eF6h06nwzDc+X7+uXPe7TxnbtLTZ973vOfRjkGSNIslKeDVVbW4V/Y74JKqOqJX9idgcVUdl+TTwJuBNcBcYBPgC1X18emNXpL0WDSZe8+QMX4OfKSqLp6OmCfKGTxJ0oySZFNgT+DcgapzgRcCVNVxVbV9Ve0IHAN8x+ROkjRZE7n3JNkqyWbt56cCuwHXTWecE+EumpKkmWYbYA6waqB8FfCU6Q9HkjQLTOTe8yzg20n+CxTw3qq6e/pCnBgTPEnSTDX4DEGGlFFVJ01LNJKk2WDce09VrQB2n/aI1pNLNCVJM82ddM/WDc7WPZn//8uqJElTYWTuPSZ4kqQZpaoeAH4PHDhQdSCwYvojkiSNulG697hEU5I07ZLMBXbuFT09yR7A3VV1I/BF4HtJLgYuAN4O7AB8a9qDlSSNhNly7/E1CZKkaZdkHrBkSNXJVTW/tTkSOBbYDrgSOKqqlk9XjJKk0TJb7j0meJIkSZI0InwGT5IkSZJGhAmeJEmSJI0IEzxJkiRJGhEmeJIkSZI0IkzwJEmSJGlEmOBJkiRJ0ogwwZMkSZKkEWGCJ0mSJEkjwgRPkiRNiyQbJTk+yW3t0/+HSNIU8x9WSZI0XV4KPBN4OfCsdixJmkImeJKkx6Qk85JUkic82rFMlSQ7tmva49GOBSDJSUkWr2efhe0aKsn7Bqr/AdwDXAPcBdw90Hd+r++XNyh4SZqlTPAkSTNWkhcmWZPkF0OqVwDb0SUN6zPmbUk+MFD22ZZUvHig/FdJTl3vwB9jHoHE8o90383x/cKqWgFsSvedzamq3w30+0Hrd8EUxSFJs44JniRpJnsr8DXgRUl26FdU1QNVdXtV1bCOSeaM84zXUmD/gbJ5wE398iSbAi8Alkw6+tlrdftu/t0vTLIJsBfwOWCfJBv366vqvqq6HXhg+kKVpNFigidJmpGSbAG8Dvgm8FNg/kD9Q5ZotuV9f0/yyiRXAf8BnjZk6CXAvmPJRZItgecAn6FL9MY8D9i8tSfJTkl+kmRVknuTXJTkgF48n07y2yHXcUWSj/WOFyT5U5L7k6xMcuQ6fg+7JTm7nXNVku8l2aZXvzTJV5N8LsndSW5PsnBgjF2TnN/OeVWSA9rv7qDW5Pr2eWkrXzrQ/5g283lXkm+0RG0yXgE8CHwUWN2OJUlTyARPkjRTvR64uqquBk4BFiTJOvo8HjgOeBvwbOCOIW2WAHOBvdvxfsCfgTOAvZM8vpXvD9xcVde247nA2cABdAnhOcBZvZnFRcDzkuw0dqIkzwZ2b3UkORz4FPAhuk1GPgh8Ismhwy4myXbAMuAyupmvlwHbAj8caHoo8C+6pPRY4KNJDmxjbAQsBv7d6t/eYujbp30eQLdE8jW9uv2BndrnoXSJ9vxh8U7AAuC0qnoQOK0dS5KmkAmeJGmmOowusQP4BV2C9eLxmwOwCXBkVa2oqqur6l+DDarqGuAW1s7WzQOWVdUdwHXAvr3yJb1+l1fVt6vqD1V1TVV9uLV/Vau/ErgCeGPvdG8CLqqqP7fjPKnFXAAABCBJREFUjwBHV9WPqur6qvoR8CXgHeNczxHAJVX1wapaWVWX0i1b3T/JM3vtrqiqj7W4vgtc3PtdvYQuQTukXcP5dAlm39/a511taWV/85N7gHe18/8U+Bnr/h7+T5In0+2eOfadngK8opVLkqaICZ4kacZJsgvdrNL3AapqNd0GHG9dR9cH6JKsdVnKQxO8pe3nZcC8JJsBzwd+3Ytpi7YM8qq2FPReYFeg/2zgIrqkjjbbeDBrZ++eBGwPnNiWW97bxvgwXQI2zJ50yVy//cpW1+8zeM23AWOJ0y7ATe3ZtjEXjnO+Yf5YVWvGGXt9vAVYWVWXA1TVZXTX8uZJjCVJGsfG624iSdK0O4zuHnVLb1VmgAeTbFVV94zT777xNl0ZsAT4SpIn0i23XN7KlwHvBs6l9/xd83m697YdA1wL3Ee3rHPTXptTgc8keW7rvz0tSWXtH1UPBwZ3j1zDcBsBZwEfGFJ3W+/nBwfqqne+tOPJerix18cCYLckq3tlG7XyL04yNknSABM8SdKM0jY/OQQ4mi7R6juTbobs6xt4miXAFsD7gWuqalUrXwacTLf5x/VVdUOvz37ASVX14xbnXGDH/qBVdXOS5S3GzYHzxsauqlVJbgGeUVWLJhjnJcBrgb+2WczJWAnskGTb3nXuPdBmbNfKOZM8x8NKsjewG91saX/55xOA5Un2qqqLH4lzS9JsY4InSZppXglsBZxYVQ95x12SM+hm9zYowauq65LcSDdbt6hXfmuSG4B3AqcPdLsWeE2Ss+hmsT7B8JmsRcBCupm9owbqFgJfTfJP4OfAZnSbp2xVVcNmsb5BN+N3WpLPA3cCOwNvAA4fWDo5nl8CfwFOTnIssCVrN1kZm9m7g25G8mVJbgbuH/zdb6AFwIVVtXywIskFrd4ET5KmgM/gSZJmmsPoZr6GJRhnAnu0JZAbagldsrN0oHxZKx98/91RdBuOrKBbNnkO3QzboNOBJ9Lt6Lm4X1FVJ9Dt8Dkf+EM713zWvqaAgfa30m36Mqed70rgK3QvCv/vOq5vbIw1wEF0m9RcBJwAfLJV39/arAbeQ7fZy63ATyYy9kQkeRzds4hnjtPkTODg1k6StIEysUcVJEnSqEiyL3A+sHNV/WWKx14IHFRVe2zAGEuBy6rqfVMVlyTNFs7gSZI04pK8OsmBSXZsL2c/HvjNVCd3Pbu3XT8f9iXug5K8qe0Uut8jFJckjTxn8CRJGnFJDqF7B9/2dM/xnUf3Pr67HoFzbQ1s3Q7/tj7P8iXZku5F7gB/r6o7pzo+SRp1JniSJEmSNCJcoilJkiRJI8IET5IkSZJGhAmeJEmSJI0IEzxJkiRJGhEmeJIkSZI0IkzwJEmSJGlEmOBJkiRJ0ogwwZMkSZKkEWGCJ0mSJEkj4n/vLUyJdcUdrwAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "fig = plt.figure(figsize=(10, 10), dpi=100)\n", "ax = fig.add_subplot(111)\n", "p1 = ax.semilogx(wavelength[good], greisen_a2v[good], 'k.', label='Greisen')\n", "p2 = ax.semilogx(wavelength[good], ciddor_a2v[good], 'gs', label='Ciddor')\n", "p3 = ax.semilogx(wavelength[good], wcslib_a2v[good], 'ro', label='IAU')\n", "foo = p2[0].set_markeredgecolor('g')\n", "foo = p2[0].set_markerfacecolor('none')\n", "foo = p3[0].set_markeredgecolor('r')\n", "foo = p3[0].set_markerfacecolor('none')\n", "# foo = ax.set_xlim([-5, 10])\n", "# foo = ax.set_ylim([24, 15])\n", "foo = ax.set_xlabel('Air Wavelength [Å]')\n", "foo = ax.set_ylabel('Fractional Change in Wavelength [$\\\\lambda_{\\\\mathrm{vacuum}}/\\\\lambda_{\\mathrm{air}} - 1$]')\n", "foo = ax.set_title('Comparison of Air to Vacuum Conversions')\n", "l = ax.legend(numpoints=1)" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [], "source": [ "greisen_v2a = 1.0 - vac_to_air(wavelength) / wavelength\n", "ciddor_v2a = 1.0 - vactoair(wavelength) / wavelength\n", "wcslib_v2a = 1.0 - waveawav(wavelength) / wavelength\n", "good = (greisen_v2a > 0) & (ciddor_v2a > 0) & (wcslib_v2a > 0)" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAA3gAAANNCAYAAADF2dxQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAIABJREFUeJzs3Xl8XGW9+PHPN9OyBEiKlKUYG4oJFZCt7IIKRZarqKhcEOQq0p+twvUqgnpdQJTFCwp6XdCGG0Bc4SpyFVARLFoEgbIJ0kICjWWgbEJToBZK8vz+OCdlMk3StEw66fTzfr3mNZ3nfM9znjlzks43z3IipYQkSZIkae1XV+0GSJIkSZIqwwRPkiRJkmqECZ4kSZIk1QgTPEmSJEmqESZ4kiRJklQjTPAkSZIkqUaY4EmSJElSjTDBkyRJkqQaYYInSZIkSTXCBE/SOiEido6ISyJifkQsjYjnI+LOiPhMRLym2u0baRFxaUR0Vbsdr1ZE7BYRf4yI7ohIEfHJAWLenW/76BD1HJzHfGpkWzy6RcS/R8QHR/gY9wz2WeXb/1++vakCx3prRPxvRDwaES/l18mfI2JGRNS/2vrXFhFxVkS8XO12SKqOSClVuw2SNKIi4iPAhcAD+fP9wFhgD+AjwD0ppfdUr4UjLyJeDzSklO6qdltejYi4C9gI+ATwLNCVUnq8LGYM8AjwSEppr0Hq+QlwJPDalNJTI9vq0Ssi5gHFlNLbRqj+PYHb8pf3pZR2GiBmc+D1wJ0ppZdexbHOBj4P3ARcAnQCGwNvAqYDP0gpfXp161+b5Mnya1NKt1a7LZLWPBM8STUtIvYFZgO/B45IKb1Ytn094LCU0q+q0b6RFhH1KaUl1W5HpUTEMuCilNKJK4k7F/gMsFNK6b6ybeOAhcA1KaUjR6yxa4E1kOB9H5gBXAu8Hdg7pXTb0HsNWteg13JEHAP8BGgDPprKvtxEREN+7N+vzrFHg/x3VW9KyZ45SUNyiKakWvd5IAHTy5M7gJTSS6XJXUTU5cM250XEixHxZERcVj58LCJujIj7ImLfiLg5Iv4ZEV0R8eF8+zvyIaBLIuLeiDisbP8z8mFpu0XElRGxOB9O9qO8R6M09uiIuC4iFubHmRsR/xURG5XFXZoPPd0pj38OuKFkW1dZ/L9GxK35cZdExMMRcXFZzMS8TU/m52NuRJwSEXUlMdvk7+XUiPhUPgz2+Yi4JSL2Gc6HFBFvjIj/i4hn8yG0d0fEh0q2Hx8RCRgDfCw/3lB/oWzPnz88wLZjgA2A5e81Ig6NiF9FRDE/xx0RcWFEbDZAW3eIiMsj4on8nPw9P79j8+0DDo8rH4oYEWPy118cILYYEf8zwL4HRER7RDyTf26XRkR9RGwdET/Pyx6LiHMj68kcVEQUgcnAQX3nMyI6S7Y3R8SPI+Kp/H3eHxEnR0QMVW/J/hsC7wduBU7Ni09Y2XnJy27Kr4ED8+toCVnyNpjTgX8AnyhP7gBSSotLk7uI2DA/R12RDeUsRsS3I6Kx/BxFxFUR8faIuKvk56/02twjb/+HKBMR78y3vb2kbHJE/KzkZ+r+iPhY2X5vy/c7NiK+ERGPAUuBbSJio4i4IF4Zbv5MRNweEUeV7L/CNRgRhYj4z4h4ID/uExHxg4jYuiyu79zvHdnw1iUR8VBkvxejrL7T8/r+GRGLIuKvEfHvQ3xOktaAIX/5S9LaLCIKwFTgjpTSI8Pc7Xtkw7m+A1wNbAOcCRwQEVNSSk+XxG5FNhTsPKAIfBy4OCJeRzb87xygm+zL51URsW1K6bGy4/0SuAL4PrBjfqwdImLvlNKyPKaVrAfkm8ALwBuAzwJ75e+v1HrAr4CZwH8xyO/5yHo2L88fZ5B9eWwurS+yRPPmvM7TgC7gcODrZEPqynvRTgLmAX1zrc4Ero2ISSml7oHakR9ncn6cJ4H/IPuifhxwaURsmVI6D7gG2Be4Bfg5cP5g9QGklB6MiJuA4yLiP0vOJWRJ36PA70rKWoA/AxeRfWaTgFOAP0XELn29JhExBfhT3tbTyIYBbg28m2zYb+lxRkI78AvgaGB34CygALyRV66jQ8l6Lx8FvjVEXe8ku/76zjtk1wERsSXZua4DvgD8new9XkB2bv6jvLIB/CvQCFycUpobEX8BjomITw2zV7kJ+AHZdfw5oGegoPzn7Q3Aj1NKS1dWaWR/nPgV8Fayn9E/A7uS/RzsExH7lQ0VnUL2M/5fZOdqBtm12ZFSujmlNCci7iW7rn5QdrjjyXqLf5cf+4358eYDnwKeIOvZ/E5EvCaldHbZ/ufl8dPz1/8A/psscf4CcDfZMNSdgBX+GFGmLW/jt8l+n0wi+xl9S0TsnlJ6piR2a+Aysp+zL5H9PjuX7PfcT/KYz5H9DJxJNix2PbLPYdOVtEPSSEsp+fDhw0dNPoAtyXrvfjrM+Dfk8d8tK98rLz+7pOzGvGz3krLXAC8DS4CtS8p3yWM/XlJ2Rl52Qdmxjs3LPzBIG4MsaXtLHrdzybZL87IPD7DfpWTz1fpen5LHNg5xPr6ax+xVVn4h0Atsl7/eJo/7K1AoidszL3//Ss77T8kSi9eVlV9LltA2lpQl4DvD/DyPz+PfU1K2Y1521hD79Z3jbfPYt5ds+yPZl+zNhtj/LODlAcr/X15fU/56TP76iwPEFoH/GWDf8uvl1+XXVl5+L3DrMM7RPOD6Acq/ln/Gu5eVt+Xlrx9G3X/MP7+GsvfwwaHOS152U172lmEcZ7889sxhXhfvyONPHuRn78MlZcX8Pby2pGxDYFHpdQicXH5eyBKuF4H/Kim7nixZ3qTs2N8j+73RmL9+W96WGwZo/1zgf1fyHvtdgyXX/X+Xxb0pL/9y2bnvBaaU/UzMA64uKfsNcPtwzrkPHz7W7MMhmpL0igPz50tLC1M2Z2gucFBZ/MKU0h0lcc+Q/YX/7tS/p25u/tw8wDF/XPb6CrIksa8tRMS2EfGTiHicrBdjGdmXZ4DtB6jzFwOUlbu973gRcVREvHaAmKnA/WnFOVOXkn3hK+89vCalVNrL8tf8eaD3XX6cG9KKvayXAvVkPXer4wrgOfoPCzyB7AvtJaWBEbFlRLTlwxZfJjvHD+Wbt89jNgb2B36WUvrHarbp1bq67HXftXXNAOUrO+9DmQrcW3p95y4l++wPXGGPEhHRQvZHiJ+nlBbnxZeTJUsrDNMcxFMppT8Nu8XD13fdXlpW/jOyPzSU/5zfmVJ6tO9FSumfQAf9z++PyK6b0mGaHyDr1boEILIh1QeQ/Xz+M7IhumPyobTXkiWO5YsCDfSzfBtweEScE9mqoRsO/laXG/A9p5Ruzt9L+Xt+NKV0Z0lcIvujQel7vg3YPSK+ExGHRDbPUdIoYIInqZY9TfZX8UnDjO8b4rRwgG2PseIQqGcGiHupvDy9MtxrgwHiHy+LfZm8hwiWJxWzgb2BL5J9QdwTeG++S/mXuyUlX6gHlX9xPoKsF+kyoBjZnMJjSsI2Y/Bz0be9VL+kJ70y53FlX0BX9TjDkrJhgD8DDouIrfIv0scBf0wp9SVvfUN5rwfeRTYM7yCyL9r7l7V/M7L/N4ur054KKb/mXhqifKDrbbhe7WcyLX/+eUSMi2xhmwJZgvrWPAFcmYGOP5AF+fOq/Jy/mFJ6trQwpdRLNmRyyOs69yIl13XKVmK9GvhQvDI/9Xjg5pTSA/nr8WTn4GSyPyCUPn5VElNqoHNwEtkw6feRjSR4JiJ+GdlKuYNZ1d9tK33PZL2EnyH7Ofkt8I+IuD4fxiypikzwJNWsvDfpBrK/Mg/nHlt9X2omDLBta7KEsdK2Kn2RJyGblbRlan7sE1JK/5NS+lNKaQ5Zz9RAhr00ckrp/1JKB5HNkzqAfH5NPj+PvA2DnQuo3PkYyeO0kyWxHySbP7gFryzA0mcXsjlsp6SUvpNSujGldDsrfsn9B9nQtZVdS0uBulhxkZPyL+99vbHrlxbmC1lU+96Mq/2Z5AlzX0/Wr8huZ9H3ODovH04v3rCu5bzndy5waEQMJ6n9B7B+RPSbK5YnZluy+tfbJcBEYGpE7AzsRv+e4mfIrp92sj/SDPQonRcKA5yDlNLzKaXTUkqTyT6jE8mGqf7fEG2r+O+2lNKylNLXU0q7kv3OOpZsuPbvhvk5SBohJniSat1XyYaUXRTZMuP9RMTYiHhn/vIP+fNxZTF7kg3Tu2EE2veBstdHkSUkN+av+77gla8AOqNSDUgpvZhS+iPZwi2QfTGF7P3uMMBf5D+Yt2tWhZpwA9mX4q3Lyj9I1gP7l9WtOGX3AbuPbHGJD5MtoFI+7G1Y5zil9DzZ/KSjI2KoBKyL7Jorv+fbO0tf5MPeFgA7l8UdzMp7PSulvFemzw3ATnmiUuqDZEnKUJ/928kSiW+RDeUsf8wj6+kqvLqm93MmWQL9zdKVHvtExCYR0XcriL6f4+PKwo4i6/Vc3Z/z35D1yPdda0vIhqUCkFJ6jmyBnt3I7r05Z4DHQKMCBpVSejyldEl+nB0jYv1BQgf73bYP2SJOr+p3W0rp2ZTS/5LNJRxPluhKqhJX0ZRU01JKt+RLkF8I3BER3wP+Rrbi4W5kq9PdB/w6pfRARLQBH4+IXrIvbNuQfXl8BPjGCDTxvfly5r/nlVU07yGbPwbZ6pLPAt+PiC+T9fh8gKzXabVFxFfIeqJuIOu5G0d28/DS+X3fIPtCf01EnE62OMQ7yHoMvpdSevDVtKHEl8l612bl7XqG7D2+A/hMGmIFzmG6mGz1x8nAzHwOVam/kSVl5+W9bovIhmuWz0uCbHjdn4DbIrvXXidZr88RZL2sS8iG6i0CLomIL5ElRCcwcO/JD4HTI+KMvN43kg3BG6yHttLuBd6XL7E/H/hnyu4beD5ZMvCb/LN/hCxBnQF8K6X08BB1TiO7js5JKT1RvjH/GbsAOIwV5w6ulpTST/MVKj9P9keJS8jmUPbN4ZxBdq6vJ/u5vh74ej509Bayn6cvA3N4ZZXIVW3DyxHxQ7LVdJ8nm39Y/jn+B9mQ6z9Fdo/ALqCBLMl6R96jPqSImANcRfbZPUv2e+MDwJ/SALeCydv2t8hugXJynv/+jldW0fw7Q6+2Olg7riVbxfMO4Km8vo8DD+cPSdVS7VVefPjw4WNNPMi+wF1K9mXmRbIvYHeSfanbvCSujmxeyQNk85ieIvti2FRW343AfQMcp4uSleZKyvut/sgrq2hOIRvG9hywmOzL5RZl++5Llui9QLaIy0VkyWkCji+JuxR4fpD3fyn9V9F8B9nCDsX8fDxB9mV7/7L9JpItBPN0fj7mkd3TrK4kZpu8LacO8r7PGMbn88b8PCzK23N36Xsb7DwO87Mfn9eZgD0HidmRLMleTJZg/oxsQYkVVrnMY3+en5MX88/8YmBsScw+ZInD82TJ0Wlkf0woXy1yfbIVKx8h6/H5A1nP32CraO5a1paz8vJxZeU/AhYN49xMAq7Lr78EdJZ9rj8p++w/BcQQ9W1JltwNusoj2fDTpcCVZe+tfBXNu1fj5/yA/LN5LG9HN9ltBk4BNi6Jqye7BcHf8/f2KNmtURrL6isCVw1wnJsYePXR7fP3koADBmnjtmRDN4v5sZ/M6/vPkpi+VTSPGGD/88gS0WeAf5IlsucDrym7Ll4u269AdmuDB3nld9tllKz4O9S5z6+p0uvj0/m5fYrs5+DvZKusvm6g9+3Dh48194iUhj1dQ5JUIXmPzZfIksuRmNsnSZLWQc7BkyRJkqQaYYInSZIkSTXCIZqSJEmSVCPswZMkSZKkGmGCJ0mSJEk1wgRPkiRJkmqENzofRSK7++jWrLkb3EqSJEkavTYBHkursHCKCd7osjXZjU8lSZIkCaAJeHS4wSZ4o8tzAI888ggNDQ3VboskSZKkKlm8eDGve93rYBVH95ngjUINDQ0meJIkSZJWmYusSJIkSVKNMMGTJEmSpBphgidJkiRJNcI5eJIkSdI6oqenh2XLllW7Gcqtt9561NVVts/NBE+SJEmqcSklHn/8cRYtWlTtpqhEXV0dkyZNYr311qtYnSZ4kiRJUo3rS+622GIL6uvriYhqN2md19vby2OPPcbChQuZOHFixT4TEzxJkiSphvX09CxP7jbbbLNqN0clNt98cx577DFefvllxo4dW5E6XWRFkiRJqmF9c+7q6+ur3BKV6xua2dPTU7E6TfAkSZKkdYDDMkefkfhMTPAkSZIkqUaY4EmSJElaJx1wwAF88pOfrHYzKsoET5IkSdKo9fjjj/OJT3yClpYWNthgA7bcckv2339/vv/977NkyZJXVfeVV17JmWeeWaGWjg6uoilJkiRpVHr44YfZb7/9GDduHOeccw477bQTL7/8Mg8++CAXX3wxW2+9Ne9617tW2G/ZsmXDWpXyNa95zUg0u6rswZMkSZI0bMVikVmzZlEsFkf8WCeeeCJjxoxhzpw5HHXUUWy//fbstNNOvO997+Oaa67hne98J5AtVvL973+fd7/73Wy00UacddZZANx///28/e1vZ+ONN2bLLbfk3/7t33j66aeX118+RPPCCy+ktbV1eU/hkUceuXxbSonzzjuPbbfdlg033JBddtmFn//858u333jjjUQEN9xwA3vssQf19fW86U1v4oEHHhjp09SPCZ4kSZKkYWlvb6e5uZmpU6fS3NxMe3v7iB3rH//4B9dddx0nnXQSG2200YAxpatQfulLX+Ld73439957LyeccAILFy7krW99K7vuuitz5szht7/9LU888QRHHXXUgHXNmTOH//iP/+ArX/kKDzzwAL/97W95y1vesnz7F7/4RS655BK+973v8be//Y2TTz6Z4447jj/+8Y/96vnCF77A+eefz5w5cxgzZgwnnHBCBc7G8DlEU5IkSdJKFYtFpk+fTm9vLwC9vb3MmDGDQw89lKampoofr7Ozk5QSkydP7lc+fvx4li5dCsBJJ53EueeeC8Cxxx7bL5k6/fTTmTJlCuecc87ysosvvpjXve51PPjgg2y33Xb96l2wYAEbbbQRhx9+OJtssgnNzc3stttuALzwwgtccMEF/OEPf2DfffcFYNttt+Wmm25i5syZvPWtb11ez9lnn7389X/+53/yjne8g6VLl7LBBhtU6tQMyQRPkiRJ0kp1dHQsT+769PT00NnZOSIJXp/ye8Xddttt9Pb28oEPfIAXX3xxefkee+zRL+6OO+5g1qxZbLzxxivU+dBDD62Q4B188ME0Nzez7bbbcthhh3HYYYfxnve8h/r6eu6//36WLl3KwQcf3G+fl156aXkS2GfnnXde/u8JEyYA8OSTTzJx4sRVeNerzwRPkiRJ0kq1trZSV1fXL8krFAq0tLSMyPFaWlqICObNm9evfNtttwVgww037FdePoyzt7eXd77znct7+Er1JV6lNtlkE+68805uvPFGrrvuOk4//XTOOOMMbr/99uXv+ZprruG1r31tv/3WX3/9fq9LF3fpS07LE+OR5Bw8SZIkSSvV1NREW1sbhUIByJK7mTNnjljv3WabbcbBBx/Md77zHV544YVV3n/KlCn87W9/Y5tttqGlpaXfY7A5fWPGjOFtb3sb5513Hn/961/p6uriD3/4AzvssAPrr78+CxYsWKGu173uda/2rVaUPXiSJEmShmXatGkceuihdHZ20tLSMqJDMyFb1XK//fZjjz324IwzzmDnnXemrq6O22+/nXnz5rH77rsPuu9JJ53ERRddxDHHHMOnP/1pxo8fT2dnJz/72c+46KKLlieqfa6++moefvhh3vKWt7Dpppty7bXX0tvby+TJk9lkk0049dRTOfnkk+nt7WX//fdn8eLF3HzzzWy88cZ86EMfGtHzsCpM8CRJkiQNW1NT04gndn1e//rXc9ddd3HOOefwuc99jmKxyPrrr88OO+zAqaeeyoknnjjovltvvTV//vOf+exnP8uhhx7Kiy++SHNzM4cddhh1dSsOZBw3bhxXXnklZ5xxBkuXLqW1tZWf/vSn7LjjjgCceeaZbLHFFnz1q1/l4YcfZty4cUyZMoXPf/7zI/b+V0eklKrdBuUiogHo7u7upqGhodrNkSRJUg1YunQp8+fPZ9KkSWtsJUcNz1CfzeLFi2lsbARoTCktHm6dzsGTJEmSpBphgidJkiRJNcIET5IkSZJqhAmeJEmSJNUIEzxJkiRJqhEmeJIkSZJUI0zwNKhiscisWbMoFovVbookSZKkYTDB04Da29tpbm5m6tSpNDc3097eXu0mSZIkSVoJEzytoFgsMn36dHp7ewHo7e1lxowZ9uRJkiRJo5wJnlbQ0dGxPLnr09PTQ2dnZ5VaJEmSJA0sIrjqqqsG3d7V1UVEcPfddw8ac+ONNxIRLFq0aCSauEaNqXYDNPq0trZSV1fXL8krFAq0tLRUsVWSJEmqlgXdC3h6ydODbh9fP56JjRNH5NiPP/44Z599Ntdccw2PPvooW2yxBbvuuiuf/OQnOeigg1i4cCGbbrrpiBx7bWSCpxU0NTXR1tbGjBkz6OnpoVAoMHPmTJqamqrdNEmSJK1hC7oXsP13t2fJsiWDxtSPrWfuSXMrnuR1dXWx3377MW7cOM477zx23nlnli1bxu9+9ztOOukk5s2bx1ZbbVXRY66ul156ifXWW6/azXCIpgY2bdo0urq6mDVrFl1dXUybNq3aTZIkSVIVPL3kaZYsW8KP3vMj7ph+xwqPH73nRyxZtmTIHr7VdeKJJxIR3HbbbRx55JFst9127LjjjnzqU5/iL3/5C7DiEM3bbruN3XbbjQ022IA99tiDu+66a4V6r732Wrbbbjs23HBDDjzwQLq6ulaI+cUvfsGOO+7I+uuvzzbbbMP555/fb/s222zDWWedxfHHH09jYyMf+chHKvvmV5M9eBpUU1OTvXaSJEkCYPvNt2fKhClr7HjPPPMMv/3tbzn77LPZaKONVtg+bty4FcpeeOEFDj/8cKZOncqPfvQj5s+fzyc+8Yl+MY888gjvfe97+ehHP8rHPvYx5syZwymnnNIv5o477uCoo47ijDPO4Oijj+bmm2/mxBNPZLPNNuP4449fHve1r32N0047jS9+8YuVedMVYIInSZIkadTp7OwkpcQb3vCGYe/z4x//mJ6eHi6++GLq6+vZcccdKRaLfOxjH1se873vfY9tt92Wb3zjG0QEkydP5t577+Xcc89dHnPBBRdw0EEHcdpppwGw3Xbbcf/99/O1r32tX4I3depUTj311Ff/ZivIIZqSJEmSRp2UEpANwRyuuXPnsssuu1BfX7+8bN99910hZp999ulX70Ax++23X7+y/fbbj46ODnp6epaX7bHHHsNu25pigidJkiRp1GltbSUimDt37rD36UsKKxFTnlgOtN9AQ0erzQRPkiRJ0qjzmte8hkMPPZTvfve7vPDCCytsH+iedTvssAP33HMP//znP5eX9S3GUhpTXjZQzE033dSv7Oabb2a77bajUCis8ntZk0zwJEmSJK3U3KfmcufCO1d4zH1q+D1sq+rCCy+kp6eHvfbai1/84hd0dHQwd+5cvvWtb60wrBLg2GOPpa6ujmnTpnH//fdz7bXX8vWvf71fzEc/+lEeeughPvWpT/HAAw/wk5/8hEsvvbRfzCmnnMINN9zAmWeeyYMPPsgPfvADvvOd74y6+XYDcZEVSZIkSYMaXz+e+rH1HPfL4waNqR9bz/j68RU/9qRJk7jzzjs5++yzOeWUU1i4cCGbb745u+++O9/73vdWiN9444359a9/zUc/+lF22203dthhB84991ze9773LY+ZOHEiv/jFLzj55JO58MIL2WuvvTjnnHM44YQTlsdMmTKFK664gtNPP50zzzyTCRMm8JWvfKXfAiujVQxnDKrWjIhoALq7u7tpaGiodnMkSZJUA5YuXcr8+fOZNGkSG2ywwWrVsaB7wZD3uRtfP77iNzlfFwz12SxevJjGxkaAxpTS4uHWaQ+eJEmSpCFNbJxoAreWcA6eJEmSJNUIEzxJkiRJqhEmeJIkSZJUI0zwJEmSJKlGmOBJkiRJUo0wwZMkSZKkGmGCJ0mSJEk1wgRPkiRJkmqECZ4kSZIk1QgTPEmSJEnD09MDN94IP/1p9tzTM6KHO/744zniiCP6ld18880UCgUOO+ywFeJvvPFGIoJFixatsG3XXXfljDPOGKmmjhomeJIkSZJW7soroaUFDjwQjj02e25pycrXoIsvvpiPf/zj3HTTTSxYsGCNHnttYIInSZIkaWhXXglHHgk77QS33ALPPZc977RTVr6GkrwXXniBK664go997GMcfvjhXHrppWvkuGsTEzwNW7FYZNasWRSLxWo3RZIkSWtKTw+ccgocfjhcdRXssw9svHH2fNVVWfmpp474cE2Ayy+/nMmTJzN58mSOO+44LrnkElJKI37ctYkJnoalvb2d5uZmpk6dSnNzM+3t7dVukiRJktaE2bOhqws+/3moK0sf6urgc5+D+fOzuBHW3t7OcccdB8Bhhx3G888/zw033DDix12bmOBppYrFItOnT6e3txeA3t5eZsyYYU+eJEnSumDhwuz5jW8ceHtfeV/cCHnggQe47bbbeP/73w/AmDFjOProo7n44otH9LhrmzHVboBGv46OjuXJXZ+enh46OztpamqqUqskSZK0RkyYkD3fd182LLPcfff1jxsh7e3tvPzyy7z2ta9dXpZSYuzYsTz77LNsuummNDQ0ANDd3c24ceP67b9o0SIaGxtHtI2jgT14WqnW1lbqyrrjC4UCLS0tVWqRJEmS1pg3vxm22QbOOQfK/uhPby989aswaVIWN0JefvllLrvsMs4//3zuvvvu5Y977rmH5uZmfvzjHwOvfG+9/fbb++2/cOFCHn30USZPnjxibRwtTPC0Uk1NTbS1tVEoFIAsuZs5c6a9d5IkSeuCQgHOPx+uvhqOOKL/KppHHJGVf/3rWdwIufrqq3n22WeZNm0ab3zjG/s9jjzyyOXrQ2yyySbMmDGDU045hauuuor58+fz5z//mWOOOYbtt9+eQw45ZMTaOFqEq86MHhHRAHR3d3cv714eTYrFIp2+ES2HAAAgAElEQVSdnbS0tJjcSZIkrSWWLl3K/PnzmTRpEhtssMHqV3Tlldlqml1dr5RNmpQld+9976tu50COP/54Fi1aRE9PD729vVxzzTUrxNx5553svvvu3HHHHUyZMoUXX3yR8847j5/+9Kd0dXWxxRZbcOCBB/LVr36VrbbaakTaubqG+mwWL17cN6S0MaW0eLh1muCNIqM9wZMkSdLap2IJHmS3Qpg9O1tQZcKEbFjmCPbc1bqRSPBcZEWSJEnS8BQKcMAB1W6FhuAcPEmSJEmqESZ4kiRJklQjTPAkSZIkqUaY4EmSJEnrABdXHH1G4jMxwZMkSZJq2NixYwFYsmRJlVuici+99BLA8vtNV4KraEqSJEk1rFAoMG7cOJ588kkA6uvriYgqt0q9vb089dRT1NfXM2ZM5dIyEzxJkiSpxvXd4LsvydPoUFdXx8SJEyuacJvgSZIkSTUuIpgwYQJbbLEFy5Ytq3ZzlFtvvfWoq6vsrDkTPEmSJGkdUSgUKjrfS6OPi6xIkiRJUo0wwZMkSZKkGmGCJ0mSJEk1wgRPkiRJkmqECZ4kSZIk1QgTPEmSJEmqESZ4kiRJklQjTPAkSZIkqUaY4Gm1FYtFZs2aRbFYrHZTJEmSJGGCp9XU3t5Oc3MzU6dOpbm5mfb29mo3SZIkSVrnRUqp2m1QLiIagO7u7m4aGhqq3ZxBFYtFmpub6e3tXV5WKBTo6uqiqampii2TJEmSasPixYtpbGwEaEwpLR7ufvbgaZV1dHT0S+4Aenp66OzsrFKLJEmSJIEJnlZDa2srdXX9L51CoUBLS0uVWiRJkiQJTPC0Gpqammhra6NQKABZcjdz5kyHZ0qSJElV5hy8UWRtmYPXp1gs0tnZSUtLi8mdJEmSVEGrOwdvzMg1SbWuqanJxE6SJEkaRRyiKUmSJEk1wgRPkiRJkmqECZ4kSZIk1QgTPEmSJEmqESZ4kiRJklQjTPAkSZIkqUaY4EmSJElSjTDBkyRJkqQaMSoTvIg4MSLmR8TSiLgjIt68kvj3RcT9EfFi/vyesu0REWdExGMR8c+IuDEidiyL2TQifhgR3fnjhxExrmT75IiYFRFP5O16OCLOioixJTHvjYg5EbEoIl6IiLsj4t8qdV4kSZIkaSijLsGLiKOBbwJnA7sBs4HfRMTEQeL3BS4Hfgjskj9fERF7l4R9BvgU8O/AnsDjwO8jYpOSmJ8AuwKH5Y9d87r6LAMuAw4BJgOfBD4CfLkk5pm83fsCOwOXAJdExKGrdBIkSZIkaTVESqnabegnIm4F7kwpfaykbC5wVUrpcwPEXw40pJT+paTst8CzKaVjIiKAx4BvppTOzbevDzwBfDalNDMitgfuB/ZJKd2ax+wD3AK8IaX0wCBtvQDYM6U0aA9jRNwJXJNSOm0Y770B6O7u7qahoWFl4ZIkSZJq1OLFi2lsbARoTCktHu5+o6oHLyLWA3YHrivbdB3wpkF223eA+N+VxE8CtiqNSSm9CPyxJGZfoLsvuctj/gJ0D3bciGgh6+n74yDbIyIOIuvt+9MgMetHREPfA9hkoDhJkiRJGo5RleAB44ECWe9aqSfIkrSBbLWS+K1KyoaKeXKAup8sP25E3BwRS4EOsuGjp5dtb4yI54GXgGuAj6eUfj9I2z9HlkT2PYqDxEmSJEnSSo22BK9P+bjRGKBsVeNXFjNQ/QPVczQwBTgWeAdwatn258jm7+0JfAG4ICIOGKTdXwUaSx5Ng8RJkiRJ0kqNqXYDyjwN9LBib90WrNgD1+fxlcQ/nj9vBSwcImbLAerevPy4KaVH8n/eHxEFoC0izk8p9eTbe4HOPObufH7f54AbyyvPh4q+2Pc6my4oSZIkSatnVPXgpZReAu4ADi7bdDBw8yC73TJA/CEl8fPJErjlMflcv7eWxNwCNEbEXiUxe5P1qg12XMh6+Mbmz0PFrD/EdkmSJEmqiNHWgwdwAfDDiJhDlnhNByYC3weIiMuAR0tW1Pxv4E8R8Vng/4B3A28D9gdIKaWI+Cbw+YjoIJs793lgCdmtEUgpzc1X3rwoImbk9bYBV/etoBkRHyC7VcK9ZL1uu5MNsbw8pfRyHvM5YA7wELAe8Hbgg8DyFUElSZIkaaSMugQvpXR5RGxGtnjJBOA+4O0ppb/nIROB3pL4myPi/cBZwJlkydXRpStiAucBGwIXApsCtwKHpJSeK4n5APAtXllt81dk983r8zLwWWA7sl65vwPfBb5RErNRfowm4J/APOC4lNLlq34m1j7FYpGOjg5aW1tpanI6oSRJkrSmjbr74K3L1ub74LW3tzN9+nR6e3upq6ujra2NadOmVbtZkiRJ0lppde+DZ4I3iqytCV6xWKS5uZne3uUdqxQKBbq6uuzJkyRJklZDTdzoXGunjo6OfskdQE9PD52dnYPsIUmSJGkkmODpVWttbaWurv+lVCgUaGlpqVKLJEmSpHWTCZ5etaamJtra2igUCkCW3M2cOdPhmZIkSdIa5hy8UWRtnYPXp1gs0tnZSUtLi8mdJEmS9Cqs7hy8UXebBK29mpqaTOwkSZKkKnKIpiRJkiTVCBM8SZIkSaoRJniSJEmSVCNM8CRJkiSpRpjgSZIkSVKNMMGTJEmSpBphgidJkiRJNcIET5IkSZJqhAmeJEmSJNUIEzxJkiRJqhEmeJIkSZJUI0zwJEmSJKlGmOBJkiRJUo0wwZMkSZKkGmGCJ0mSJEk1wgRPI6ZYLDJr1iyKxWK1myJJkiStE0zwNCLa29tpbm5m6tSpNDc3097eXu0mSZIkSTUvUkrVboNyEdEAdHd3d9PQ0FDt5qy2YrFIc3Mzvb29y8sKhQJdXV00NTVVsWWSJEnS2mHx4sU0NjYCNKaUFg93P3vwVHEdHR39kjuAnp4eOjs7q9QiSZIkad1ggqeKa21tpa6u/6VVKBRoaWmpUoskSZKkdYMJniquqamJtrY2CoUCkCV3M2fOdHimJEmSNMKcgzeK1MocvD7FYpHOzk5aWlpM7iRJkqRVsLpz8MaMXJO0rmtqajKxkyRJktYgh2hKkiRJUo0wwZMkSZKkGmGCJ0mSJEk1wgRPkiRJkmqECZ4kSZIk1QgTPEmSJEmqESZ4kiRJklQjTPAkSZIkqUaY4EmSJElSjTDBkyRJkqQaYYInSZIkSTXCBE+SJEmSaoQJniRJkiTVCBM8SZIkSaoRJnhaY4rFIrNmzaJYLFa7KZIkSVJNMsHTGtHe3k5zczNTp06lubmZ9vb2ajdJkiRJqjmRUqp2G5SLiAagu7u7m4aGhmo3p2KKxSLNzc309vYuLysUCnR1ddHU1FTFlkmSJEmj0+LFi2lsbARoTCktHu5+9uBpxHV0dPRL7gB6enro7OysUoskSZKk2mSCpxHX2tpKXV3/S61QKNDS0lKlFkmSJEm1yQRPI66pqYm2tjYKhQKQJXczZ850eKYkSZJUYc7BG0VqdQ5en2KxSGdnJy0tLSZ3kiRJ0hBWdw7emJFrktRfU1OTiZ0kSZI0ghyiKUmSJEk1wgRPkiRJkmqECZ4kSZIk1Qjn4GlgPT0wezYsXAgTJsCb3wz5KpiSJEmSRid78LSiK6+ElhY48EA49tjsuaUlK5ckSZI0apngqb8rr4Qjj4SddoJbboHnnsued9opKzfJkyRJkkYt74M3ilT9Png9Pby87TY8P3lbHr7kfKgryf97e9n2w6ew8YPzGfPQfIdrSpIkSSPI++DpVXviNz9nywVFDju4yK3/s+cK2/eZCLf8Po87/OgqtFCSJEnSUEzwtNySBQ8BcPL/+x9am3dbYXtH153Q/pHlcZIkSZJGFxM8Lbdsy/EA7PL0GN6wz5QVtm8056/94iRJkiSNLi6youWe33s35o+Drb51MfT29t/Y28uW376Eh8dlcZIkSZJGHxM8vaJQ4JRDoPH62XDEEf1X0TziCBqvn82ph+ACK5IkSdIoZYKnfn65A8y/6Dy4915405ugoSF7vu8+5l90Hr/codotlCRJkjQYEzytYNHbp0JnJ8yaBT/5Sfbc0ZGVV1ixWGTWrFkUi8WK1y1JkiSta1xkRSuY+9Tc7B+TG7IHwJP3vFJeIe3t7UyfPp3e3l7q6upoa2tj2rRpFT2GJEmStC7xRuejSLVvdL6gewHbf3d7lixbMmhM/dh65p40l4mNE1/VsYrFIs3NzfSWLOZSKBTo6uqiqanpVdUtSZIkre280bletYmNE5l70lyeXvL0oDHj68e/6uQOoKOjo19yB9DT00NnZ6cJniRJkrSaTPDUz8TGiRVJ4FamtbWVurq6FXrwWlpaRvzYkiRJUq1ykRVVRVNTE21tbRTyWy4UCgVmzpxp750kSZL0KjgHbxSp9hy8aigWi3R2dtLS0mJyJ0mSJOWcg6e1UlNTk4mdJEmSVCEO0ZQkSZKkGmGCJ0mSJEk1wgRPkiRJkmqECZ4kSZIk1QgTPEmSJEmqESZ4kiRJklQjTPAkSZIkqUaY4EmSJElSjTDBkyRJkqQaYYInSZIkSTXCBE+SJEmSaoQJnkaVYrHIrFmzKBaL1W6KJEmStNYxwdOo0d7eTnNzM1OnTqW5uZn29vZqN0mSJElaq0RKqdptUC4iGoDu7u5uGhoaqt2cNapYLNLc3Exvb+/yskKhQFdXF01NTVVsmSRJkrTmLV68mMbGRoDGlNLi4e5nD55GhY6Ojn7JHUBPTw+dnZ1VapEkSZK09jHB06jQ2tpKXV3/y7FQKNDS0lKlFkmSJElrHxM8jQpNTU20tbVRKBSALLmbOXOmwzMlSZKkVeAcvFFkVM/B6+mB2bNh4UKYMAHe/GbIk7FKKhaLdHZ20tLSYnInSZKkddbqzsEbM3JNUs248ko45RTo6nqlbJtt4Pzz4b3vreihmpqaTOwkSZKk1eQQTQ3pqR/OJB15JItaJzLv15dyd8ds5v36Uha1TiQdeSRP/XBmtZsoSZIkKecQzVFktA3RXPDMfHpbXs9fN08c8X5IJX8OiF646mew01NBofMhJr5mUvUaKkmSJNUYh2iq4pbO+j3bPZuY960zmHPQO1fY/lTLr5j0wS/z4Kzfw/umV6GFkiRJkkqtUoIXEe9ajWP8PqX0z9XYT1U29omnAdhqn4PYdcKUFbbfvc8LwJeXx0mSJEmqrlXtwbtqFeMT0Ao8vIr7aRRYtuV4ADaY9xC07L/C9g3nPdQvTpIkSVJ1rc4iK1ullOqG8wCWrE6jIuLEiJgfEUsj4o6IePNK4t8XEfdHxIv583vKtkdEnBERj0XEPyPixojYsSxm04j4YUR0548fRsS4ku2TI2JWRDyRt+vhiDgrIsaWxHwkImZHxLP54/qI2Gt1zsFo8PzeuzF/HGz1rYuht7f/xt5etvz2JTw8LouTJEmSVH2rmuD9AFiV4ZY/AoY9IRAgIo4GvgmcDewGzAZ+ExETB4nfF7gc+CGwS/58RUTsXRL2GeBTwL8DewKPA7+PiE1KYn4C7Aoclj92zevqswy4DDgEmAx8EvgI8OWSmAOAnwIHAvsCC4DrIuK1q3IORo1CgVMOgcbrZ8MRR8Att8Bzz2XPRxxB4/WzOfUQRuR+eJIkSZJW3ahbRTMibgXuTCl9rKRsLnBVSulzA8RfDjSklP6lpOy3wLMppWMiIoDHgG+mlM7Nt68PPAF8NqU0MyK2B+4H9kkp3ZrH7APcArwhpfTAIG29ANgzpTRgD2NEFIBngX9PKV02jPc+qlbRvHPhnezetjsPNX2Nbc/6bv/74E2axMNfOJHXFz/NHdPvYMoAc/QkSZIkrZ6aWEUzItYDdgf+q2zTdcCbBtltX+AbZWW/I+thA5gEbJXXAUBK6cWI+GNe58y8ju6+5C6P+UtEdOcxKyR4EdFC1tN35RBvqR4YCzwz0MY80Vy/pGiTgeKq7ZY9J7Bo9hVsfOtdjH3iaZZtOZ7n996Nuc88CMVqt06SJElSn4oleBHxeuDjQDOwfMxeSmlVVt4cn+/7RFn5E2RJ2kC2Wkn8ViVl5THNJTFPDlD3k+XHjYibgSlkiVkbcPog7YIsUX0UuH6Q7Z8DvjTE/lU1vn489WPrOe6Xx/Xf8BRwX/bP+rH1jK93kRVJkiRpNKhkD95VwHfI5sP1riR2ZcrHjcYAZasav7KYgeofqJ6jyXradgG+BpwKnLfCjhGfAY4BDkgpLR2k3V8FLih5vQmjqE9sYuNE5p40l6eXDH4bhPH145nYOOD0SEmSJElrWCUTvBdSSjNfZR1PAz2s2Fu3BSv2wPV5fCXxj+fPWwELh4jZcoC6Ny8/bkrpkfyf9+dz7Noi4vyUUk9fTEScCnweeFtK6a+DtJuU0ovAiyX7DRZaNRMbJ1Y1gSsWi3R0dNDa2kpTU1PV2iFJkiStDVbnNgmD+WpEnBsRB0fEW/oeq1JBSukl4A7g4LJNBwM3D7LbLQPEH1ISP58sgVsek8/1e2tJzC1AY+ktDfJVOBuHOC5kPXxj8+e+/T4NnAYcllKaM8S+Won29naam5uZOnUqzc3NtLe3V7tJkiRJ0qhWsVU0I+JCstsEzOWVIZoppXTUKtZzNNntCT5KlnhNJ7sdwY4ppb9HxGXAo30rakbEm4A/AV8A/g94N3AWsH/JipifJZvv9mGgg6x37QBgckrpuTzmN8DWwIy8KW3A31NK78y3f4DsVgn3kvW67U62uMuNKaXj8pjPAGcCxwJ/Lnlbz6eUnh/Gex9Vq2hWU7FYpLm5md6S++8VCgW6urrsyZMkSVLNGw2raL6VLAl7VRljSunyiNiMbPGSCWTLebw9pfT3PGQiJXP8Uko3R8T7yZK6M4GHgKNLV8QkmyO3IXAhsClwK3BIX3KX+wDwLV5ZbfNXZPfN6/My8FlgO7Ieu78D36X/Cp4nAusBPy97W18GzhjeGRBAR0dHv+QOoKenh87OThM8SZIkaRCV7MG7BDg7pdRZkQrXQfbgvcIePEmSJK3LVrcHr5Jz8HYD7ouIeyLitoi4PSJuq2D9Woc0NTXR1tZGoZDdcaNQKDBz5kyTO0mSJGkIlezBax6ovGRopVbCHrwVFYtFOjs7aWlpMbmTJEnSOmO0zMEbyGUVPIbWMU1NTSZ2kiRJ0jBVMsHbqeTf65PdluCvmOBJkiRJ0hpRsQQvpfTp0tcRsTHws0rVL0mSJEkaWiUXWSmXyG4pIEmSJElaAyrWgxcRt5MldQAFsnvYfa1S9UuSJEmShlbJOXhHlvz7ZeDJlNKyCtYvSZIkSRrCiK6iGRGklFxkRZIkSZLWAFfRlCRJkqQa4SqakiRJklQjXEVTkiRJkmrESK+ieV6l6pcAisUiHR0dtLa20tTUVO3mSJIkSaOKq2hqrdHe3s706dPp7e2lrq6OtrY2pk2bVu1mSZIkSaNGpJRWHqU1IiIagO7u7m4aGhqq3Zyh9fTA7NmwcCFMmABvfjMUCiN2uGKxSHNzM729vcvLCoUCXV1d9uRJkiSp5ixevJjGxkaAxpTS4uHuV7E5eBFxTkSMK3m9aUScXan6NYpceSW0tMCBB8Kxx2bPLS1Z+Qjp6Ojol9wB9PT00NnZOWLHlCRJktY2lVxk5V9SSov6XqSUngX+pYL1axR46oczSUceyaLWicz79aXc3TGbeb++lEWtE0lHHslTP5w5IsdtbW2lrq7/5VooFGhpaRmR40mSJElro4oN0YyIvwJvSik9n79uAP6cUtpp6D3VZ7QP0VzwzHx6W17PXzdPHPF+SCX5VvTCVT+DnZ4KCp0PMfE1kyp+/Pb2dmbMmEFPTw+FQoGZM2c6B0+SJEk1aXWHaFZykZVvA3+OiMuBAI4CvlHB+lVlS2f9nu2eTcz71hnMOeidK2x/quVXTPrgl3lw1u/hfdMrfvxp06Zx6KGH0tnZSUtLi3PvJEmSpDKVvNH5RRFxK/DWvOjYlNLfKlW/qm/sE08DsNU+B7HrhCkrbL97nxeALy+PGwlNTU0mdpIkSdIgKtmDR0rprxHxBLA+QERMTCktqOQxVD3LthwPwAbzHoKW/VfYvuG8h/rFSZIkSVqzKrmK5nsiYi7wEPA7YD7wf5WqX9X3/N67MX8cbPWti6FsRUt6e9ny25fw8LgsTpIkSdKaV8lVNL8M7A10ppS2B/YF7q5g/aq2QoFTDoHG62fDEUfALbfAc89lz0ccQeP1szn1EEb0fniSJEmSBlfJBO/FvtVdImK9lNJtwC4VrF+jwC93gPkXnQf33gtvehM0NGTP993H/IvO45c7VLuFkiRJ0rqrknPwFuY3Ov81cG1E/AN4qoL1a5S4Zc8JLJp9BRvfehdjn3iaZVuO5/m9d2PuMw9CsdqtkyRJktZdlUzwvg88n1I6LSIOABrI5uKpRoyvH0/92HqO++Vx/Tc8BdyX/bN+bD3j611kRZIkSaqGSiZ4RwLfjojZwBXAtSmllytYv6psYuNE5p40l6eXDH4bhPH145nYOHENtkqSJElSn0gpVa6yiDHA24B/Jbsf3uyU0ocrdoAaFxENQHd3dzcNDQ3Vbo4kSZKkKlm8eDGNjY0AjX1rnQxHpe+D93JE3AxsDmwNHFDJ+qVyxWKRjo4OWltbvQG6JEmS1nmVvA/e8RFxDXAbsBPwpZTSpErVL5Vrb2+nubmZqVOn0tzcTHt7e7WbJEmSJFVVxYZoRsS5wP+mlOZUpMJ1kEM0h69YLNLc3ExvyQ3XC4UCXV1d9uRJkiRprbe6QzQr1oOXUvqsyZ3WlI6Ojn7JHUBPTw+dnZ1VapEkSZJUfZW80bm0xrS2tlJX1//yLRQKtLS0VKlFkiRJUvVVJMGLiDERsWNEHB0RZ0bELytRrzSYpqYm2traKBQKQJbczZw50+GZkiRJWqet8hy8iNiWbBGVN5Y8tiNbkfMlYC5wb0rpQ5Vtau1zDt6qKxaLdHZ20tLSYnInSZKkmrFGbpMQET8CjgESsATYCLgG+ApwL9CRUupZlTqlV6OpqcnETpIkScqt6hDNI4GPAxuT3efuO8AhwJ7A303uJEmSJKl6VjXB+xpwWUppaUrp+ZTSJ4D9gAOB+yPisIq3UJIkSZI0LKuU4KWUTkspPV9WdgewF/BN4PKI+ElEbF7BNkqSJEmShmGVV9GMiHMiYq/SspT5b2AHYH1gXoXaJ0mSJEkaptW5TcIE4OqIWBgRbRHxjohYHyCl9GhK6X3AByvaSkmSJEnSSq1ygpdS+jCwJXAUsAg4H3g6Iq6MiOMjYrOU0jUVbqckSZIkaSVW60bn+ZDM2Smlz6SU3kA2B+8vwEeAxyLiTxFxakS8tpKNlSRJkiQNbpXugzeYlNJcshucn5cvsPKu/AHw9UocQ5IkSZI0tIokeAARMTaltCyl9BTQnj9U4xZ0L+DpJU9DTw8b33oXY594mmVbjuf5vXeDQoHx9eOZ2DhxjbWnWCzS0dFBa2urN0CXJEnSOqciCV5E1AG3A7tWoj6tHRZ0L2D7727Pofcs4fzrYNKiV7bNHwenHAK/26WeuSfNXSNJXnt7O9OnT6e3t5e6ujra2tqYNm3aiB9XkiRJGi1Waw5euZRSL3BbROxYifq0dnh6ydMces8SfvG/waZ7voV5v76UuztmM+/Xl7Lpnm/hF/8bHHrPkqyHb4QVi8XlyR1Ab28vM2bMoFgsjvixJUmSpNGiYkM0yRZauTsiHgCWAEG2HsteQ++mtVZPD+dfB91vezPjfjuLcXX53wta9oe3/xuLDjuQr1/3Jxb19Ix4Uzo6OpYnd680r4fOzk6HakqSJGmdUckE790DlKUK1q9RZuNb72LSIpj3Hye8ktz1qavjiY9/mMnv+hMP3noXNO05om1pbW2lrq6uX5JXKBT+P3v3Hh31dd97//2dX3FAJkgkkwjHc5gCowJJfAEXQ1pDkEwQx12PT/oQP+4lXk9BDUrjmqaGOk2ay8mlJyetcWwcdx3UDuaJ0/YklDYpvRjKqRrUZVfBMSZOjYgEyOrPseVM7REhgoTM7OeP3wyMBiE0F2lGms9rrVka//aePd844MWXvff3SywWm9DvFRERERGpJiUf0TSzJzJv/wrYm/f6q1LXl+o1YzA4enl+yaJRx89lnmfnTaRIJEJHRwee5wFBcrdr1y7t3omIiIhITSnHDt4DmZ/vK8NaMoVcaAwDMLPnZHAsM8+snpMj5k20trY2Wltb6evrIxaLKbkTERERkZpTcoLnnHs58/NFADNrBN5Q6rpS/c6uXMbpBpi3czfccQ/kHtNMp2l89HFONQTzJkskElFiJyIiIiI1qyxVNAHM7L1mdhw4CRwATgPfKNf6UoU8j23rof5QF8kNzZzIVNE8sX8PyQ3N1B/qYvv6YJ6IiIiIiEy8chZZ+QywEjjsnLvZzG4FfquM60uVCdeFOXBTHRsZZsfBwyy+8/DFsVMNsPmuoA/ew3WTc0RTRERERKTWmXPlKXRpZkeccyvM7DngVufcT8zsWefc8rJ8QQ0wsznA0NDQEHPmzKl0OOMyMDQQ9LlLpZjdfZQZgwkuNIaDY5meR7guPClNzkVEREREppMzZ85QX18PUO+cOzPez5VzB+9lM2sA9gP/YGb/CfygjOtLFZpfP/9SAjfBrRBERERERGRsZUvwnHN3Zt5+wszWAnMI7uKJiIiIiIjIJCjnDt5Fzrl/mYh1RURERERE5MrKVkVTREREREREKqssO3hm9jPAYuCd2Zdz7pfLsbZIqXzfp7e3l6amJvXIExEREZFpreAEz8wWAjeQk8wBP5dZ6yfAceD5MsYoUrR4PM6WLVtIp9OEQiE6Ojpoa2urdFgiIiIiIhOioDYJZvYV4FcBBwwD1wJ/DzxBkNT1OudSExBnTZiKbRKqme/7RKNR0un0xWee59Hf36+dPBERERGpasW2SSj0Dt77gPuA2cDbgC8B64EVwItK7qSa9Pb2jkjuAFKpFH19fRWKSERERERkYhWa4P0x8Mx2ahcAACAASURBVGXn3Hnn3Fnn3O8Avwg0Ay+Y2YayRyhSpKamJkKhkb/EPc8jFotVKCIRERERkYlVUILnnPuEc+5s3rNvA7cCDwNfNbO/MLO3lDFGkaJEIhE6OjrwPA8Ikrtdu3bpeKaIiIiITFsF3cG76mJm1wM7gbXOuTeXbeEaoTt4E8P3ffr6+ojFYkruRERERGRKKPYOXlkbnTvnXgI2mtkvlXNdkVJEIhEldiIiIiJSEyak0blz7u8nYl0RERERERG5sglJ8ERERERERGTylSXBM7PlZnZNOdYSERERERGR4pTrDt4RYCnwvTKtJ1NdKgVdXfDyy3DddbB6NWSqWYqIiIiIyMQoV4JnZVpHpriBoQF++ldf4/pPf5E3/Mf3Lz7/8X95Gy996nf5mff9P8yvn1/BCEVEREREpi/dwZOyGRga4CMfauJnf/P3ODDr+6xqg9kfhVVtcGDW9/nZ3/w9PvKhJgaGBiodqoiIiIjItFTWNglS2xI/HOR//MNP+P67lxH58w7+JJTz9wfpNN//9S384T8cJfHDwYru4vm+T29vL01NTWqfICIiIiLTinbwpGxmdx9lQRLObv8dll//8yy/bvml1/U/z4+2bWVhMphXKfF4nGg0SktLC9FolHg8XrFYRERERETKTQmelM2MwQQA55csGnX8XOZ5dt5k832fLVu2kE6nAUin07S3t+P7fkXiEREREREpNyV4UjYXGsMAzOw5Oer4rMzz7LzJ1tvbezG5y0qlUvT19VUkHhERERGRclOCJ2VzduUyTjfAvJ27IS+RIp2m8dHHOdUQzKuEpqYmQqGRv+Q9zyMWi1UkHhERERGRcitXgvdpoDLn7qR6eB7b1kP9oS5473vh6afhhz8Mfr73vdQf6mL7eirWDy8SidDR0YGX+X7P89i1a5cKrYiIiIjItGHOuUrHIBlmNgcYGhoaYs6cOZUOp2DPvvwst3Tcwjff+DusfHjvyD5486+n+3fex7t/+Ajf3vJtll+3vGJx+r5PX18fsVhMyZ2IiIiIVKUzZ85QX18PUO+cOzPez6lNgpRNuC5M3Yw63v3DRwhtgtUvwnVn4eXZ0BV9ifQPH6FuRh3husrcwcuKRCJK7ERERERkWlKCJ2Uzv34+x+89TmL4yqd1w3XhivbAExERERGZzpTgSVnNr5+vBE5EREREpELKXkXTzP6Lme0u97oiIiIiIiIytolok/Am4P+dgHVFRERERERkDAUf0TSzO68yZWGRsYiIiIiIiEgJirmD93XAATbGHPVekCnD9316e3tpampSdU0RERERmdKKOaL5MrDRORca7QVUrsGZSIHi8TjRaJSWlhai0SjxeLzSIYmIiIiIFK2YBO/bjJ3EXW13T6Qq+L7Pli1bSKfTAKTTadrb2/F9v8KRiYiIiIgUp5gE74+Bp8YY7wOaiwtHZPL09vZeTO6yUqkUfX19FYpIRERERKQ0Bd/Bc851XWX8R8A3i45IZJI0NTURCoVGJHme5xGLxSoYlYiIiIhI8SaiTYLIlBCJROjo6MDzPCBI7nbt2qVCKyIiIiIyZZlz4y94aWY3At91zqWvOjmY/w7ghHPup0XGV1PMbA4wNDQ0xJw5cyodTs3wfZ++vj5isZiSOxERERGpCmfOnKG+vh6g3jl3ZryfKzTBSwHznHM/GOf8M8DNzrlT4/6SGqYET0REREREoPgEr9A7eAZ81syGxzn/mgLXFxERERERkSIVmuAdBhYXMP9p4FyB3yHTyMDQAInhBKRSzO4+yozBBBcaw5xduQw8j3BdmPn18ysdpoiIiIjItFBQguecWztBcVxkZh8Cfg+4Dvh34MNjVe40s43AZ4FFwEngD5xzf5MzbsCngC3AXKAbuNc59+85c+YCO4E7M4/+FrjPOZfMjC8G/hfwdqAe+D7wF8CnnXMXMnPeAXwGuAWIAr/rnHu4pH8ZU9zA0ABLH1tK67FhdhyEBclLY6cbYNt6OHBTHcfvPa4kT0RERESkDKqqiqaZ3Q08DPwhsAzoAv7RzEb907+ZvQv4KvAEcFPm59fMbGXOtAeA+4HfBlYArwD/ZGZvzJnzF8DNwIbM6+bMWlkXgC8D6wl2MD8MfAD4dM6cOuAU8PuZ76h5ieEErceG2bfXmLtiDT379/Bcbxc9+/cwd8Ua9u01Wo8NBzt8IiIiIiJSsoKKrEw0M+sGnnXO/VbOs+PA151zHx1l/leBOc65/5rz7Engdefcr2Z2774PPOyc+0Jm/A3AIPAR59wuM1sKvACscs51Z+asIjheusQ5d+IKsT4ErHDOrR5lrD/znQXt4E23IivP+keYe8OtzF2xhoYnOyGU8/cJ6TTJDc28duQwyee/xfLIisoFmsf3fXp7e2lqalJVTRERERGpiGKLrFTNDp6ZXUNwvPFg3tBB4Beu8LF3jTL/QM78BcC83DnOuR8TNGLPznkXMJRN7jJz/g0YutL3mlmMYKevpIbuZvYGM5uTfQFvvOqHppDZ3UdZkIRXtm4emdwBhEIM3reJhclgXrWIx+NEo1FaWlqIRqPE4/FKhyQiIiIiMm5Vk+ABYcAj2F3LNUiQpI1m3lXmz8t5NtacV0dZ+9X87zWzp8zsPNBLcHz0k1eIa7w+SpBIZl9+ietVlRmDwdHL80sWjTp+LvM8O6/SfN9ny5YtpNNBm8d0Ok17ezu+P63+bxERERGRaayaErys/DOjNsqzQudfbc5o64+2zt3AcuDXgF8Cto8R13h8nqBoS/Y1rc4DXmgMAzCz5+So47Myz7PzKq23t/dicpeVSqXo6+urUEQiIiIiIoUptE3CCGZ2O3A78FbykkXn3OYCl0sAKS7frXsrl+/AZb1ylfnZYifzgJfHmNM4ytpvyf9e59x/ZN6+YGYe0GFmO5xzqSvEN6bMcdEfZ/85uDI4fZxduYzTDTBv5264457L7uA1Pvo4pxqCedWgqamJUCg0IsnzPI9YLFbBqERERERExq/oHTwz+xTB3bbbCY5Xzs17FcQ59xPg28B78obeAzx1hY89Pcr89TnzTxMkcBfnZO76vTtnztNAvZndmjNnJcGO2pW+F4IdvhmZnzIaz2Pbeqg/1EVyQzMnMlU0T+zfQ3JDM/WHuti+PphXDSKRCB0dHXiZeDzPY9euXSq0IiIiIiJTRik7eB8EfsM598RVZ47fQ8ATZvYMQeK1BZhP0IMOM/sy8FJORc1HgMNm9hHgG8B/A9YBtwE455yZPQx8zMx6Ce7OfQwYJmiNgHPueKby5p+aWXtm3Q7g77IVNM3s1wlaJTxPsON2C8Hxyq86536amXMNQZ88gGuA683sZuCsc64mz/iF68IcuKmOjQyz4+BhFt95+OLYqQbYfFfQB+/huuo4ognQ1tZGa2srfX19xGIxJXciIiIiMqUU3SbBzP4TuNU5N/oFq2IDChqdP0DQ6Py7BA3DD2fG/gXod879Rs789wGfAxZyqdH5X+eMZxudtzOy0fl3c+a8icsbnf92TqPzuzMx/RzBjt2LwFeALzrnzmfm/CzBjmG+b463Qfx0a5MAQbPzxHACUilmdx9lxmCCC43h4Fim5xGuC6vJuYiIiIhInmLbJJSS4H2BYHfqs0UtIJeZjgmeiIiIiIgUrtgEr6Ajmpnm3lkhYIuZrQO+Q3CE8SLn3P2FrC0iIiIiIiKlKfQOXn65w+cyP9+Z97y4bUEREREREREpWkEJnnOueaICERERERERkdKU0iZhvl2hcZuZqWqGTBu+79PZ2Ynv+5UORURERERkTEUneAQVI9+S/9DM3szo1SRFppx4PE40GqWlpYVoNEo8Hq90SCIiIiIiV1RKFc000Oic+0He8yjwgnPu2jLEV1NURbO6+L5PNBolnU5ffOZ5Hv39/eqPJyIiIiITalKqaMKISpoO+KyZDecMe8BKLhVfEZmyent7RyR3AKlUir6+PiV4IiIiIlKVCk7wuFRJ04AbgJ/kjP0EOAY8WGJcIhXX1NREKBS6bAcvFotVMCoRERERkSsrOMHLVtI0s8eBrc65H5Y9KpEqEIlE6OjooL29nVQqhed57Nq1S7t3IiIiIlK1SrmD90VG73fngPNAH/AN59xrxYdXW3QHrzr5vk9fXx+xWEzJnYiIiIhMimLv4JWS4HUCywnu3Z0gOLLZBKSAHmAxQbJ3m3PuhaK+pMYowRMREREREZjEIis5vgG8BmzKfmEmQYkD/wr8KfAXwBeB1hK+R6aZgaEBEsMJSKWY3X2UGYMJLjSGObtyGXge4bow8+vVSlFEREREpFCl7OC9BLwnf3fOzN4BHHTOXW9myzPvw6WHOv3Vwg7ewNAASx9bSuuxYXYchAXJS2OnG2DbejhwUx3H7z2uJE9EREREalaxO3ilNDqvB946yvO3ANnsJAlcU8J3yDSTGE7QemyYfXuNuSvW0LN/D8/1dtGzfw9zV6xh316j9dhwsMMnIiIiIiIFKfWI5m4z2wYcIbhvdytBi4SvZ+bcCnyvpAhlekml2HEQhtatpuHJThpCmb9jiN0Gd9xDckMzDx48TDKVqmycY/B9n97eXpqamlR0RURERESqSik7eO3A/wH+N/AiMJB5/3+AD2bm9AC/WUqAMr3M7j7KgiS8snUzhPJ++YVCDN63iYXJYF41isfjRKNRWlpaiEajxOPxSockIiIiInJR0Qmec+6sc+4DwJsJmp8vB97snNvinPtRZs5zzrnnyhOqTAczBoOjl+eXLBp1/FzmeXZeNfF9ny1btlxsfJ5Op2lvb8f3/QpHJiIiIiISKGUHD7iY6H3HOXfMOXe2HEHJ9HWhMai3M7Pn5KjjszLPs/OqSW9v78XkLiuVStHX11ehiERERERERirlDh5mdjtwO0GxlRHJonNucylry/R0duUyTjfAvJ274Y57Rh7TTKdpfPRxTjUE86pNU1MToVBoRJLneR6xWKyCUYmIiIiIXFL0Dp6ZfQo4SJDghYG5eS+Ry3ke29ZD/aEukhuaOZGponli/x6SG5qpP9TF9vXBvGoTiUTo6OjAy8TmeR67du1SoRURERERqRql9MF7GXjAOfdEeUOqXbXeB+9UA2yfAn3wfN+nr6+PWCym5E5EREREJkSxffBKSfD+E7jVOTf6ZSopWC0keBAkeYnhBKRSzO4+yozBBBcaw8GxTM8jXBeu2uRORERERGQyVCLB+wJw1jn32aIWkMvUSoInIiIiIiJjKzbBK6XIykxgi5mtA74DXMgddM7dX8LaIiIiIiIiUqBSErwbgWyPu3fmjRW3LSgiIiIiIiJFKzrBc841lzMQkanM9316e3tpampS4RURERERqZiSG52L1Lp4PE40GqWlpYVoNEo8Hq90SCIiIiJSo4ousgJgZquBdmAR8D7n3Etmdg9w2jn3r2WKsWaoyMrU4/s+0Wj0subn/f392skTERERkaIVW2SllEbnG4EDwDlgGfCGzNAbgY8Vu67IVNLb2zsiuQNIpVL09fVVKCIRERERqWWlHNH8OPBB59wHGFlB8ylgeUlRiUwRTU1NhEIjfxt5nkcsFqtQRCIiIiJSy0pJ8BYDh0d5fgZoKGFdkSkjEonQ0dGB53lAkNzt2rVLxzNFREREpCJKaZPwMhAD+vOe3wacKmFdkSmlra2N1tZW+vr6iMViSu5EREREpGJKSfB2AY+Y2WaCvndvM7N3AQ8CnylHcFIbBoYGSAwnIJVidvdRZgwmuNAY5uzKZeB5hOvCzK+fX+kwxxSJRJTYiYiIiEjFldIH74/MrB7oBGYSHNf8MfCgc+5LZYpPprmBoQGWPraU1mPD7DgIC5KXxk43wLb1cOCmOo7fe7zqkzwRERERkUorqQ+ec+4PgDBwK7AKeItz7hPlCExqQ2I4QeuxYfbtNeauWEPP/j0819tFz/49zF2xhn17jdZjw8EOn4iIiIiIjKmUI5oAOOeGgWfKEIvUolSKHQdhaN1qGp7spCFbkTJ2G9xxD8kNzTx48DDJVKqycRbI9316e3tpamrS0U0RERERmTQFJXhm9tB45zrn7i88HKk1s7uPsiAJPVs3X0ruskIhBu/bxOI7D/O97qMQWVGZIAsUj8fZsmUL6XSaUChER0cHbW1tlQ5LRERERGpAoTt4y8Y5zxUaiNSmGYPB0cvzSxaNOn4u8zw7r9r5vn8xuQNIp9O0t7fT2tqqnTwRERERmXAFJXjOueaJCkRq04XGMAAze04GxzLzzOo5OWJetevt7b2Y3GWlUin6+vqU4ImIiIjIhCupyIqZrTazr5jZU2Z2febZPWZ2+Z/URUZxduUyTjfAvJ27IS8xIp2m8dHHOdUQzJsKmpqaCOUdNfU8j1gsVqGIRERERKSWFJ3gmdlG4ABwDlgOvCEz9EbgY6WHJjXB89i2HuoPdZHc0MyJTBXNE/v3kNzQTP2hLravD+ZNBZFIhI6ODrxMvJ7nsWvXLu3eiYiIiMikMOeKuy5nZkeBLzrnvmxmPwRucs6dMrObgSedc/PKGWgtMLM5wNDQ0BBz5sypdDiTYqw+eKcaYPsU7YPn+z59fX3EYjEldyIiIiJSsDNnzlBfXw9Q75w7M97PlZLgDQNvd8715yV4C4EXnHMzi1q4htViggdBkpcYTkAqxezuo8wYTHChMRwcy/Q8wnXhKZXciYiIiIiUqtgEr5Q+eC8DMaA/7/ltwKkS1pUaM79+/qUEboq0QhARERERqUalFFnZBTxiZisJ2iK8zcx+HXgQ+JNyBCciIiIiIiLjV/QOnnPuj8ysHugEZgKHgR8DDzrnvlSm+ESmBd/36e3tpampSXfyRERERGTClNQmwTn3B0AYuBVYBbzFOfeJcgQmMl3E43Gi0SgtLS1Eo1Hi8XilQxIRERGRaaqUIiuPA18B/tkVu4iMUKtFVqYz3/eJRqMjmp97nkd/f7928kRERETkiootslLKDt6bgb8HfDPbkWmPICI5ent7RyR3AKlUir6+vgpFJCIiIiLTWdEJnnPuTmAe8GngFuDbZvaCmX3MzH62POGJTG1NTU2EQiN/m3meRywWq1BEIiIiIjKdlXoHL+mc63DOrQWiwOPAPYC2J0SASCRCR0cHnucBQXK3a9cuHc8UERERkQlR9B28EYuYzQB+CXh/5udrzrnrS164xugO3vTl+z59fX3EYjEldyIiIiJyVZVodI6ZNQO/BmwEPOCvgf8L+OdS1pXaNjA0QGI4AakUs7uPMmMwwYXGMGdXLgPPI1wXvtQYfYqIRCJK7ERERERkwhWd4JmZT1Bo5QDQDux3zp0vV2BSmwaGBlj62FJajw2z4yAsSF4aO90A29bDgZvqOH7v8SmX5ImIiIiITLRSdvA+A+x1zr1ermBEEsMJWo8Ns2+vMbRuNT1bN3N+ySJm9pxk3s7d7NvbxUaGSQwnpnSCp8bnIiIiIjIRik7wnHMd5QxEBIBUih0HYWjdahqe7KQhW4EydhvccQ/JDc08ePAwyVSqsnGWIB6Ps2XLFtLpNKFQiI6ODtra2iodloiIiIhMAyVV0QQws7eb2QYzuzP3VY7gpPbM7j7KgiS8snUz5LUXIBRi8L5NLEwG86Yi3/cvJncA6XSa9vZ2fN+vcGQiIiIiMh2UcgdvIfA3wA2AAywzlC3L6ZUWmtSiGYMJAM4vWTTq+LnM8+y8qWasxuc6qikiIiIipSplB+8R4DTQCAwD7wDWAM8Aa0uOTGrShcYwADN7To46PivzPDtvqlHjcxERERGZSKUkeO8CPumc+wGQBtLOuX8FPgrsLEdwUnvOrlzG6QaYt3M35O10kU7T+OjjnGoI5k1FanwuIiIiIhOplCqaHnA28z4BvA04AbwILC4xLqlVnse29bBvbxfJDc0M3reJc0sWMavnJI2PPk79oS423wUf96buCeC2tjZaW1vV+FxEREREyq6UBO+7wI3AKaAbeMDMfgJsyTwTKVi4LsyBm+rYyDA7Dh5m8Z2HL46daoDNdwV98B6um5pHNLPU+FxEREREJoI5564+a7QPmrUC1zrn/jpTcOXvgCXAfwJ3O+f+uXxh1gYzmwMMDQ0NMWfOnEqHUzEDQwMkhhOQSjG7+ygzBhNcaAwHxzI9j3BdeEr3wBuN+uKJiIiISK4zZ85QX18PUO+cOzPezxWd4I26mNmbgNddORetIUrwapP64omIiIhIvklL8Mws5pzrKzA+GQcleLXH932i0eiI1gme59Hf36+dPBEREZEaVmyCV0wVze+Z2X+Y2ZfNbJOZ/WwRa4gIY/fFExEREREpVDFFVt6dea0FvgTMNLMB4J+BTqDTOfdS2SIUmcayffHyd/DUF09EREREilHwDp5zrss59znn3DqgAWgGHgcWAB3AgJmdKG+YItOT+uKJiIiISDmVpciKmc0CbgNagQ8As51zU7dRWYXoDl7t8n1fffFERERE5KJi7+AV1QfPzGYCv0Cwe7cWWAGcBr4J/Fbmp4iMk/riiYiIiEg5FJzgmdk3CRK6k8Bh4FHgm865wTLHJgKoL54SPxEREREZr2J28H4BeJmgoMq/AIedc4lyBiWSNTA0wNLHltJ6bJgdB2FB8tLY6QbYth4O3FTH8XuPT5skT33xRERERKRYxbRJaAC2AMPAR4CXzOx5M/uSmb3PzN5S1gilpiWGE7QeG2bfXmPuijX07N/Dc71d9Ozfw9wVa9i312g9Nhzs8E0Dvu9fTO4A0uk07e3t+L5f4chEREREZCooeAfPOfcj4MnMCzN7I0GBlWbgAeDPzazXOffOcgYqNSqVYsdBGFq3moYnO2kIZf5OInYb3HEPyQ3NPHjwMMlUqrJxlslYffF0VFNERERErqaYHbx8PwJey7xeB34KLC3DuiLM7j7KgiS8snUzhPJ+uYZCDN63iYXJYN50kO2Ll0t98URERERkvApO8MwsZGa3mtkDZvaPQBJ4CvgQ8ApwL7CwvGFKrZoxGBy9PL9k0ajj5zLPs/OmOvXFExEREZFSFFNkJQlcS1Bo5V+A+4FO59zJMsYlAsCFxjAAM3tOBscy88zqOTli3nTQ1tZGa2ur+uKJiIiISMGKSfB+jyCh+165gxHJd3blMk43wLydu+GOe0Ye00ynaXz0cU41BPOmE/XFExEREZFiFHxE0zm3S8mdTBrPY9t6qD/URXJDMycyVTRP7N9DckMz9Ye62L4+mDfd+b5PZ2enKmqKiIiIyBUVs4MnMmnCdWEO3FTHRobZcfAwi+88fHHsVANsvivog/dw3fQ5ojka9cYTERERkfEw51ylY5AMM5sDDA0NDTFnzpxKh1M1BoYGgj53qRSzu48yYzDBhcZwcCzT8wjXhadNk/PR+L5PNBod0T7B8zz6+/t1jFNERERkmjpz5gz19fUA9c65M+P9nHbwpOrNr59/KYGLrKhsMBWg3ngiIiIiMl7l6IMnIhNIvfFEREREZLxK2sEzs9uB24G3kpcsOuc2l7K2iASyvfHa29tJpVLqjSciIiIiV1T0HTwz+xTwSeAZgp54IxZyzv1yydHVGN3Bk7H4vq/eeCIiIiI1ohJ38D4I/IZz7okS1hCRccrvjef7Pr29vTQ1NSnhExERERGgtDt41wBPlSsQERm/eDxONBqlpaWFaDRKPB6vdEgiIiIiUgVKOaL5BeCsc+6z5Q2pdumIpoyH2iaIiIiITH+VOKI5E9hiZuuA7wAXcgedc/eXsLbIFdV6Xzy1TRARERGRKyklwbsReC7z/p15Y+qeLhNiYGiApY8tpfXYMDsOwoLkpbHTDbBtPRy4qY7j9x6ftkletm1C/g6e2iaIiIiISNEJnnOuuZyBiIxHYjhB67Fh9u01htatpmfrZs4vWcTMnpPM27mbfXu72MgwieHEtE3w1DZBRERERK6k6Dt4Un66g3d1z/pHmHvDrcxdsYaGJzshtwF4Ok1yQzOvHTlM8vlvsTyyonKBTgK1TRARERGZviblDp6ZPQR8wjn3o8z7K9IdPJkIs7uPsiAJPVs30xDKKwIbCjF43yYW33mY73UfhWme4KltgoiIiIjkK/SI5jJgRs77K9G2oEyIGYMJAM4vWTTq+LnM8+y8WhGPx9myZQvpdJpQKERHRwdtbW2VDktEREREJllBCV7uvTvdwZNKuNAYBmBmz0mI3XbZ+KyekyPm1QLf9y8mdwDpdJr29nZaW1u1kyciIiJSY0ppdC4y6c6uXMbpBpi3czfktQognabx0cc51RDMqxVjtU0QERERkdpSlQmemX3IzE6b2Xkz+7aZrb7K/I1m9oKZ/Tjz85fzxs3M/ruZfd/MzpnZv5jZO/LmzDWzJ8xsKPN6wswacsYXm1mnmQ1m4jplZp8zsxl564wZi5TI89i2HuoPdZHc0MyJ/Xt4rreLE/v3kNzQTP2hLravD+bVimzbhFxqmyAiIiJSm6ouwTOzu4GHgT8kuOfXBfyjmY1a897M3gV8FXgCuCnz82tmtjJn2gPA/cBvAyuAV4B/MrM35sz5C+BmYEPmdXNmrawLwJeB9cBi4MPAB4BPFxiLlCBcF+bATXVsvMvx+pHDLL5zEzf/3BoW37mJ144cZuNdjgM31RGuq50jmtm2CV4mqVXbBBEREZHaVXVtEsysG3jWOfdbOc+OA193zn10lPlfBeY45/5rzrMngdedc79qZgZ8H3jYOfeFzPgbgEHgI865XWa2FHgBWOWc687MWQU8DSxxzp24QqwPASucc6vHE8s4/rerTcI4DAwNkBhOQCrF7O6jzBhMcKExHBzL9DzCdeFp2wNvLGqbICIiIjJ9TEqbhIlmZtcAtwD/M2/oIPALV/jYu4Av5j07QLDDBrAAmJdZAwDn3I/N7JuZNXdl1hjKJneZOf9mZkOZOZcleGYWI9jp++sCYpEymF8//1ICN81bIRRCbRNEREREpKQjmma22sy+YmZPm9n1mWf3mNnl5Q3HJwx4BLtruQYJkrTRcslo6QAAIABJREFUzLvK/Hk5z8aa8+ooa7+a/71m9pSZnQd6CY6PfrKAWEYwszeY2ZzsC3jjaPNEChWPx4lGo7S0tBCNRonH45UOSUREREQmQdEJnpltJNidOkdwV+4NmaE3Ah8rMa78c6M2yrNC519tzmjrj7bO3cBy4NeAXwK2FxFL1keBoZyXf4V5IuN2pbYJvq9fXiIiIiLTXSk7eB8HPuic+wBBAZKspwgSoGIkgBSX73i9lct3xrJeucr8VzI/rzancZS135L/vc65/3DOveCc+0vg94H/bmbZko1XiyXf54H6nJfO0UnJ1DZBREREpHaVkuAtBg6P8vwM0DDK86tyzv0E+Dbwnryh9xAkjqN5epT563PmnyZIvC7Oydz1e3fOnKeBejO7NWfOSoKk60rfC8Hu3IzMz/HEMoJz7sfOuTPZF/DDMb5LZFzUNkFERESkdpVSZOVlIAb05z2/DThVwroPAU+Y2TMECdMWYD7wvwDM7MvASzkVNR8BDpvZR4BvAP8NWJeJA+ecM7OHgY+ZWS/B3bmPAcMErRFwzh3PVLv8UzNrz6zbAfxdtoKmmf06wU7l88CPCYrBfB74qnPup+OJRSaGqmqOlG2b0N7eTiqVUtsEERERkRpSSoK3C3jEzDYT3DF7W6YP3IPAZ4pd1Dn3VTN7M0HxkuuA7wJ3OOdezEyZD6Rz5j9lZr8CfA74LHASuDu3IibwR8As4E+AuUA3sN45l7tj9uvATi5V2/xbgr55WT8FPgL8HMGO3YvAY+RUzRxnLFJGA0MDLH1sKa3HhtlxEBYkL42dboBt6+HATXUcv/d4TSV5bW1ttLa2jmiboKqaIiIiItNfSX3wzOwPgd8FZmYe/Rh40Dn3iTLEVnPUB69wz778LJ/78C3s22sMrVvNK1s3c37JImb2nGTezt3UH+pi412Ojz/8bZZfV+zV0KkvHo9fLLwSCoXo6Oigra2t0mGJiIiIyBUU2wev5EbnZlYHvJ3gPt8LzrmzJS1Yw5TgFe5Z/whzb7iVuSvW0PBkJ+TePUunSW5o5rUjh0k+/y2W12jPPN/3iUajIwqveJ5Hf3+/dvJEREREqtSkNzo3s4eu8NwB54E+4BvOudeK/Q6Rq5ndfZQFSejZupmGvMIihEIM3reJxXce5nvdR2u2KfpYVTWV4ImIiIhML6XcwVtG0A7BA04Q3EtrImhz0AN8CNhhZrc5514oNVCR0cwYTABwfsmiUcfPZZ5n59WibFXN/B08VdUUERERmX5KaZPwDeAQ8Dbn3C3OueXA9cA/AX+ZeX+YnCIkIuV2oTEMwMyek6OOz8o8z86rRdmqmp4XtGtUVU0RERGR6avoO3hm9hLwnvzdOTN7B3DQOXe9mS3PvK/dP10XQHfwCqc7eOPn+76qaoqIiIhMEcXewStlB68eeOsoz98CZLOTJHBNCd8hMjbPY9t6qD/URXJDMyf27+G53i5O7N9DckMz9Ye62L4+mFfrIpEIa9euJRKJEI/HiUajtLS0EI1GicfjlQ5PRERERMqglB28PwfeBWwDjhD0wruVoA/eU865ezI94bY7536+TPFOa9rBK9xYffBONcD2Gu2DNxZV1RQRERGpfpNeRRNoJ7hf979z1vkp8P8R9MaDoNjKb5bwHSJjml8/n+P3HicxnOD1VIoL3UeZMZjgQmOYsyuX8XHP4+G6sJK7HKqqKSIiIjJ9laMP3mxgIUEVzZPqg1c87eDJZNAOnoiIiEj1q8QdPACcc2edc99xzh1TcidS/VRVU0RERGT6KmkHz8xuB24nKLYyIll0zm0uLbTaox08mUz5VTWzz1RZU0RERKTyJn0Hz8w+BRwkSPDCwNy8l4hUsdyqmoAqa4qIiIhMA6VU0XwZeMA590R5Q6pd2sErn4GhARLDCUilmJ1XeAXPI6zCKyPoXp6IiIhIdalEFc1rgKdK+LzIhBirdcLpBtim1gmXUWVNERERkemhlCIrfwb8WrkCESmXxHCC1mPD7NtrzF2xhp5M8/Oe/XuYu2IN+/YarceGgx0+AaCpqYlQaOR/DjzPIxaLVSgiERERESlGKTt4M4EtZrYO+A5wIXfQOXd/KYGJFC2VYsdBGFq3moYnO2nIJi6x2+COe0huaObBg4dJplKVjbOKZCtrtre3k0qlLlbWBOjs7FTRFREREZEpopQdvBuB54A08E5gWc7r5tJDEynO7O6jLEjCK1s3Q96uFKEQg/dtYmEymCeXtLW10d/fT2dnJ/39/QAquiIiIiIyxRS9g+ecay5nICLlMmMwOHp5fsmiUcfPZZ5n58klkUiESCSC7/ts2bLl4r28dDpNe3s7ra2t2skTERERqWIlNzoXqTYXGsMAzOw5Oer4rMzz7Dy53FhFV0RERESkepVyBw8AM3s7MJ+gquZFzrm/LXVtkWKcXbmM0w0wb+duuOOekcc002kaH32cUw3BPBldtuhKftsEFV0RERERqW6lNDpfaGbHgO8Cfw98PfP6m8xLpDI8j23rof5QF8kNzZzIVNE8sX8PyQ3N1B/qYvv6YJ6MLlt0xcv8O8oWXdHxTBEREZHqVkqj8/1ACvgAcAq4FXgzsAPY7pzrKleQtUKNzstjrD54pxpgu/rgjZvv+/T19RGLxS7ezevt7VVVTREREZEJVmyj81ISvATQ4pz7jpkNAbc6506YWQuwwzmn828FUoJXPgNDA0Gfu1SK2d1HmTGY4EJjODiW6XmE68JK7goUj8cvFl4JhUJ0dHTQ1tZW6bBEREREpqVKJHivA7c4506Z2UngN51znWa2CHjeOVdX1MI1TAmeVCvf94lGo5fdyevv79dOnoiIiMgEKDbBK6WK5ncJeuEBdAMPmNkvAp8kOLIpItOEqmqKiIiITA2lJHify/n8x4Eo0AXcAWwtMS4RqSLZqpq5PM/j2muvpbOzE9/3KxSZiIiIiOQqpdH5gZz3p4C3m9mbgNddsec+RSaI7uSVJltVs729nVQqhed5vP/972fVqlW6kyciIiJSRYq+gyflpzt4E2OsqpqnG2CbqmqOW7aq5rXXXnsxucvSnTwRERGR8in2Dl5Jjc7N7HbgduCt5B33dM5tLmVtkXJJDCdoPTbMvr3G0LrV9GzdzPkli5jZc5J5O3ezb28XGxkmMZxQgncVkUiESCRCZ2fnFe/kKcETERERqZyiEzwz+xRBQZVngJcBbQVKdUql2HEQhtatpuHJThqyd8lit8Ed95Dc0MyDBw+TTKUqG+cUkr2Tl7+DF4vFKhiViIiIiJRSZOWDwG8451Y6597rnPvl3Fe5AhQp1ezuoyxIwitbN0NeoRBCIQbv28TCZDBPxid7J8/zPCBI7nbt2gWgoisiIiIiFVRKgncN8FS5AhGZKDMGEwCcX7Jo1PFzmefZeTI+bW1t9Pf309nZSX9/PwDRaJSWlhai0SjxeLyyAYqIiIjUoFISvD8Dfq1cgYhMlAuNYQBm9pwcdXxW5nl2noxfJBJh7dq1AGzZsuXikc10Ok17e7t28kREREQmWUF38MzsoZx/DAFbzGwd8B3gQu5c59z9pYcnUrqzK5dxugHm7dwNd9wz8phmOk3jo49zqiGYJ8UZqxG6iq6IiIiITJ5Ci6zk/wn4uczPd+Y9V8EVqR6ex7b1sG9vF8kNzQzet4lzSxYxq+ckjY8+Tv2hLjbfBR/P3CeTwqnoioiIiEh1KCjBc841T1QgIhMlXBfmwE11bGSYHQcPs/jOwxfHTjXA5ruCPngP1+mIZrFGa4SeW3SlqalJO3kiIiIik6DgRudm1gJ8CViV33DPzOoJCq980DnXVbYoa4QanU+cgaEBEsMJSKWY3X2UGYMJLjSGg2OZnke4LqweeGWQbYQei8U4cODAxXt5oVCIjo4O2traKh2iiIiIyJRQbKPzYhK8vwU6nXNfvML4VqBZrRIKpwRPpgvf94lGo5cd2ezv79dOnoiIiMg4FJvgFVNF8ybgyTHGDwK3FLGuiEwTYxVdEREREZGJU2iRFYBG8ipm5vkp8JbiwhGZHDqyObHGKrri+z69vb26lyciIiIyAYpJ8F4CbgCu9FfxNwIvFx2RyAQbGBpg6WNLaT02zI6DsCB5aex0A2xbHxRdOX7vcSV5RbpS0RXdyxMRERGZWMXcwXsUWAuscM6dzxubBXyL4I7e1nIFWSt0B29yPPvys3zuw7ewb68xtG41r2zdzPkli5jZc5J5O3dTf6iLjXc5Pv7wt1l+3fJKhzul5RZdAXQvT0RERGScir2DV8wO3ueA/xv4npl9CThB0PduKXAv4AF/WMS6IpMjlWLHQRhat5qGJztpyDY+j90Gd9xDckMzDx48TDKVqmyc00AkErmYvHV2dqoZuoiIiMgEK7jIinNuEPgF4LvA54G/Ab4O/I/Ms1/MzBGpSrO7j7IgCa9s3QyhvN8CoRCD921iYTKYJ+WTvZeXy/M8rr32Wjo7O/F9v0KRiYiIiEwfxVTRxDn3onPuDiAMrARWAWHn3B3Ouf4yxidSdjMGEwCcX7Jo1PFzmefZeVIe2Xt5nucBQXL3/ve/n1WrVtHS0kI0GiUej1c4ShEREZGpragEL8s597pz7ohz7lvOudfLFZTIRLrQGAZgZs/JUcdnZZ5n50n5tLW10d/fT2dnJ08//TRPPPHExWOb6XSa9vZ27eSJiIiIlKCkBE9kKjq7chmnG2Dezt2QdyeMdJrGRx/nVEMwT8ovEomwdu1azp49q155IiIiImWmBE9qj+exbT3UH+oiuaGZE/v38FxvFyf27yG5oZn6Q11sXx/Mk4mjO3kiIiIi5acET2pOuC7MgZvq2HiX4/Ujh1l85yZu/rk1LL5zE68dOczGuxwHbqojXKcjmhNJd/JEREREyq/gPngycdQHb/IMDA2QGE5AKsXs7qPMGExwoTEcHMv0PMJ1YTU5nyTZXnnXXnstq1atUp88ERERESa3D57IlDe/fv6lBC6yorLB1Lhsr7wr9cl7+umnCYfDNDU1KdETERERuYqCEjwze2i8c51z9xcejojUquydvNwkz8z4lV/5FdLpNKFQiI6ODtra2ioYpYiIiEh1K+iIppl1jnOqc861FBdS7dIRzcrRkc3qEI/HaW9vJ5VKEQqFcM6R+98oHdkUERGRWlHsEU3dwasiSvAqY2BogKWPLaX12DA7DsKC5KWx0w2wbT0cuKmO4/ceV5I3CbJ38l599VXuvvvuy8Y7OztZu3bt5AcmIiIiMokqdgfPzN4OzAeuyXnsnHP7S11bZDIkhhO0Hhtm315jaN1qerZu5vySRczsOcm8nbvZt7eLjQyTGE4owZsE2Tt5vu9fdmQzt42C7uSJiIiIXK7oBM/MFgJ/A9wAOMAyQ9ktQTURk6khlWLHQRhat5qGJztpyPZmi90Gd9xDckMzDx48TDKVqmycNSbbRiF7ZDO3jYLu5ImIiIiMrpQ+eI8Ap4FGYBh4B7AGeAZYW3JkIpNkdvdRFiThla2bIa/xNqEQg/dtYmEymCeTq62tjf7+fjo7O3n66ad54oknLu7opdNp2tvb1RBdREREJEcpCd67gE86534ApIG0c+5fgY8CO8sRnMhkmDGYAOD8kkWjjp/LPM/Ok8kViURYu3YtZ8+evWIbhc7OTiV6IiIiIpSW4HnA2cz7BPC2zPsXgcWlBCUymS40hgGY2XNy1PFZmefZeVIZ2TYKubJtFFpaWohGo8Tj8QpFJyIiIlIdSknwvgvcmHnfDTxgZr8IfBI4VWpgIpPl7MplnG6AeTt3Q94OEek0jY8+zqmGYJ5UTvZOnucF13uzyZ6ObIqIiIhcUkqC97mcz38ciAJdwB3A1hLjEpk8nse29VB/qIvkhmZO7N/Dc71dnNi/h+SGZuoPdbF9fTBPKiv3Tt5f/uVfkt/mJZVK0dfXh+/7OrYpIiIiNanoKprOuQM5708BbzezNwGvOzXXkykkXBfmwE11bGSYHQcPs/jOwxfHTjXA5ruCPngP1+mIZjW4WhuFZ555httvv12VNkVERKQmqdF5FVGj88oZGBogMZyAVIrZ3UeZMZjgQmM4OJbpeYTrwuqBV4Xi8fiINgqf//zn+f3f//3Lkr7+/n71zBMREZEppdhG5yUleGZ2O3A78Fbyjns65zYXvXCNUoInUjjf9+nr6yMWi9Hb20tLS8tlc772ta8RDofVHF1ERESmjGITvFIanX+KoKDKM8DLXGpwLjLlaUdv6sge2czKP7aZrbSpI5siIiJSC4rewTOzl4EHnHNPlDek2qUdvOowMDTA0seW0npsmB0HYUHy0tjpBti2PriTd/ze40ryqlDusc1QKIRzbkQxFh3ZFBERkamg2B28UqpoXgM8VcLnRapSYjhB67Fh9u015q5YQ0+mqmbP/j3MXbGGfXuN1mPDwQ6fVJ3xVNpUc3QRERGZrkrZwfsCcNY599nyhlS7tINXHZ71jzD3hluZu2INDU92Qm5z7XSa5IZmXjtymOTz32J5ZEXlApWr8n2faDR62ZFNM9ORTREREalqldjBmwncb2bfNLNHzeyh3FcJ64pU1OzuoyxIwitbN49M7gBCIQbv28TCZDBPqpuao4uIiEitKbrICnAj8Fzm/TvzxlRwRaasGYPB0cvzSxaNOn4u8zw7T6pbW1sbra2t9PX18eqrr3L33XePGM8e2VSVTREREZkOSml03lzOQESqxYXGoKH5zJ6TELvtsvFZPSdHzJPqN1ZzdFXZFBERkemklCOamFmDmW0zsz8zsz81s981s/pyBSdSCWdXLuN0A8zbuRtyEgEA0mkaH32cUw3BPJlaxntk88iRIyrCIiIiIlNS0Qmemf08cBL4XeBNQBi4HzhpZsvLE55IBXge29ZD/aEukhuaOZGponli/x6SG5qpP9TF9vXBPJl6xlNlc9WqVbS0tBCNRonH4xWKVERERKRwpVTR7AL6gA84536aefYzwJ8BC51za8oWZY1QFc3qMFYfvFMNsF198KaN0aps5lPfPBEREamEYqtolpLgnQOWOed68p6/HXjGOVdX1MI1TAle9RgYGgj63KVSzO4+yozBBBcaw8GxTM8jXBdWcjdN5DdGHy3Z+9rXvqYiLCIiIjKpKpHgDQL3OOcO5j1vBb7snGssauEapgSveinhm95836evr49rr72WVatWqW+eiIiIVFwlErydwC8D24GnCFoj3Ab8MbDPOffhohauYUrwqtNYRzZPN8A2HdmcVvJ39JxzI+7pZY9sAvT29mpXT0RERCZEJRqdbwf+Gvgy0A+8COwB/gr4SAnrilSVxHCC1mPD7NtrzF2xhp5M0ZWe/XuYu2IN+/YarceGgx0+mfLGU4TlkUceIRqNqhCLiIiIVJ2id/AuLmBWBywCDOhzzg2XI7BapB286vSsf4S5N9zK3BVraHiyE0I5fy+STpPc0MxrRw6TfP5bLI+sqFygUnajFWHJb60Awa7e008/zdmzZ7WjJyIiImVRiR08AJxzw865551z31FyJ9PR7O6jLEjCK1s3j0zuAEIhBu/bxMJkME+ml/y+eZ7ncf/9919WiEWtFURERKRa/Ewhk83sIeATzrkfZd5fkXPu/pIiE6kSMwaDo5fnlywadfxc5nl2nkwvbW1ttLa20tfXRywWA+Chhx66LMnLb5Z+4403akdPREREJl1BCR6wDJiR8/5KSjv3KVJFLjSGAZjZcxJit102Pqvn5Ih5Mv1EIpERSVpHR8eYrRWyO3qqvCkiIiKTreQ7eFI+uoNXnXQHT0YzVmuFfLqjJyIiIoWa9Dt4ZjbfzOxKY8WuK1J1PI9t66H+UBfJDc2cyFTRPLF/D8kNzdQf6mL7+mCe1I5IJMLatWtZsWLFiHt6ofx7muiOnoiIiEyeUvrgpYDrnHOv5j1/M/Cqc05/2i2QdvCq01h98E41wHb1wRO0oyciIiLlVYlG52mg0Tn3g7znUeAF59y1RS1cw5TgVa+BoYGgz10qxezuo8wYTPDKHOP0O4M/mC/4rs+8M44LjWHOrlwGnke4LqyEr0blN0sfLdnLPtcdPRERERnNpCV4OdUzfwf4UyC3NYIHrARSzrlfLGhhUYI3hYy1q3e6AbZpV6/maUdPRERESlFsgldoFU24VD3TgBuAn+SM/QQ4BjxYxLoiU0ZiOEHrsWH27TWG1q2mZ+tmzi9ZxMyek8zbuZt9e7vYyDCJ4YQSvBqVW3lTVTdFRERkspRyRPNxYKtz7oflDal2aQdv6lBlTSlUoTt6/f39APT29mpXT0REpAZNehVNoBe4K/+hmW02s4+UsK5I1ZvdfZQFSXhl6+aRyR1AKMTgfZtYmAzmiUDhVTcfeeQRotGoKm+KiIhIQUpJ8LYAPaM8/3fggyWsK1L1ZgwmADi/ZNGo4+cyz7PzRHK1tbXR399PZ2cn//Zv/3ZZkhcKhXjooYcu7vKl02na29s5cuQInZ2d+L5fibBFRERkCiglwZsHvDzK8x8A15WwrkjVu9AYBmBmz8lRx2dlnmfnieS70o6e53ncf//9V7ynl7uj5/u+Ej4REREZoZQ7eL3Ap51zX8l7fk/m+cIyxFdTdAdv6tAdPCm37B29WCwGQDQaHfOenplhZirMIiIiMk1NZhXNrD8DHjazGf8/e3ceH1V1/3/8dRIiGJAECQQQoyBrFdlkU0FQtlqkWsQuahERbbV1q1alYOuGWDek1ZZNEdTvTwEVqQqIBBJF1gQEBcISCQgEIiQIQQjJ+f1xZ5LJZJLJhCwzk/fz8ZhHvHPP3HsTZXl7zvl8gGWu964B/gm8eAbXFQl+kZH8ZTDMn5tM9tABZP55NCc6XMTZW3cS/683iFmazO0jYbxrVkbEH8+qm+C/8qa1Fvf/oHMv4bz00kvVakFERKSWO5MZPANMAu4FznK9/RPwnLX2ycp5vNpFM3iho6w+eLti4SH1wZNKEEjlTSjZPH3IkCGqwikiIhKiqq3ReYkLGNMA6AicALZba0+e0QVrMQW80JKRk0FWbhbk59NgdSrHM3aQFRNFVud2xG1MIy4nj/oJbTjWqytERhIXHaewJxU2c+bMYjN6njN4vvhawqnAJyIiEjpqLOBVNmPM3cDDOIVavgHut9YmlzF+BPAUcBGwE/ibtfYDj/MG+DtO1c9GwGrgHmvtNx5jGgFTgOGutz4C/mytzXad7w88APQEGuK0iHjeWvu2xzWigMeAUcB5wDbgEWvtogC+dwW8EFXWjF56LPxFM3pSCTz36S1evLjMJZzetGdPREQktNTkDN7PgASKlmkCYK39qALX+jUwB7gb+BK4C7gD+Jm1NsPH+D5AMjAB+AC4AXgSuNJau9o15hHgb8BtQBowHugHtHc3aTfGfAq0xAmBANOA76y117nOjwPOBj4FMoFfAC8D11trF7rGPAfcAozFaR8xBHgJuNxaW65maAp4oStlfwpP39+d+XMNOQP7cuDe2/mpw0XU27qTZlNeJ2ZpMiNGWsZPXk+35t1q+nElTAS6hNOTmqmLiIgEt2oPeMaY1jihqhNgAeM6ZQGstQFXlzDGrAZSrLV/9HhvC/ChtfYxH+PfBRpaa3/u8d4i4Ii19reu2bt9wGRr7XOu83VxQtoj1tqpxpiOwLdAb49Q2Bv4Cuhgrd1WyrN+DGRaa293He8DnrHWvuox5kPgmLX2lnJ+/wp4IUpVNaWmBbqEE+Chhx4q7LenZZwiIiLBpaIB70z64L0CpAPxQC5wMc7M2Dqgf6AXM8acBXQHlnidWgJcXsrH+vgYv9hjfCucfn2FY1x7BFd4jOkD5LjDnWvMKiCnjPsCxACHPY7r4hSZ8XQCuLKMa0iYaLA6lVbZcODe24uHO4CICDL/PJrW2c44karg2Tx99+7dTJ8+vbC3XkREBM7/7yriq5n62LFjueCCC9RrT0REJISdScDrAzxurT0EFAAF1tovcPahTanA9eKASJzZNU+ZOCHNl2Z+xjfzeK+sMQd9XPtgafc1xtwI9ADe8Hh7MfCgMaatMSbCGDMI+CVlNH03xtQ1xjR0v4BzShsrwS0qMwuAnzpc5PP8Cdf77nEiVcHdPL1ly5ZlBr7SmqlbaxX4REREQtyZ9MGLBI65/jkLaIFTWGQ30P4Mruu9psj4eC/Q8f7G+Lq+z/u6Cq7MAsZ6FmoB7gOm4+y/szgFX94ARpfx7I/hFICREJcXHwdAva07oU3JSduzt+4sNk6kOnj21hszZgxDhgwp1kzdcwbPF+9ee2PHjlVlThERkSB3JjN4m4FLXf+8GvirMeYK4HFgVwWulwXkU3LWrCklZ+DcDvgZf8D11d+YeB/XbuJ9X2PMVcBC4EFr7WzPc9baQ9ba64H6wAVAB5wAnF7KswM8i7PU0/3S345C1LFeXUmPhWZTXgfvvzAXFBD/rzfYFeuME6kpnjN8LVu2ZNq0aWUu4/RWnhk+ERERqVlnEvCe9vj8eJxQkwxci9P8PCDW2lPAemCQ16lBwMpSPvaVj/GDPcan4wS4wjGuvX5XeYz5CogxxvT0GNMLJ3Ct9HivP/Ax8Ki1dloZ38dP1trvcWZHRwALyhh70lp71P0CfixtrAS5yEj+MhhiliaTPXQA2xbOYsP2ZLYtnEX20AHELE3mocHOOJFgEei+PW/ege+uu+5i7dq1JZZwalmniIhI9anUPnjGmHNxKlhW6KIebRL+gBO87sRpO3CxtXa3MWY28L27oqYx5nIgCacNwgKcPW9PU7JNwmM4SyW3A+NwisB4t0logdOWAZw2Cbs92iT0xwl3r1B8f+Epa+1h15heOP3vNri+/gOnyEs3dz+9cnz/qqIZosrqg7crFh5SHzwJEWX12itPZU53Tz73Ek6AO++8U8s6RUREAlStbRJcTb2XAHdZa9MCvkDZ174b+CtOcZLNwAPW2iTXueU4/elu8xh/I06oa01Ro/P3Pc67G53fRfFG55s9xpx0Ui7XAAAgAElEQVRLyUbnf/JodD4Lp4G5txXW2v6uMVcB/3E9xzHgE5zZvn0BfO8KeCEsIyeDrNwsyM+nwepUjmfsICsmiqzO7YjbmEZcTh71E9o4yzQjI4mLjlPYk6B3JoEvwlVR1nOfn6+G6wp8IiIiJdVEH7xDOE28t1foAlKCAl74KGtGLz0W/qIZPQlRZQW+QBqtuynwiYiI+FYTAe9FIM9a+2iFLiAlKOCFj5T9KTx9f3fmzzXkDOzLgXtv56cOF1Fv606aTXmdmKXJjBhpGT95Pd2ad6vpxxWpMHfgq1+/Pr179y4W8nzN4PmjwCciIuKoiYD3L+D3wA6c5ubHPc9bax+s0IVrMQW88JGydy2NOvWkUY9+xC5KLN78vKCA7KEDOLw2iexNa+jWskfNPahIJZo5c2bhjF5kZCRTp04FCHgfnycFPhERqa0qGvDOpA/eJUCK65/beZ2rvMotIiGowepUWmXD1ntvJzbCq1htRASZfx5N++FJpK1OBQU8CRPevfbc4cvzvUD38ZWnF9+YMWPYu3dvsdDnfSwiIlJbBBzwjDGtgXRr7YAqeB6RsBCVmQXATx0u8nn+hOt99ziRcOHZXN3Xe94h8EwD31133cWRI0d45JFHCkPfrbfeypw5czTrJyIitVJF+uBtx2kCDoAx5l1jjK9G4SK1Vl58HAD1tu70ef5s1/vucSK1iWfD9TPtxZefn18Y7sAJfW+++abfhuzevfnUq09ERMJFwHvwjDEFQDNr7UHX8Y9AZ2vtrip4vlpFe/DCh/bgiVRcIK0ZKlK903tfn68ZPy37FBGRmlZtRVYU8KqOAl748K6imfnn0aS3rI/55hsunr6A85I28HJv6DR2HI0HX6++eCJlKC3wRUZG8uyzz/Loo49WqEVDaSIjI5k0aVLAyz4VAEVEpDJVZ8DLxwl4h1zHPwKXWmvTA7qQlKCAFz7K6oOXFwFRHn8XVV88kcB4Br6WLVuWqN55yy238NZbb1W4cif4nxkszyygAqCIiJyJ6p7B+xQ46XrrOmAZJdsk/CqgC4sCXpjJyMkgKzcL8vP5YcmHbJo+kQdWw/d9u7J57C/hkou5cO9x9cUTqQTeoS+QZZ7eKtq03ZOWgYqIyJmqzoD3RnnGWWtHB3RhUcALY9qTJ1Kzylrm6TnjV1XLPr1V1jJQBUIRkfBV7Y3OpfIp4IWvtPnTaHfjXWxdOIsOw0aVOL9t4SzaDx9N2ryptBtxZw08oUjtUtaMX1Ut+/R2pstAy9sOQqFQRCQ0KeCFAQW88JX+2kRa3fM3NmxPpkubK0uc37A9mS7t+pH+6jO0untcDTyhiHgr77LPigTAylgG6s07EE6bNg2AO++8U7OEIiIhSAEvDCjghS/N4ImEn4oGwOpaBhrhWgoeDLOECokiIoFTwAsDCnjhS3vwRGqfYFgGWtkqMktYWbOGCpEiUtso4IUBBbzw5asv3okOF3H21p3E/+sNVdEUqaXOZBbQXyAszwzemarIPSoyawgKkSJS+yjghQEFvPBVVl+8XTHwRlfIaBLFH4Y/Qd3+A9X4XEQA/7OAZQXCqVOnAoTcLKG32h4iKxI6FURFwkNFA17hb+561fwLaAjYnJwcK+Fnd/Zuu37fert+zxq7cs5E+/uRUXZ8f+yuWKyl6LUrFnvDTdjoZ6Lt7uzdNf3YIhJC9uzZYxMTE+2ePXtKfc/zeMaMGTYyMtICNjIy0o4aNarU44iICGuMsUDhKyIiwkZERBR7LxRfFfk+jDGFn4mIiLCjRo0q83jGjBl2xowZAX0m0GNf95gxY4bds2ePXbZsWbH/Bs7kuDKuESr3EKlJOTk57t9zGtpAMkUgg/VSwJPKsX7fenvDTdgCY+yRQf3sloWzbOr2ZLtl4Sx7ZFA/W2CMveEm7Pp962v6UUUkzJUVAL2PvQOhO1CcSUisivAVjK/q+D583SPQIBqsQbUm7lEZ4bginwmV8Ftb7lGTFPDC4KWAV3us37PG7orFHhnUz9r8/OIn8/PtkUH97M5Y7Po9a2rmAUVEShHoLKH3cSCBUCEy+F41FVRr4h5nGo6rIqgGS/itLfeYMWNGjfw+61bRgKc9eEFEe/BqD7VNEJHaLJC9hZ77yiq6H7GsAjXl2a9YGfsXI4K06I1Unar49xEu/x2Fyj0iIyP57rvvamwvq/bghcELzeDVGrtefcZasKnbk32eT01LshaccSIiErBAZhUr8pmqnomsyLH3PSoym+nvFSyza6FwD73C45WYmFi9v3l50BLNMHihgFdrbJs31VqwWxbO8nl+60dvWAvOOBERCUpVHSIr4x6VHSprIqjWxD0qIxxXRYgMl/AbKveIjIys0b14Cnhh8EIBr9bQHjwREakuVR0iw/UeZxoyqyKoBkP4rU330B48OWPag1d7+Gp8nt6yPuabb7h4+gLOS9rAy72h09hxNB58vfriiYiI1IBA94ue6f7Rmrqm7lH2NWqKGp2HAQW82qOsxud5ERDlsR84PRb+MhgWd45myz1bFPJEREREaoGKBrw6VfdIIlKahJgEttyzhazcLI7k57NjyYdsmj6RB1ZD5pVd2Tz2l3DJxVy49zjNprzO/LnJjCCXrNwsBTwRERERKZVm8IKIZvBqr5S9a2nUqSeNevQjdlEiuEr7AlBQQPbQARxem0T2pjV0a9mj5h5URERERKpFRWfwIvwPEZGq1mB1Kq2y4cC9txcPdwAREWT+eTSts51xIiIiIiKlUcATCQJRmVkA/NThIp/nT7jed48TEREREfFFAU8kCOTFxwFQb+tOn+fPdr3vHiciIiIi4ouKrIgEgWO9upIeC82mvA7X3lq4TDMjJ4OsYwdp+sIr7KsPh9I2wPxpHOvVVa0TRERERKQEBTyRYBAZyV8Gw/y5yWQPHVDYF2/Sf2/h+YWnOG8/GKDFuP8Aap0gIiIiIr4p4IkEgbjoOBZ3jmYEuby4JIn2w5NoDwwFLLCuBTx03Vk89oe31TpBREREREqlNglBRG0SareMnAyycrMgP58Gq1M5mJbKRRP/S363rhx8ZxpxDZoWBTm1ThAREREJa2p0LhLiEmISigJcyx4wfxrNj8HWh++j23mXFR/sap3QfngSaatTnfEiIiIiUuupiqZIkFLrBBEREREJlAKeSJBS6wQRERERCZSWaIoEKV+tEwr36eXlcf6EZ8iMhowj33Fs71q1TRARERERBTyRoOXVOuGbO67j+s3jGbPyJBNWQP3TzrD4sc+SHvus2iaIiIiIiAKeSLDybp1wxWdJHHKdO14HHr0GZlxelwWdnuHi6R+pbYKIiIiIqE1CMFGbBPHm2Trhh0Xv0/m+SZi2bdnz0RyIiipakqm2CSIiIiJhRW0SRMKQZ+uEtEapNM2FrU//jW4JvYoPVNsEEREREUFVNEVChtomiIiIiIg/CngiIUJtE0RERETEHy3RFAkRapsgIiIiIv4o4ImECrVNEBERERE/FPBEQoTaJoiIiIiIP2qTEETUJkH8UdsEERERkdpBbRJEagG1TRARERGRsqiKpkiIUtsEEREREfGmGTyREFWsbUKbK4HiSzhPz5oJwA/ffcsRVdUUERERqRW0By+IaA+eBCJl71oadepJox79iF2USMaPe+n4akeGbMzlxSXQKrtobHosqqopIiIiEkIqugdPSzRFQpWrbULMUqdtwpaFr3P9+lzmz4VmkQ2xwO9ugEVz/kGjHv2YP9cwZKNTVVNEREREwpMCnkiIKmybMNJyZG0SQ259grffB2Nh/+mjjLgJFlwWzc+uG03sokRyBvblhSVAfn5NP7qIiIiIVBEFPJEQlRCTwJZ7tjB+8nqObFrDuodvBmDNuFFkb17D+Mnri5Zjuqpqts6GBqtTa/jJRURERKSqqMiKSAjzbJuQfuHPADhr9B108dEWQVU1RURERMKfZvBEwkSxqpre8vM5d+7HANQ59IOWaYqIiIiEKQU8kTBxrFdX0mOh2ZTXoaAAcNom7Jr5AidbJZDwt+cAOP8fL3GyVQK7Zr5ARk5GTT6yiIiIiFQyBTyRcOFVVfPL917g8Tsu4sI7HubI4X0U4FTV7D0GFp+9jwvveJhH7m6rkCciIiISRrQHTyRMFFbVJJcXlyRxxWdJXOE6l1sHbrwJPr20HvNumkfzJ+LZd/OdPPNJKlk/ZqovnoiIiEiYUKPzIKJG53KmMnIynD53+fkUTHmFy55/mzXjRlHnj/dAZCRx0XGFYW7bwlm0Hz6atHlTaTfizhp+chERERHxVNFG55rBEwkjqqopIiIiUrsp4ImEqWJVNdtcCRSf4Ts9ayYAP3z3LUf2ri0xwyciIiIioUdLNIOIlmhKZUrZu5ZGnXrSqEc/YhclkvHjXjq+2pEhG3N5cQm0yi4amx4LfxkMiztHFzVHFxEREZEaU9ElmqqiKRKuPKpqcv31nEhaxvXrc5k/F1rWOReLU1Vz0Zx/0KhHP+bPNQzZmOvM8ImIiIhISFLAEwljH/wMkl+4l5Mb1tN++Gjefh+MhYKz65H84n38X2does11xC5KJGdgX15Ygpqgi4iIiIQwBTyRMBUXHUd0VDRX/fgK0aP3cf8Q5/27r4Xo0fuc96OiiYuOg4gIMv88mtbZ0GB1as0+uIiIiIhUmIqsiISphJgEttyzpXDJZW70TFj8Glc9PoM7EroCFCuqoqqaIiIiIqFPAU8kjHm2TUhrsw6Azofq0KFXt8Ix7sqah776DIBjG9eQNn8ax3p1VWVNERERkRCjKppBRFU0pSp5V9UkIoKMnAw6vtqRoRtymf0B1D9dNF6VNUVERERqjqpoikjZPKpqZg8dwLaFs/hqyxL+vCyXeXMh+jQ8eg28u2oGWxfOUmVNERERkRCkGbwgohk8qUru2TpfffCO14Hf3wCLunjM1hUUkD10AIfXJpG9aQ3dWvaouYcXERERqWUqOoOnPXgitYRn0ZUj+fnkrU7l5OJP6DR9AV/MfJy/DfolL3vut3NV1mw/PIm01amggCciIiIS9BTwRGoRz6IrtOxBemYWsID4ywfRpblTeMVddIX8fLL3fEN74OTiT0hR0RURERGRoKeAJ1KL5cXHAVBv605oc2WJZZzdXMs4O01fQPrcBSq6IiIiIhLkVGRFpBY71qsr6bHQbMrrUFBAVm4WQzbmMn+uoVGPvhzs2o70GFj85uMquiIiIiISAhTwRGozr8qahz77kJcWw6EubcFCkw3b+csQaDLol8QuSiRnYF9eWALk59f0k4uIiIiIDwp4IrVYXHQciztHM2Kk5cjaJIaMeooLc6BpahqH1yUzYqRlcedo4qLjCouutM6GBqtTa/rRRURERMQH7cETqcW8K2see+oZOk1bwLJXHyJ2+E2Mj4zk4dMnycrNIuvHTBVdEREREQly6oMXRNQHT2pa2vxptLvxLrYunEWHYaPK7J2XHouKroiIiIhUkYr2wdMSTREppKIrIiIiIqFNAU9EiqjoioiIiEhIU8ATkULlKbry6aX12H9sPyn71pPWoxWts6Fgyiuk7F1Lyv4UMnIyavrbEBEREam1tAcviGgPngSDjJwMZ8llfj5RXkVX9p84yMj3RjL06xPakyciIiJShbQHT0QqRUJMAt2ad6Nbyx7UHXwtAC0SLqFbyx40b9CcoV+fcO3J68eqv48BYM24UdqTJyIiIhIEFPBEpFTeRVfIz+fFJZAzsC+xn3xOh5Xb2RULdf54j/bkiYiIiAQBBTwRKZ1X0ZXT/3mVVtmw9fK2ZF97DTFLk3losDNOjdBFREREap4anYtIqQqLrpDLi0uS6PlZEgC9n5jJrli4fSR8emk9xhzbT8retWqELiIiIlLDVGQliKjIigQjz6IrBVNe4bLn32bNuFHU+eM9KroiIiIiUkVUZEVEqoRn0ZWIe+8jPRbarU2nW4vuXkVX1AhdREREpKYp4IlI+akRuoiIiEhQU8ATkXJTI3QRERGR4KY9eEFEe/AkFKgRuoiIiEjV0x48EakWaoQuIiIiErwU8ESkwtQIXURERCS4KOCJSMWVsxH6/hMHScncwOpbrqJ1NkQ99Qxp86dpX56IiIhIJdMevCCiPXgSajJyMuj4akeGbMwtseduVyw8NNhphG4wDP36BC8thgtzisZoX56IiIiIb2GzB88Yc7cxJt0Y85MxZr0xpq+f8SOMMd8aY066vt7gdd4YY/5hjNlnjDlhjFlujLnYa0wjY8wcY0yO6zXHGBPrcb6/MWaBMWa/Mea4MWaDMeZmH89yvzFmm+s+e4wxLxtj6p3pz0QkWCXEJLDlni2Mn7yeI5vWsO5h55fFmnGjyN60hvGT1zPvpnmF+/KiW7cDYNmrD7F14SztyxMRERGpZEEV8IwxvwYmA88AXYFk4FNjjM//rW+M6QO8C8wBOru+vmeM6eUx7K/Ag8CfgB7AAeAzY8w5HmPeAboAQ12vLq5ruV0OfA2MAC4FXgdmG2Ou83iWm4FJwBNAR2AM8Gvg2UB/DiKhpKxG6N2ad6P52U1d+/Ku5KzG8eyKhdjhN9Fh2CjtyxMRERGpZEEV8HCC2Exr7Qxr7RZr7f3AHuCPpYy/H/jMWvustXartfZZ4HPX+xhjjOufn7HWvm+t3QyMAqKB37nGdMQJdXdYa7+y1n4FjAWGGWPaA1hrJ1prJ1hrV1prd1prpwCLAM/Zwj7Al9bad6y131lrlwD/B1xWeT8ekSDntSdv28JZZH/0Hq2y4VRWJjGff1G0J29/inrliYiIiFSyOjX9AG7GmLOA7jizYJ6W4Myg+dIHeNnrvcW4Ah7QCmjmugYA1tqTxpgVrmtOdV0jx1q72mPMKmNMjmvMtlLuHQNs8Tj+ArjFGNPTWrvGGNMauBZ4s5TPY4ypC9T1eOuc0saKhILCRujk8uKSJNoPT6K969zxXWn8YaSzJ2+Rj155lz3/NunT39aePBEREZEzEDQBD4gDIoFMr/czcUKaL838jG/m8Z73mAs8xhz0ce2Dpd3XGHMjznLPu9zvWWv/nzGmCfCFa+awDvAfa613YPX0GPD3Ms6LhBT3nrys3CyO5OeTtzqVk4s/odP0BaRNeZzxg37JmGP7mTl+GPPnGnIG9mXV5W3p/cRM1owbRbu16cyfm8wInD15CngiIiIigQmmgOfmXdbT+Hgv0PH+xvi6vs/7GmP6A7OAsdbab7ze/xtwN7AaaAO8YozZb619qpRnfxZ4yeP4HGBvKWNFQkJCTEJRMGvZg5ReXUmfu4Beby0n9pa/k+LdK+/aa9gVCztGXkOduzvS9OY7eWFJKqsyC395ERcdp7AnIiIiUg7BFPCygHxKzpo1peQMnNsBP+MPuL42A/aXMSbex7WbeN/XGHMVsBB40Fo722v8U8Aca+0M1/EmY0x9YJox5hlrbYH3Day1J4GTHtf38RgiIc61L2/+XFevvB6taJUNqy5vSwdXr7zbR8IHC35PRAHcVR9ey4Ypj/6eW3tBQQRER2nJpoiIiEh5BE2RFWvtKWA9MMjr1CBgZSkf+8rH+MEe49NxAlzhGNdev6s8xnwFxBhjenqM6YWzx26lx3v9gY+BR62103w8SzTgHeLycWYCldyk1irclzfScmRtEj0nOttSez8xk8Nrkxgx0vLBz2Bh3dHkvtGC1z5xPjd5MeS+0YIV59xHbp7aKIiIiIiUR9AEPJeXgDuMMbcbYzoaY14GEoD/AhhjZhtjPNsOvAIMNsY8YozpYIx5BBiI02oB63RxnwyMM8bcYIy5BGd5ZS5OawSstVtwKmJON8b0Nsb0BqYD/7PWbnPdtz9OuJsCzDfGNHO9zvV4loXAH40xvzHGtDLGDMKZ1fvIWqv671Jr+euVN2L8W9zwLfxi3CzqdunO7n+OByBj4iPU7dKdvg9N4YZva/I7EBEREQkdxslAwcMYczdO77rmwGbgAWttkuvccuA7a+1tHuNvBJ4GWgM7gb9Za9/3OG9wCpncBTTC2R93j6tlgnvMuTjhbbjrrY+AP1lrs13nZ+G0V/C2wlrb3zWmDs4evFuB84BDOKHvb+7rlON7bwjk5OTk0LBhw/J8RCTkpOxdS6NOPWnUox+xixJJ2be+6PiTz8m+9hoOr01iVeJsOpzbjvOH30LB9h1seuUxzh16A0RGak+eiIiIhL2jR48SExMDEGOtPVrezwVdwKvNFPCkNkjZn8LT93cvrKKZ1qMVPSe+yaq/j6HDyu3ELE1mxEjn9yXPNgoA6bGojYKIiIjUChUNeMG2RFNEwlx59uQBzJ9raNSjH0unPwrAxrt/RaMe/Zg/1zBko/bkiYiIiPiigCci1crfnrwbH5td1EZhUSLnxzut0g9278CuGf8k69KL+M//4PDH80jZu5aU/Slk5GTU5LckIiIiEjQU8ESk2iXEJNCteTe6texBxL33kR4LLb78GoyhScpWp43C768mZX8KUf98gV2xMHXRRBp17k2TjTuIz4WBY5+lUaeePH1/dzq+2lEhT0RERAQFPBGpYXHnxDPu2rNosSKVvVf3IHnORAD++dk/2Ht1Dy784hvm/gzmztOSTRERERF/FPBEpEYlxCTw3Gvb+W7G8ww50YInlzvvL5sNQ346jy9e+DM3faslmyIiIiLloYAnIjUuISaB1mMeom56Bmnv/oeD0XDsskupuyOdZhdeQqtsOHDv7QDETv6PlmyKiIiIlEIBT0SCR2Qkx/r25A/DoP76TWT/YiAnli0GYO+BbWQPHUDTxLVasikiIiJSijo1/QAiIp4K2yiQy4tLkuj8mfP+wLHPsisW7hgBzy+FgwN6EL8okfM/ng1Abp/L2DVuEE1/N5Z/f7yB9C4zSGvXhWO9uqo5uoiIiNQaanQeRNToXMSRkZPhzMDl5/PDovfpfN8kTNu27PloDj8s/YhBYyaydeEsOlx7K9lDB3B4bRKdHqzLkM0n+fcn0OJY0bXUHF1ERERCkRqdi0jY8Gyj0PgXI/nDMIj7eget7/grTddvBYqWbMYsTeahwTBk80nmzzXYLp0BWPnEWLYunKVlmyIiIlKrKOCJSFArXLI50nJkbRKdX3sfcJZsHl6bxIiRlsWX1C1sjn7soXsBMK1bkdu9kyptioiISK2iJZpBREs0RXwra8kmUVGc/HwxfW4dx9YFr9Ps37OKLdl8cQm0yi66lpZsioiISCjQEk0RCVtlLdmsv+5r2LULgHOen1xiyaYqbYqIiEhtoiqaIhJSvKtsth+eVHRyw9eMGAmLL6nL5pdOFjVHd1XaPNi9A/mPDOf84bfwn//tYNOgeaQMzVeVTREREQkbWqIZRLREU6R8PJdsNlidysG0VC6a+F/yu3Xl4DvTOJn4mbNk01VpM3NQH46nrOGvg+D5z7RkU0RERIJfRZdoagZPREJOQkxCURBr2YNj+1O4Z8N/mT93Aw1G/4XdbeMAp9JmM1dz9H/2cZqj5wzsy9KbLmfg2EksH301bXYeZv57G3g5I5dtzaaSNfh6zeiJiIhIyNIMXhDRDJ5IxWTkZNDx1Y4M2ZhboqjKrlj460CnOXp0t57Ef/YVX857iSt+/TAT+sPtGzSjJyIiIsFHRVZEpNZKiElgyz1bGD95PUc2reGz6Y9yMBoOdW5L9sZV3PXzcbTKhiMP3A3ABa+9w4FoeHKFU4Rl1d/HALBm3CgVYREREZGQpoAnImEhkObo561IxRj4vl8XYhclUu/mUQCcalhfffNEREQkpCngiUjY8dcc/fH+EH8cdt9zM0REcDx1NQCLF71Go869abJxB/G5zvhGnXry9P3d6fhqR4U8ERERCXoKeCISdvwt2Rwy1FmqWb9rLygoKLFk0903b/noq4m6tAvz34OnFuaybf5UzeiJiIhIUFPAE5GwVNaSzbOOHgfgp7dm+VyyeXbDxgB8nr6MvK83YIAHV8GgMRM1oyciIiJBTVU0g4iqaIpUjTKrbMbAG13hqeXw5XsvcMWIB9h79WXUWZtK/AmnrcLWy9vS+4mZLHtoBO3W7uK8Fak83h+GDL2bpm06c6xXV7VWEBERkUqlPngiIqVwL9nMys3iSH4+O5Z8yKbpE3lgNZzVuSvXtIqF5YmcyPmhcEbvYH1nRq/lokTq7fwSnpjJm7s/4ljTPGbXcQIhy18D1FpBREREgocCnojUCp7N0TNGxHP9gcl8kZDLi0tS6Z/kjBk49ll2xcKL/V0zevfcTMuICM7eutO5xqE8nlxhONSlLfVT0/jgxbGc3fQ8LpmxgPlzNzCCXJIzkukY1xFAM3oiIiJS7bREM4hoiaZI9cnIyXD63OXn88Oi9+l83yRM27bs+WgOuf83mysffY0N25Pp0vpysocO4PCaJIyBRj368d2/nqRLh/6MGhnF7IvziDwNK2ZBm8Pw2xthxYVQEAHRUZrRExERkYpRo3MRkQAEUoQlZmkyb3R19u4duPd2zk5LB2B3dB4rzrmP42+24Iq9EJ8Ly2ZDzowmvBf5G3LznBm9lP0pqrwpIiIi1UIzeEFEM3giNcNfEZaHhsA5Noo35+axYdsKLvzTBA6vTeKvg2DuPIMZNoyvRvSiz23jmdAfenwPw7bD5N7wUXtIvkAzeiIiIhIYFVkREakgf0VY7rpuOCcO7oO502nxm7HEbNjOHSPg+c8gZ2BfYj/8kKPvPAnANa2ups/OrUSwjwdXOe0VjjWPY93POzE1J5Ft86eSNfh6Vd0UERGRKqEZvCCiGTyR4FDWjN7xOvD7G+DHc6JY8noeWxfOosO1txZrrWCGDWNjzwvoPOHfTOsGN38N9U8XXUNVN0VERMQfzeCJiFQS7xm9vNWpHNyxkcWLXuPJFYZXDnZhT0w8sIi9B7bRzLu1wocfsnXN63SeAGNTIGdQXw5mZXJ8VxrLn7mDQe+tY/57G3g5I5dtzTSjJyIiIpVHM3hBRDN4IsGrzH16sfBGF49m6SP/wi7QkZYAACAASURBVOLZjzNk1FMc6tyWJilb+XLeS1zx64eZ0B9u31D885rRExEREW+qoikiUoXcs3rjJ6/nyKY1fDb9UQ5Gw56Lz2fV57Po1nckAHsTYkn5fh0/++csAFIe/C1ERLA3IRaAJ5c7rRZW/X0MAJ/eP4yoS7sw/z14amEu2+ZPJWXvWlXdFBERkQrREk0RkXIq1ix9ZDz3Jb7E2+/socFvb2NRG7gB+OyZOzh7G3RJcz5jL+0EQGyaE9ayOrelyaJEsjfOhydmsmXV/+iwFQxOQRZWTSQ9dqJm9ERERKRCNIMnIlIBCTEJPPfadr6b8TxDTrTgtU+c92cshAHH4ki6fYDzxuZvfM7omW++AeCB1cVn9JY9NIKozl2Z/x48+FkuGf99jrT50zSrJyIiIuWiPXhBRHvwREJUfj6Zn87jvWdv4Z6Vp/m4HUy6At5+H/bEwJG6Tl+8CGBR6lyGXvor9g7oTvwXG8i8sistE9exYeeXdGnXj1EjozhGHrM/8Kq8GQOvd4WMJlH8YfgT1O0/UIVZREREwpj24ImI1JTISOKH/ZpffrKT72Y8z+ATLfjyDbgwB/pmwLV7ziL1F90AiP1wEdlDB3Be0gaiCmDz2F9CRARnb90JQMKhPObNMxzv1A6A4bfW4dFroOlxp4jLm3Pz6HPrOBp16snT93en46sdNasnIiIihTSDF0Q0gycSJlwzeo/M+T3nHzxVomrmrhj4sKOz5849o5c9dACH1yRhjLNk87t/PUmXDv15qTc8sNqQM/BKTnm1WjgvaQMv94ZOY8fRWK0WREREwkpFZ/AU8IKIAp5IeMnIySArNwvy8/lhyYdsmj6RB1bD9327sqlrC37+yses+vsYOqzcTszSZB6/yvLUcti6cBbGWtoPH83+BnB2n37ELkosu9WClnCKiIiEFQW8MKCAJxK+yuyjFwMPDYFzbBRvzs1jw7YVXPinCeSuTKLFcSfwdRg2indXzeDXfcZigZxB/VjXvRkDJ73HtG5w89dee/ZcvfU+vbQe826aR/MGzQEU+EREREJERQOe2iSIiFQDdx+9rNwsjuTns8NjRu+szl2567rhnDi4D+ZOp8VvxhKzYTuv93KWcf7U4SKgZKuFH9a8DpPeY2wK5Azqy/eZezm1O52bR8CjyTDvPZic8RPPbx9G8gVQEAHRUWq9ICIiEs4U8EREqkmxPnoj4rn+wGS+SMjlxSWpDLk1tXBc/U1p3Hgj/HhOFA+uyqPe1p3Q+nK6vfR/gNNqYUhEhFfgW87qd55kyK1PMNFeTZ+fthLBPh5c5YTEY83jWPfzTkzNSWTb/Klkac+eiIhIWNISzSCiJZoitYvnHr0Gq1M5uGMjixe9xpMrDN/368LK311Jz7/+i+jW7TircTwxS5Mx4CzV7DGaQ93a02TjDha/+ThDfv9E0RJOA2bYdWzseQGdJ/y71CWcDw+Eow2jeO6S+6mf0IZjvboq9ImIiAQJLdEUEQkxnjN6tOxBvZwMXsqfxTdNnVm9m1a4ZvVS0zheJ43pXeHOVGj84RKyJ8wmbuMOALLbO9conNG7tC1NPvyQrWtep/MECpdwHnRV4bxtZB1e+Pg0c+eBIQ94HlChFhERkXCggCciEiS89+kdWL6U/370dxIO5XH7BifcAQyc9B67YuDF/k5vvJYZ2dCjwO8Szm0fz6b98NH023Gayw4Yfmp8DodPHqXLvVGMWZ3HhBXO9SAP5o4jPWacAp+IiEiI0RLNIKIlmiLiraxWC0tHdqP/+JnlXsK5YXsyXdr142A0nHVFPzLvGUX768cwoT88ucLptXfcVahlWncYn6TKnCIiIjVFSzRFRMJQWYVZbksKbAnn2Vt3AtA0F7beezs/tW8NwJ/XQM7AvsQuSiwq1JLovFeeypz16ijwiYiIBAsFPBGREFHRJZxs3kxKy3U0fXEKmdEQn+u0Xjh7ixMA3YEvNiKCnFYtADjc4UIa+6nM6d6zt6OxAp+IiEiwUMATEQkhxQqz3NKD8667uVjgW/vaBP70VT7fxMOyVjAmFVo8OZm9z02my3Z4+upIHl+WT71vtxP/71nFAh/AhZ+sBGDnDVfROCKC7DYtAbhq1jLMsOtYdfvP6f3ETFKGdafz0s08tfxk4bP5CnwA1+xVpU4REZHqoj14QUR78ETkTGXkZHB63nuc98TL1N2zr/D90/XOYtOd17NqWBd++atxNKrbkHo/HOXx/s4s39aFs+hw7a38cElrGm/ZzXtfTuOmy8eyePbjDBn1FIc6t6VJylbeXfN6USuGXwzj4PdpHN+VRvrIa7jqrS+I/Kko8GVGAwbijxc9nyp1ioiIlE9F9+Ap4AURBTwRqTT5+WR+Oo9H5vye8w+e4vYN0Cq76LQF1rWAh38eyax5+SUKtSya8w+G/m5CiUIt3oFvsWsJpzvw/Zi5l6y0DUzrDhOXOfd67BqY3ucsxn51igkrvAq3KPCJiIj4pIAXBhTwRKSyeTdTP56xg0c2T6bhj3k8/1nx0He8DjzdF+5KgagLW1G/SUu/lTnf+3IaN115Fz90vJDGm3cWtmLY3wDO7tMXLBxel8xfB8HceeWo1KnAJyIiAqiKpoiI+ODdTB1gRs6fSi3U8myi64Mb0zleJ91vZU7vPXsnXHv5mh+DrfeOwVhL++HJ/OtT/5U63YHPVy++3U3qMLz/XZy6vBc2MoLYerGFRVxAhVxERETcFPBERGoZf4VaAqnMef7cJQB894sr6ElRKwZwFW5xrRJxAp+fSp1lBr7TMPdV0mNedRVygf0NUCEXERERLwp4IiK13JlW5gSwW74l5QKnFcO++tDiONTbuhPjsQ2gtEqd/gJfxoHd5GdksHF4T4bN3cBTy08VXrOokEse8DxQtMxTs34iIlIbKeCJiEgxvgLfd/PeY/ATL3PdG05lzgtznD17fxvg7NnzDHy/vSmSSUvyafDCK2BtscBH68u56IPlQNGsX3kD34VvrSm1kMujXoVcypr1+/J8uGIPXJCrfX4iIhJ+FPBERKRMCTEJMOYhuO0BSE4ma+cmTm39hsbvfsSzifuBosA37mr45KJ8WnWHZz931ndOuCaCO9YVcO7jE8l+ZSbnbtkNQMNd+6B3QWCBb8EC9rkKudy7BnIGOYVc7lyXzI5zT/HsMqeQS1mzfnkREFUA3vv8MppE8ZvBD3DksksAaLV5L82OWvLi47TsU0REQoaqaAYRVdEUkZCSn18i8NXds7/w9IFoMF598Mqq1OluzeDdi2/N326j58Q3WTNuFD2fmcWG7cl0adcPcPr3OYVc3JU7+xG7KLFE+4aD36dht6XR9AScbnwudbIOc/MNkHCUEq0byurf5172mderB3Eb04jLydO+PxERqRKqoikiItUrMhL69yeuf3/neNKrkJwM+/eTeU4E+7q0BmCXn0qd41zLPDu98CbZb37ud4Yv4EIuCxaQ+tY/ivr3rfuWvQN78tLiVOJPODN+B7MyOb4rrdzLPotmAR0KgCIiEiwU8EREpHK4Ah9AvOsF+K3UGWjgi//XGwEXcok6dgKAfVd0okmdOnxz+zCGrEjlUOc2NFm0vLB/n79ln3V3ZjizgI0aEvnDUW7xmAUsTwD0tQy03qEjZMVEcahbBxWDERGRM6aAJyIiVS6Q1gylBb52k6azd/qHnJe8kTtGwPNLKXchl8bf7ALgh4udWcViga+8/fveKnsWsHwBMA/m/rPMZaDexWAScovPCkZlZpEXH6dAKCIiPingiYhItatQ4Mv5nuN1vuexAbC4LbQ5Uv5CLi2+/BqAvAbRQMnAV55ln/5mAf0FwNKWgf67J3yUNYh+c1aUUgym7FlBX9VBNUsoIlJ7KeCJiEiNKyvw5a1O5XjGDiJ27OSiD5OYtCyLSa6A5C7k8vTnruSTmsbxOmklZv1abNxBZn1oNfVdUgb8giYr1gKuwFdQvmWf/mYBK74MFAbMW4r5xS+KFYPh3HOxWYeZ3BPuX1t8VnBkw55c925Z1UErZ5Ywq3M7zRqKiIQYBTwREQk6xQJfyx5FJ6bkB1bIxTXrN+5q2BMLc97fxsGOvWh6HA5GwyUvzSF7TiIxn39RuOyz2ZSZYCmx7NPfLOAZLwP1VQzmmh786YsNfN+3Ky0/X+MxK7i21EDorg767OfOj+BMZgn9zRomX+C8f83eKMa1Ge0zFCokiohULwU8EREJHeUs5OJr1s+t8QmIAJrmUrjP77EBkNghiqlH8nj2s2QAnnPNArqXfca5ZgEvfn0h/G5C8cBXUDXLQPdd2ZmWSRvYd+WltPSeFfQRCLOvvYanlyVhzJnNEr5SzlnDolnCPGCa8/0HGBJLm0WMy8njpyaNSL+kJVB8qWlpofFU/inOijyr8F4KkSJSGyngiYhIWPA365e1cxOHY+ty7LJLOZmcWGzGb9IymLQsDyha9lk4C+ha9vmYaxbwrQ9SOdGsMY1/OMrBaOj4wiyy5yT6DYAVWQaaXzcKgNN1ndDib1lo5p9H0354EnAGs4R+Zg1/zNxLVtqGEnsJM2Lg7Q8CD4n+ZhFb+1hqWt7Q2PxY5YXI8s5MKmSKSE1TwBMRkfDm0a8vzv3eBb1LzPhFZWZxoKEp/Mt+o3Wb+X9LXi4WAt3qHDlaNAu4KYPjWzL8BsCKLAONPOmEzjonnUDkb1moexkoVHyW0N+s4T5fewnXJjuzhgP7EfvJ5xVaWuo5i1jnh6M+l5p6h8jSQmNBZAQR+UUJsDJCZEVnJt0hsyLLWct7fCZBtbRgqqAqEroU8EREpFbyNePXCujjHtAFLh55T4lln+6/NEetXstHy6dywaHT5QuAZSwDjR73OHufn1yiGExs4lfkRUCLL76G06f9Lgv1XAZa0VlCf7OGpe0lBFeIrMDSUu9A6HOpqXeILCM0RsQ1wWZmFobGMwmR3jOR5Z2Z9A6ZFVnOGuhxRYJqeWc/gz2ohuo9AgnYANk/Zfs89vWZQI8rcg8F/+CkgCciIlKKUpd9AnQfRc+b/xpQACx1GeimDI7XyfBZDOaVnnB/cvFZwYuem8bemR+VCIRNX5zCrhjn2hWdJfQ3a1jaXsLCYwJfWuodCEtbauocOyHSX2j83s/S03KFyACXr/oKmZURIss6rmhQLW8wDZWgGqr3qGjA7ru79DFnenwm9/BXkTfYAnZZ16if0IZjvbpCZGToBVdrrV5B8gIaAjYnJ8eKiEjo2529267ft96u37PGbps31e569Rm7cs5E+3bqbPt26my7cs5Em/LSw3bD3b+yP7aIs9aJS9aCPRVBmcfH6mAfuRr7u19h88Hur+98vXVEpH3kGmwBzuvRAdj0GOzuTgl2T78uNh/sgfrYbZe3t+szVtttfdrZUxHYPf26WpuXZ/f061Ls+GDnNtaCXfTm49bm59sjg/rZ7+s7z7Bl4Sy79aM3Cp9py8JZ1lprV48bZS04X621G+7+lbVgP3/1IZ/HK5+4w1qwXz4x1lprbWpaUuE1U7cnlzi21jqfBedaPo69n8H72Pu5y/N9eB8venOCtWAPdm5rbX5+8eO8PHtkUD+7Mwa7KxZ7ZFBfe2RgX4/jfj5/3oEeV+geV3W1+6OxBcZYO2yYzezazh6IxhYYrI2PtwVgX+pZ/Pi3N1Dsv6tHrsH+9lfOmNzGDW2+x2cqclzb7vHbG7Dvj+pp8+qdVezXdX5kRJnHx2Kj7bHY+gF9JtDjitzD3+9X3scHop3fh87kGtVxj12x2BtuwkY/E213Z++ujD8WApKTk2MBCzS0gWSKQAbrpYAnIiJV5PRpaxMT7aEZU+y2eVPt+u++stvmTbUpLz1sl8wcZ99ZN8sumTnOJk68s1yBcL+Pv9yUFgr9/QU2Mxq7/WfN7J5+XWyBMXbEjc5ffPZc1dXu6dfFb6DwDonFjm3FwlegobEiIdLfNfyFzMoIkVURVP0F00oJkdURVEP1HpUasE2pnwn0+EzucSru3GLHwR6wy3ONT+f8wx4Z1M8WGGNvuAm7ft/6AH5DrxwVDXhaoikiIhIMSikGU0x3j3/2UR20wbqvSxSLcS9PKqtlxANrnGP33kHv46a50PTbAxyvc4DHBsDittDmCDz7eSoA4wdG8l3DfN76IIkTzRrTwr0ksJxLS733Gsb/640SS02Ljl+HoTf73Y/ob+lpWfsV623dCW2uDHj5anmL3hQe438PpL/jityjKpbM+ivW4++4Nt0j0OXFOWUuJ+5L7Cefl/hMoMeVcQ9/S5orZVl0Ndwj+9preDoxiewp1xL7uwlkDx3AC0uSyM7PJ1Qo4ImIiISiMgKhd7GYQqWEQs+9g6XtT2m0N6swILoLyrj3Ej6ztOgvPuUJif72Gsb8cJS73fvKXAVpJhVWKU0qMzQ2f/VN3u7Umk6fJZIXAbGJq0jZs6ZiIfKLjcWL3Hgfn0HRm/KGyKoIqv6CaWWEyOoIqqF6DwXs4L6H+xppq1NhRI/ix957sYOUAp6IiEhtEegsofexKyCyfz+Z50Swr4sTCHK8CsyUJyS6+ZpFNMA7HzjH7hBZWpXSkqFxN0373lYYGu/7quIh0nsmsrwzk+6Q2WPSpMLQ2OCFV8DawEOkn+MKBdUAZz+DNaiG6j0UsIP7Hu5rRGVm+TwOBQp4IiIiUj6ugAgQ73oBPiqMUvpxOWcRy1sJz1dobPpTBBEUFIbGMwmRgSxf9RUybymscFm+5ayBHlckqJZ39rPeZ8t5+8Y3gzeohuo9FLCD+h7ua+TFx/k8DgmBbNjTS0VWREREgo6rQI195x3n68mT5SpYk/LSwz6rmnoXtfF37KvojXeVQV9Fbyq7amBF7lFW4R3PY+/CFeMGRtrfVXVxDNdzheM9PAsXuSvbbux5gX0rZZbd2CPBnorAbuzhHG+7vH1hoZZtl7e32/q0K3bs6zOBHp/JPbb1KV9F3qApclOea8Ri1+9ZU1g5uPC4mlW0yIqx1pYZAKX6GGMaAjk5OTk0bNiwph9HREREyiu/aPkqzZvD5ZfDypWFy1m/dy1nbVDO5azV0ffLPfvZYF/R7Ke/Pnju2c6q7B9Xm+5xvA481c898wsH61O0vHhN0fEtXr0OH7va/2cCPa6se9y/Fn46tyF1Pfov/nRuQ+r56NnovsdbH/j+TKDHlXEP9zVG/erv9JqdSMzSZEaMtIyfvJ5uzbtRnY4ePUpMTAxAjLX2aHk/p4AXRBTwREREpFqVEUxDJaiG6j0qErBPnNsQA9Q7fLTcnwn0uCL3CNWA7e94Vyw8NBgWd45myz1bqr3ZuQJeGFDAExEREalFAgjYNG8Offs6nwvkM4EeB3CPQCvyBlPA9neN+gltONarK0RGEhcdV+3hDhTwwoICnoiIiIiIQMUDXkTVPZKIiIiIiIhUJwU8ERERERGRMKGAJyIiIiIiEiYU8ERERERERMKEAp6IiIiIiEiYUMATEREREREJEwp4IiIiIiIiYUIBT0REREREJEwo4ImIiIiIiIQJBTwREREREZEwoYAnIiIiIiISJhTwREREREREwoQCnoiIiIiISJhQwBMREREREQkTCngiIiIiIiJhQgFPREREREQkTARlwDPG3G2MSTfG/GSMWW+M6etn/AhjzLfGmJOurzd4nTfGmH8YY/YZY04YY5YbYy72GtPIGDPHGJPjes0xxsR6nO9vjFlgjNlvjDlujNlgjLnZ6xrLjTHWx+vjyvi5iIiIiIiIlCXoAp4x5tfAZOAZoCuQDHxqjEkoZXwf4F1gDtDZ9fU9Y0wvj2F/BR4E/gT0AA4AnxljzvEY8w7QBRjqenVxXcvtcuBrYARwKfA6MNsYc53HmF8BzT1elwD5wNyAfggiIiIiIiIVYKy1Nf0MxRhjVgMp1to/ery3BfjQWvuYj/HvAg2ttT/3eG8RcMRa+1tjjAH2AZOttc+5ztcFMoFHrLVTjTEdgW+B3tba1a4xvYGvgA7W2m2lPOvHQKa19vZSzt8PPAk0t9YeL8f33hDIycnJoWHDhv6Gi4iIiIhImDp69CgxMTEAMdbao+X9XFDN4BljzgK6A0u8Ti3BmUHzpY+P8Ys9xrcCmnmOsdaeBFZ4jOkD5LjDnWvMKiCnjPsCxACHyzg/Bvh/pYU7Y0xdY0xD9ws4x9c4ERERERGR8giqgAfEAZE4s2ueMnFCmi/N/Ixv5vFeWWMO+rj2wdLua4y5EWe55xulnO+Js0RzRinPDfAYToh0v/aWMVZERERERKRMwRbw3LzXjRof7wU63t8YX9f3eV9jTH9gFjDWWvtNKc80BthsrV1TynmAZ3FmAd2vlmWMFRERERERKVOdmn4AL1k4RUm8Z82aUnIGzu2An/EHXF+bAfvLGBPv49pNvO9rjLkKWAg8aK2d7euBjDHRwG+Ax0t5ZqBwqehJj8+VNVxERERERKRMQTWDZ609BawHBnmdGgSsLOVjX/kYP9hjfDpOgCsc49rrd5XHmK+AGNeySveYXjizais93usPfAw8aq2dVsa3chNQF3irjDEiIiIiIiKVKthm8ABeAuYYY9bhBK87gQTgvwDGmNnA9x4VNV8BkowxjwALgF8CA4ErAay11hgzGRhnjNkObAfGAbk4rRGw1m5xVd6cboy5y3XdacD/3BU0PcLdK8B8Y4x71vCUtda70MoYnKqfP1TSz0RERERERMSvoAt41tp3jTGNcZY3Ngc2A9daa3e7hiQABR7jVxpjfgM8DTwF7AR+7VkRE/gncDbwGtAIWA0Mttb+6DHmZmAKRdU2P8Lpm+d2GxCNUxjFs13DCqC/+8AY0w4nXA4O8FsXERERERE5I0HXB682c/fB27Nnj/rgiYiIiIjUYkePHuX888+HAPvgKeAFEWPMeahVgoiIiIiIFGlprf2+vIMV8IKIccpotgB+9De2mqwBevodVbuF288o2L+fYHi+6n6Gqr5fVVy/sq55Ds7/9GpJ8Py+KFUvGH6dB7tw+xkF+/cTDM+nP3uq75rB9mfPOcA+G0BoC7o9eLWZ619cudN5VTPGFAQyHVwbhdvPKNi/n2B4vup+hqq+X1Vcv7Ku6dE65sea/vcu1ScYfp0Hu3D7GQX79xMMz6c/e6rvmkH4Z0/AzxBUbRIk6Lxa0w8QAsLtZxTs308wPF91P0NV368qrh8M/54kdOm/H//C7WcU7N9PMDyf/uypmWuGJC3RFBGRoOQuPEWAm8tFREQqKhz+7NEMnoiIBKuTwBOuryIiItUh5P/s0QyeiIiIiIhImNAMnoiIiIiISJhQwBMREREREQkTCngiIiIiIiJhQgFPREREREQkTCjgiYiIiIiIhAkFPBERCVnGmGhjzG5jzAs1/Swi8v/bu/dgO6czjuPfXxNUhdKGSutyENVkxC0tpUXi0popinFp1SXul7pPqlI6kzFuNWRcB5UgJDopad1DyhBxCRKlLiEIwpCIoZpUQiNP/1hr83qzz3FOcs4+J+/5fWbO5Oy11rv2s9/8sfdz1rPWNqs+SYskPZN/RnZ2PPX07OwAzMzMlsGZwBOdHYSZmXUb/46ILTo7iJZ4Bc/MzJZLkjYGfgDc09mxmJmZdRVO8MzMrOEk7SDpTknvSApJe9UZc7yk1yUtlDRN0valIRcBwxoTsZmZLe/a6b1ntdz+iKQdGxR6mzjBMzOzzrAK8CxwQr1OSQcAlwDnAlsCk4EJktbL/b8EZkTEjMaEa2ZmFbBM7z1ZU0QMBI4FbpS0WseG3HaKiM6OwczMujFJAewdEbcV2p4Ano6I4wpt04HbImKYpPOBg4DPgF7ACsDFEXF2Y6M3M7Pl0dK899SZYwLwx4iY2oiYW8sreGZm1qVIWhEYCEwsdU0EtgOIiGERsW5ENAFDgWud3JmZ2dJqzXuPpDUkrZR/XwfoD8xsZJyt4VM0zcysq+kN9ADmlNrnAGs3PhwzM+sGWvPe0w+4RtJiIICTI+KDxoXYOk7wzMysqyrvIVCdNiLihoZEY2Zm3UGz7z0R8RgwoOERtZFLNM3MrKt5n7S3rrxatxZL/mXVzMysPVTmvccJnpmZdSkR8SkwDdi11LUr8FjjIzIzs6qr0nuPSzTNzKzhJPUC+haaNpC0BfBBRMwCRgA3SZoKPA4cDawHXN3wYM3MrBK6y3uPvybBzMwaTtIg4ME6XaMjYkgeczxwOtAHeB44NSIeblSMZmZWLd3lvccJnpmZmZmZWUV4D56ZmZmZmVlFOMEzMzMzMzOrCCd4ZmZmZmZmFeEEz8zMzMzMrCKc4JmZmZmZmVWEEzwzMzMzM7OKcIJnZmZmZmZWEU7wzMzMzMzMKsIJnpmZmS13JPWTNFXSTEm7d3Y8ZmZdhSKis2MwMzMzaxNJ44HJwMvAVRHR1LkRmZl1DV7BMzMzqyBJ50ia2tlxAEjqKSnautIm6e18XUjqVer+CHgTeAV4r861jxSu3XTpozczW744wTMzs2Um6U5J9zfTt23+kL1Vo+PqSJJ2l7RY0pqFNkmaK+n10timfA92anykjdUBieUfgD7Af0vtZwHjSSt459a5bk9g23aMw8xsueAEz8zM2sMoYCdJ69fpOxx4JiKebnBMHe1hYDGwY6FtU6AH0FtSU6F9MPAJ8HijgquQeRExO5bcU7It8AZwC/CT8kUR8QHwfseHZ2bWtTjBMzOz9nAXqUxuSLFR0jeAA0gJIJJWkHSdpDckLZD0sqQTy5NJOkrSi5I+kfSOpEtze99yyZ2k3rntp/nxkZLeL823r6RFhcfn5AM6jpL0lqR5kq6Q1EPSGZJmS5oj6YzmXnBE/Ad4GhhUaB5ESvweq9M+JSIW5Oc/VNI0SfPzc42prQTmGN6VdGTpNWydVwzXz49XlzQyrxh+JOl+SQOai7dwb16StFDSdEnHFPpq93YvSZMkfSzpGUnblOY4NpdOfizpVklDa/c7x3wmMLBQHnlQ4fK1JN2Rr50h6RctxfsVDgPGAmOAHJNCuAAABmZJREFUgyX1XIa5zMwqwwmemZkts4hYBNwIDJGkQtd+wIqkD+KQVrdmAfsC/YFzgD9J2qd2QU74LgWuIq2I7QW81gFhbwLsDPwcOAg4Grgb+A5pVe5M4HxJP2xhjgdJq3M1g4GHgEml9kF5bM0Kef7NgL2BjclJcER8BowDflN6rgOBRyLiTUlfAyYAvYHdgB8BzwMPSFq9XqCSjgOGA8OAfqQSxwsklZ/nXOACYAtgJnCzpB55jh2BK4ERuf8hUgllzVjgEuBZUlllH+DWQv/wPGYz4B957rrxtkTS2qT/tzHAvUBP0n0wM+v2nOCZmVl7uQ5o4ssrV4cDf4uIDwEiYmFEDI+IqRHxekTcBNwE7A9pDxsp8bkwIi6PiFci4smIuKyDYj4iIl6MiNtJK299gdMi4uWIGElKLAe1cP1DQH9Ja+XYdyAld5Nq10naEFiPQoIXESMj4t6ImBkRjwOnAHtIWjkPGQvsIGmdPEcP0kromNy/CylB3T8ipkXEDOBU0j61z5PlkrOAUyPi7/nejycl0seUxl0YERPynMOBDYENct+JwJ0RMSIiZkTEFcDEwutakGNYlMsqZ0fEwsLcoyJiXES8Sko0VwNaSqCbcwip7Pfl/MeFcaQVPTOzbs8JnpmZtYuIeIlUmng4gKSNgO1Jid/nJB2fyyPnSppP+mC+Xu7uQ1pBe6ABIc+MiOLBHXOAF0p7veYAa7Uwx2RgEWnFbwBphfJZ4ClgTUkbkBK9BcCU2kWSBuZSxVmS5gG1A2rWBYiIp4BXgV/l9p2Ab5H2mwEMBL4JfJDLPOcD80j3caNykJL6AN8FRtfG52vOqDP+X4Xf383/1u7BJsCTpfHlxy35fO5c4voxLd/f5gzhi2SX/PseknovxVxmZpXienUzM2tPo4ArJP2WlLi9SSFZk3QgcBFwGvAEKSk5g1TuBykRasni2lSFthXqjFGprTwG4H+lx9FMW7N/DI2I+fnEyMGkBGpyRCwGPpX0eG4fDDwaEZ8CSFoVuA+4h1SG+R4pybqbVM5aczOpLPOi/O89tZXQHNPbpBLTsg/rtNVew2HAtFLfZ6XHxXtQS3Zr16vQRqGttdp0f+uR9GNSiekISRcXunqQ7uelbZnPzKxqnOCZmVl7+ivpA/aBwKHAtaUVse1JSdDVtQZJfWu/R8SHkmqJy+Q689e+76wP8Fz+fYvSmLnA6pK+XigPLI9pTw+S9gn2IZVs1tTKNHcEri609we+Dfw+It6F9FUSdeYdCwyXtCWp7PKIQt/TpITyk4h4qxUxvkNajdwwIsa1YnxzXgK2LrWVSyw/JSVbHeUw0j0/qU77YTjBM7NuziWaZmbWbiJiPmk/1HmkBOSG0pBXgW0k7Srp+5LOA7YsjRkOnC7pBEkbS9pK0gmF+acCwyT1kzQIOLt0/RRgIXBePhnyIODgdnuRS3qQtKK0Cympq5lESvzW5csHrLxJWsk6SdKGkvbiyweVAJD3qT0JXE9a6bqr0H0fqQz09nwvmyRtJ+m8nBCW5wrSfT1L0on53m8m6XBJp7ThtV5OKoU8Of/fHA/8jC+v6r0BbCRpc6UTTldqw/wtynsUDwBujojniz/ASGDzeq/fzKw7cYJnZmbtbRSwBnB/RMwq9V0J3EHaSzaFdMjGNcUBETEKGEo60OMFUmJT3Cc2BFiZVGo4gnR4SPH6uaRDOPYk7fnajyWTwPb0KGnV6jPgn4X2KaTS0PmkZKwW32zSPsVfAy+SXuvQZuYeC2wOjC8eVpLLQHcj7XkcDcwA/kJKJt+rMw951fRY0krgc6Sk8xDg9Xrjm5ljEnACcDppr+HOpFMziwep3ELaUziJtJq6X2vnb4V9gF7A7XVimw5MJ+8BNTPrrrTk94aamZmZtY6k64GmiBj8lYPbPvfbwAX5tM6lub4v8AowIK/ymZlVnlfwzMzMrNUk/S6XX24s6WTSdwiO7sCnvDif+rlKWy6SNJG0ymhm1q14Bc/MzMxaTdJ40mE5q5K+J/CyiPhzBz1XE18cCPdatOFDi6TvkUp5AWbVTjE1M6s6J3hmZmZmZmYV4RJNMzMzMzOzinCCZ2ZmZmZmVhFO8MzMzMzMzCrCCZ6ZmZmZmVlFOMEzMzMzMzOrCCd4ZmZmZmZmFeEEz8zMzMzMrCKc4JmZmZmZmVWEEzwzMzMzM7OK+D+8cDrrIZrT0gAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "fig = plt.figure(figsize=(10, 10), dpi=100)\n", "ax = fig.add_subplot(111)\n", "p1 = ax.semilogx(wavelength[good], greisen_v2a[good], 'k.', label='Greisen')\n", "p2 = ax.semilogx(wavelength[good], ciddor_v2a[good], 'gs', label='Ciddor')\n", "p3 = ax.semilogx(wavelength[good], wcslib_v2a[good], 'ro', label='IAU')\n", "foo = p2[0].set_markeredgecolor('g')\n", "foo = p2[0].set_markerfacecolor('none')\n", "foo = p3[0].set_markeredgecolor('r')\n", "foo = p3[0].set_markerfacecolor('none')\n", "# foo = ax.set_xlim([-5, 10])\n", "# foo = ax.set_ylim([24, 15])\n", "foo = ax.set_xlabel('Vacuum Wavelength [Å]')\n", "foo = ax.set_ylabel('Fractional Change in Wavelength [$1 - \\\\lambda_{\\\\mathrm{air}}/\\\\lambda_{\\mathrm{vacuum}}$]')\n", "foo = ax.set_title('Comparison of Vacuum to Air Conversions')\n", "l = ax.legend(numpoints=1)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.5" } }, "nbformat": 4, "nbformat_minor": 2 } pydl-0.7.0/docs/credits.rst0000644000076500000240000000164513434104050016221 0ustar weaverstaff00000000000000=================== Authors and Credits =================== Thank you to everyone who has contributed to PyDL. If we have neglected to include you, please let us know right away! * `Kyle Barbary `_ * `Larry Bradley `_ * `Matt Craig `_ * `Christoph Deil `_ * `Mike DiPompeo `_ * `igoldste `_ * `Duncan Macleod `_ * `Demitri Muna `_ * `Thomas Robitaille `_ * `José Sánchez-Gallego `_ * `Brigitta Sipocz `_ * `Erik Tollerud `_ * `ViviCoder `_ * `Benjamin Alan Weaver `_ * `Kyle Westfall `_ pydl-0.7.0/cextern/0000755000076500000240000000000013434104632014552 5ustar weaverstaff00000000000000pydl-0.7.0/cextern/README.rst0000644000076500000240000000054212135272325016244 0ustar weaverstaff00000000000000External Packages/Libraries =========================== This directory contains C extensions included with the package. Note that only C extensions should be included in this directory - pure Cython code should be placed in the package source tree, and wrapper Cython code for C libraries included here should be in the packagename/wrappers directory. pydl-0.7.0/ah_bootstrap.py0000644000076500000240000010776213273057371016166 0ustar weaverstaff00000000000000""" This bootstrap module contains code for ensuring that the astropy_helpers package will be importable by the time the setup.py script runs. It also includes some workarounds to ensure that a recent-enough version of setuptools is being used for the installation. This module should be the first thing imported in the setup.py of distributions that make use of the utilities in astropy_helpers. If the distribution ships with its own copy of astropy_helpers, this module will first attempt to import from the shipped copy. However, it will also check PyPI to see if there are any bug-fix releases on top of the current version that may be useful to get past platform-specific bugs that have been fixed. When running setup.py, use the ``--offline`` command-line option to disable the auto-upgrade checks. When this module is imported or otherwise executed it automatically calls a main function that attempts to read the project's setup.cfg file, which it checks for a configuration section called ``[ah_bootstrap]`` the presences of that section, and options therein, determine the next step taken: If it contains an option called ``auto_use`` with a value of ``True``, it will automatically call the main function of this module called `use_astropy_helpers` (see that function's docstring for full details). Otherwise no further action is taken and by default the system-installed version of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers`` may be called manually from within the setup.py script). This behavior can also be controlled using the ``--auto-use`` and ``--no-auto-use`` command-line flags. For clarity, an alias for ``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using the latter if needed. Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same names as the arguments to `use_astropy_helpers`, and can be used to configure the bootstrap script when ``auto_use = True``. See https://github.com/astropy/astropy-helpers for more details, and for the latest version of this module. """ import contextlib import errno import imp import io import locale import os import re import subprocess as sp import sys try: from ConfigParser import ConfigParser, RawConfigParser except ImportError: from configparser import ConfigParser, RawConfigParser if sys.version_info[0] < 3: _str_types = (str, unicode) _text_type = unicode PY3 = False else: _str_types = (str, bytes) _text_type = str PY3 = True # What follows are several import statements meant to deal with install-time # issues with either missing or misbehaving pacakges (including making sure # setuptools itself is installed): # Some pre-setuptools checks to ensure that either distribute or setuptools >= # 0.7 is used (over pre-distribute setuptools) if it is available on the path; # otherwise the latest setuptools will be downloaded and bootstrapped with # ``ez_setup.py``. This used to be included in a separate file called # setuptools_bootstrap.py; but it was combined into ah_bootstrap.py try: import pkg_resources _setuptools_req = pkg_resources.Requirement.parse('setuptools>=0.7') # This may raise a DistributionNotFound in which case no version of # setuptools or distribute is properly installed _setuptools = pkg_resources.get_distribution('setuptools') if _setuptools not in _setuptools_req: # Older version of setuptools; check if we have distribute; again if # this results in DistributionNotFound we want to give up _distribute = pkg_resources.get_distribution('distribute') if _setuptools != _distribute: # It's possible on some pathological systems to have an old version # of setuptools and distribute on sys.path simultaneously; make # sure distribute is the one that's used sys.path.insert(1, _distribute.location) _distribute.activate() imp.reload(pkg_resources) except: # There are several types of exceptions that can occur here; if all else # fails bootstrap and use the bootstrapped version from ez_setup import use_setuptools use_setuptools() # typing as a dependency for 1.6.1+ Sphinx causes issues when imported after # initializing submodule with ah_boostrap.py # See discussion and references in # https://github.com/astropy/astropy-helpers/issues/302 try: import typing # noqa except ImportError: pass # Note: The following import is required as a workaround to # https://github.com/astropy/astropy-helpers/issues/89; if we don't import this # module now, it will get cleaned up after `run_setup` is called, but that will # later cause the TemporaryDirectory class defined in it to stop working when # used later on by setuptools try: import setuptools.py31compat # noqa except ImportError: pass # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass # End compatibility imports... # In case it didn't successfully import before the ez_setup checks import pkg_resources from setuptools import Distribution from setuptools.package_index import PackageIndex from distutils import log from distutils.debug import DEBUG # TODO: Maybe enable checking for a specific version of astropy_helpers? DIST_NAME = 'astropy-helpers' PACKAGE_NAME = 'astropy_helpers' if PY3: UPPER_VERSION_EXCLUSIVE = None else: UPPER_VERSION_EXCLUSIVE = '3' # Defaults for other options DOWNLOAD_IF_NEEDED = True INDEX_URL = 'https://pypi.python.org/simple' USE_GIT = True OFFLINE = False AUTO_UPGRADE = True # A list of all the configuration options and their required types CFG_OPTIONS = [ ('auto_use', bool), ('path', str), ('download_if_needed', bool), ('index_url', str), ('use_git', bool), ('offline', bool), ('auto_upgrade', bool) ] class _Bootstrapper(object): """ Bootstrapper implementation. See ``use_astropy_helpers`` for parameter documentation. """ def __init__(self, path=None, index_url=None, use_git=None, offline=None, download_if_needed=None, auto_upgrade=None): if path is None: path = PACKAGE_NAME if not (isinstance(path, _str_types) or path is False): raise TypeError('path must be a string or False') if PY3 and not isinstance(path, _text_type): fs_encoding = sys.getfilesystemencoding() path = path.decode(fs_encoding) # path to unicode self.path = path # Set other option attributes, using defaults where necessary self.index_url = index_url if index_url is not None else INDEX_URL self.offline = offline if offline is not None else OFFLINE # If offline=True, override download and auto-upgrade if self.offline: download_if_needed = False auto_upgrade = False self.download = (download_if_needed if download_if_needed is not None else DOWNLOAD_IF_NEEDED) self.auto_upgrade = (auto_upgrade if auto_upgrade is not None else AUTO_UPGRADE) # If this is a release then the .git directory will not exist so we # should not use git. git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git')) if use_git is None and not git_dir_exists: use_git = False self.use_git = use_git if use_git is not None else USE_GIT # Declared as False by default--later we check if astropy-helpers can be # upgraded from PyPI, but only if not using a source distribution (as in # the case of import from a git submodule) self.is_submodule = False @classmethod def main(cls, argv=None): if argv is None: argv = sys.argv config = cls.parse_config() config.update(cls.parse_command_line(argv)) auto_use = config.pop('auto_use', False) bootstrapper = cls(**config) if auto_use: # Run the bootstrapper, otherwise the setup.py is using the old # use_astropy_helpers() interface, in which case it will run the # bootstrapper manually after reconfiguring it. bootstrapper.run() return bootstrapper @classmethod def parse_config(cls): if not os.path.exists('setup.cfg'): return {} cfg = ConfigParser() try: cfg.read('setup.cfg') except Exception as e: if DEBUG: raise log.error( "Error reading setup.cfg: {0!r}\n{1} will not be " "automatically bootstrapped and package installation may fail." "\n{2}".format(e, PACKAGE_NAME, _err_help_msg)) return {} if not cfg.has_section('ah_bootstrap'): return {} config = {} for option, type_ in CFG_OPTIONS: if not cfg.has_option('ah_bootstrap', option): continue if type_ is bool: value = cfg.getboolean('ah_bootstrap', option) else: value = cfg.get('ah_bootstrap', option) config[option] = value return config @classmethod def parse_command_line(cls, argv=None): if argv is None: argv = sys.argv config = {} # For now we just pop recognized ah_bootstrap options out of the # arg list. This is imperfect; in the unlikely case that a setup.py # custom command or even custom Distribution class defines an argument # of the same name then we will break that. However there's a catch22 # here that we can't just do full argument parsing right here, because # we don't yet know *how* to parse all possible command-line arguments. if '--no-git' in argv: config['use_git'] = False argv.remove('--no-git') if '--offline' in argv: config['offline'] = True argv.remove('--offline') if '--auto-use' in argv: config['auto_use'] = True argv.remove('--auto-use') if '--no-auto-use' in argv: config['auto_use'] = False argv.remove('--no-auto-use') if '--use-system-astropy-helpers' in argv: config['auto_use'] = False argv.remove('--use-system-astropy-helpers') return config def run(self): strategies = ['local_directory', 'local_file', 'index'] dist = None # First, remove any previously imported versions of astropy_helpers; # this is necessary for nested installs where one package's installer # is installing another package via setuptools.sandbox.run_setup, as in # the case of setup_requires for key in list(sys.modules): try: if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'): del sys.modules[key] except AttributeError: # Sometimes mysterious non-string things can turn up in # sys.modules continue # Check to see if the path is a submodule self.is_submodule = self._check_submodule() for strategy in strategies: method = getattr(self, 'get_{0}_dist'.format(strategy)) dist = method() if dist is not None: break else: raise _AHBootstrapSystemExit( "No source found for the {0!r} package; {0} must be " "available and importable as a prerequisite to building " "or installing this package.".format(PACKAGE_NAME)) # This is a bit hacky, but if astropy_helpers was loaded from a # directory/submodule its Distribution object gets a "precedence" of # "DEVELOP_DIST". However, in other cases it gets a precedence of # "EGG_DIST". However, when activing the distribution it will only be # placed early on sys.path if it is treated as an EGG_DIST, so always # do that dist = dist.clone(precedence=pkg_resources.EGG_DIST) # Otherwise we found a version of astropy-helpers, so we're done # Just active the found distribution on sys.path--if we did a # download this usually happens automatically but it doesn't hurt to # do it again # Note: Adding the dist to the global working set also activates it # (makes it importable on sys.path) by default. try: pkg_resources.working_set.add(dist, replace=True) except TypeError: # Some (much) older versions of setuptools do not have the # replace=True option here. These versions are old enough that all # bets may be off anyways, but it's easy enough to work around just # in case... if dist.key in pkg_resources.working_set.by_key: del pkg_resources.working_set.by_key[dist.key] pkg_resources.working_set.add(dist) @property def config(self): """ A `dict` containing the options this `_Bootstrapper` was configured with. """ return dict((optname, getattr(self, optname)) for optname, _ in CFG_OPTIONS if hasattr(self, optname)) def get_local_directory_dist(self): """ Handle importing a vendored package from a subdirectory of the source distribution. """ if not os.path.isdir(self.path): return log.info('Attempting to import astropy_helpers from {0} {1!r}'.format( 'submodule' if self.is_submodule else 'directory', self.path)) dist = self._directory_import() if dist is None: log.warn( 'The requested path {0!r} for importing {1} does not ' 'exist, or does not contain a copy of the {1} ' 'package.'.format(self.path, PACKAGE_NAME)) elif self.auto_upgrade and not self.is_submodule: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_local_file_dist(self): """ Handle importing from a source archive; this also uses setup_requires but points easy_install directly to the source archive. """ if not os.path.isfile(self.path): return log.info('Attempting to unpack and import astropy_helpers from ' '{0!r}'.format(self.path)) try: dist = self._do_download(find_links=[self.path]) except Exception as e: if DEBUG: raise log.warn( 'Failed to import {0} from the specified archive {1!r}: ' '{2}'.format(PACKAGE_NAME, self.path, str(e))) dist = None if dist is not None and self.auto_upgrade: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_index_dist(self): if not self.download: log.warn('Downloading {0!r} disabled.'.format(DIST_NAME)) return None log.warn( "Downloading {0!r}; run setup.py with the --offline option to " "force offline installation.".format(DIST_NAME)) try: dist = self._do_download() except Exception as e: if DEBUG: raise log.warn( 'Failed to download and/or install {0!r} from {1!r}:\n' '{2}'.format(DIST_NAME, self.index_url, str(e))) dist = None # No need to run auto-upgrade here since we've already presumably # gotten the most up-to-date version from the package index return dist def _directory_import(self): """ Import astropy_helpers from the given path, which will be added to sys.path. Must return True if the import succeeded, and False otherwise. """ # Return True on success, False on failure but download is allowed, and # otherwise raise SystemExit path = os.path.abspath(self.path) # Use an empty WorkingSet rather than the man # pkg_resources.working_set, since on older versions of setuptools this # will invoke a VersionConflict when trying to install an upgrade ws = pkg_resources.WorkingSet([]) ws.add_entry(path) dist = ws.by_key.get(DIST_NAME) if dist is None: # We didn't find an egg-info/dist-info in the given path, but if a # setup.py exists we can generate it setup_py = os.path.join(path, 'setup.py') if os.path.isfile(setup_py): # We use subprocess instead of run_setup from setuptools to # avoid segmentation faults - see the following for more details: # https://github.com/cython/cython/issues/2104 sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path) for dist in pkg_resources.find_distributions(path, True): # There should be only one... return dist return dist def _do_download(self, version='', find_links=None): if find_links: allow_hosts = '' index_url = None else: allow_hosts = None index_url = self.index_url # Annoyingly, setuptools will not handle other arguments to # Distribution (such as options) before handling setup_requires, so it # is not straightforward to programmatically augment the arguments which # are passed to easy_install class _Distribution(Distribution): def get_option_dict(self, command_name): opts = Distribution.get_option_dict(self, command_name) if command_name == 'easy_install': if find_links is not None: opts['find_links'] = ('setup script', find_links) if index_url is not None: opts['index_url'] = ('setup script', index_url) if allow_hosts is not None: opts['allow_hosts'] = ('setup script', allow_hosts) return opts if version: req = '{0}=={1}'.format(DIST_NAME, version) else: if UPPER_VERSION_EXCLUSIVE is None: req = DIST_NAME else: req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE) attrs = {'setup_requires': [req]} # NOTE: we need to parse the config file (e.g. setup.cfg) to make sure # it honours the options set in the [easy_install] section, and we need # to explicitly fetch the requirement eggs as setup_requires does not # get honored in recent versions of setuptools: # https://github.com/pypa/setuptools/issues/1273 try: context = _verbose if DEBUG else _silence with context(): dist = _Distribution(attrs=attrs) try: dist.parse_config_files(ignore_option_errors=True) dist.fetch_build_eggs(req) except TypeError: # On older versions of setuptools, ignore_option_errors # doesn't exist, and the above two lines are not needed # so we can just continue pass # If the setup_requires succeeded it will have added the new dist to # the main working_set return pkg_resources.working_set.by_key.get(DIST_NAME) except Exception as e: if DEBUG: raise msg = 'Error retrieving {0} from {1}:\n{2}' if find_links: source = find_links[0] elif index_url != INDEX_URL: source = index_url else: source = 'PyPI' raise Exception(msg.format(DIST_NAME, source, repr(e))) def _do_upgrade(self, dist): # Build up a requirement for a higher bugfix release but a lower minor # release (so API compatibility is guaranteed) next_version = _next_version(dist.parsed_version) req = pkg_resources.Requirement.parse( '{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version)) package_index = PackageIndex(index_url=self.index_url) upgrade = package_index.obtain(req) if upgrade is not None: return self._do_download(version=upgrade.version) def _check_submodule(self): """ Check if the given path is a git submodule. See the docstrings for ``_check_submodule_using_git`` and ``_check_submodule_no_git`` for further details. """ if (self.path is None or (os.path.exists(self.path) and not os.path.isdir(self.path))): return False if self.use_git: return self._check_submodule_using_git() else: return self._check_submodule_no_git() def _check_submodule_using_git(self): """ Check if the given path is a git submodule. If so, attempt to initialize and/or update the submodule if needed. This function makes calls to the ``git`` command in subprocesses. The ``_check_submodule_no_git`` option uses pure Python to check if the given path looks like a git submodule, but it cannot perform updates. """ cmd = ['git', 'submodule', 'status', '--', self.path] try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except _CommandNotFound: # The git command simply wasn't found; this is most likely the # case on user systems that don't have git and are simply # trying to install the package from PyPI or a source # distribution. Silently ignore this case and simply don't try # to use submodules return False stderr = stderr.strip() if returncode != 0 and stderr: # Unfortunately the return code alone cannot be relied on, as # earlier versions of git returned 0 even if the requested submodule # does not exist # This is a warning that occurs in perl (from running git submodule) # which only occurs with a malformatted locale setting which can # happen sometimes on OSX. See again # https://github.com/astropy/astropy/issues/2749 perl_warning = ('perl: warning: Falling back to the standard locale ' '("C").') if not stderr.strip().endswith(perl_warning): # Some other unknown error condition occurred log.warn('git submodule command failed ' 'unexpectedly:\n{0}'.format(stderr)) return False # Output of `git submodule status` is as follows: # # 1: Status indicator: '-' for submodule is uninitialized, '+' if # submodule is initialized but is not at the commit currently indicated # in .gitmodules (and thus needs to be updated), or 'U' if the # submodule is in an unstable state (i.e. has merge conflicts) # # 2. SHA-1 hash of the current commit of the submodule (we don't really # need this information but it's useful for checking that the output is # correct) # # 3. The output of `git describe` for the submodule's current commit # hash (this includes for example what branches the commit is on) but # only if the submodule is initialized. We ignore this information for # now _git_submodule_status_re = re.compile( '^(?P[+-U ])(?P[0-9a-f]{40}) ' '(?P\S+)( .*)?$') # The stdout should only contain one line--the status of the # requested submodule m = _git_submodule_status_re.match(stdout) if m: # Yes, the path *is* a git submodule self._update_submodule(m.group('submodule'), m.group('status')) return True else: log.warn( 'Unexpected output from `git submodule status`:\n{0}\n' 'Will attempt import from {1!r} regardless.'.format( stdout, self.path)) return False def _check_submodule_no_git(self): """ Like ``_check_submodule_using_git``, but simply parses the .gitmodules file to determine if the supplied path is a git submodule, and does not exec any subprocesses. This can only determine if a path is a submodule--it does not perform updates, etc. This function may need to be updated if the format of the .gitmodules file is changed between git versions. """ gitmodules_path = os.path.abspath('.gitmodules') if not os.path.isfile(gitmodules_path): return False # This is a minimal reader for gitconfig-style files. It handles a few of # the quirks that make gitconfig files incompatible with ConfigParser-style # files, but does not support the full gitconfig syntax (just enough # needed to read a .gitmodules file). gitmodules_fileobj = io.StringIO() # Must use io.open for cross-Python-compatible behavior wrt unicode with io.open(gitmodules_path) as f: for line in f: # gitconfig files are more flexible with leading whitespace; just # go ahead and remove it line = line.lstrip() # comments can start with either # or ; if line and line[0] in (':', ';'): continue gitmodules_fileobj.write(line) gitmodules_fileobj.seek(0) cfg = RawConfigParser() try: cfg.readfp(gitmodules_fileobj) except Exception as exc: log.warn('Malformatted .gitmodules file: {0}\n' '{1} cannot be assumed to be a git submodule.'.format( exc, self.path)) return False for section in cfg.sections(): if not cfg.has_option(section, 'path'): continue submodule_path = cfg.get(section, 'path').rstrip(os.sep) if submodule_path == self.path.rstrip(os.sep): return True return False def _update_submodule(self, submodule, status): if status == ' ': # The submodule is up to date; no action necessary return elif status == '-': if self.offline: raise _AHBootstrapSystemExit( "Cannot initialize the {0} submodule in --offline mode; " "this requires being able to clone the submodule from an " "online repository.".format(submodule)) cmd = ['update', '--init'] action = 'Initializing' elif status == '+': cmd = ['update'] action = 'Updating' if self.offline: cmd.append('--no-fetch') elif status == 'U': raise _AHBootstrapSystemExit( 'Error: Submodule {0} contains unresolved merge conflicts. ' 'Please complete or abandon any changes in the submodule so that ' 'it is in a usable state, then try again.'.format(submodule)) else: log.warn('Unknown status {0!r} for git submodule {1!r}. Will ' 'attempt to use the submodule as-is, but try to ensure ' 'that the submodule is in a clean state and contains no ' 'conflicts or errors.\n{2}'.format(status, submodule, _err_help_msg)) return err_msg = None cmd = ['git', 'submodule'] + cmd + ['--', submodule] log.warn('{0} {1} submodule with: `{2}`'.format( action, submodule, ' '.join(cmd))) try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except OSError as e: err_msg = str(e) else: if returncode != 0: err_msg = stderr if err_msg is not None: log.warn('An unexpected error occurred updating the git submodule ' '{0!r}:\n{1}\n{2}'.format(submodule, err_msg, _err_help_msg)) class _CommandNotFound(OSError): """ An exception raised when a command run with run_cmd is not found on the system. """ def run_cmd(cmd): """ Run a command in a subprocess, given as a list of command-line arguments. Returns a ``(returncode, stdout, stderr)`` tuple. """ try: p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE) # XXX: May block if either stdout or stderr fill their buffers; # however for the commands this is currently used for that is # unlikely (they should have very brief output) stdout, stderr = p.communicate() except OSError as e: if DEBUG: raise if e.errno == errno.ENOENT: msg = 'Command not found: `{0}`'.format(' '.join(cmd)) raise _CommandNotFound(msg, cmd) else: raise _AHBootstrapSystemExit( 'An unexpected error occurred when running the ' '`{0}` command:\n{1}'.format(' '.join(cmd), str(e))) # Can fail of the default locale is not configured properly. See # https://github.com/astropy/astropy/issues/2749. For the purposes under # consideration 'latin1' is an acceptable fallback. try: stdio_encoding = locale.getdefaultlocale()[1] or 'latin1' except ValueError: # Due to an OSX oddity locale.getdefaultlocale() can also crash # depending on the user's locale/language settings. See: # http://bugs.python.org/issue18378 stdio_encoding = 'latin1' # Unlikely to fail at this point but even then let's be flexible if not isinstance(stdout, _text_type): stdout = stdout.decode(stdio_encoding, 'replace') if not isinstance(stderr, _text_type): stderr = stderr.decode(stdio_encoding, 'replace') return (p.returncode, stdout, stderr) def _next_version(version): """ Given a parsed version from pkg_resources.parse_version, returns a new version string with the next minor version. Examples ======== >>> _next_version(pkg_resources.parse_version('1.2.3')) '1.3.0' """ if hasattr(version, 'base_version'): # New version parsing from setuptools >= 8.0 if version.base_version: parts = version.base_version.split('.') else: parts = [] else: parts = [] for part in version: if part.startswith('*'): break parts.append(part) parts = [int(p) for p in parts] if len(parts) < 3: parts += [0] * (3 - len(parts)) major, minor, micro = parts[:3] return '{0}.{1}.{2}'.format(major, minor + 1, 0) class _DummyFile(object): """A noop writeable object.""" errors = '' # Required for Python 3.x encoding = 'utf-8' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def _verbose(): yield @contextlib.contextmanager def _silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr _err_help_msg = """ If the problem persists consider installing astropy_helpers manually using pip (`pip install astropy_helpers`) or by manually downloading the source archive, extracting it, and installing by running `python setup.py install` from the root of the extracted source code. """ class _AHBootstrapSystemExit(SystemExit): def __init__(self, *args): if not args: msg = 'An unknown problem occurred bootstrapping astropy_helpers.' else: msg = args[0] msg += '\n' + _err_help_msg super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:]) BOOTSTRAPPER = _Bootstrapper.main() def use_astropy_helpers(**kwargs): """ Ensure that the `astropy_helpers` module is available and is importable. This supports automatic submodule initialization if astropy_helpers is included in a project as a git submodule, or will download it from PyPI if necessary. Parameters ---------- path : str or None, optional A filesystem path relative to the root of the project's source code that should be added to `sys.path` so that `astropy_helpers` can be imported from that path. If the path is a git submodule it will automatically be initialized and/or updated. The path may also be to a ``.tar.gz`` archive of the astropy_helpers source distribution. In this case the archive is automatically unpacked and made temporarily available on `sys.path` as a ``.egg`` archive. If `None` skip straight to downloading. download_if_needed : bool, optional If the provided filesystem path is not found an attempt will be made to download astropy_helpers from PyPI. It will then be made temporarily available on `sys.path` as a ``.egg`` archive (using the ``setup_requires`` feature of setuptools. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. index_url : str, optional If provided, use a different URL for the Python package index than the main PyPI server. use_git : bool, optional If `False` no git commands will be used--this effectively disables support for git submodules. If the ``--no-git`` option is given at the command line the value of this argument is overridden to `False`. auto_upgrade : bool, optional By default, when installing a package from a non-development source distribution ah_boostrap will try to automatically check for patch releases to astropy-helpers on PyPI and use the patched version over any bundled versions. Setting this to `False` will disable that functionality. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. offline : bool, optional If `False` disable all actions that require an internet connection, including downloading packages from the package index and fetching updates to any git submodule. Defaults to `True`. """ global BOOTSTRAPPER config = BOOTSTRAPPER.config config.update(**kwargs) # Create a new bootstrapper with the updated configuration and run it BOOTSTRAPPER = _Bootstrapper(**config) BOOTSTRAPPER.run() pydl-0.7.0/setup.py0000755000076500000240000001265113273057371014634 0ustar weaverstaff00000000000000#!/usr/bin/env python # Licensed under a 3-clause BSD style license - see LICENSE.rst import glob import os import sys # Enforce Python version check - this is the same check as in __init__.py but # this one has to happen before importing ah_bootstrap. if sys.version_info < tuple((int(val) for val in "2.7".split('.'))): sys.stderr.write("ERROR: packagename requires Python {} or later\n".format(2.7)) sys.exit(1) import ah_bootstrap from setuptools import setup # A dirty hack to get around some early import/configurations ambiguities if sys.version_info[0] >= 3: import builtins else: import __builtin__ as builtins builtins._ASTROPY_SETUP_ = True from astropy_helpers.setup_helpers import (register_commands, get_debug_option, get_package_info) from astropy_helpers.git_helpers import get_git_devstr from astropy_helpers.version_helpers import generate_version_py # Get some values from the setup.cfg try: from ConfigParser import ConfigParser except ImportError: from configparser import ConfigParser conf = ConfigParser() conf.read(['setup.cfg']) metadata = dict(conf.items('metadata')) PACKAGENAME = metadata.get('package_name', 'packagename') DESCRIPTION = metadata.get('description', 'Astropy Package Template') AUTHOR = metadata.get('author', 'Astropy Developers') AUTHOR_EMAIL = metadata.get('author_email', '') LICENSE = metadata.get('license', 'unknown') URL = metadata.get('url', 'http://astropy.org') # order of priority for long_description: # (1) set in setup.cfg, # (2) load LONG_DESCRIPTION.rst, # (3) load README.rst, # (4) package docstring readme_glob = 'README*' _cfg_long_description = metadata.get('long_description', '') if _cfg_long_description: LONG_DESCRIPTION = _cfg_long_description elif os.path.exists('LONG_DESCRIPTION.rst'): with open('LONG_DESCRIPTION.rst') as f: LONG_DESCRIPTION = f.read() elif len(glob.glob(readme_glob)) > 0: with open(glob.glob(readme_glob)[0]) as f: LONG_DESCRIPTION = f.read() else: # Get the long description from the package's docstring __import__(PACKAGENAME) package = sys.modules[PACKAGENAME] LONG_DESCRIPTION = package.__doc__ # Store the package name in a built-in variable so it's easy # to get from other parts of the setup infrastructure builtins._ASTROPY_PACKAGE_NAME_ = PACKAGENAME # VERSION should be PEP440 compatible (http://www.python.org/dev/peps/pep-0440) VERSION = metadata.get('version', '0.0.dev0') # Indicates if this version is a release version RELEASE = 'dev' not in VERSION if not RELEASE: VERSION += get_git_devstr(False) # Populate the dict of setup command overrides; this should be done before # invoking any other functionality from distutils since it can potentially # modify distutils' behavior. cmdclassd = register_commands(PACKAGENAME, VERSION, RELEASE) # Freeze build information in version.py generate_version_py(PACKAGENAME, VERSION, RELEASE, get_debug_option(PACKAGENAME)) # Treat everything in scripts except README* as a script to be installed scripts = [fname for fname in glob.glob(os.path.join('scripts', '*')) if not os.path.basename(fname).startswith('README')] # Get configuration information from all of the various subpackages. # See the docstring for setup_helpers.update_package_files for more # details. package_info = get_package_info() # Add the project-global data package_info['package_data'].setdefault(PACKAGENAME, []) package_info['package_data'][PACKAGENAME].append('data/*') # Define entry points for command-line scripts entry_points = {'console_scripts': []} if conf.has_section('entry_points'): entry_point_list = conf.items('entry_points') for entry_point in entry_point_list: entry_points['console_scripts'].append('{0} = {1}'.format( entry_point[0], entry_point[1])) # Include all .c files, recursively, including those generated by # Cython, since we can not do this in MANIFEST.in with a "dynamic" # directory name. c_files = [] for root, dirs, files in os.walk(PACKAGENAME): for filename in files: if filename.endswith('.c'): c_files.append( os.path.join( os.path.relpath(root, PACKAGENAME), filename)) package_info['package_data'][PACKAGENAME].extend(c_files) # Note that requires and provides should not be included in the call to # ``setup``, since these are now deprecated. See this link for more details: # https://groups.google.com/forum/#!topic/astropy-dev/urYO8ckB2uM setup(name=PACKAGENAME, version=VERSION, description=DESCRIPTION, scripts=scripts, install_requires=[s.strip() for s in metadata.get('install_requires', 'astropy').split(',')], author=AUTHOR, author_email=AUTHOR_EMAIL, license=LICENSE, url=URL, long_description=LONG_DESCRIPTION, cmdclass=cmdclassd, zip_safe=False, use_2to3=False, entry_points=entry_points, classifiers = [ 'Development Status :: 5 - Production/Stable', 'Environment :: Console', 'Intended Audience :: Science/Research', 'License :: OSI Approved :: BSD License', 'Operating System :: OS Independent', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', 'Topic :: Scientific/Engineering :: Physics', 'Topic :: Scientific/Engineering :: Astronomy', ], python_requires='>={}'.format("2.7"), **package_info ) pydl-0.7.0/scripts/0000755000076500000240000000000013434104632014571 5ustar weaverstaff00000000000000pydl-0.7.0/scripts/README.rst0000644000076500000240000000013312135272325016257 0ustar weaverstaff00000000000000Scripts ======= This directory contains command-line scripts for the affiliated package. pydl-0.7.0/setup.cfg0000644000076500000240000000302313434104570014722 0ustar weaverstaff00000000000000[build_sphinx] source-dir = docs build-dir = docs/_build all_files = 1 [build_docs] source-dir = docs build-dir = docs/_build all_files = 1 [upload_docs] upload-dir = docs/_build/html show-response = 1 [tool:pytest] minversion = 3.0 norecursedirs = build docs/_build doctest_plus = enabled addopts = -p no:warnings [ah_bootstrap] auto_use = True [pycodestyle] # E101 - mix of tabs and spaces # W191 - use of tabs # W291 - trailing whitespace # W292 - no newline at end of file # W293 - trailing whitespace # W391 - blank line at end of file # E111 - 4 spaces per indentation level # E112 - 4 spaces per indentation level # E113 - 4 spaces per indentation level # E901 - SyntaxError or IndentationError # E902 - IOError select = E101,W191,W291,W292,W293,W391,E111,E112,E113,E901,E902 # The default ignore set is E123,E133,E226,E241,E242 # ignore = E12,E133,E226,E501 exclude = extern,sphinx,*parsetab.py [metadata] package_name = pydl description = Astropy affiliated package # long_description = This description will be obtained from pydl.__doc__. author = Benjamin Alan Weaver author_email = baweaver@lbl.gov license = BSD url = http://github.com/weaverba137/pydl edit_on_github = False github_project = weaverba137/pydl install_requires = astropy, scipy, matplotlib # version should be PEP440 compatible (http://www.python.org/dev/peps/pep-0440) version = 0.7.0 [entry_points] get_juldate = pydl.goddard.astro:get_juldate_main hogg_iau_name = pydl.pydlutils.misc:hogg_iau_name_main compute_templates = pydl.pydlspec2d.spec1d:template_input_main pydl-0.7.0/ez_setup.py0000644000076500000240000003037113170742461015323 0ustar weaverstaff00000000000000#!/usr/bin/env python """ Setuptools bootstrapping installer. Maintained at https://github.com/pypa/setuptools/tree/bootstrap. Run this script to install or upgrade setuptools. This method is DEPRECATED. Check https://github.com/pypa/setuptools/issues/581 for more details. """ import os import shutil import sys import tempfile import zipfile import optparse import subprocess import platform import textwrap import contextlib from distutils import log try: from urllib.request import urlopen except ImportError: from urllib2 import urlopen try: from site import USER_SITE except ImportError: USER_SITE = None # 33.1.1 is the last version that supports setuptools self upgrade/installation. DEFAULT_VERSION = "33.1.1" DEFAULT_URL = "https://pypi.io/packages/source/s/setuptools/" DEFAULT_SAVE_DIR = os.curdir DEFAULT_DEPRECATION_MESSAGE = "ez_setup.py is deprecated and when using it setuptools will be pinned to {0} since it's the last version that supports setuptools self upgrade/installation, check https://github.com/pypa/setuptools/issues/581 for more info; use pip to install setuptools" MEANINGFUL_INVALID_ZIP_ERR_MSG = 'Maybe {0} is corrupted, delete it and try again.' log.warn(DEFAULT_DEPRECATION_MESSAGE.format(DEFAULT_VERSION)) def _python_cmd(*args): """ Execute a command. Return True if the command succeeded. """ args = (sys.executable,) + args return subprocess.call(args) == 0 def _install(archive_filename, install_args=()): """Install Setuptools.""" with archive_context(archive_filename): # installing log.warn('Installing Setuptools') if not _python_cmd('setup.py', 'install', *install_args): log.warn('Something went wrong during the installation.') log.warn('See the error message above.') # exitcode will be 2 return 2 def _build_egg(egg, archive_filename, to_dir): """Build Setuptools egg.""" with archive_context(archive_filename): # building an egg log.warn('Building a Setuptools egg in %s', to_dir) _python_cmd('setup.py', '-q', 'bdist_egg', '--dist-dir', to_dir) # returning the result log.warn(egg) if not os.path.exists(egg): raise IOError('Could not build the egg.') class ContextualZipFile(zipfile.ZipFile): """Supplement ZipFile class to support context manager for Python 2.6.""" def __enter__(self): return self def __exit__(self, type, value, traceback): self.close() def __new__(cls, *args, **kwargs): """Construct a ZipFile or ContextualZipFile as appropriate.""" if hasattr(zipfile.ZipFile, '__exit__'): return zipfile.ZipFile(*args, **kwargs) return super(ContextualZipFile, cls).__new__(cls) @contextlib.contextmanager def archive_context(filename): """ Unzip filename to a temporary directory, set to the cwd. The unzipped target is cleaned up after. """ tmpdir = tempfile.mkdtemp() log.warn('Extracting in %s', tmpdir) old_wd = os.getcwd() try: os.chdir(tmpdir) try: with ContextualZipFile(filename) as archive: archive.extractall() except zipfile.BadZipfile as err: if not err.args: err.args = ('', ) err.args = err.args + ( MEANINGFUL_INVALID_ZIP_ERR_MSG.format(filename), ) raise # going in the directory subdir = os.path.join(tmpdir, os.listdir(tmpdir)[0]) os.chdir(subdir) log.warn('Now working in %s', subdir) yield finally: os.chdir(old_wd) shutil.rmtree(tmpdir) def _do_download(version, download_base, to_dir, download_delay): """Download Setuptools.""" py_desig = 'py{sys.version_info[0]}.{sys.version_info[1]}'.format(sys=sys) tp = 'setuptools-{version}-{py_desig}.egg' egg = os.path.join(to_dir, tp.format(**locals())) if not os.path.exists(egg): archive = download_setuptools(version, download_base, to_dir, download_delay) _build_egg(egg, archive, to_dir) sys.path.insert(0, egg) # Remove previously-imported pkg_resources if present (see # https://bitbucket.org/pypa/setuptools/pull-request/7/ for details). if 'pkg_resources' in sys.modules: _unload_pkg_resources() import setuptools setuptools.bootstrap_install_from = egg def use_setuptools( version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=DEFAULT_SAVE_DIR, download_delay=15): """ Ensure that a setuptools version is installed. Return None. Raise SystemExit if the requested version or later cannot be installed. """ to_dir = os.path.abspath(to_dir) # prior to importing, capture the module state for # representative modules. rep_modules = 'pkg_resources', 'setuptools' imported = set(sys.modules).intersection(rep_modules) try: import pkg_resources pkg_resources.require("setuptools>=" + version) # a suitable version is already installed return except ImportError: # pkg_resources not available; setuptools is not installed; download pass except pkg_resources.DistributionNotFound: # no version of setuptools was found; allow download pass except pkg_resources.VersionConflict as VC_err: if imported: _conflict_bail(VC_err, version) # otherwise, unload pkg_resources to allow the downloaded version to # take precedence. del pkg_resources _unload_pkg_resources() return _do_download(version, download_base, to_dir, download_delay) def _conflict_bail(VC_err, version): """ Setuptools was imported prior to invocation, so it is unsafe to unload it. Bail out. """ conflict_tmpl = textwrap.dedent(""" The required version of setuptools (>={version}) is not available, and can't be installed while this script is running. Please install a more recent version first, using 'easy_install -U setuptools'. (Currently using {VC_err.args[0]!r}) """) msg = conflict_tmpl.format(**locals()) sys.stderr.write(msg) sys.exit(2) def _unload_pkg_resources(): sys.meta_path = [ importer for importer in sys.meta_path if importer.__class__.__module__ != 'pkg_resources.extern' ] del_modules = [ name for name in sys.modules if name.startswith('pkg_resources') ] for mod_name in del_modules: del sys.modules[mod_name] def _clean_check(cmd, target): """ Run the command to download target. If the command fails, clean up before re-raising the error. """ try: subprocess.check_call(cmd) except subprocess.CalledProcessError: if os.access(target, os.F_OK): os.unlink(target) raise def download_file_powershell(url, target): """ Download the file at url to target using Powershell. Powershell will validate trust. Raise an exception if the command cannot complete. """ target = os.path.abspath(target) ps_cmd = ( "[System.Net.WebRequest]::DefaultWebProxy.Credentials = " "[System.Net.CredentialCache]::DefaultCredentials; " '(new-object System.Net.WebClient).DownloadFile("%(url)s", "%(target)s")' % locals() ) cmd = [ 'powershell', '-Command', ps_cmd, ] _clean_check(cmd, target) def has_powershell(): """Determine if Powershell is available.""" if platform.system() != 'Windows': return False cmd = ['powershell', '-Command', 'echo test'] with open(os.path.devnull, 'wb') as devnull: try: subprocess.check_call(cmd, stdout=devnull, stderr=devnull) except Exception: return False return True download_file_powershell.viable = has_powershell def download_file_curl(url, target): cmd = ['curl', url, '--location', '--silent', '--output', target] _clean_check(cmd, target) def has_curl(): cmd = ['curl', '--version'] with open(os.path.devnull, 'wb') as devnull: try: subprocess.check_call(cmd, stdout=devnull, stderr=devnull) except Exception: return False return True download_file_curl.viable = has_curl def download_file_wget(url, target): cmd = ['wget', url, '--quiet', '--output-document', target] _clean_check(cmd, target) def has_wget(): cmd = ['wget', '--version'] with open(os.path.devnull, 'wb') as devnull: try: subprocess.check_call(cmd, stdout=devnull, stderr=devnull) except Exception: return False return True download_file_wget.viable = has_wget def download_file_insecure(url, target): """Use Python to download the file, without connection authentication.""" src = urlopen(url) try: # Read all the data in one block. data = src.read() finally: src.close() # Write all the data in one block to avoid creating a partial file. with open(target, "wb") as dst: dst.write(data) download_file_insecure.viable = lambda: True def get_best_downloader(): downloaders = ( download_file_powershell, download_file_curl, download_file_wget, download_file_insecure, ) viable_downloaders = (dl for dl in downloaders if dl.viable()) return next(viable_downloaders, None) def download_setuptools( version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=DEFAULT_SAVE_DIR, delay=15, downloader_factory=get_best_downloader): """ Download setuptools from a specified location and return its filename. `version` should be a valid setuptools version number that is available as an sdist for download under the `download_base` URL (which should end with a '/'). `to_dir` is the directory where the egg will be downloaded. `delay` is the number of seconds to pause before an actual download attempt. ``downloader_factory`` should be a function taking no arguments and returning a function for downloading a URL to a target. """ # making sure we use the absolute path to_dir = os.path.abspath(to_dir) zip_name = "setuptools-%s.zip" % version url = download_base + zip_name saveto = os.path.join(to_dir, zip_name) if not os.path.exists(saveto): # Avoid repeated downloads log.warn("Downloading %s", url) downloader = downloader_factory() downloader(url, saveto) return os.path.realpath(saveto) def _build_install_args(options): """ Build the arguments to 'python setup.py install' on the setuptools package. Returns list of command line arguments. """ return ['--user'] if options.user_install else [] def _parse_args(): """Parse the command line for options.""" parser = optparse.OptionParser() parser.add_option( '--user', dest='user_install', action='store_true', default=False, help='install in user site package') parser.add_option( '--download-base', dest='download_base', metavar="URL", default=DEFAULT_URL, help='alternative URL from where to download the setuptools package') parser.add_option( '--insecure', dest='downloader_factory', action='store_const', const=lambda: download_file_insecure, default=get_best_downloader, help='Use internal, non-validating downloader' ) parser.add_option( '--version', help="Specify which version to download", default=DEFAULT_VERSION, ) parser.add_option( '--to-dir', help="Directory to save (and re-use) package", default=DEFAULT_SAVE_DIR, ) options, args = parser.parse_args() # positional arguments are ignored return options def _download_args(options): """Return args for download_setuptools function from cmdline args.""" return dict( version=options.version, download_base=options.download_base, downloader_factory=options.downloader_factory, to_dir=options.to_dir, ) def main(): """Install or upgrade setuptools and EasyInstall.""" options = _parse_args() archive = download_setuptools(**_download_args(options)) return _install(archive, _build_install_args(options)) if __name__ == '__main__': sys.exit(main()) pydl-0.7.0/astropy_helpers/0000755000076500000240000000000013434104632016325 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/licenses/0000755000076500000240000000000013434104632020132 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/licenses/LICENSE_NUMPYDOC.rst0000644000076500000240000001350713434074306023276 0ustar weaverstaff00000000000000------------------------------------------------------------------------------- The files - numpydoc.py - docscrape.py - docscrape_sphinx.py - phantom_import.py have the following license: Copyright (C) 2008 Stefan van der Walt , Pauli Virtanen Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ------------------------------------------------------------------------------- The files - compiler_unparse.py - comment_eater.py - traitsdoc.py have the following license: This software is OSI Certified Open Source Software. OSI Certified is a certification mark of the Open Source Initiative. Copyright (c) 2006, Enthought, Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Enthought, Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ------------------------------------------------------------------------------- The file - plot_directive.py originates from Matplotlib (http://matplotlib.sf.net/) which has the following license: Copyright (c) 2002-2008 John D. Hunter; All Rights Reserved. 1. This LICENSE AGREEMENT is between John D. Hunter (“JDH”), and the Individual or Organization (“Licensee”) accessing and otherwise using matplotlib software in source or binary form and its associated documentation. 2. Subject to the terms and conditions of this License Agreement, JDH hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use matplotlib 0.98.3 alone or in any derivative version, provided, however, that JDH’s License Agreement and JDH’s notice of copyright, i.e., “Copyright (c) 2002-2008 John D. Hunter; All Rights Reserved” are retained in matplotlib 0.98.3 alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on or incorporates matplotlib 0.98.3 or any part thereof, and wants to make the derivative work available to others as provided herein, then Licensee hereby agrees to include in any such work a brief summary of the changes made to matplotlib 0.98.3. 4. JDH is making matplotlib 0.98.3 available to Licensee on an “AS IS” basis. JDH MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, JDH MAKES NO AND DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF MATPLOTLIB 0.98.3 WILL NOT INFRINGE ANY THIRD PARTY RIGHTS. 5. JDH SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF MATPLOTLIB 0.98.3 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING MATPLOTLIB 0.98.3, OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 6. This License Agreement will automatically terminate upon a material breach of its terms and conditions. 7. Nothing in this License Agreement shall be deemed to create any relationship of agency, partnership, or joint venture between JDH and Licensee. This License Agreement does not grant permission to use JDH trademarks or trade name in a trademark sense to endorse or promote products or services of Licensee, or any third party. 8. By copying, installing or otherwise using matplotlib 0.98.3, Licensee agrees to be bound by the terms and conditions of this License Agreement. pydl-0.7.0/astropy_helpers/licenses/LICENSE_COPYBUTTON.rst0000644000076500000240000000471113434074306023543 0ustar weaverstaff00000000000000Copyright 2014 Python Software Foundation License: PSF PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2 -------------------------------------------- . 1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), and the Individual or Organization ("Licensee") accessing and otherwise using this software ("Python") in source or binary form and its associated documentation. . 2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. . 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make the derivative work available to others as provided herein, then Licensee hereby agrees to include in any such work a brief summary of the changes made to Python. . 4. PSF is making Python available to Licensee on an "AS IS" basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT INFRINGE ANY THIRD PARTY RIGHTS. . 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON, OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. . 6. This License Agreement will automatically terminate upon a material breach of its terms and conditions. . 7. Nothing in this License Agreement shall be deemed to create any relationship of agency, partnership, or joint venture between PSF and Licensee. This License Agreement does not grant permission to use PSF trademarks or trade name in a trademark sense to endorse or promote products or services of Licensee, or any third party. . 8. By copying, installing or otherwise using Python, Licensee agrees to be bound by the terms and conditions of this License Agreement. pydl-0.7.0/astropy_helpers/licenses/LICENSE_ASTROSCRAPPY.rst0000644000076500000240000000315413216334311023760 0ustar weaverstaff00000000000000# The OpenMP helpers include code heavily adapted from astroscrappy, released # under the following license: # # Copyright (c) 2015, Curtis McCully # All rights reserved. # # Redistribution and use in source and binary forms, with or without modification, # are permitted provided that the following conditions are met: # # * Redistributions of source code must retain the above copyright notice, this # list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, this # list of conditions and the following disclaimer in the documentation and/or # other materials provided with the distribution. # * Neither the name of the Astropy Team nor the names of its contributors may be # used to endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND # ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR # ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES # (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; # LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON # ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS # SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. pydl-0.7.0/astropy_helpers/ah_bootstrap.py0000644000076500000240000010605313434074306021375 0ustar weaverstaff00000000000000""" This bootstrap module contains code for ensuring that the astropy_helpers package will be importable by the time the setup.py script runs. It also includes some workarounds to ensure that a recent-enough version of setuptools is being used for the installation. This module should be the first thing imported in the setup.py of distributions that make use of the utilities in astropy_helpers. If the distribution ships with its own copy of astropy_helpers, this module will first attempt to import from the shipped copy. However, it will also check PyPI to see if there are any bug-fix releases on top of the current version that may be useful to get past platform-specific bugs that have been fixed. When running setup.py, use the ``--offline`` command-line option to disable the auto-upgrade checks. When this module is imported or otherwise executed it automatically calls a main function that attempts to read the project's setup.cfg file, which it checks for a configuration section called ``[ah_bootstrap]`` the presences of that section, and options therein, determine the next step taken: If it contains an option called ``auto_use`` with a value of ``True``, it will automatically call the main function of this module called `use_astropy_helpers` (see that function's docstring for full details). Otherwise no further action is taken and by default the system-installed version of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers`` may be called manually from within the setup.py script). This behavior can also be controlled using the ``--auto-use`` and ``--no-auto-use`` command-line flags. For clarity, an alias for ``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using the latter if needed. Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same names as the arguments to `use_astropy_helpers`, and can be used to configure the bootstrap script when ``auto_use = True``. See https://github.com/astropy/astropy-helpers for more details, and for the latest version of this module. """ import contextlib import errno import io import locale import os import re import subprocess as sp import sys __minimum_python_version__ = (2, 7) if sys.version_info < __minimum_python_version__: print("ERROR: Python {} or later is required by astropy-helpers".format( __minimum_python_version__)) sys.exit(1) try: from ConfigParser import ConfigParser, RawConfigParser except ImportError: from configparser import ConfigParser, RawConfigParser if sys.version_info[0] < 3: _str_types = (str, unicode) _text_type = unicode PY3 = False else: _str_types = (str, bytes) _text_type = str PY3 = True # What follows are several import statements meant to deal with install-time # issues with either missing or misbehaving pacakges (including making sure # setuptools itself is installed): # Check that setuptools 1.0 or later is present from distutils.version import LooseVersion try: import setuptools assert LooseVersion(setuptools.__version__) >= LooseVersion('1.0') except (ImportError, AssertionError): print("ERROR: setuptools 1.0 or later is required by astropy-helpers") sys.exit(1) # typing as a dependency for 1.6.1+ Sphinx causes issues when imported after # initializing submodule with ah_boostrap.py # See discussion and references in # https://github.com/astropy/astropy-helpers/issues/302 try: import typing # noqa except ImportError: pass # Note: The following import is required as a workaround to # https://github.com/astropy/astropy-helpers/issues/89; if we don't import this # module now, it will get cleaned up after `run_setup` is called, but that will # later cause the TemporaryDirectory class defined in it to stop working when # used later on by setuptools try: import setuptools.py31compat # noqa except ImportError: pass # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass # End compatibility imports... # In case it didn't successfully import before the ez_setup checks import pkg_resources from setuptools import Distribution from setuptools.package_index import PackageIndex from distutils import log from distutils.debug import DEBUG # TODO: Maybe enable checking for a specific version of astropy_helpers? DIST_NAME = 'astropy-helpers' PACKAGE_NAME = 'astropy_helpers' if PY3: UPPER_VERSION_EXCLUSIVE = None else: UPPER_VERSION_EXCLUSIVE = '3' # Defaults for other options DOWNLOAD_IF_NEEDED = True INDEX_URL = 'https://pypi.python.org/simple' USE_GIT = True OFFLINE = False AUTO_UPGRADE = True # A list of all the configuration options and their required types CFG_OPTIONS = [ ('auto_use', bool), ('path', str), ('download_if_needed', bool), ('index_url', str), ('use_git', bool), ('offline', bool), ('auto_upgrade', bool) ] class _Bootstrapper(object): """ Bootstrapper implementation. See ``use_astropy_helpers`` for parameter documentation. """ def __init__(self, path=None, index_url=None, use_git=None, offline=None, download_if_needed=None, auto_upgrade=None): if path is None: path = PACKAGE_NAME if not (isinstance(path, _str_types) or path is False): raise TypeError('path must be a string or False') if PY3 and not isinstance(path, _text_type): fs_encoding = sys.getfilesystemencoding() path = path.decode(fs_encoding) # path to unicode self.path = path # Set other option attributes, using defaults where necessary self.index_url = index_url if index_url is not None else INDEX_URL self.offline = offline if offline is not None else OFFLINE # If offline=True, override download and auto-upgrade if self.offline: download_if_needed = False auto_upgrade = False self.download = (download_if_needed if download_if_needed is not None else DOWNLOAD_IF_NEEDED) self.auto_upgrade = (auto_upgrade if auto_upgrade is not None else AUTO_UPGRADE) # If this is a release then the .git directory will not exist so we # should not use git. git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git')) if use_git is None and not git_dir_exists: use_git = False self.use_git = use_git if use_git is not None else USE_GIT # Declared as False by default--later we check if astropy-helpers can be # upgraded from PyPI, but only if not using a source distribution (as in # the case of import from a git submodule) self.is_submodule = False @classmethod def main(cls, argv=None): if argv is None: argv = sys.argv config = cls.parse_config() config.update(cls.parse_command_line(argv)) auto_use = config.pop('auto_use', False) bootstrapper = cls(**config) if auto_use: # Run the bootstrapper, otherwise the setup.py is using the old # use_astropy_helpers() interface, in which case it will run the # bootstrapper manually after reconfiguring it. bootstrapper.run() return bootstrapper @classmethod def parse_config(cls): if not os.path.exists('setup.cfg'): return {} cfg = ConfigParser() try: cfg.read('setup.cfg') except Exception as e: if DEBUG: raise log.error( "Error reading setup.cfg: {0!r}\n{1} will not be " "automatically bootstrapped and package installation may fail." "\n{2}".format(e, PACKAGE_NAME, _err_help_msg)) return {} if not cfg.has_section('ah_bootstrap'): return {} config = {} for option, type_ in CFG_OPTIONS: if not cfg.has_option('ah_bootstrap', option): continue if type_ is bool: value = cfg.getboolean('ah_bootstrap', option) else: value = cfg.get('ah_bootstrap', option) config[option] = value return config @classmethod def parse_command_line(cls, argv=None): if argv is None: argv = sys.argv config = {} # For now we just pop recognized ah_bootstrap options out of the # arg list. This is imperfect; in the unlikely case that a setup.py # custom command or even custom Distribution class defines an argument # of the same name then we will break that. However there's a catch22 # here that we can't just do full argument parsing right here, because # we don't yet know *how* to parse all possible command-line arguments. if '--no-git' in argv: config['use_git'] = False argv.remove('--no-git') if '--offline' in argv: config['offline'] = True argv.remove('--offline') if '--auto-use' in argv: config['auto_use'] = True argv.remove('--auto-use') if '--no-auto-use' in argv: config['auto_use'] = False argv.remove('--no-auto-use') if '--use-system-astropy-helpers' in argv: config['auto_use'] = False argv.remove('--use-system-astropy-helpers') return config def run(self): strategies = ['local_directory', 'local_file', 'index'] dist = None # First, remove any previously imported versions of astropy_helpers; # this is necessary for nested installs where one package's installer # is installing another package via setuptools.sandbox.run_setup, as in # the case of setup_requires for key in list(sys.modules): try: if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'): del sys.modules[key] except AttributeError: # Sometimes mysterious non-string things can turn up in # sys.modules continue # Check to see if the path is a submodule self.is_submodule = self._check_submodule() for strategy in strategies: method = getattr(self, 'get_{0}_dist'.format(strategy)) dist = method() if dist is not None: break else: raise _AHBootstrapSystemExit( "No source found for the {0!r} package; {0} must be " "available and importable as a prerequisite to building " "or installing this package.".format(PACKAGE_NAME)) # This is a bit hacky, but if astropy_helpers was loaded from a # directory/submodule its Distribution object gets a "precedence" of # "DEVELOP_DIST". However, in other cases it gets a precedence of # "EGG_DIST". However, when activing the distribution it will only be # placed early on sys.path if it is treated as an EGG_DIST, so always # do that dist = dist.clone(precedence=pkg_resources.EGG_DIST) # Otherwise we found a version of astropy-helpers, so we're done # Just active the found distribution on sys.path--if we did a # download this usually happens automatically but it doesn't hurt to # do it again # Note: Adding the dist to the global working set also activates it # (makes it importable on sys.path) by default. try: pkg_resources.working_set.add(dist, replace=True) except TypeError: # Some (much) older versions of setuptools do not have the # replace=True option here. These versions are old enough that all # bets may be off anyways, but it's easy enough to work around just # in case... if dist.key in pkg_resources.working_set.by_key: del pkg_resources.working_set.by_key[dist.key] pkg_resources.working_set.add(dist) @property def config(self): """ A `dict` containing the options this `_Bootstrapper` was configured with. """ return dict((optname, getattr(self, optname)) for optname, _ in CFG_OPTIONS if hasattr(self, optname)) def get_local_directory_dist(self): """ Handle importing a vendored package from a subdirectory of the source distribution. """ if not os.path.isdir(self.path): return log.info('Attempting to import astropy_helpers from {0} {1!r}'.format( 'submodule' if self.is_submodule else 'directory', self.path)) dist = self._directory_import() if dist is None: log.warn( 'The requested path {0!r} for importing {1} does not ' 'exist, or does not contain a copy of the {1} ' 'package.'.format(self.path, PACKAGE_NAME)) elif self.auto_upgrade and not self.is_submodule: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_local_file_dist(self): """ Handle importing from a source archive; this also uses setup_requires but points easy_install directly to the source archive. """ if not os.path.isfile(self.path): return log.info('Attempting to unpack and import astropy_helpers from ' '{0!r}'.format(self.path)) try: dist = self._do_download(find_links=[self.path]) except Exception as e: if DEBUG: raise log.warn( 'Failed to import {0} from the specified archive {1!r}: ' '{2}'.format(PACKAGE_NAME, self.path, str(e))) dist = None if dist is not None and self.auto_upgrade: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_index_dist(self): if not self.download: log.warn('Downloading {0!r} disabled.'.format(DIST_NAME)) return None log.warn( "Downloading {0!r}; run setup.py with the --offline option to " "force offline installation.".format(DIST_NAME)) try: dist = self._do_download() except Exception as e: if DEBUG: raise log.warn( 'Failed to download and/or install {0!r} from {1!r}:\n' '{2}'.format(DIST_NAME, self.index_url, str(e))) dist = None # No need to run auto-upgrade here since we've already presumably # gotten the most up-to-date version from the package index return dist def _directory_import(self): """ Import astropy_helpers from the given path, which will be added to sys.path. Must return True if the import succeeded, and False otherwise. """ # Return True on success, False on failure but download is allowed, and # otherwise raise SystemExit path = os.path.abspath(self.path) # Use an empty WorkingSet rather than the man # pkg_resources.working_set, since on older versions of setuptools this # will invoke a VersionConflict when trying to install an upgrade ws = pkg_resources.WorkingSet([]) ws.add_entry(path) dist = ws.by_key.get(DIST_NAME) if dist is None: # We didn't find an egg-info/dist-info in the given path, but if a # setup.py exists we can generate it setup_py = os.path.join(path, 'setup.py') if os.path.isfile(setup_py): # We use subprocess instead of run_setup from setuptools to # avoid segmentation faults - see the following for more details: # https://github.com/cython/cython/issues/2104 sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path) for dist in pkg_resources.find_distributions(path, True): # There should be only one... return dist return dist def _do_download(self, version='', find_links=None): if find_links: allow_hosts = '' index_url = None else: allow_hosts = None index_url = self.index_url # Annoyingly, setuptools will not handle other arguments to # Distribution (such as options) before handling setup_requires, so it # is not straightforward to programmatically augment the arguments which # are passed to easy_install class _Distribution(Distribution): def get_option_dict(self, command_name): opts = Distribution.get_option_dict(self, command_name) if command_name == 'easy_install': if find_links is not None: opts['find_links'] = ('setup script', find_links) if index_url is not None: opts['index_url'] = ('setup script', index_url) if allow_hosts is not None: opts['allow_hosts'] = ('setup script', allow_hosts) return opts if version: req = '{0}=={1}'.format(DIST_NAME, version) else: if UPPER_VERSION_EXCLUSIVE is None: req = DIST_NAME else: req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE) attrs = {'setup_requires': [req]} # NOTE: we need to parse the config file (e.g. setup.cfg) to make sure # it honours the options set in the [easy_install] section, and we need # to explicitly fetch the requirement eggs as setup_requires does not # get honored in recent versions of setuptools: # https://github.com/pypa/setuptools/issues/1273 try: context = _verbose if DEBUG else _silence with context(): dist = _Distribution(attrs=attrs) try: dist.parse_config_files(ignore_option_errors=True) dist.fetch_build_eggs(req) except TypeError: # On older versions of setuptools, ignore_option_errors # doesn't exist, and the above two lines are not needed # so we can just continue pass # If the setup_requires succeeded it will have added the new dist to # the main working_set return pkg_resources.working_set.by_key.get(DIST_NAME) except Exception as e: if DEBUG: raise msg = 'Error retrieving {0} from {1}:\n{2}' if find_links: source = find_links[0] elif index_url != INDEX_URL: source = index_url else: source = 'PyPI' raise Exception(msg.format(DIST_NAME, source, repr(e))) def _do_upgrade(self, dist): # Build up a requirement for a higher bugfix release but a lower minor # release (so API compatibility is guaranteed) next_version = _next_version(dist.parsed_version) req = pkg_resources.Requirement.parse( '{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version)) package_index = PackageIndex(index_url=self.index_url) upgrade = package_index.obtain(req) if upgrade is not None: return self._do_download(version=upgrade.version) def _check_submodule(self): """ Check if the given path is a git submodule. See the docstrings for ``_check_submodule_using_git`` and ``_check_submodule_no_git`` for further details. """ if (self.path is None or (os.path.exists(self.path) and not os.path.isdir(self.path))): return False if self.use_git: return self._check_submodule_using_git() else: return self._check_submodule_no_git() def _check_submodule_using_git(self): """ Check if the given path is a git submodule. If so, attempt to initialize and/or update the submodule if needed. This function makes calls to the ``git`` command in subprocesses. The ``_check_submodule_no_git`` option uses pure Python to check if the given path looks like a git submodule, but it cannot perform updates. """ cmd = ['git', 'submodule', 'status', '--', self.path] try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except _CommandNotFound: # The git command simply wasn't found; this is most likely the # case on user systems that don't have git and are simply # trying to install the package from PyPI or a source # distribution. Silently ignore this case and simply don't try # to use submodules return False stderr = stderr.strip() if returncode != 0 and stderr: # Unfortunately the return code alone cannot be relied on, as # earlier versions of git returned 0 even if the requested submodule # does not exist # This is a warning that occurs in perl (from running git submodule) # which only occurs with a malformatted locale setting which can # happen sometimes on OSX. See again # https://github.com/astropy/astropy/issues/2749 perl_warning = ('perl: warning: Falling back to the standard locale ' '("C").') if not stderr.strip().endswith(perl_warning): # Some other unknown error condition occurred log.warn('git submodule command failed ' 'unexpectedly:\n{0}'.format(stderr)) return False # Output of `git submodule status` is as follows: # # 1: Status indicator: '-' for submodule is uninitialized, '+' if # submodule is initialized but is not at the commit currently indicated # in .gitmodules (and thus needs to be updated), or 'U' if the # submodule is in an unstable state (i.e. has merge conflicts) # # 2. SHA-1 hash of the current commit of the submodule (we don't really # need this information but it's useful for checking that the output is # correct) # # 3. The output of `git describe` for the submodule's current commit # hash (this includes for example what branches the commit is on) but # only if the submodule is initialized. We ignore this information for # now _git_submodule_status_re = re.compile( '^(?P[+-U ])(?P[0-9a-f]{40}) ' '(?P\S+)( .*)?$') # The stdout should only contain one line--the status of the # requested submodule m = _git_submodule_status_re.match(stdout) if m: # Yes, the path *is* a git submodule self._update_submodule(m.group('submodule'), m.group('status')) return True else: log.warn( 'Unexpected output from `git submodule status`:\n{0}\n' 'Will attempt import from {1!r} regardless.'.format( stdout, self.path)) return False def _check_submodule_no_git(self): """ Like ``_check_submodule_using_git``, but simply parses the .gitmodules file to determine if the supplied path is a git submodule, and does not exec any subprocesses. This can only determine if a path is a submodule--it does not perform updates, etc. This function may need to be updated if the format of the .gitmodules file is changed between git versions. """ gitmodules_path = os.path.abspath('.gitmodules') if not os.path.isfile(gitmodules_path): return False # This is a minimal reader for gitconfig-style files. It handles a few of # the quirks that make gitconfig files incompatible with ConfigParser-style # files, but does not support the full gitconfig syntax (just enough # needed to read a .gitmodules file). gitmodules_fileobj = io.StringIO() # Must use io.open for cross-Python-compatible behavior wrt unicode with io.open(gitmodules_path) as f: for line in f: # gitconfig files are more flexible with leading whitespace; just # go ahead and remove it line = line.lstrip() # comments can start with either # or ; if line and line[0] in (':', ';'): continue gitmodules_fileobj.write(line) gitmodules_fileobj.seek(0) cfg = RawConfigParser() try: cfg.readfp(gitmodules_fileobj) except Exception as exc: log.warn('Malformatted .gitmodules file: {0}\n' '{1} cannot be assumed to be a git submodule.'.format( exc, self.path)) return False for section in cfg.sections(): if not cfg.has_option(section, 'path'): continue submodule_path = cfg.get(section, 'path').rstrip(os.sep) if submodule_path == self.path.rstrip(os.sep): return True return False def _update_submodule(self, submodule, status): if status == ' ': # The submodule is up to date; no action necessary return elif status == '-': if self.offline: raise _AHBootstrapSystemExit( "Cannot initialize the {0} submodule in --offline mode; " "this requires being able to clone the submodule from an " "online repository.".format(submodule)) cmd = ['update', '--init'] action = 'Initializing' elif status == '+': cmd = ['update'] action = 'Updating' if self.offline: cmd.append('--no-fetch') elif status == 'U': raise _AHBootstrapSystemExit( 'Error: Submodule {0} contains unresolved merge conflicts. ' 'Please complete or abandon any changes in the submodule so that ' 'it is in a usable state, then try again.'.format(submodule)) else: log.warn('Unknown status {0!r} for git submodule {1!r}. Will ' 'attempt to use the submodule as-is, but try to ensure ' 'that the submodule is in a clean state and contains no ' 'conflicts or errors.\n{2}'.format(status, submodule, _err_help_msg)) return err_msg = None cmd = ['git', 'submodule'] + cmd + ['--', submodule] log.warn('{0} {1} submodule with: `{2}`'.format( action, submodule, ' '.join(cmd))) try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except OSError as e: err_msg = str(e) else: if returncode != 0: err_msg = stderr if err_msg is not None: log.warn('An unexpected error occurred updating the git submodule ' '{0!r}:\n{1}\n{2}'.format(submodule, err_msg, _err_help_msg)) class _CommandNotFound(OSError): """ An exception raised when a command run with run_cmd is not found on the system. """ def run_cmd(cmd): """ Run a command in a subprocess, given as a list of command-line arguments. Returns a ``(returncode, stdout, stderr)`` tuple. """ try: p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE) # XXX: May block if either stdout or stderr fill their buffers; # however for the commands this is currently used for that is # unlikely (they should have very brief output) stdout, stderr = p.communicate() except OSError as e: if DEBUG: raise if e.errno == errno.ENOENT: msg = 'Command not found: `{0}`'.format(' '.join(cmd)) raise _CommandNotFound(msg, cmd) else: raise _AHBootstrapSystemExit( 'An unexpected error occurred when running the ' '`{0}` command:\n{1}'.format(' '.join(cmd), str(e))) # Can fail of the default locale is not configured properly. See # https://github.com/astropy/astropy/issues/2749. For the purposes under # consideration 'latin1' is an acceptable fallback. try: stdio_encoding = locale.getdefaultlocale()[1] or 'latin1' except ValueError: # Due to an OSX oddity locale.getdefaultlocale() can also crash # depending on the user's locale/language settings. See: # http://bugs.python.org/issue18378 stdio_encoding = 'latin1' # Unlikely to fail at this point but even then let's be flexible if not isinstance(stdout, _text_type): stdout = stdout.decode(stdio_encoding, 'replace') if not isinstance(stderr, _text_type): stderr = stderr.decode(stdio_encoding, 'replace') return (p.returncode, stdout, stderr) def _next_version(version): """ Given a parsed version from pkg_resources.parse_version, returns a new version string with the next minor version. Examples ======== >>> _next_version(pkg_resources.parse_version('1.2.3')) '1.3.0' """ if hasattr(version, 'base_version'): # New version parsing from setuptools >= 8.0 if version.base_version: parts = version.base_version.split('.') else: parts = [] else: parts = [] for part in version: if part.startswith('*'): break parts.append(part) parts = [int(p) for p in parts] if len(parts) < 3: parts += [0] * (3 - len(parts)) major, minor, micro = parts[:3] return '{0}.{1}.{2}'.format(major, minor + 1, 0) class _DummyFile(object): """A noop writeable object.""" errors = '' # Required for Python 3.x encoding = 'utf-8' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def _verbose(): yield @contextlib.contextmanager def _silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr _err_help_msg = """ If the problem persists consider installing astropy_helpers manually using pip (`pip install astropy_helpers`) or by manually downloading the source archive, extracting it, and installing by running `python setup.py install` from the root of the extracted source code. """ class _AHBootstrapSystemExit(SystemExit): def __init__(self, *args): if not args: msg = 'An unknown problem occurred bootstrapping astropy_helpers.' else: msg = args[0] msg += '\n' + _err_help_msg super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:]) BOOTSTRAPPER = _Bootstrapper.main() def use_astropy_helpers(**kwargs): """ Ensure that the `astropy_helpers` module is available and is importable. This supports automatic submodule initialization if astropy_helpers is included in a project as a git submodule, or will download it from PyPI if necessary. Parameters ---------- path : str or None, optional A filesystem path relative to the root of the project's source code that should be added to `sys.path` so that `astropy_helpers` can be imported from that path. If the path is a git submodule it will automatically be initialized and/or updated. The path may also be to a ``.tar.gz`` archive of the astropy_helpers source distribution. In this case the archive is automatically unpacked and made temporarily available on `sys.path` as a ``.egg`` archive. If `None` skip straight to downloading. download_if_needed : bool, optional If the provided filesystem path is not found an attempt will be made to download astropy_helpers from PyPI. It will then be made temporarily available on `sys.path` as a ``.egg`` archive (using the ``setup_requires`` feature of setuptools. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. index_url : str, optional If provided, use a different URL for the Python package index than the main PyPI server. use_git : bool, optional If `False` no git commands will be used--this effectively disables support for git submodules. If the ``--no-git`` option is given at the command line the value of this argument is overridden to `False`. auto_upgrade : bool, optional By default, when installing a package from a non-development source distribution ah_boostrap will try to automatically check for patch releases to astropy-helpers on PyPI and use the patched version over any bundled versions. Setting this to `False` will disable that functionality. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. offline : bool, optional If `False` disable all actions that require an internet connection, including downloading packages from the package index and fetching updates to any git submodule. Defaults to `True`. """ global BOOTSTRAPPER config = BOOTSTRAPPER.config config.update(**kwargs) # Create a new bootstrapper with the updated configuration and run it BOOTSTRAPPER = _Bootstrapper(**config) BOOTSTRAPPER.run() pydl-0.7.0/astropy_helpers/astropy_helpers.egg-info/0000755000076500000240000000000013434104632023242 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers.egg-info/PKG-INFO0000644000076500000240000000755113434104217024346 0ustar weaverstaff00000000000000Metadata-Version: 1.1 Name: astropy-helpers Version: 2.0.8 Summary: Utilities for building and installing Astropy, Astropy affiliated packages, and their respective documentation. Home-page: https://github.com/astropy/astropy-helpers Author: The Astropy Developers Author-email: astropy.team@gmail.com License: BSD Description: astropy-helpers =============== * Stable versions: https://pypi.org/project/astropy-helpers/ * Development version, issue tracker: https://github.com/astropy/astropy-helpers This project provides a Python package, ``astropy_helpers``, which includes many build, installation, and documentation-related tools used by the Astropy project, but packaged separately for use by other projects that wish to leverage this work. The motivation behind this package and details of its implementation are in the accepted `Astropy Proposal for Enhancement (APE) 4 `_. The ``astropy_helpers.extern`` sub-module includes modules developed elsewhere that are bundled here for convenience. At the moment, this consists of the following two sphinx extensions: * `numpydoc `_, a Sphinx extension developed as part of the Numpy project. This is used to parse docstrings in Numpy format * `sphinx-automodapi `_, a Sphinx extension developed as part of the Astropy project. This used to be developed directly in ``astropy-helpers`` but is now a standalone package. Issues with these sub-modules should be reported in their respective repositories, and we will regularly update the bundled versions to reflect the latest released versions. ``astropy_helpers`` includes a special "bootstrap" module called ``ah_bootstrap.py`` which is intended to be used by a project's setup.py in order to ensure that the ``astropy_helpers`` package is available for build/installation. This is similar to the ``ez_setup.py`` module that is shipped with some projects to bootstrap `setuptools `_. As described in APE4, the version numbers for ``astropy_helpers`` follow the corresponding major/minor version of the `astropy core package `_, but with an independent sequence of micro (bugfix) version numbers. Hence, the initial release is 0.4, in parallel with Astropy v0.4, which will be the first version of Astropy to use ``astropy-helpers``. For examples of how to implement ``astropy-helpers`` in a project, see the ``setup.py`` and ``setup.cfg`` files of the `Affiliated package template `_. .. image:: https://travis-ci.org/astropy/astropy-helpers.svg :target: https://travis-ci.org/astropy/astropy-helpers .. image:: https://coveralls.io/repos/astropy/astropy-helpers/badge.svg :target: https://coveralls.io/r/astropy/astropy-helpers Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: Framework :: Setuptools Plugin Classifier: Framework :: Sphinx :: Extension Classifier: Framework :: Sphinx :: Theme Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Software Development :: Build Tools Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: System :: Archiving :: Packaging pydl-0.7.0/astropy_helpers/astropy_helpers.egg-info/not-zip-safe0000644000076500000240000000000113434104217025467 0ustar weaverstaff00000000000000 pydl-0.7.0/astropy_helpers/astropy_helpers.egg-info/SOURCES.txt0000644000076500000240000000644513434104217025136 0ustar weaverstaff00000000000000CHANGES.rst LICENSE.rst MANIFEST.in README.rst ah_bootstrap.py setup.cfg setup.py astropy_helpers/__init__.py astropy_helpers/conftest.py astropy_helpers/distutils_helpers.py astropy_helpers/git_helpers.py astropy_helpers/openmp_helpers.py astropy_helpers/setup_helpers.py astropy_helpers/test_helpers.py astropy_helpers/utils.py astropy_helpers/version.py astropy_helpers/version_helpers.py astropy_helpers.egg-info/PKG-INFO astropy_helpers.egg-info/SOURCES.txt astropy_helpers.egg-info/dependency_links.txt astropy_helpers.egg-info/not-zip-safe astropy_helpers.egg-info/top_level.txt astropy_helpers/commands/__init__.py astropy_helpers/commands/_dummy.py astropy_helpers/commands/_test_compat.py astropy_helpers/commands/build_ext.py astropy_helpers/commands/build_py.py astropy_helpers/commands/build_sphinx.py astropy_helpers/commands/install.py astropy_helpers/commands/install_lib.py astropy_helpers/commands/register.py astropy_helpers/commands/setup_package.py astropy_helpers/commands/test.py astropy_helpers/commands/src/compiler.c astropy_helpers/compat/__init__.py astropy_helpers/extern/__init__.py astropy_helpers/extern/setup_package.py astropy_helpers/extern/automodapi/__init__.py astropy_helpers/extern/automodapi/autodoc_enhancements.py astropy_helpers/extern/automodapi/automodapi.py astropy_helpers/extern/automodapi/automodsumm.py astropy_helpers/extern/automodapi/smart_resolver.py astropy_helpers/extern/automodapi/utils.py astropy_helpers/extern/automodapi/templates/autosummary_core/base.rst astropy_helpers/extern/automodapi/templates/autosummary_core/class.rst astropy_helpers/extern/automodapi/templates/autosummary_core/module.rst astropy_helpers/extern/numpydoc/__init__.py astropy_helpers/extern/numpydoc/docscrape.py astropy_helpers/extern/numpydoc/docscrape_sphinx.py astropy_helpers/extern/numpydoc/numpydoc.py astropy_helpers/extern/numpydoc/templates/numpydoc_docstring.rst astropy_helpers/sphinx/__init__.py astropy_helpers/sphinx/conf.py astropy_helpers/sphinx/setup_package.py astropy_helpers/sphinx/ext/__init__.py astropy_helpers/sphinx/ext/changelog_links.py astropy_helpers/sphinx/ext/doctest.py astropy_helpers/sphinx/ext/edit_on_github.py astropy_helpers/sphinx/ext/tocdepthfix.py astropy_helpers/sphinx/ext/tests/__init__.py astropy_helpers/sphinx/local/python2_local_links.inv astropy_helpers/sphinx/local/python3_local_links.inv astropy_helpers/sphinx/themes/bootstrap-astropy/globaltoc.html astropy_helpers/sphinx/themes/bootstrap-astropy/layout.html astropy_helpers/sphinx/themes/bootstrap-astropy/localtoc.html astropy_helpers/sphinx/themes/bootstrap-astropy/searchbox.html astropy_helpers/sphinx/themes/bootstrap-astropy/theme.conf astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_linkout.svg astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_linkout_20.png astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_logo.ico astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_logo.svg astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_logo_32.png astropy_helpers/sphinx/themes/bootstrap-astropy/static/bootstrap-astropy.css astropy_helpers/sphinx/themes/bootstrap-astropy/static/copybutton.js astropy_helpers/sphinx/themes/bootstrap-astropy/static/sidebar.js licenses/LICENSE_ASTROSCRAPPY.rst licenses/LICENSE_COPYBUTTON.rst licenses/LICENSE_NUMPYDOC.rstpydl-0.7.0/astropy_helpers/astropy_helpers.egg-info/top_level.txt0000644000076500000240000000002013434104217025763 0ustar weaverstaff00000000000000astropy_helpers pydl-0.7.0/astropy_helpers/astropy_helpers.egg-info/dependency_links.txt0000644000076500000240000000000113434104217027307 0ustar weaverstaff00000000000000 pydl-0.7.0/astropy_helpers/LICENSE.rst0000644000076500000240000000272312403636675020161 0ustar weaverstaff00000000000000Copyright (c) 2014, Astropy Developers All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the Astropy Team nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. pydl-0.7.0/astropy_helpers/astropy_helpers/0000755000076500000240000000000013434104632021550 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/setup_helpers.py0000644000076500000240000006636013434074306025023 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This module contains a number of utilities for use during setup/build/packaging that are useful to astropy as a whole. """ from __future__ import absolute_import, print_function import collections import os import re import subprocess import sys import traceback import warnings from distutils import log from distutils.dist import Distribution from distutils.errors import DistutilsOptionError, DistutilsModuleError from distutils.core import Extension from distutils.core import Command from distutils.command.sdist import sdist as DistutilsSdist from setuptools import find_packages as _find_packages from .distutils_helpers import (add_command_option, get_compiler_option, get_dummy_distribution, get_distutils_build_option, get_distutils_build_or_install_option) from .version_helpers import get_pkg_version_module from .utils import (walk_skip_hidden, import_file, extends_doc, resolve_name, AstropyDeprecationWarning) from .commands.build_ext import generate_build_ext_command from .commands.build_py import AstropyBuildPy from .commands.install import AstropyInstall from .commands.install_lib import AstropyInstallLib from .commands.register import AstropyRegister from .commands.test import AstropyTest # These imports are not used in this module, but are included for backwards # compat with older versions of this module from .utils import get_numpy_include_path, write_if_different # noqa from .commands.build_ext import should_build_with_cython, get_compiler_version # noqa _module_state = {'registered_commands': None, 'have_sphinx': False, 'package_cache': None, 'exclude_packages': set(), 'excludes_too_late': False} try: import sphinx # noqa _module_state['have_sphinx'] = True except ValueError as e: # This can occur deep in the bowels of Sphinx's imports by way of docutils # and an occurrence of this bug: http://bugs.python.org/issue18378 # In this case sphinx is effectively unusable if 'unknown locale' in e.args[0]: log.warn( "Possible misconfiguration of one of the environment variables " "LC_ALL, LC_CTYPES, LANG, or LANGUAGE. For an example of how to " "configure your system's language environment on OSX see " "http://blog.remibergsma.com/2012/07/10/" "setting-locales-correctly-on-mac-osx-terminal-application/") except ImportError: pass except SyntaxError: # occurs if markupsafe is recent version, which doesn't support Python 3.2 pass PY3 = sys.version_info[0] >= 3 # This adds a new keyword to the setup() function Distribution.skip_2to3 = [] def adjust_compiler(package): """ This function detects broken compilers and switches to another. If the environment variable CC is explicitly set, or a compiler is specified on the commandline, no override is performed -- the purpose here is to only override a default compiler. The specific compilers with problems are: * The default compiler in XCode-4.2, llvm-gcc-4.2, segfaults when compiling wcslib. The set of broken compilers can be updated by changing the compiler_mapping variable. It is a list of 2-tuples where the first in the pair is a regular expression matching the version of the broken compiler, and the second is the compiler to change to. """ warnings.warn( 'Direct use of the adjust_compiler function in setup.py is ' 'deprecated and can be removed from your setup.py. This ' 'functionality is now incorporated directly into the build_ext ' 'command.', AstropyDeprecationWarning) def get_debug_option(packagename): """ Determines if the build is in debug mode. Returns ------- debug : bool True if the current build was started with the debug option, False otherwise. """ try: current_debug = get_pkg_version_module(packagename, fromlist=['debug'])[0] except (ImportError, AttributeError): current_debug = None # Only modify the debug flag if one of the build commands was explicitly # run (i.e. not as a sub-command of something else) dist = get_dummy_distribution() if any(cmd in dist.commands for cmd in ['build', 'build_ext']): debug = bool(get_distutils_build_option('debug')) else: debug = bool(current_debug) if current_debug is not None and current_debug != debug: build_ext_cmd = dist.get_command_class('build_ext') build_ext_cmd.force_rebuild = True return debug def add_exclude_packages(excludes): if _module_state['excludes_too_late']: raise RuntimeError( "add_package_excludes must be called before all other setup helper " "functions in order to properly handle excluded packages") _module_state['exclude_packages'].update(set(excludes)) def register_commands(package, version, release, srcdir='.'): if _module_state['registered_commands'] is not None: return _module_state['registered_commands'] if _module_state['have_sphinx']: try: from .commands.build_sphinx import (AstropyBuildSphinx, AstropyBuildDocs) except ImportError: AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx else: AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx _module_state['registered_commands'] = registered_commands = { 'test': generate_test_command(package), # Use distutils' sdist because it respects package_data. # setuptools/distributes sdist requires duplication of information in # MANIFEST.in 'sdist': DistutilsSdist, # The exact form of the build_ext command depends on whether or not # we're building a release version 'build_ext': generate_build_ext_command(package, release), # We have a custom build_py to generate the default configuration file 'build_py': AstropyBuildPy, # Since install can (in some circumstances) be run without # first building, we also need to override install and # install_lib. See #2223 'install': AstropyInstall, 'install_lib': AstropyInstallLib, 'register': AstropyRegister, 'build_sphinx': AstropyBuildSphinx, 'build_docs': AstropyBuildDocs } # Need to override the __name__ here so that the commandline options are # presented as being related to the "build" command, for example; normally # this wouldn't be necessary since commands also have a command_name # attribute, but there is a bug in distutils' help display code that it # uses __name__ instead of command_name. Yay distutils! for name, cls in registered_commands.items(): cls.__name__ = name # Add a few custom options; more of these can be added by specific packages # later for option in [ ('use-system-libraries', "Use system libraries whenever possible", True)]: add_command_option('build', *option) add_command_option('install', *option) add_command_hooks(registered_commands, srcdir=srcdir) return registered_commands def add_command_hooks(commands, srcdir='.'): """ Look through setup_package.py modules for functions with names like ``pre__hook`` and ``post__hook`` where ```` is the name of a ``setup.py`` command (e.g. build_ext). If either hook is present this adds a wrapped version of that command to the passed in ``commands`` `dict`. ``commands`` may be pre-populated with other custom distutils command classes that should be wrapped if there are hooks for them (e.g. `AstropyBuildPy`). """ hook_re = re.compile(r'^(pre|post)_(.+)_hook$') # Distutils commands have a method of the same name, but it is not a # *classmethod* (which probably didn't exist when distutils was first # written) def get_command_name(cmdcls): if hasattr(cmdcls, 'command_name'): return cmdcls.command_name else: return cmdcls.__name__ packages = filter_packages(find_packages(srcdir)) dist = get_dummy_distribution() hooks = collections.defaultdict(dict) for setuppkg in iter_setup_packages(srcdir, packages): for name, obj in vars(setuppkg).items(): match = hook_re.match(name) if not match: continue hook_type = match.group(1) cmd_name = match.group(2) if hook_type not in hooks[cmd_name]: hooks[cmd_name][hook_type] = [] hooks[cmd_name][hook_type].append((setuppkg.__name__, obj)) for cmd_name, cmd_hooks in hooks.items(): commands[cmd_name] = generate_hooked_command( cmd_name, dist.get_command_class(cmd_name), cmd_hooks) def generate_hooked_command(cmd_name, cmd_cls, hooks): """ Returns a generated subclass of ``cmd_cls`` that runs the pre- and post-command hooks for that command before and after the ``cmd_cls.run`` method. """ def run(self, orig_run=cmd_cls.run): self.run_command_hooks('pre_hooks') orig_run(self) self.run_command_hooks('post_hooks') return type(cmd_name, (cmd_cls, object), {'run': run, 'run_command_hooks': run_command_hooks, 'pre_hooks': hooks.get('pre', []), 'post_hooks': hooks.get('post', [])}) def run_command_hooks(cmd_obj, hook_kind): """Run hooks registered for that command and phase. *cmd_obj* is a finalized command object; *hook_kind* is either 'pre_hook' or 'post_hook'. """ hooks = getattr(cmd_obj, hook_kind, None) if not hooks: return for modname, hook in hooks: if isinstance(hook, str): try: hook_obj = resolve_name(hook) except ImportError as exc: raise DistutilsModuleError( 'cannot find hook {0}: {1}'.format(hook, exc)) else: hook_obj = hook if not callable(hook_obj): raise DistutilsOptionError('hook {0!r} is not callable' % hook) log.info('running {0} from {1} for {2} command'.format( hook_kind.rstrip('s'), modname, cmd_obj.get_command_name())) try: hook_obj(cmd_obj) except Exception: log.error('{0} command hook {1} raised an exception: %s\n'.format( hook_obj.__name__, cmd_obj.get_command_name())) log.error(traceback.format_exc()) sys.exit(1) def generate_test_command(package_name): """ Creates a custom 'test' command for the given package which sets the command's ``package_name`` class attribute to the name of the package being tested. """ return type(package_name.title() + 'Test', (AstropyTest,), {'package_name': package_name}) def update_package_files(srcdir, extensions, package_data, packagenames, package_dirs): """ This function is deprecated and maintained for backward compatibility with affiliated packages. Affiliated packages should update their setup.py to use `get_package_info` instead. """ info = get_package_info(srcdir) extensions.extend(info['ext_modules']) package_data.update(info['package_data']) packagenames = list(set(packagenames + info['packages'])) package_dirs.update(info['package_dir']) def get_package_info(srcdir='.', exclude=()): """ Collates all of the information for building all subpackages and returns a dictionary of keyword arguments that can be passed directly to `distutils.setup`. The purpose of this function is to allow subpackages to update the arguments to the package's ``setup()`` function in its setup.py script, rather than having to specify all extensions/package data directly in the ``setup.py``. See Astropy's own ``setup.py`` for example usage and the Astropy development docs for more details. This function obtains that information by iterating through all packages in ``srcdir`` and locating a ``setup_package.py`` module. This module can contain the following functions: ``get_extensions()``, ``get_package_data()``, ``get_build_options()``, ``get_external_libraries()``, and ``requires_2to3()``. Each of those functions take no arguments. - ``get_extensions`` returns a list of `distutils.extension.Extension` objects. - ``get_package_data()`` returns a dict formatted as required by the ``package_data`` argument to ``setup()``. - ``get_build_options()`` returns a list of tuples describing the extra build options to add. - ``get_external_libraries()`` returns a list of libraries that can optionally be built using external dependencies. - ``get_entry_points()`` returns a dict formatted as required by the ``entry_points`` argument to ``setup()``. - ``requires_2to3()`` should return `True` when the source code requires `2to3` processing to run on Python 3.x. If ``requires_2to3()`` is missing, it is assumed to return `True`. """ ext_modules = [] packages = [] package_data = {} package_dir = {} skip_2to3 = [] if exclude: warnings.warn( "Use of the exclude parameter is no longer supported since it does " "not work as expected. Use add_exclude_packages instead. Note that " "it must be called prior to any other calls from setup helpers.", AstropyDeprecationWarning) # Use the find_packages tool to locate all packages and modules packages = filter_packages(find_packages(srcdir, exclude=exclude)) # Update package_dir if the package lies in a subdirectory if srcdir != '.': package_dir[''] = srcdir # For each of the setup_package.py modules, extract any # information that is needed to install them. The build options # are extracted first, so that their values will be available in # subsequent calls to `get_extensions`, etc. for setuppkg in iter_setup_packages(srcdir, packages): if hasattr(setuppkg, 'get_build_options'): options = setuppkg.get_build_options() for option in options: add_command_option('build', *option) if hasattr(setuppkg, 'get_external_libraries'): libraries = setuppkg.get_external_libraries() for library in libraries: add_external_library(library) if hasattr(setuppkg, 'requires_2to3'): requires_2to3 = setuppkg.requires_2to3() else: requires_2to3 = True if not requires_2to3: skip_2to3.append( os.path.dirname(setuppkg.__file__)) for setuppkg in iter_setup_packages(srcdir, packages): # get_extensions must include any Cython extensions by their .pyx # filename. if hasattr(setuppkg, 'get_extensions'): ext_modules.extend(setuppkg.get_extensions()) if hasattr(setuppkg, 'get_package_data'): package_data.update(setuppkg.get_package_data()) # Locate any .pyx files not already specified, and add their extensions in. # The default include dirs include numpy to facilitate numerical work. ext_modules.extend(get_cython_extensions(srcdir, packages, ext_modules, ['numpy'])) # Now remove extensions that have the special name 'skip_cython', as they # exist Only to indicate that the cython extensions shouldn't be built for i, ext in reversed(list(enumerate(ext_modules))): if ext.name == 'skip_cython': del ext_modules[i] # On Microsoft compilers, we need to pass the '/MANIFEST' # commandline argument. This was the default on MSVC 9.0, but is # now required on MSVC 10.0, but it doesn't seem to hurt to add # it unconditionally. if get_compiler_option() == 'msvc': for ext in ext_modules: ext.extra_link_args.append('/MANIFEST') return { 'ext_modules': ext_modules, 'packages': packages, 'package_dir': package_dir, 'package_data': package_data, 'skip_2to3': skip_2to3 } def iter_setup_packages(srcdir, packages): """ A generator that finds and imports all of the ``setup_package.py`` modules in the source packages. Returns ------- modgen : generator A generator that yields (modname, mod), where `mod` is the module and `modname` is the module name for the ``setup_package.py`` modules. """ for packagename in packages: package_parts = packagename.split('.') package_path = os.path.join(srcdir, *package_parts) setup_package = os.path.relpath( os.path.join(package_path, 'setup_package.py')) if os.path.isfile(setup_package): module = import_file(setup_package, name=packagename + '.setup_package') yield module def iter_pyx_files(package_dir, package_name): """ A generator that yields Cython source files (ending in '.pyx') in the source packages. Returns ------- pyxgen : generator A generator that yields (extmod, fullfn) where `extmod` is the full name of the module that the .pyx file would live in based on the source directory structure, and `fullfn` is the path to the .pyx file. """ for dirpath, dirnames, filenames in walk_skip_hidden(package_dir): for fn in filenames: if fn.endswith('.pyx'): fullfn = os.path.relpath(os.path.join(dirpath, fn)) # Package must match file name extmod = '.'.join([package_name, fn[:-4]]) yield (extmod, fullfn) break # Don't recurse into subdirectories def get_cython_extensions(srcdir, packages, prevextensions=tuple(), extincludedirs=None): """ Looks for Cython files and generates Extensions if needed. Parameters ---------- srcdir : str Path to the root of the source directory to search. prevextensions : list of `~distutils.core.Extension` objects The extensions that are already defined. Any .pyx files already here will be ignored. extincludedirs : list of str or None Directories to include as the `include_dirs` argument to the generated `~distutils.core.Extension` objects. Returns ------- exts : list of `~distutils.core.Extension` objects The new extensions that are needed to compile all .pyx files (does not include any already in `prevextensions`). """ # Vanilla setuptools and old versions of distribute include Cython files # as .c files in the sources, not .pyx, so we cannot simply look for # existing .pyx sources in the previous sources, but we should also check # for .c files with the same remaining filename. So we look for .pyx and # .c files, and we strip the extension. prevsourcepaths = [] ext_modules = [] for ext in prevextensions: for s in ext.sources: if s.endswith(('.pyx', '.c', '.cpp')): sourcepath = os.path.realpath(os.path.splitext(s)[0]) prevsourcepaths.append(sourcepath) for package_name in packages: package_parts = package_name.split('.') package_path = os.path.join(srcdir, *package_parts) for extmod, pyxfn in iter_pyx_files(package_path, package_name): sourcepath = os.path.realpath(os.path.splitext(pyxfn)[0]) if sourcepath not in prevsourcepaths: ext_modules.append(Extension(extmod, [pyxfn], include_dirs=extincludedirs)) return ext_modules class DistutilsExtensionArgs(collections.defaultdict): """ A special dictionary whose default values are the empty list. This is useful for building up a set of arguments for `distutils.Extension` without worrying whether the entry is already present. """ def __init__(self, *args, **kwargs): def default_factory(): return [] super(DistutilsExtensionArgs, self).__init__( default_factory, *args, **kwargs) def update(self, other): for key, val in other.items(): self[key].extend(val) def pkg_config(packages, default_libraries, executable='pkg-config'): """ Uses pkg-config to update a set of distutils Extension arguments to include the flags necessary to link against the given packages. If the pkg-config lookup fails, default_libraries is applied to libraries. Parameters ---------- packages : list of str A list of pkg-config packages to look up. default_libraries : list of str A list of library names to use if the pkg-config lookup fails. Returns ------- config : dict A dictionary containing keyword arguments to `distutils.Extension`. These entries include: - ``include_dirs``: A list of include directories - ``library_dirs``: A list of library directories - ``libraries``: A list of libraries - ``define_macros``: A list of macro defines - ``undef_macros``: A list of macros to undefine - ``extra_compile_args``: A list of extra arguments to pass to the compiler """ flag_map = {'-I': 'include_dirs', '-L': 'library_dirs', '-l': 'libraries', '-D': 'define_macros', '-U': 'undef_macros'} command = "{0} --libs --cflags {1}".format(executable, ' '.join(packages)), result = DistutilsExtensionArgs() try: pipe = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE) output = pipe.communicate()[0].strip() except subprocess.CalledProcessError as e: lines = [ ("{0} failed. This may cause the build to fail below." .format(executable)), " command: {0}".format(e.cmd), " returncode: {0}".format(e.returncode), " output: {0}".format(e.output) ] log.warn('\n'.join(lines)) result['libraries'].extend(default_libraries) else: if pipe.returncode != 0: lines = [ "pkg-config could not lookup up package(s) {0}.".format( ", ".join(packages)), "This may cause the build to fail below." ] log.warn('\n'.join(lines)) result['libraries'].extend(default_libraries) else: for token in output.split(): # It's not clear what encoding the output of # pkg-config will come to us in. It will probably be # some combination of pure ASCII (for the compiler # flags) and the filesystem encoding (for any argument # that includes directories or filenames), but this is # just conjecture, as the pkg-config documentation # doesn't seem to address it. arg = token[:2].decode('ascii') value = token[2:].decode(sys.getfilesystemencoding()) if arg in flag_map: if arg == '-D': value = tuple(value.split('=', 1)) result[flag_map[arg]].append(value) else: result['extra_compile_args'].append(value) return result def add_external_library(library): """ Add a build option for selecting the internal or system copy of a library. Parameters ---------- library : str The name of the library. If the library is `foo`, the build option will be called `--use-system-foo`. """ for command in ['build', 'build_ext', 'install']: add_command_option(command, str('use-system-' + library), 'Use the system {0} library'.format(library), is_bool=True) def use_system_library(library): """ Returns `True` if the build configuration indicates that the given library should use the system copy of the library rather than the internal one. For the given library `foo`, this will be `True` if `--use-system-foo` or `--use-system-libraries` was provided at the commandline or in `setup.cfg`. Parameters ---------- library : str The name of the library Returns ------- use_system : bool `True` if the build should use the system copy of the library. """ return ( get_distutils_build_or_install_option('use_system_{0}'.format(library)) or get_distutils_build_or_install_option('use_system_libraries')) @extends_doc(_find_packages) def find_packages(where='.', exclude=(), invalidate_cache=False): """ This version of ``find_packages`` caches previous results to speed up subsequent calls. Use ``invalide_cache=True`` to ignore cached results from previous ``find_packages`` calls, and repeat the package search. """ if exclude: warnings.warn( "Use of the exclude parameter is no longer supported since it does " "not work as expected. Use add_exclude_packages instead. Note that " "it must be called prior to any other calls from setup helpers.", AstropyDeprecationWarning) # Calling add_exclude_packages after this point will have no effect _module_state['excludes_too_late'] = True if not invalidate_cache and _module_state['package_cache'] is not None: return _module_state['package_cache'] packages = _find_packages( where=where, exclude=list(_module_state['exclude_packages'])) _module_state['package_cache'] = packages return packages def filter_packages(packagenames): """ Removes some packages from the package list that shouldn't be installed on the current version of Python. """ if PY3: exclude = '_py2' else: exclude = '_py3' return [x for x in packagenames if not x.endswith(exclude)] class FakeBuildSphinx(Command): """ A dummy build_sphinx command that is called if Sphinx is not installed and displays a relevant error message """ # user options inherited from sphinx.setup_command.BuildDoc user_options = [ ('fresh-env', 'E', ''), ('all-files', 'a', ''), ('source-dir=', 's', ''), ('build-dir=', None, ''), ('config-dir=', 'c', ''), ('builder=', 'b', ''), ('project=', None, ''), ('version=', None, ''), ('release=', None, ''), ('today=', None, ''), ('link-index', 'i', '')] # user options appended in astropy.setup_helpers.AstropyBuildSphinx user_options.append(('warnings-returncode', 'w', '')) user_options.append(('clean-docs', 'l', '')) user_options.append(('no-intersphinx', 'n', '')) user_options.append(('open-docs-in-browser', 'o', '')) def initialize_options(self): try: raise RuntimeError("Sphinx and its dependencies must be installed " "for build_docs.") except: log.error('error: Sphinx and its dependencies must be installed ' 'for build_docs.') sys.exit(1) pydl-0.7.0/astropy_helpers/astropy_helpers/conftest.py0000644000076500000240000000315713434074306023761 0ustar weaverstaff00000000000000# This file contains settings for pytest that are specific to astropy-helpers. # Since we run many of the tests in sub-processes, we need to collect coverage # data inside each subprocess and then combine it into a single .coverage file. # To do this we set up a list which run_setup appends coverage objects to. # This is not intended to be used by packages other than astropy-helpers. import os from collections import defaultdict try: from coverage import CoverageData except ImportError: HAS_COVERAGE = False else: HAS_COVERAGE = True if HAS_COVERAGE: SUBPROCESS_COVERAGE = [] def pytest_configure(config): if HAS_COVERAGE: SUBPROCESS_COVERAGE[:] = [] def pytest_unconfigure(config): if HAS_COVERAGE: # We create an empty coverage data object combined_cdata = CoverageData() lines = defaultdict(list) for cdata in SUBPROCESS_COVERAGE: # For each CoverageData object, we go through all the files and # change the filename from one which might be a temporary path # to the local filename. We then only keep files that actually # exist. for filename in cdata.measured_files(): try: pos = filename.rindex('astropy_helpers') except ValueError: continue short_filename = filename[pos:] if os.path.exists(short_filename): lines[os.path.abspath(short_filename)].extend(cdata.lines(filename)) combined_cdata.add_lines(lines) combined_cdata.write_file('.coverage.subprocess') pydl-0.7.0/astropy_helpers/astropy_helpers/compat/0000755000076500000240000000000013434104632023033 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/compat/__init__.py0000644000076500000240000000056013434074306025151 0ustar weaverstaff00000000000000def _fix_user_options(options): """ This is for Python 2.x and 3.x compatibility. distutils expects Command options to all be byte strings on Python 2 and Unicode strings on Python 3. """ def to_str_or_none(x): if x is None: return None return str(x) return [tuple(to_str_or_none(x) for x in y) for y in options] pydl-0.7.0/astropy_helpers/astropy_helpers/version_helpers.py0000644000076500000240000002346313434074306025345 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Utilities for generating the version string for Astropy (or an affiliated package) and the version.py module, which contains version info for the package. Within the generated astropy.version module, the `major`, `minor`, and `bugfix` variables hold the respective parts of the version number (bugfix is '0' if absent). The `release` variable is True if this is a release, and False if this is a development version of astropy. For the actual version string, use:: from astropy.version import version or:: from astropy import __version__ """ from __future__ import division import datetime import imp import os import pkgutil import sys import time from distutils import log import pkg_resources from . import git_helpers from .distutils_helpers import is_distutils_display_option from .utils import invalidate_caches PY3 = sys.version_info[0] == 3 def _version_split(version): """ Split a version string into major, minor, and bugfix numbers. If any of those numbers are missing the default is zero. Any pre/post release modifiers are ignored. Examples ======== >>> _version_split('1.2.3') (1, 2, 3) >>> _version_split('1.2') (1, 2, 0) >>> _version_split('1.2rc1') (1, 2, 0) >>> _version_split('1') (1, 0, 0) >>> _version_split('') (0, 0, 0) """ parsed_version = pkg_resources.parse_version(version) if hasattr(parsed_version, 'base_version'): # New version parsing for setuptools >= 8.0 if parsed_version.base_version: parts = [int(part) for part in parsed_version.base_version.split('.')] else: parts = [] else: parts = [] for part in parsed_version: if part.startswith('*'): # Ignore any .dev, a, b, rc, etc. break parts.append(int(part)) if len(parts) < 3: parts += [0] * (3 - len(parts)) # In principle a version could have more parts (like 1.2.3.4) but we only # support .. return tuple(parts[:3]) # This is used by setup.py to create a new version.py - see that file for # details. Note that the imports have to be absolute, since this is also used # by affiliated packages. _FROZEN_VERSION_PY_TEMPLATE = """ # Autogenerated by {packagetitle}'s setup.py on {timestamp!s} UTC from __future__ import unicode_literals import datetime {header} major = {major} minor = {minor} bugfix = {bugfix} release = {rel} timestamp = {timestamp!r} debug = {debug} astropy_helpers_version = "{ahver}" try: from ._compiler import compiler except ImportError: compiler = "unknown" try: from .cython_version import cython_version except ImportError: cython_version = "unknown" """[1:] _FROZEN_VERSION_PY_WITH_GIT_HEADER = """ {git_helpers} _packagename = "{packagename}" _last_generated_version = "{verstr}" _last_githash = "{githash}" # Determine where the source code for this module # lives. If __file__ is not a filesystem path then # it is assumed not to live in a git repo at all. if _get_repo_path(__file__, levels=len(_packagename.split('.'))): version = update_git_devstr(_last_generated_version, path=__file__) githash = get_git_devstr(sha=True, show_warning=False, path=__file__) or _last_githash else: # The file does not appear to live in a git repo so don't bother # invoking git version = _last_generated_version githash = _last_githash """[1:] _FROZEN_VERSION_PY_STATIC_HEADER = """ version = "{verstr}" githash = "{githash}" """[1:] def _get_version_py_str(packagename, version, githash, release, debug, uses_git=True): try: from astropy_helpers import __version__ as ahver except ImportError: ahver = "unknown" epoch = int(os.environ.get('SOURCE_DATE_EPOCH', time.time())) timestamp = datetime.datetime.utcfromtimestamp(epoch) major, minor, bugfix = _version_split(version) if packagename.lower() == 'astropy': packagetitle = 'Astropy' else: packagetitle = 'Astropy-affiliated package ' + packagename header = '' if uses_git: header = _generate_git_header(packagename, version, githash) elif not githash: # _generate_git_header will already generate a new git has for us, but # for creating a new version.py for a release (even if uses_git=False) # we still need to get the githash to include in the version.py # See https://github.com/astropy/astropy-helpers/issues/141 githash = git_helpers.get_git_devstr(sha=True, show_warning=True) if not header: # If _generate_git_header fails it returns an empty string header = _FROZEN_VERSION_PY_STATIC_HEADER.format(verstr=version, githash=githash) return _FROZEN_VERSION_PY_TEMPLATE.format(packagetitle=packagetitle, timestamp=timestamp, header=header, major=major, minor=minor, bugfix=bugfix, ahver=ahver, rel=release, debug=debug) def _generate_git_header(packagename, version, githash): """ Generates a header to the version.py module that includes utilities for probing the git repository for updates (to the current git hash, etc.) These utilities should only be available in development versions, and not in release builds. If this fails for any reason an empty string is returned. """ loader = pkgutil.get_loader(git_helpers) source = loader.get_source(git_helpers.__name__) or '' source_lines = source.splitlines() if not source_lines: log.warn('Cannot get source code for astropy_helpers.git_helpers; ' 'git support disabled.') return '' idx = 0 for idx, line in enumerate(source_lines): if line.startswith('# BEGIN'): break git_helpers_py = '\n'.join(source_lines[idx + 1:]) if PY3: verstr = version else: # In Python 2 don't pass in a unicode string; otherwise verstr will # be represented with u'' syntax which breaks on Python 3.x with x # < 3. This is only an issue when developing on multiple Python # versions at once verstr = version.encode('utf8') new_githash = git_helpers.get_git_devstr(sha=True, show_warning=False) if new_githash: githash = new_githash return _FROZEN_VERSION_PY_WITH_GIT_HEADER.format( git_helpers=git_helpers_py, packagename=packagename, verstr=verstr, githash=githash) def generate_version_py(packagename, version, release=None, debug=None, uses_git=True, srcdir='.'): """Regenerate the version.py module if necessary.""" try: version_module = get_pkg_version_module(packagename) try: last_generated_version = version_module._last_generated_version except AttributeError: last_generated_version = version_module.version try: last_githash = version_module._last_githash except AttributeError: last_githash = version_module.githash current_release = version_module.release current_debug = version_module.debug except ImportError: version_module = None last_generated_version = None last_githash = None current_release = None current_debug = None if release is None: # Keep whatever the current value is, if it exists release = bool(current_release) if debug is None: # Likewise, keep whatever the current value is, if it exists debug = bool(current_debug) package_srcdir = os.path.join(srcdir, *packagename.split('.')) version_py = os.path.join(package_srcdir, 'version.py') if (last_generated_version != version or current_release != release or current_debug != debug): if '-q' not in sys.argv and '--quiet' not in sys.argv: log.set_threshold(log.INFO) if is_distutils_display_option(): # Always silence unnecessary log messages when display options are # being used log.set_threshold(log.WARN) log.info('Freezing version number to {0}'.format(version_py)) with open(version_py, 'w') as f: # This overwrites the actual version.py f.write(_get_version_py_str(packagename, version, last_githash, release, debug, uses_git=uses_git)) invalidate_caches() if version_module: imp.reload(version_module) def get_pkg_version_module(packagename, fromlist=None): """Returns the package's .version module generated by `astropy_helpers.version_helpers.generate_version_py`. Raises an ImportError if the version module is not found. If ``fromlist`` is an iterable, return a tuple of the members of the version module corresponding to the member names given in ``fromlist``. Raises an `AttributeError` if any of these module members are not found. """ if not fromlist: # Due to a historical quirk of Python's import implementation, # __import__ will not return submodules of a package if 'fromlist' is # empty. # TODO: For Python 3.1 and up it may be preferable to use importlib # instead of the __import__ builtin return __import__(packagename + '.version', fromlist=['']) else: mod = __import__(packagename + '.version', fromlist=fromlist) return tuple(getattr(mod, member) for member in fromlist) pydl-0.7.0/astropy_helpers/astropy_helpers/version.py0000644000076500000240000000106613434104217023611 0ustar weaverstaff00000000000000# Autogenerated by Astropy-affiliated package astropy_helpers's setup.py on 2019-02-22 23:41:03 UTC from __future__ import unicode_literals import datetime version = "2.0.8" githash = "231c409a632dcbf2beae1c2dea5b843d81ede511" major = 2 minor = 0 bugfix = 8 release = True timestamp = datetime.datetime(2019, 2, 22, 23, 41, 3) debug = False astropy_helpers_version = "" try: from ._compiler import compiler except ImportError: compiler = "unknown" try: from .cython_version import cython_version except ImportError: cython_version = "unknown" pydl-0.7.0/astropy_helpers/astropy_helpers/openmp_helpers.py0000644000076500000240000000610513434074306025150 0ustar weaverstaff00000000000000# This module defines functions that can be used to check whether OpenMP is # available and if so what flags to use. To use this, import the # add_openmp_flags_if_available function in a setup_package.py file where you # are defining your extensions: # # from astropy_helpers.openmp_helpers import add_openmp_flags_if_available # # then call it with a single extension as the only argument: # # add_openmp_flags_if_available(extension) # # this will add the OpenMP flags if available. from __future__ import absolute_import, print_function import os import sys import glob import tempfile import subprocess from distutils import log from distutils.ccompiler import new_compiler from distutils.sysconfig import customize_compiler from distutils.errors import CompileError, LinkError from .setup_helpers import get_compiler_option __all__ = ['add_openmp_flags_if_available'] CCODE = """ #include #include int main(void) { #pragma omp parallel printf("nthreads=%d\\n", omp_get_num_threads()); return 0; } """ def add_openmp_flags_if_available(extension): """ Add OpenMP compilation flags, if available (if not a warning will be printed to the console and no flags will be added) Returns `True` if the flags were added, `False` otherwise. """ ccompiler = new_compiler() customize_compiler(ccompiler) tmp_dir = tempfile.mkdtemp() start_dir = os.path.abspath('.') if get_compiler_option() == 'msvc': compile_flag = '-openmp' link_flag = '' else: compile_flag = '-fopenmp' link_flag = '-fopenmp' try: os.chdir(tmp_dir) with open('test_openmp.c', 'w') as f: f.write(CCODE) os.mkdir('objects') # Compile, link, and run test program ccompiler.compile(['test_openmp.c'], output_dir='objects', extra_postargs=[compile_flag]) ccompiler.link_executable(glob.glob(os.path.join('objects', '*' + ccompiler.obj_extension)), 'test_openmp', extra_postargs=[link_flag]) output = subprocess.check_output('./test_openmp').decode(sys.stdout.encoding or 'utf-8').splitlines() if 'nthreads=' in output[0]: nthreads = int(output[0].strip().split('=')[1]) if len(output) == nthreads: using_openmp = True else: log.warn("Unexpected number of lines from output of test OpenMP " "program (output was {0})".format(output)) using_openmp = False else: log.warn("Unexpected output from test OpenMP " "program (output was {0})".format(output)) using_openmp = False except (CompileError, LinkError): using_openmp = False finally: os.chdir(start_dir) if using_openmp: log.info("Compiling Cython extension with OpenMP support") extension.extra_compile_args.append(compile_flag) extension.extra_link_args.append(link_flag) else: log.warn("Cannot compile Cython extension with OpenMP, reverting to non-parallel code") return using_openmp pydl-0.7.0/astropy_helpers/astropy_helpers/git_helpers.py0000644000076500000240000001450113434074306024434 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Utilities for retrieving revision information from a project's git repository. """ # Do not remove the following comment; it is used by # astropy_helpers.version_helpers to determine the beginning of the code in # this module # BEGIN import locale import os import subprocess import warnings def _decode_stdio(stream): try: stdio_encoding = locale.getdefaultlocale()[1] or 'utf-8' except ValueError: stdio_encoding = 'utf-8' try: text = stream.decode(stdio_encoding) except UnicodeDecodeError: # Final fallback text = stream.decode('latin1') return text def update_git_devstr(version, path=None): """ Updates the git revision string if and only if the path is being imported directly from a git working copy. This ensures that the revision number in the version string is accurate. """ try: # Quick way to determine if we're in git or not - returns '' if not devstr = get_git_devstr(sha=True, show_warning=False, path=path) except OSError: return version if not devstr: # Probably not in git so just pass silently return version if 'dev' in version: # update to the current git revision version_base = version.split('.dev', 1)[0] devstr = get_git_devstr(sha=False, show_warning=False, path=path) return version_base + '.dev' + devstr else: # otherwise it's already the true/release version return version def get_git_devstr(sha=False, show_warning=True, path=None): """ Determines the number of revisions in this repository. Parameters ---------- sha : bool If True, the full SHA1 hash will be returned. Otherwise, the total count of commits in the repository will be used as a "revision number". show_warning : bool If True, issue a warning if git returns an error code, otherwise errors pass silently. path : str or None If a string, specifies the directory to look in to find the git repository. If `None`, the current working directory is used, and must be the root of the git repository. If given a filename it uses the directory containing that file. Returns ------- devversion : str Either a string with the revision number (if `sha` is False), the SHA1 hash of the current commit (if `sha` is True), or an empty string if git version info could not be identified. """ if path is None: path = os.getcwd() if not os.path.isdir(path): path = os.path.abspath(os.path.dirname(path)) if sha: # Faster for getting just the hash of HEAD cmd = ['rev-parse', 'HEAD'] else: cmd = ['rev-list', '--count', 'HEAD'] def run_git(cmd): try: p = subprocess.Popen(['git'] + cmd, cwd=path, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE) stdout, stderr = p.communicate() except OSError as e: if show_warning: warnings.warn('Error running git: ' + str(e)) return (None, b'', b'') if p.returncode == 128: if show_warning: warnings.warn('No git repository present at {0!r}! Using ' 'default dev version.'.format(path)) return (p.returncode, b'', b'') if p.returncode == 129: if show_warning: warnings.warn('Your git looks old (does it support {0}?); ' 'consider upgrading to v1.7.2 or ' 'later.'.format(cmd[0])) return (p.returncode, stdout, stderr) elif p.returncode != 0: if show_warning: warnings.warn('Git failed while determining revision ' 'count: {0}'.format(_decode_stdio(stderr))) return (p.returncode, stdout, stderr) return p.returncode, stdout, stderr returncode, stdout, stderr = run_git(cmd) if not sha and returncode == 128: # git returns 128 if the command is not run from within a git # repository tree. In this case, a warning is produced above but we # return the default dev version of '0'. return '0' elif not sha and returncode == 129: # git returns 129 if a command option failed to parse; in # particular this could happen in git versions older than 1.7.2 # where the --count option is not supported # Also use --abbrev-commit and --abbrev=0 to display the minimum # number of characters needed per-commit (rather than the full hash) cmd = ['rev-list', '--abbrev-commit', '--abbrev=0', 'HEAD'] returncode, stdout, stderr = run_git(cmd) # Fall back on the old method of getting all revisions and counting # the lines if returncode == 0: return str(stdout.count(b'\n')) else: return '' elif sha: return _decode_stdio(stdout)[:40] else: return _decode_stdio(stdout).strip() # This function is tested but it is only ever executed within a subprocess when # creating a fake package, so it doesn't get picked up by coverage metrics. def _get_repo_path(pathname, levels=None): # pragma: no cover """ Given a file or directory name, determine the root of the git repository this path is under. If given, this won't look any higher than ``levels`` (that is, if ``levels=0`` then the given path must be the root of the git repository and is returned if so. Returns `None` if the given path could not be determined to belong to a git repo. """ if os.path.isfile(pathname): current_dir = os.path.abspath(os.path.dirname(pathname)) elif os.path.isdir(pathname): current_dir = os.path.abspath(pathname) else: return None current_level = 0 while levels is None or current_level <= levels: if os.path.exists(os.path.join(current_dir, '.git')): return current_dir current_level += 1 if current_dir == os.path.dirname(current_dir): break current_dir = os.path.dirname(current_dir) return None pydl-0.7.0/astropy_helpers/astropy_helpers/distutils_helpers.py0000644000076500000240000001736213434074306025705 0ustar weaverstaff00000000000000""" This module contains various utilities for introspecting the distutils module and the setup process. Some of these utilities require the `astropy_helpers.setup_helpers.register_commands` function to be called first, as it will affect introspection of setuptools command-line arguments. Other utilities in this module do not have that restriction. """ import os import sys from distutils import ccompiler, log from distutils.dist import Distribution from distutils.errors import DistutilsError from .utils import silence # This function, and any functions that call it, require the setup in # `astropy_helpers.setup_helpers.register_commands` to be run first. def get_dummy_distribution(): """ Returns a distutils Distribution object used to instrument the setup environment before calling the actual setup() function. """ from .setup_helpers import _module_state if _module_state['registered_commands'] is None: raise RuntimeError( 'astropy_helpers.setup_helpers.register_commands() must be ' 'called before using ' 'astropy_helpers.setup_helpers.get_dummy_distribution()') # Pre-parse the Distutils command-line options and config files to if # the option is set. dist = Distribution({'script_name': os.path.basename(sys.argv[0]), 'script_args': sys.argv[1:]}) dist.cmdclass.update(_module_state['registered_commands']) with silence(): try: dist.parse_config_files() dist.parse_command_line() except (DistutilsError, AttributeError, SystemExit): # Let distutils handle DistutilsErrors itself AttributeErrors can # get raise for ./setup.py --help SystemExit can be raised if a # display option was used, for example pass return dist def get_distutils_option(option, commands): """ Returns the value of the given distutils option. Parameters ---------- option : str The name of the option commands : list of str The list of commands on which this option is available Returns ------- val : str or None the value of the given distutils option. If the option is not set, returns None. """ dist = get_dummy_distribution() for cmd in commands: cmd_opts = dist.command_options.get(cmd) if cmd_opts is not None and option in cmd_opts: return cmd_opts[option][1] else: return None def get_distutils_build_option(option): """ Returns the value of the given distutils build option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build option. If the option is not set, returns None. """ return get_distutils_option(option, ['build', 'build_ext', 'build_clib']) def get_distutils_install_option(option): """ Returns the value of the given distutils install option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build option. If the option is not set, returns None. """ return get_distutils_option(option, ['install']) def get_distutils_build_or_install_option(option): """ Returns the value of the given distutils build or install option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build or install option. If the option is not set, returns None. """ return get_distutils_option(option, ['build', 'build_ext', 'build_clib', 'install']) def get_compiler_option(): """ Determines the compiler that will be used to build extension modules. Returns ------- compiler : str The compiler option specified for the build, build_ext, or build_clib command; or the default compiler for the platform if none was specified. """ compiler = get_distutils_build_option('compiler') if compiler is None: return ccompiler.get_default_compiler() return compiler def add_command_option(command, name, doc, is_bool=False): """ Add a custom option to a setup command. Issues a warning if the option already exists on that command. Parameters ---------- command : str The name of the command as given on the command line name : str The name of the build option doc : str A short description of the option, for the `--help` message is_bool : bool, optional When `True`, the option is a boolean option and doesn't require an associated value. """ dist = get_dummy_distribution() cmdcls = dist.get_command_class(command) if (hasattr(cmdcls, '_astropy_helpers_options') and name in cmdcls._astropy_helpers_options): return attr = name.replace('-', '_') if hasattr(cmdcls, attr): raise RuntimeError( '{0!r} already has a {1!r} class attribute, barring {2!r} from ' 'being usable as a custom option name.'.format(cmdcls, attr, name)) for idx, cmd in enumerate(cmdcls.user_options): if cmd[0] == name: log.warn('Overriding existing {0!r} option ' '{1!r}'.format(command, name)) del cmdcls.user_options[idx] if name in cmdcls.boolean_options: cmdcls.boolean_options.remove(name) break cmdcls.user_options.append((name, None, doc)) if is_bool: cmdcls.boolean_options.append(name) # Distutils' command parsing requires that a command object have an # attribute with the same name as the option (with '-' replaced with '_') # in order for that option to be recognized as valid setattr(cmdcls, attr, None) # This caches the options added through add_command_option so that if it is # run multiple times in the same interpreter repeated adds are ignored # (this way we can still raise a RuntimeError if a custom option overrides # a built-in option) if not hasattr(cmdcls, '_astropy_helpers_options'): cmdcls._astropy_helpers_options = set([name]) else: cmdcls._astropy_helpers_options.add(name) def get_distutils_display_options(): """ Returns a set of all the distutils display options in their long and short forms. These are the setup.py arguments such as --name or --version which print the project's metadata and then exit. Returns ------- opts : set The long and short form display option arguments, including the - or -- """ short_display_opts = set('-' + o[1] for o in Distribution.display_options if o[1]) long_display_opts = set('--' + o[0] for o in Distribution.display_options) # Include -h and --help which are not explicitly listed in # Distribution.display_options (as they are handled by optparse) short_display_opts.add('-h') long_display_opts.add('--help') # This isn't the greatest approach to hardcode these commands. # However, there doesn't seem to be a good way to determine # whether build *will be* run as part of the command at this # phase. display_commands = set([ 'clean', 'register', 'setopt', 'saveopts', 'egg_info', 'alias']) return short_display_opts.union(long_display_opts.union(display_commands)) def is_distutils_display_option(): """ Returns True if sys.argv contains any of the distutils display options such as --version or --name. """ display_options = get_distutils_display_options() return bool(set(sys.argv[1:]).intersection(display_options)) pydl-0.7.0/astropy_helpers/astropy_helpers/__init__.py0000644000076500000240000000345413434074306023673 0ustar weaverstaff00000000000000try: from .version import version as __version__ from .version import githash as __githash__ except ImportError: __version__ = '' __githash__ = '' # If we've made it as far as importing astropy_helpers, we don't need # ah_bootstrap in sys.modules anymore. Getting rid of it is actually necessary # if the package we're installing has a setup_requires of another package that # uses astropy_helpers (and possibly a different version at that) # See https://github.com/astropy/astropy/issues/3541 import sys if 'ah_bootstrap' in sys.modules: del sys.modules['ah_bootstrap'] # Note, this is repeated from ah_bootstrap.py, but is here too in case this # astropy-helpers was upgraded to from an older version that did not have this # check in its ah_bootstrap. # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass import os # Ensure that all module-level code in astropy or other packages know that # we're in setup mode: if ('__main__' in sys.modules and hasattr(sys.modules['__main__'], '__file__')): filename = os.path.basename(sys.modules['__main__'].__file__) if filename.rstrip('co') == 'setup.py': if sys.version_info[0] >= 3: import builtins else: import __builtin__ as builtins builtins._ASTROPY_SETUP_ = True del filename pydl-0.7.0/astropy_helpers/astropy_helpers/utils.py0000644000076500000240000006476513434074306023310 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst from __future__ import absolute_import, unicode_literals import contextlib import functools import imp import inspect import os import sys import glob import textwrap import types import warnings try: from importlib import machinery as import_machinery # Python 3.2 does not have SourceLoader if not hasattr(import_machinery, 'SourceLoader'): import_machinery = None except ImportError: import_machinery = None # Python 3.3's importlib caches filesystem reads for faster imports in the # general case. But sometimes it's necessary to manually invalidate those # caches so that the import system can pick up new generated files. See # https://github.com/astropy/astropy/issues/820 if sys.version_info[:2] >= (3, 3): from importlib import invalidate_caches else: def invalidate_caches(): return None # Python 2/3 compatibility if sys.version_info[0] < 3: string_types = (str, unicode) # noqa else: string_types = (str,) # Note: The following Warning subclasses are simply copies of the Warnings in # Astropy of the same names. class AstropyWarning(Warning): """ The base warning class from which all Astropy warnings should inherit. Any warning inheriting from this class is handled by the Astropy logger. """ class AstropyDeprecationWarning(AstropyWarning): """ A warning class to indicate a deprecated feature. """ class AstropyPendingDeprecationWarning(PendingDeprecationWarning, AstropyWarning): """ A warning class to indicate a soon-to-be deprecated feature. """ def _get_platlib_dir(cmd): """ Given a build command, return the name of the appropriate platform-specific build subdirectory directory (e.g. build/lib.linux-x86_64-2.7) """ plat_specifier = '.{0}-{1}'.format(cmd.plat_name, sys.version[0:3]) return os.path.join(cmd.build_base, 'lib' + plat_specifier) def get_numpy_include_path(): """ Gets the path to the numpy headers. """ # We need to go through this nonsense in case setuptools # downloaded and installed Numpy for us as part of the build or # install, since Numpy may still think it's in "setup mode", when # in fact we're ready to use it to build astropy now. if sys.version_info[0] >= 3: import builtins if hasattr(builtins, '__NUMPY_SETUP__'): del builtins.__NUMPY_SETUP__ import imp import numpy imp.reload(numpy) else: import __builtin__ if hasattr(__builtin__, '__NUMPY_SETUP__'): del __builtin__.__NUMPY_SETUP__ import numpy reload(numpy) try: numpy_include = numpy.get_include() except AttributeError: numpy_include = numpy.get_numpy_include() return numpy_include class _DummyFile(object): """A noop writeable object.""" errors = '' # Required for Python 3.x def write(self, s): pass def flush(self): pass @contextlib.contextmanager def silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr if sys.platform == 'win32': import ctypes def _has_hidden_attribute(filepath): """ Returns True if the given filepath has the hidden attribute on MS-Windows. Based on a post here: http://stackoverflow.com/questions/284115/cross-platform-hidden-file-detection """ if isinstance(filepath, bytes): filepath = filepath.decode(sys.getfilesystemencoding()) try: attrs = ctypes.windll.kernel32.GetFileAttributesW(filepath) assert attrs != -1 result = bool(attrs & 2) except (AttributeError, AssertionError): result = False return result else: def _has_hidden_attribute(filepath): return False def is_path_hidden(filepath): """ Determines if a given file or directory is hidden. Parameters ---------- filepath : str The path to a file or directory Returns ------- hidden : bool Returns `True` if the file is hidden """ name = os.path.basename(os.path.abspath(filepath)) if isinstance(name, bytes): is_dotted = name.startswith(b'.') else: is_dotted = name.startswith('.') return is_dotted or _has_hidden_attribute(filepath) def walk_skip_hidden(top, onerror=None, followlinks=False): """ A wrapper for `os.walk` that skips hidden files and directories. This function does not have the parameter `topdown` from `os.walk`: the directories must always be recursed top-down when using this function. See also -------- os.walk : For a description of the parameters """ for root, dirs, files in os.walk( top, topdown=True, onerror=onerror, followlinks=followlinks): # These lists must be updated in-place so os.walk will skip # hidden directories dirs[:] = [d for d in dirs if not is_path_hidden(d)] files[:] = [f for f in files if not is_path_hidden(f)] yield root, dirs, files def write_if_different(filename, data): """Write `data` to `filename`, if the content of the file is different. Parameters ---------- filename : str The file name to be written to. data : bytes The data to be written to `filename`. """ assert isinstance(data, bytes) if os.path.exists(filename): with open(filename, 'rb') as fd: original_data = fd.read() else: original_data = None if original_data != data: with open(filename, 'wb') as fd: fd.write(data) def import_file(filename, name=None): """ Imports a module from a single file as if it doesn't belong to a particular package. The returned module will have the optional ``name`` if given, or else a name generated from the filename. """ # Specifying a traditional dot-separated fully qualified name here # results in a number of "Parent module 'astropy' not found while # handling absolute import" warnings. Using the same name, the # namespaces of the modules get merged together. So, this # generates an underscore-separated name which is more likely to # be unique, and it doesn't really matter because the name isn't # used directly here anyway. mode = 'U' if sys.version_info[0] < 3 else 'r' if name is None: basename = os.path.splitext(filename)[0] name = '_'.join(os.path.relpath(basename).split(os.sep)[1:]) if import_machinery: loader = import_machinery.SourceFileLoader(name, filename) mod = loader.load_module() else: with open(filename, mode) as fd: mod = imp.load_module(name, fd, filename, ('.py', mode, 1)) return mod def resolve_name(name): """Resolve a name like ``module.object`` to an object and return it. Raise `ImportError` if the module or name is not found. """ parts = name.split('.') cursor = len(parts) - 1 module_name = parts[:cursor] attr_name = parts[-1] while cursor > 0: try: ret = __import__('.'.join(module_name), fromlist=[attr_name]) break except ImportError: if cursor == 0: raise cursor -= 1 module_name = parts[:cursor] attr_name = parts[cursor] ret = '' for part in parts[cursor:]: try: ret = getattr(ret, part) except AttributeError: raise ImportError(name) return ret if sys.version_info[0] >= 3: def iteritems(dictionary): return dictionary.items() else: def iteritems(dictionary): return dictionary.iteritems() def extends_doc(extended_func): """ A function decorator for use when wrapping an existing function but adding additional functionality. This copies the docstring from the original function, and appends to it (along with a newline) the docstring of the wrapper function. Examples -------- >>> def foo(): ... '''Hello.''' ... >>> @extends_doc(foo) ... def bar(): ... '''Goodbye.''' ... >>> print(bar.__doc__) Hello. Goodbye. """ def decorator(func): if not (extended_func.__doc__ is None or func.__doc__ is None): func.__doc__ = '\n\n'.join([extended_func.__doc__.rstrip('\n'), func.__doc__.lstrip('\n')]) return func return decorator # Duplicated from astropy.utils.decorators.deprecated # When fixing issues in this function fix them in astropy first, then # port the fixes over to astropy-helpers def deprecated(since, message='', name='', alternative='', pending=False, obj_type=None): """ Used to mark a function or class as deprecated. To mark an attribute as deprecated, use `deprecated_attribute`. Parameters ---------- since : str The release at which this API became deprecated. This is required. message : str, optional Override the default deprecation message. The format specifier ``func`` may be used for the name of the function, and ``alternative`` may be used in the deprecation message to insert the name of an alternative to the deprecated function. ``obj_type`` may be used to insert a friendly name for the type of object being deprecated. name : str, optional The name of the deprecated function or class; if not provided the name is automatically determined from the passed in function or class, though this is useful in the case of renamed functions, where the new function is just assigned to the name of the deprecated function. For example:: def new_function(): ... oldFunction = new_function alternative : str, optional An alternative function or class name that the user may use in place of the deprecated object. The deprecation warning will tell the user about this alternative if provided. pending : bool, optional If True, uses a AstropyPendingDeprecationWarning instead of a AstropyDeprecationWarning. obj_type : str, optional The type of this object, if the automatically determined one needs to be overridden. """ method_types = (classmethod, staticmethod, types.MethodType) def deprecate_doc(old_doc, message): """ Returns a given docstring with a deprecation message prepended to it. """ if not old_doc: old_doc = '' old_doc = textwrap.dedent(old_doc).strip('\n') new_doc = (('\n.. deprecated:: %(since)s' '\n %(message)s\n\n' % {'since': since, 'message': message.strip()}) + old_doc) if not old_doc: # This is to prevent a spurious 'unexpected unindent' warning from # docutils when the original docstring was blank. new_doc += r'\ ' return new_doc def get_function(func): """ Given a function or classmethod (or other function wrapper type), get the function object. """ if isinstance(func, method_types): func = func.__func__ return func def deprecate_function(func, message): """ Returns a wrapped function that displays an ``AstropyDeprecationWarning`` when it is called. """ if isinstance(func, method_types): func_wrapper = type(func) else: func_wrapper = lambda f: f func = get_function(func) def deprecated_func(*args, **kwargs): if pending: category = AstropyPendingDeprecationWarning else: category = AstropyDeprecationWarning warnings.warn(message, category, stacklevel=2) return func(*args, **kwargs) # If this is an extension function, we can't call # functools.wraps on it, but we normally don't care. # This crazy way to get the type of a wrapper descriptor is # straight out of the Python 3.3 inspect module docs. if type(func) != type(str.__dict__['__add__']): deprecated_func = functools.wraps(func)(deprecated_func) deprecated_func.__doc__ = deprecate_doc( deprecated_func.__doc__, message) return func_wrapper(deprecated_func) def deprecate_class(cls, message): """ Returns a wrapper class with the docstrings updated and an __init__ function that will raise an ``AstropyDeprectationWarning`` warning when called. """ # Creates a new class with the same name and bases as the # original class, but updates the dictionary with a new # docstring and a wrapped __init__ method. __module__ needs # to be manually copied over, since otherwise it will be set # to *this* module (astropy.utils.misc). # This approach seems to make Sphinx happy (the new class # looks enough like the original class), and works with # extension classes (which functools.wraps does not, since # it tries to modify the original class). # We need to add a custom pickler or you'll get # Can't pickle : it's not found as ... # errors. Picklability is required for any class that is # documented by Sphinx. members = cls.__dict__.copy() members.update({ '__doc__': deprecate_doc(cls.__doc__, message), '__init__': deprecate_function(get_function(cls.__init__), message), }) return type(cls.__name__, cls.__bases__, members) def deprecate(obj, message=message, name=name, alternative=alternative, pending=pending): if obj_type is None: if isinstance(obj, type): obj_type_name = 'class' elif inspect.isfunction(obj): obj_type_name = 'function' elif inspect.ismethod(obj) or isinstance(obj, method_types): obj_type_name = 'method' else: obj_type_name = 'object' else: obj_type_name = obj_type if not name: name = get_function(obj).__name__ altmessage = '' if not message or type(message) == type(deprecate): if pending: message = ('The %(func)s %(obj_type)s will be deprecated in a ' 'future version.') else: message = ('The %(func)s %(obj_type)s is deprecated and may ' 'be removed in a future version.') if alternative: altmessage = '\n Use %s instead.' % alternative message = ((message % { 'func': name, 'name': name, 'alternative': alternative, 'obj_type': obj_type_name}) + altmessage) if isinstance(obj, type): return deprecate_class(obj, message) else: return deprecate_function(obj, message) if type(message) == type(deprecate): return deprecate(message) return deprecate def deprecated_attribute(name, since, message=None, alternative=None, pending=False): """ Used to mark a public attribute as deprecated. This creates a property that will warn when the given attribute name is accessed. To prevent the warning (i.e. for internal code), use the private name for the attribute by prepending an underscore (i.e. ``self._name``). Parameters ---------- name : str The name of the deprecated attribute. since : str The release at which this API became deprecated. This is required. message : str, optional Override the default deprecation message. The format specifier ``name`` may be used for the name of the attribute, and ``alternative`` may be used in the deprecation message to insert the name of an alternative to the deprecated function. alternative : str, optional An alternative attribute that the user may use in place of the deprecated attribute. The deprecation warning will tell the user about this alternative if provided. pending : bool, optional If True, uses a AstropyPendingDeprecationWarning instead of a AstropyDeprecationWarning. Examples -------- :: class MyClass: # Mark the old_name as deprecated old_name = misc.deprecated_attribute('old_name', '0.1') def method(self): self._old_name = 42 """ private_name = '_' + name @deprecated(since, name=name, obj_type='attribute') def get(self): return getattr(self, private_name) @deprecated(since, name=name, obj_type='attribute') def set(self, val): setattr(self, private_name, val) @deprecated(since, name=name, obj_type='attribute') def delete(self): delattr(self, private_name) return property(get, set, delete) def minversion(module, version, inclusive=True, version_path='__version__'): """ Returns `True` if the specified Python module satisfies a minimum version requirement, and `False` if not. By default this uses `pkg_resources.parse_version` to do the version comparison if available. Otherwise it falls back on `distutils.version.LooseVersion`. Parameters ---------- module : module or `str` An imported module of which to check the version, or the name of that module (in which case an import of that module is attempted-- if this fails `False` is returned). version : `str` The version as a string that this module must have at a minimum (e.g. ``'0.12'``). inclusive : `bool` The specified version meets the requirement inclusively (i.e. ``>=``) as opposed to strictly greater than (default: `True`). version_path : `str` A dotted attribute path to follow in the module for the version. Defaults to just ``'__version__'``, which should work for most Python modules. Examples -------- >>> import astropy >>> minversion(astropy, '0.4.4') True """ if isinstance(module, types.ModuleType): module_name = module.__name__ elif isinstance(module, string_types): module_name = module try: module = resolve_name(module_name) except ImportError: return False else: raise ValueError('module argument must be an actual imported ' 'module, or the import name of the module; ' 'got {0!r}'.format(module)) if '.' not in version_path: have_version = getattr(module, version_path) else: have_version = resolve_name('.'.join([module.__name__, version_path])) try: from pkg_resources import parse_version except ImportError: from distutils.version import LooseVersion as parse_version if inclusive: return parse_version(have_version) >= parse_version(version) else: return parse_version(have_version) > parse_version(version) # Copy of the classproperty decorator from astropy.utils.decorators class classproperty(property): """ Similar to `property`, but allows class-level properties. That is, a property whose getter is like a `classmethod`. The wrapped method may explicitly use the `classmethod` decorator (which must become before this decorator), or the `classmethod` may be omitted (it is implicit through use of this decorator). .. note:: classproperty only works for *read-only* properties. It does not currently allow writeable/deleteable properties, due to subtleties of how Python descriptors work. In order to implement such properties on a class a metaclass for that class must be implemented. Parameters ---------- fget : callable The function that computes the value of this property (in particular, the function when this is used as a decorator) a la `property`. doc : str, optional The docstring for the property--by default inherited from the getter function. lazy : bool, optional If True, caches the value returned by the first call to the getter function, so that it is only called once (used for lazy evaluation of an attribute). This is analogous to `lazyproperty`. The ``lazy`` argument can also be used when `classproperty` is used as a decorator (see the third example below). When used in the decorator syntax this *must* be passed in as a keyword argument. Examples -------- :: >>> class Foo(object): ... _bar_internal = 1 ... @classproperty ... def bar(cls): ... return cls._bar_internal + 1 ... >>> Foo.bar 2 >>> foo_instance = Foo() >>> foo_instance.bar 2 >>> foo_instance._bar_internal = 2 >>> foo_instance.bar # Ignores instance attributes 2 As previously noted, a `classproperty` is limited to implementing read-only attributes:: >>> class Foo(object): ... _bar_internal = 1 ... @classproperty ... def bar(cls): ... return cls._bar_internal ... @bar.setter ... def bar(cls, value): ... cls._bar_internal = value ... Traceback (most recent call last): ... NotImplementedError: classproperty can only be read-only; use a metaclass to implement modifiable class-level properties When the ``lazy`` option is used, the getter is only called once:: >>> class Foo(object): ... @classproperty(lazy=True) ... def bar(cls): ... print("Performing complicated calculation") ... return 1 ... >>> Foo.bar Performing complicated calculation 1 >>> Foo.bar 1 If a subclass inherits a lazy `classproperty` the property is still re-evaluated for the subclass:: >>> class FooSub(Foo): ... pass ... >>> FooSub.bar Performing complicated calculation 1 >>> FooSub.bar 1 """ def __new__(cls, fget=None, doc=None, lazy=False): if fget is None: # Being used as a decorator--return a wrapper that implements # decorator syntax def wrapper(func): return cls(func, lazy=lazy) return wrapper return super(classproperty, cls).__new__(cls) def __init__(self, fget, doc=None, lazy=False): self._lazy = lazy if lazy: self._cache = {} fget = self._wrap_fget(fget) super(classproperty, self).__init__(fget=fget, doc=doc) # There is a buglet in Python where self.__doc__ doesn't # get set properly on instances of property subclasses if # the doc argument was used rather than taking the docstring # from fget if doc is not None: self.__doc__ = doc def __get__(self, obj, objtype=None): if self._lazy and objtype in self._cache: return self._cache[objtype] if objtype is not None: # The base property.__get__ will just return self here; # instead we pass objtype through to the original wrapped # function (which takes the class as its sole argument) val = self.fget.__wrapped__(objtype) else: val = super(classproperty, self).__get__(obj, objtype=objtype) if self._lazy: if objtype is None: objtype = obj.__class__ self._cache[objtype] = val return val def getter(self, fget): return super(classproperty, self).getter(self._wrap_fget(fget)) def setter(self, fset): raise NotImplementedError( "classproperty can only be read-only; use a metaclass to " "implement modifiable class-level properties") def deleter(self, fdel): raise NotImplementedError( "classproperty can only be read-only; use a metaclass to " "implement modifiable class-level properties") @staticmethod def _wrap_fget(orig_fget): if isinstance(orig_fget, classmethod): orig_fget = orig_fget.__func__ # Using stock functools.wraps instead of the fancier version # found later in this module, which is overkill for this purpose @functools.wraps(orig_fget) def fget(obj): return orig_fget(obj.__class__) # Set the __wrapped__ attribute manually for support on Python 2 fget.__wrapped__ = orig_fget return fget def find_data_files(package, pattern): """ Include files matching ``pattern`` inside ``package``. Parameters ---------- package : str The package inside which to look for data files pattern : str Pattern (glob-style) to match for the data files (e.g. ``*.dat``). This supports the Python 3.5 ``**``recursive syntax. For example, ``**/*.fits`` matches all files ending with ``.fits`` recursively. Only one instance of ``**`` can be included in the pattern. """ if sys.version_info[:2] >= (3, 5): return glob.glob(os.path.join(package, pattern), recursive=True) else: if '**' in pattern: start, end = pattern.split('**') if end.startswith(('/', os.sep)): end = end[1:] matches = glob.glob(os.path.join(package, start, end)) for root, dirs, files in os.walk(os.path.join(package, start)): for dirname in dirs: matches += glob.glob(os.path.join(root, dirname, end)) return matches else: return glob.glob(os.path.join(package, pattern)) pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/0000755000076500000240000000000013434104632023061 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/conf.py0000644000076500000240000002721713434074306024375 0ustar weaverstaff00000000000000# -*- coding: utf-8 -*- # Licensed under a 3-clause BSD style license - see LICENSE.rst # # Astropy shared Sphinx settings. These settings are shared between # astropy itself and affiliated packages. # # Note that not all possible configuration values are present in this file. # # All configuration values have a default; values that are commented out # serve to show the default. import os import sys import warnings from os import path import sphinx from distutils.version import LooseVersion # -- General configuration ---------------------------------------------------- # The version check in Sphinx itself can only compare the major and # minor parts of the version number, not the micro. To do a more # specific version check, call check_sphinx_version("x.y.z.") from # your project's conf.py needs_sphinx = '1.3' on_rtd = os.environ.get('READTHEDOCS', None) == 'True' def check_sphinx_version(expected_version): sphinx_version = LooseVersion(sphinx.__version__) expected_version = LooseVersion(expected_version) if sphinx_version < expected_version: raise RuntimeError( "At least Sphinx version {0} is required to build this " "documentation. Found {1}.".format( expected_version, sphinx_version)) # Configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = { 'python': ('https://docs.python.org/3/', (None, 'http://data.astropy.org/intersphinx/python3.inv')), 'pythonloc': ('http://docs.python.org/', path.abspath(path.join(path.dirname(__file__), 'local/python3_local_links.inv'))), 'numpy': ('https://docs.scipy.org/doc/numpy/', (None, 'http://data.astropy.org/intersphinx/numpy.inv')), 'scipy': ('https://docs.scipy.org/doc/scipy/reference/', (None, 'http://data.astropy.org/intersphinx/scipy.inv')), 'matplotlib': ('http://matplotlib.org/', (None, 'http://data.astropy.org/intersphinx/matplotlib.inv')), 'astropy': ('http://docs.astropy.org/en/stable/', None), 'h5py': ('http://docs.h5py.org/en/stable/', None)} if sys.version_info[0] == 2: intersphinx_mapping['python'] = ( 'https://docs.python.org/2/', (None, 'http://data.astropy.org/intersphinx/python2.inv')) intersphinx_mapping['pythonloc'] = ( 'http://docs.python.org/', path.abspath(path.join(path.dirname(__file__), 'local/python2_local_links.inv'))) # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # Add any paths that contain templates here, relative to this directory. # templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # The reST default role (used for this markup: `text`) to use for all # documents. Set to the "smart" one. default_role = 'obj' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # This is added to the end of RST files - a good place to put substitutions to # be used globally. rst_epilog = """ .. _Astropy: http://astropy.org """ # A list of warning types to suppress arbitrary warning messages. We mean to # override directives in astropy_helpers.sphinx.ext.autodoc_enhancements, # thus need to ignore those warning. This can be removed once the patch gets # released in upstream Sphinx (https://github.com/sphinx-doc/sphinx/pull/1843). # Suppress the warnings requires Sphinx v1.4.2 suppress_warnings = ['app.add_directive', ] # -- Project information ------------------------------------------------------ # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. #pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # -- Settings for extensions and extension options ---------------------------- # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.todo', 'sphinx.ext.coverage', 'sphinx.ext.inheritance_diagram', 'sphinx.ext.viewcode', 'astropy_helpers.extern.numpydoc', 'astropy_helpers.extern.automodapi.automodapi', 'astropy_helpers.extern.automodapi.smart_resolver', 'astropy_helpers.sphinx.ext.tocdepthfix', 'astropy_helpers.sphinx.ext.doctest', 'astropy_helpers.sphinx.ext.changelog_links'] if not on_rtd and LooseVersion(sphinx.__version__) < LooseVersion('1.4'): extensions.append('sphinx.ext.pngmath') else: extensions.append('sphinx.ext.mathjax') try: import matplotlib.sphinxext.plot_directive extensions += [matplotlib.sphinxext.plot_directive.__name__] # AttributeError is checked here in case matplotlib is installed but # Sphinx isn't. Note that this module is imported by the config file # generator, even if we're not building the docs. except (ImportError, AttributeError): warnings.warn( "matplotlib's plot_directive could not be imported. " + "Inline plots will not be included in the output") # Don't show summaries of the members in each class along with the # class' docstring numpydoc_show_class_members = False autosummary_generate = True automodapi_toctreedirnm = 'api' # Class documentation should contain *both* the class docstring and # the __init__ docstring autoclass_content = "both" # Render inheritance diagrams in SVG graphviz_output_format = "svg" graphviz_dot_args = [ '-Nfontsize=10', '-Nfontname=Helvetica Neue, Helvetica, Arial, sans-serif', '-Efontsize=10', '-Efontname=Helvetica Neue, Helvetica, Arial, sans-serif', '-Gfontsize=10', '-Gfontname=Helvetica Neue, Helvetica, Arial, sans-serif' ] # -- Options for HTML output ------------------------------------------------- # Add any paths that contain custom themes here, relative to this directory. html_theme_path = [path.abspath(path.join(path.dirname(__file__), 'themes'))] # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'bootstrap-astropy' # Custom sidebar templates, maps document names to template names. html_sidebars = { '**': ['localtoc.html'], 'search': [], 'genindex': [], 'py-modindex': [], } # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. # included in the bootstrap-astropy theme html_favicon = path.join(html_theme_path[0], html_theme, 'static', 'astropy_logo.ico') # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. html_last_updated_fmt = '%d %b %Y' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # -- Options for LaTeX output ------------------------------------------------ # The paper size ('letter' or 'a4'). #latex_paper_size = 'letter' # The font size ('10pt', '11pt' or '12pt'). #latex_font_size = '10pt' # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. latex_toplevel_sectioning = 'part' # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False latex_elements = {} # Additional stuff for the LaTeX preamble. latex_elements['preamble'] = r""" % Use a more modern-looking monospace font \usepackage{inconsolata} % The enumitem package provides unlimited nesting of lists and enums. % Sphinx may use this in the future, in which case this can be removed. % See https://bitbucket.org/birkenfeld/sphinx/issue/777/latex-output-too-deeply-nested \usepackage{enumitem} \setlistdepth{15} % In the parameters section, place a newline after the Parameters % header. (This is stolen directly from Numpy's conf.py, since it % affects Numpy-style docstrings). \usepackage{expdlist} \let\latexdescription=\description \def\description{\latexdescription{}{} \breaklabel} % Support the superscript Unicode numbers used by the "unicode" units % formatter \DeclareUnicodeCharacter{2070}{\ensuremath{^0}} \DeclareUnicodeCharacter{00B9}{\ensuremath{^1}} \DeclareUnicodeCharacter{00B2}{\ensuremath{^2}} \DeclareUnicodeCharacter{00B3}{\ensuremath{^3}} \DeclareUnicodeCharacter{2074}{\ensuremath{^4}} \DeclareUnicodeCharacter{2075}{\ensuremath{^5}} \DeclareUnicodeCharacter{2076}{\ensuremath{^6}} \DeclareUnicodeCharacter{2077}{\ensuremath{^7}} \DeclareUnicodeCharacter{2078}{\ensuremath{^8}} \DeclareUnicodeCharacter{2079}{\ensuremath{^9}} \DeclareUnicodeCharacter{207B}{\ensuremath{^-}} \DeclareUnicodeCharacter{00B0}{\ensuremath{^{\circ}}} \DeclareUnicodeCharacter{2032}{\ensuremath{^{\prime}}} \DeclareUnicodeCharacter{2033}{\ensuremath{^{\prime\prime}}} % Make the "warning" and "notes" sections use a sans-serif font to % make them stand out more. \renewenvironment{notice}[2]{ \def\py@noticetype{#1} \csname py@noticestart@#1\endcsname \textsf{\textbf{#2}} }{\csname py@noticeend@\py@noticetype\endcsname} """ # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # -- Options for the linkcheck builder ---------------------------------------- # A timeout value, in seconds, for the linkcheck builder linkcheck_timeout = 60 pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/0000755000076500000240000000000013434104632023661 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/changelog_links.py0000644000076500000240000000554313434074306027375 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This sphinx extension makes the issue numbers in the changelog into links to GitHub issues. """ from __future__ import print_function import re from docutils.nodes import Text, reference BLOCK_PATTERN = re.compile('\[#.+\]', flags=re.DOTALL) ISSUE_PATTERN = re.compile('#[0-9]+') def process_changelog_links(app, doctree, docname): for rex in app.changelog_links_rexes: if rex.match(docname): break else: # if the doc doesn't match any of the changelog regexes, don't process return app.info('[changelog_links] Adding changelog links to "{0}"'.format(docname)) for item in doctree.traverse(): if not isinstance(item, Text): continue # We build a new list of items to replace the current item. If # a link is found, we need to use a 'reference' item. children = [] # First cycle through blocks of issues (delimited by []) then # iterate inside each one to find the individual issues. prev_block_end = 0 for block in BLOCK_PATTERN.finditer(item): block_start, block_end = block.start(), block.end() children.append(Text(item[prev_block_end:block_start])) block = item[block_start:block_end] prev_end = 0 for m in ISSUE_PATTERN.finditer(block): start, end = m.start(), m.end() children.append(Text(block[prev_end:start])) issue_number = block[start:end] refuri = app.config.github_issues_url + issue_number[1:] children.append(reference(text=issue_number, name=issue_number, refuri=refuri)) prev_end = end prev_block_end = block_end # If no issues were found, this adds the whole item, # otherwise it adds the remaining text. children.append(Text(block[prev_end:block_end])) # If no blocks were found, this adds the whole item, otherwise # it adds the remaining text. children.append(Text(item[prev_block_end:])) # Replace item by the new list of items we have generated, # which may contain links. item.parent.replace(item, children) def setup_patterns_rexes(app): app.changelog_links_rexes = [re.compile(pat) for pat in app.config.changelog_links_docpattern] def setup(app): app.connect('doctree-resolved', process_changelog_links) app.connect('builder-inited', setup_patterns_rexes) app.add_config_value('github_issues_url', None, True) app.add_config_value('changelog_links_docpattern', ['.*changelog.*', 'whatsnew/.*'], True) return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/edit_on_github.py0000644000076500000240000001346413434074306027232 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This extension makes it easy to edit documentation on github. It adds links associated with each docstring that go to the corresponding view source page on Github. From there, the user can push the "Edit" button, edit the docstring, and submit a pull request. It has the following configuration options (to be set in the project's ``conf.py``): * ``edit_on_github_project`` The name of the github project, in the form "username/projectname". * ``edit_on_github_branch`` The name of the branch to edit. If this is a released version, this should be a git tag referring to that version. For a dev version, it often makes sense for it to be "master". It may also be a git hash. * ``edit_on_github_source_root`` The location within the source tree of the root of the Python package. Defaults to "lib". * ``edit_on_github_doc_root`` The location within the source tree of the root of the documentation source. Defaults to "doc", but it may make sense to set it to "doc/source" if the project uses a separate source directory. * ``edit_on_github_docstring_message`` The phrase displayed in the links to edit a docstring. Defaults to "[edit on github]". * ``edit_on_github_page_message`` The phrase displayed in the links to edit a RST page. Defaults to "[edit this page on github]". * ``edit_on_github_help_message`` The phrase displayed as a tooltip on the edit links. Defaults to "Push the Edit button on the next page" * ``edit_on_github_skip_regex`` When the path to the .rst file matches this regular expression, no "edit this page on github" link will be added. Defaults to ``"_.*"``. """ import inspect import os import re import sys from docutils import nodes from sphinx import addnodes def import_object(modname, name): """ Import the object given by *modname* and *name* and return it. If not found, or the import fails, returns None. """ try: __import__(modname) mod = sys.modules[modname] obj = mod for part in name.split('.'): obj = getattr(obj, part) return obj except: return None def get_url_base(app): return 'http://github.com/%s/tree/%s/' % ( app.config.edit_on_github_project, app.config.edit_on_github_branch) def doctree_read(app, doctree): # Get the configuration parameters if app.config.edit_on_github_project == 'REQUIRED': raise ValueError( "The edit_on_github_project configuration variable must be " "provided in the conf.py") source_root = app.config.edit_on_github_source_root url = get_url_base(app) docstring_message = app.config.edit_on_github_docstring_message # Handle the docstring-editing links for objnode in doctree.traverse(addnodes.desc): if objnode.get('domain') != 'py': continue names = set() for signode in objnode: if not isinstance(signode, addnodes.desc_signature): continue modname = signode.get('module') if not modname: continue fullname = signode.get('fullname') if fullname in names: # only one link per name, please continue names.add(fullname) obj = import_object(modname, fullname) anchor = None if obj is not None: try: lines, lineno = inspect.getsourcelines(obj) except: pass else: anchor = '#L%d' % lineno if anchor: real_modname = inspect.getmodule(obj).__name__ path = '%s%s%s.py%s' % ( url, source_root, real_modname.replace('.', '/'), anchor) onlynode = addnodes.only(expr='html') onlynode += nodes.reference( reftitle=app.config.edit_on_github_help_message, refuri=path) onlynode[0] += nodes.inline( '', '', nodes.raw('', ' ', format='html'), nodes.Text(docstring_message), classes=['edit-on-github', 'viewcode-link']) signode += onlynode def html_page_context(app, pagename, templatename, context, doctree): if (templatename == 'page.html' and not re.match(app.config.edit_on_github_skip_regex, pagename)): doc_root = app.config.edit_on_github_doc_root if doc_root != '' and not doc_root.endswith('/'): doc_root += '/' doc_path = os.path.relpath(doctree.get('source'), app.builder.srcdir) url = get_url_base(app) page_message = app.config.edit_on_github_page_message context['edit_on_github'] = url + doc_root + doc_path context['edit_on_github_page_message'] = page_message def setup(app): app.add_config_value('edit_on_github_project', 'REQUIRED', True) app.add_config_value('edit_on_github_branch', 'master', True) app.add_config_value('edit_on_github_source_root', 'lib', True) app.add_config_value('edit_on_github_doc_root', 'doc', True) app.add_config_value('edit_on_github_docstring_message', '[edit on github]', True) app.add_config_value('edit_on_github_page_message', 'Edit This Page on Github', True) app.add_config_value('edit_on_github_help_message', 'Push the Edit button on the next page', True) app.add_config_value('edit_on_github_skip_regex', '_.*', True) app.connect('doctree-read', doctree_read) app.connect('html-page-context', html_page_context) return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/tests/0000755000076500000240000000000013434104632025023 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/tests/__init__.py0000644000076500000240000000000013434074306027126 0ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/__init__.py0000644000076500000240000000010213434074306025767 0ustar weaverstaff00000000000000from __future__ import division, absolute_import, print_function pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/tocdepthfix.py0000644000076500000240000000137013434074306026561 0ustar weaverstaff00000000000000from sphinx import addnodes def fix_toc_entries(app, doctree): # Get the docname; I don't know why this isn't just passed in to the # callback # This seems a bit unreliable as it's undocumented, but it's not "private" # either: docname = app.builder.env.temp_data['docname'] if app.builder.env.metadata[docname].get('tocdepth', 0) != 0: # We need to reprocess any TOC nodes in the doctree and make sure all # the files listed in any TOCs are noted for treenode in doctree.traverse(addnodes.toctree): app.builder.env.note_toctree(docname, treenode) def setup(app): app.connect('doctree-read', fix_toc_entries) return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/ext/doctest.py0000644000076500000240000000364113434074306025710 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This is a set of three directives that allow us to insert metadata about doctests into the .rst files so the testing framework knows which tests to skip. This is quite different from the doctest extension in Sphinx itself, which actually does something. For astropy, all of the testing is centrally managed from py.test and Sphinx is not used for running tests. """ import re from docutils.nodes import literal_block from docutils.parsers.rst import Directive class DoctestSkipDirective(Directive): has_content = True def run(self): # Check if there is any valid argument, and skip it. Currently only # 'win32' is supported in astropy.tests.pytest_plugins. if re.match('win32', self.content[0]): self.content = self.content[2:] code = '\n'.join(self.content) return [literal_block(code, code)] class DoctestOmitDirective(Directive): has_content = True def run(self): # Simply do not add any content when this directive is encountered return [] class DoctestRequiresDirective(DoctestSkipDirective): # This is silly, but we really support an unbounded number of # optional arguments optional_arguments = 64 def setup(app): app.add_directive('doctest-requires', DoctestRequiresDirective) app.add_directive('doctest-skip', DoctestSkipDirective) app.add_directive('doctest-skip-all', DoctestSkipDirective) app.add_directive('doctest', DoctestSkipDirective) # Code blocks that use this directive will not appear in the generated # documentation. This is intended to hide boilerplate code that is only # useful for testing documentation using doctest, but does not actually # belong in the documentation itself. app.add_directive('testsetup', DoctestOmitDirective) return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/__init__.py0000644000076500000240000000066513434074306025205 0ustar weaverstaff00000000000000""" This package contains utilities and extensions for the Astropy sphinx documentation. In particular, the `astropy.sphinx.conf` should be imported by the sphinx ``conf.py`` file for affiliated packages that wish to make use of the Astropy documentation format. Note that some sphinx extensions which are bundled as-is (numpydoc and sphinx-automodapi) are included in astropy_helpers.extern rather than astropy_helpers.sphinx.ext. """ pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/local/0000755000076500000240000000000013434104632024153 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/local/python2_local_links.inv0000644000076500000240000000106213434074306030651 0ustar weaverstaff00000000000000# Sphinx inventory version 2 # Project: Python # Version: 2.7 and 3.5 # The remainder of this file should be compressed using zlib. x=O0@w Z!nU bw+1rpKïIiQeI˽w8g"Wf ʬxK%lS(ϭ1 k&Qrp)ɐ.Bi۠3H]a)_ZI>dH, _M_"撠bvIzЀ82b;!I,=Wh_'l!Q%^B#Ô }inuD#e³\:{tu;/wxy. !nX{0BzoH /LxA&UXS{⮸5ߣ\RBiJF?pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/0000755000076500000240000000000013434104632024346 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/0000755000076500000240000000000013434104632030062 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/globaltoc.html0000644000076500000240000000011113434074306032713 0ustar weaverstaff00000000000000

Table of Contents

{{ toctree(maxdepth=-1, titles_only=true) }} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/layout.html0000644000076500000240000000655113434074306032300 0ustar weaverstaff00000000000000{% extends "basic/layout.html" %} {# Collapsible sidebar script from default/layout.html in Sphinx #} {% set script_files = script_files + ['_static/sidebar.js'] %} {# Add the google webfonts needed for the logo #} {% block extrahead %} {% if not embedded %}{% endif %} {% endblock %} {% block header %}
{{ theme_logotext1 }}{{ theme_logotext2 }}{{ theme_logotext3 }}
  • Index
  • Modules
  • {% block sidebarsearch %} {% include "searchbox.html" %} {% endblock %}
{% endblock %} {% block relbar1 %} {% endblock %} {# Silence the bottom relbar. #} {% block relbar2 %}{% endblock %} {%- block footer %}

{%- if edit_on_github %} {{ edit_on_github_page_message }}   {%- endif %} {%- if show_source and has_source and sourcename %} {{ _('Page Source') }} {%- endif %}   Back to Top

{%- if show_copyright %} {%- if hasdoc('copyright') %} {% trans path=pathto('copyright'), copyright=copyright|e %}© Copyright {{ copyright }}.{% endtrans %}
{%- else %} {% trans copyright=copyright|e %}© Copyright {{ copyright }}.{% endtrans %}
{%- endif %} {%- endif %} {%- if show_sphinx %} {% trans sphinx_version=sphinx_version|e %}Created using Sphinx {{ sphinx_version }}.{% endtrans %}   {%- endif %} {%- if last_updated %} {% trans last_updated=last_updated|e %}Last built {{ last_updated }}.{% endtrans %}
{%- endif %}

{%- endblock %} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/theme.conf0000644000076500000240000000030013434074306032030 0ustar weaverstaff00000000000000# AstroPy theme based on Twitter Bootstrap CSS [theme] inherit = basic stylesheet = bootstrap-astropy.css pygments_style = sphinx [options] logotext1 = astro logotext2 = py logotext3 = :docs pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/0000755000076500000240000000000013434104632031351 5ustar weaverstaff00000000000000././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/bootstrap-astropy.csspydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/bootstrap-astropy.0000644000076500000240000002744313434074306035104 0ustar weaverstaff00000000000000/*! * Bootstrap v1.4.0 * * Copyright 2011 Twitter, Inc * Licensed under the Apache License v2.0 * http://www.apache.org/licenses/LICENSE-2.0 * * Heavily modified by Kyle Barbary for the AstroPy Project for use with Sphinx. */ @import url("basic.css"); body { background-color: #ffffff; margin: 0; font-family: "Helvetica Neue", Helvetica, Arial, sans-serif; font-size: 13px; font-weight: normal; line-height: 18px; color: #404040; } /* Hyperlinks ----------------------------------------------------------------*/ a { color: #0069d6; text-decoration: none; line-height: inherit; font-weight: inherit; } a:hover { color: #00438a; text-decoration: underline; } /* Typography ----------------------------------------------------------------*/ h1,h2,h3,h4,h5,h6 { color: #404040; margin: 0.7em 0 0 0; line-height: 1.5em; } h1 { font-size: 24px; margin: 0; } h2 { font-size: 21px; line-height: 1.2em; margin: 1em 0 0.5em 0; border-bottom: 1px solid #404040; } h3 { font-size: 18px; } h4 { font-size: 16px; } h5 { font-size: 14px; } h6 { font-size: 13px; text-transform: uppercase; } p { font-size: 13px; font-weight: normal; line-height: 18px; margin-top: 0px; margin-bottom: 9px; } ul, ol { margin-left: 0; padding: 0 0 0 25px; } ul ul, ul ol, ol ol, ol ul { margin-bottom: 0; } ul { list-style: disc; } ol { list-style: decimal; } li { line-height: 18px; color: #404040; } ul.unstyled { list-style: none; margin-left: 0; } dl { margin-bottom: 18px; } dl dt, dl dd { line-height: 18px; } dl dd { margin-left: 9px; } hr { margin: 20px 0 19px; border: 0; border-bottom: 1px solid #eee; } strong { font-style: inherit; font-weight: bold; } em { font-style: italic; font-weight: inherit; line-height: inherit; } .muted { color: #bfbfbf; } address { display: block; line-height: 18px; margin-bottom: 18px; } code, pre { padding: 0 3px 2px; font-family: monospace; -webkit-border-radius: 3px; -moz-border-radius: 3px; border-radius: 3px; } tt { font-family: monospace; } code { padding: 1px 3px; } pre { display: block; padding: 8.5px; margin: 0 0 18px; line-height: 18px; border: 1px solid #ddd; border: 1px solid rgba(0, 0, 0, 0.12); -webkit-border-radius: 3px; -moz-border-radius: 3px; border-radius: 3px; white-space: pre; word-wrap: break-word; } img { margin: 9px 0; } /* format inline code with a rounded box */ tt, code { margin: 0 2px; padding: 0 5px; border: 1px solid #ddd; border: 1px solid rgba(0, 0, 0, 0.12); border-radius: 3px; } code.xref, a code { margin: 0; padding: 0 1px 0 1px; background-color: none; border: none; } /* all code has same box background color, even in headers */ h1 tt, h2 tt, h3 tt, h4 tt, h5 tt, h6 tt, h1 code, h2 code, h3 code, h4 code, h5 code, h6 code, pre, code, tt { background-color: #f8f8f8; } /* override box for links & other sphinx-specifc stuff */ tt.xref, a tt, tt.descname, tt.descclassname { padding: 0 1px 0 1px; border: none; } /* override box for related bar at the top of the page */ .related tt { border: none; padding: 0 1px 0 1px; background-color: transparent; font-weight: bold; } th { background-color: #dddddd; } .viewcode-back { font-family: sans-serif; } div.viewcode-block:target { background-color: #f4debf; border-top: 1px solid #ac9; border-bottom: 1px solid #ac9; } table.docutils { border-spacing: 5px; border-collapse: separate; } /* Topbar --------------------------------------------------------------------*/ div.topbar { height: 40px; position: absolute; top: 0; left: 0; right: 0; z-index: 10000; padding: 0px 10px; background-color: #222; background-color: #222222; background-repeat: repeat-x; background-image: -khtml-gradient(linear, left top, left bottom, from(#333333), to(#222222)); background-image: -moz-linear-gradient(top, #333333, #222222); background-image: -ms-linear-gradient(top, #333333, #222222); background-image: -webkit-gradient(linear, left top, left bottom, color-stop(0%, #333333), color-stop(100%, #222222)); background-image: -webkit-linear-gradient(top, #333333, #222222); background-image: -o-linear-gradient(top, #333333, #222222); background-image: linear-gradient(top, #333333, #222222); filter: progid:DXImageTransform.Microsoft.gradient(startColorstr='#333333', endColorstr='#222222', GradientType=0); overflow: auto; } div.topbar a.brand { font-family: 'Source Sans Pro', sans-serif; font-size: 26px; color: #ffffff; font-weight: 600; text-decoration: none; float: left; display: block; height: 32px; padding: 8px 12px 0px 45px; margin-left: -10px; background: transparent url("astropy_logo_32.png") no-repeat 10px 4px; background-image: url("astropy_logo.svg"), none; background-size: 32px 32px; } #logotext1 { } #logotext2 { font-weight:200; color: #ff5000; } #logotext3 { font-weight:200; } div.topbar .brand:hover, div.topbar ul li a.homelink:hover { background-color: #333; background-color: rgba(255, 255, 255, 0.05); } div.topbar ul { font-size: 110%; list-style: none; margin: 0; padding: 0 0 0 10px; float: right; color: #bfbfbf; text-align: center; text-decoration: none; height: 100%; } div.topbar ul li { float: left; display: inline; height: 30px; margin: 5px; padding: 0px; } div.topbar ul li a { color: #bfbfbf; text-decoration: none; padding: 5px; display: block; height: auto; text-align: center; vertical-align: middle; border-radius: 4px; } div.topbar ul li a:hover { color: #ffffff; text-decoration: none; } div.topbar ul li a.homelink { width: 112px; display: block; height: 20px; padding: 5px 0px; background: transparent url("astropy_linkout_20.png") no-repeat 10px 5px; background-image: url("astropy_linkout.svg"), none; background-size: 91px 20px; } div.topbar form { text-align: left; margin: 0 0 0 5px; position: relative; filter: alpha(opacity=100); -khtml-opacity: 1; -moz-opacity: 1; opacity: 1; } div.topbar input { background-color: #444; background-color: rgba(255, 255, 255, 0.3); font-family: "Helvetica Neue", Helvetica, Arial, sans-serif; font-size: normal; font-weight: 13px; line-height: 1; padding: 4px 9px; color: #ffffff; color: rgba(255, 255, 255, 0.75); border: 1px solid #111; -webkit-border-radius: 4px; -moz-border-radius: 4px; border-radius: 4px; -webkit-box-shadow: inset 0 1px 2px rgba(0, 0, 0, 0.1), 0 1px 0px rgba(255, 255, 255, 0.25); -moz-box-shadow: inset 0 1px 2px rgba(0, 0, 0, 0.1), 0 1px 0px rgba(255, 255, 255, 0.25); box-shadow: inset 0 1px 2px rgba(0, 0, 0, 0.1), 0 1px 0px rgba(255, 255, 255, 0.25); -webkit-transition: none; -moz-transition: none; -ms-transition: none; -o-transition: none; transition: none; } div.topbar input:-moz-placeholder { color: #e6e6e6; } div.topbar input::-webkit-input-placeholder { color: #e6e6e6; } div.topbar input:hover { background-color: #bfbfbf; background-color: rgba(255, 255, 255, 0.5); color: #ffffff; } div.topbar input:focus, div.topbar input.focused { outline: 0; background-color: #ffffff; color: #404040; text-shadow: 0 1px 0 #ffffff; border: 0; padding: 5px 10px; -webkit-box-shadow: 0 0 3px rgba(0, 0, 0, 0.15); -moz-box-shadow: 0 0 3px rgba(0, 0, 0, 0.15); box-shadow: 0 0 3px rgba(0, 0, 0, 0.15); } /* Relation bar (breadcrumbs, prev, next) ------------------------------------*/ div.related { height: 21px; width: auto; margin: 0 10px; position: absolute; top: 42px; clear: both; left: 0; right: 0; z-index: 10000; font-size: 100%; vertical-align: middle; background-color: #fff; border-bottom: 1px solid #bbb; } div.related ul { padding: 0; margin: 0; } /* Footer --------------------------------------------------------------------*/ footer { display: block; margin: 10px 10px 0px; padding: 10px 0 0 0; border-top: 1px solid #bbb; } .pull-right { float: right; width: 30em; text-align: right; } /* Sphinx sidebar ------------------------------------------------------------*/ div.sphinxsidebar { font-size: inherit; border-radius: 3px; background-color: #eee; border: 1px solid #bbb; word-wrap: break-word; /* overflow-wrap is the canonical name for word-wrap in the CSS3 text draft. We include it here mainly for future-proofing. */ overflow-wrap: break-word; } div.sphinxsidebarwrapper { padding: 0px 0px 0px 5px; } div.sphinxsidebar h3 { font-family: 'Trebuchet MS', sans-serif; font-size: 1.4em; font-weight: normal; margin: 5px 0px 0px 5px; padding: 0; line-height: 1.6em; } div.sphinxsidebar h4 { font-family: 'Trebuchet MS', sans-serif; font-size: 1.3em; font-weight: normal; margin: 5px 0 0 0; padding: 0; } div.sphinxsidebar p { } div.sphinxsidebar p.topless { margin: 5px 10px 10px 10px; } div.sphinxsidebar ul { margin: 0px 0px 0px 5px; padding: 0; } div.sphinxsidebar ul ul { margin-left: 15px; list-style-type: disc; } /* If showing the global TOC (toctree), color the current page differently */ div.sphinxsidebar a.current { color: #404040; } div.sphinxsidebar a.current:hover { color: #404040; } /* document, documentwrapper, body, bodywrapper ----------------------------- */ div.document { margin-top: 72px; margin-left: 10px; margin-right: 10px; } div.documentwrapper { float: left; width: 100%; } div.body { background-color: #ffffff; padding: 0 0 0px 20px; } div.bodywrapper { margin: 0 0 0 230px; max-width: 55em; } /* Header links ------------------------------------------------------------- */ a.headerlink { font-size: 0.8em; padding: 0 4px 0 4px; text-decoration: none; } a.headerlink:hover { background-color: #0069d6; color: white; text-docoration: none; } /* Admonitions and warnings ------------------------------------------------- */ /* Shared by admonitions and warnings */ div.admonition, div.warning { padding: 0px; border-radius: 3px; -moz-border-radius: 3px; -webkit-border-radius: 3px; } div.admonition p, div.warning p { margin: 0.5em 1em 0.5em 1em; padding: 0; } div.admonition pre, div.warning pre { margin: 0.4em 1em 0.4em 1em; } div.admonition p.admonition-title, div.warning p.admonition-title { margin: 0; padding: 0.1em 0 0.1em 0.5em; color: white; font-weight: bold; font-size: 1.1em; } div.admonition ul, div.admonition ol, div.warning ul, div.warning ol { margin: 0.1em 0.5em 0.5em 3em; padding: 0; } /* Admonitions only */ div.admonition { border: 1px solid #609060; background-color: #e9ffe9; } div.admonition p.admonition-title { background-color: #70A070; } /* Warnings only */ div.warning { border: 1px solid #900000; background-color: #ffe9e9; } div.warning p.admonition-title { background-color: #b04040; } /* Figures ------------------------------------------------------------------ */ .figure.align-center { clear: none; } /* This is a div for containing multiple figures side-by-side, for use with * .. container:: figures */ div.figures { border: 1px solid #CCCCCC; background-color: #F8F8F8; margin: 1em; text-align: center; } div.figures .figure { clear: none; float: none; display: inline-block; border: none; margin-left: 0.5em; margin-right: 0.5em; } .field-list th { white-space: nowrap; } table.field-list { border-spacing: 0px; margin-left: 1px; border-left: 5px solid rgb(238, 238, 238) !important; } table.field-list th.field-name { display: inline-block; padding: 1px 8px 1px 5px; white-space: nowrap; background-color: rgb(238, 238, 238); border-radius: 0 3px 3px 0; -webkit-border-radius: 0 3px 3px 0; } pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_logo.svg0000644000076500000240000001103213434074306034614 0ustar weaverstaff00000000000000 pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/sidebar.js0000644000076500000240000001155313434074306033331 0ustar weaverstaff00000000000000/* * sidebar.js * ~~~~~~~~~~ * * This script makes the Sphinx sidebar collapsible. * * .sphinxsidebar contains .sphinxsidebarwrapper. This script adds * in .sphixsidebar, after .sphinxsidebarwrapper, the #sidebarbutton * used to collapse and expand the sidebar. * * When the sidebar is collapsed the .sphinxsidebarwrapper is hidden * and the width of the sidebar and the margin-left of the document * are decreased. When the sidebar is expanded the opposite happens. * This script saves a per-browser/per-session cookie used to * remember the position of the sidebar among the pages. * Once the browser is closed the cookie is deleted and the position * reset to the default (expanded). * * :copyright: Copyright 2007-2011 by the Sphinx team, see AUTHORS. * :license: BSD, see LICENSE for details. * */ $(function() { // global elements used by the functions. // the 'sidebarbutton' element is defined as global after its // creation, in the add_sidebar_button function var bodywrapper = $('.bodywrapper'); var sidebar = $('.sphinxsidebar'); var sidebarwrapper = $('.sphinxsidebarwrapper'); // for some reason, the document has no sidebar; do not run into errors if (!sidebar.length) return; // original margin-left of the bodywrapper and width of the sidebar // with the sidebar expanded var bw_margin_expanded = bodywrapper.css('margin-left'); var ssb_width_expanded = sidebar.width(); // margin-left of the bodywrapper and width of the sidebar // with the sidebar collapsed var bw_margin_collapsed = 12; var ssb_width_collapsed = 12; // custom colors var dark_color = '#404040'; var light_color = '#505050'; function sidebar_is_collapsed() { return sidebarwrapper.is(':not(:visible)'); } function toggle_sidebar() { if (sidebar_is_collapsed()) expand_sidebar(); else collapse_sidebar(); } function collapse_sidebar() { sidebarwrapper.hide(); sidebar.css('width', ssb_width_collapsed); bodywrapper.css('margin-left', bw_margin_collapsed); sidebarbutton.css({ 'margin-left': '-1px', 'height': bodywrapper.height(), 'border-radius': '3px' }); sidebarbutton.find('span').text('»'); sidebarbutton.attr('title', _('Expand sidebar')); document.cookie = 'sidebar=collapsed'; } function expand_sidebar() { bodywrapper.css('margin-left', bw_margin_expanded); sidebar.css('width', ssb_width_expanded); sidebarwrapper.show(); sidebarbutton.css({ 'margin-left': ssb_width_expanded - 12, 'height': bodywrapper.height(), 'border-radius': '0px 3px 3px 0px' }); sidebarbutton.find('span').text('«'); sidebarbutton.attr('title', _('Collapse sidebar')); document.cookie = 'sidebar=expanded'; } function add_sidebar_button() { sidebarwrapper.css({ 'float': 'left', 'margin-right': '0', 'width': ssb_width_expanded - 18 }); // create the button sidebar.append('
«
'); var sidebarbutton = $('#sidebarbutton'); // find the height of the viewport to center the '<<' in the page var viewport_height; if (window.innerHeight) viewport_height = window.innerHeight; else viewport_height = $(window).height(); var sidebar_offset = sidebar.offset().top; var sidebar_height = Math.max(bodywrapper.height(), sidebar.height()); sidebarbutton.find('span').css({ 'font-family': '"Lucida Grande",Arial,sans-serif', 'display': 'block', 'top': Math.min(viewport_height/2, sidebar_height/2 + sidebar_offset) - 10, 'width': 12, 'position': 'fixed', 'text-align': 'center' }); sidebarbutton.click(toggle_sidebar); sidebarbutton.attr('title', _('Collapse sidebar')); sidebarbutton.css({ 'color': '#FFFFFF', 'background-color': light_color, 'border': '1px solid ' + light_color, 'border-radius': '0px 3px 3px 0px', 'font-size': '1.2em', 'cursor': 'pointer', 'height': sidebar_height, 'padding-top': '1px', 'margin': '-1px', 'margin-left': ssb_width_expanded - 12 }); sidebarbutton.hover( function () { $(this).css('background-color', dark_color); }, function () { $(this).css('background-color', light_color); } ); } function set_position_from_cookie() { if (!document.cookie) return; var items = document.cookie.split(';'); for(var k=0; k>>] button on the top-right corner of code samples to hide * the >>> and ... prompts and the output and thus make the code * copyable. */ var div = $('.highlight-python .highlight,' + '.highlight-python3 .highlight,' + '.highlight-default .highlight') var pre = div.find('pre'); // get the styles from the current theme pre.parent().parent().css('position', 'relative'); var hide_text = 'Hide the prompts and output'; var show_text = 'Show the prompts and output'; var border_width = pre.css('border-top-width'); var border_style = pre.css('border-top-style'); var border_color = pre.css('border-top-color'); var button_styles = { 'cursor':'pointer', 'position': 'absolute', 'top': '0', 'right': '0', 'border-color': border_color, 'border-style': border_style, 'border-width': border_width, 'color': border_color, 'text-size': '75%', 'font-family': 'monospace', 'padding-left': '0.2em', 'padding-right': '0.2em', 'border-radius': '0 3px 0 0' } // create and add the button to all the code blocks that contain >>> div.each(function(index) { var jthis = $(this); if (jthis.find('.gp').length > 0) { var button = $('>>>'); button.css(button_styles) button.attr('title', hide_text); button.data('hidden', 'false'); jthis.prepend(button); } // tracebacks (.gt) contain bare text elements that need to be // wrapped in a span to work with .nextUntil() (see later) jthis.find('pre:has(.gt)').contents().filter(function() { return ((this.nodeType == 3) && (this.data.trim().length > 0)); }).wrap(''); }); // define the behavior of the button when it's clicked $('.copybutton').click(function(e){ e.preventDefault(); var button = $(this); if (button.data('hidden') === 'false') { // hide the code output button.parent().find('.go, .gp, .gt').hide(); button.next('pre').find('.gt').nextUntil('.gp, .go').css('visibility', 'hidden'); button.css('text-decoration', 'line-through'); button.attr('title', show_text); button.data('hidden', 'true'); } else { // show the code output button.parent().find('.go, .gp, .gt').show(); button.next('pre').find('.gt').nextUntil('.gp, .go').css('visibility', 'visible'); button.css('text-decoration', 'none'); button.attr('title', hide_text); button.data('hidden', 'false'); } }); }); ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_linkout.svgpydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_linkout.sv0000644000076500000240000001212113434074306035172 0ustar weaverstaff00000000000000 ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_linkout_20.pngpydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_linkout_200000644000076500000240000000327513434074306035056 0ustar weaverstaff00000000000000PNG  IHDR[8A bKGD oFFsvek pHYs B(x vpAg\@0IDATXi_SϘ+jfF BHXbOέ}-ZAb$iV#TZCՖ IiUd;?)*Orsyss!LO`[=`3|;1`{Ͷﱽv]mX=lyZjs@30l<,ݒ @+60S϶Gmo t% `4 pX<,1:`R~qP`.0kWJs¶@R>)Nt$S`6p6pTm5Hs8{@` J:v=%``/9`/i~\`b{H7KB݀"Ɠof:/hR' J\"`n*[! `'I} o9#g6 l}mh[lOe~tgE;nkmϳ=^KQ&~* N Nx l30L-'w~u O lOm)ީ`"ױakٚs\"5ߟ[m,fB 9{g[ؓ'(8}';aq^{N:l_q-HZ"x.5kO|[86_Y?-B6m8wDqkׅ (eY5$ʯwdz"D%iZMh1/ѪbmZ۟0] V-_پ9냲1K%)AB089l *N' M/o;GcJ=IÁe:T6ܝ}ʳP F76J}K h,aSΌV`%XU [IWE԰K ? |LYerZJ*<3E X1r=$.C*3^p+7 (! sPf %   i# #"""" yu =!"""""""""""""""""""!!! $$$$$$"m8T%!$$$$$###""""###$$$$$$##$$#!' &j'(&&&&!Nim5 %&&&%%$##""""""##$$%&&&&&&&&%&3)%(*)))(%'IUq'!')((('&#("DPkpmNj!C&"%&'(((((((&'9 +*+*++)*Qo%%++++*'$"Gt}/R'&))******''U+;-.---,)Vt]z('----,$F?`)),,,,,,.,.t@//0000- :z1(0/./,+Zw#L'./////0.-I0%021110/bH)1221.8xTs+/100011/2$@3n3533318Ek,243418q103222320@ 2666655Dl @/65540t\~?e1\,X,X2]?fZ}u335555644X8 5:88877@j08888.Rx:eE1*-..-*0C3]jl168778855:O9<:::8 BM3;:;7O^@/49:::99984.6/]Mu39999;88<{;=<<<;9hc<:<<<2xOy65<=<<<<<<<<<<92 Af!S9;;;<=9;'=?@??=?k7g5>>>;Mc89?>>>>>>>>>>>>?>7:[=<>>>@>>ABBAA?B#[9AAA:U F9A@@@>:6557;?@@@@A;=e>o>@@@AB?<BDCCCADT>CCC9Ev7CCC>4DBssj0f>:BBBBC; FC@BBBDBBhUEGEEEBFOAEED@Y?FC9&a?r?BEEEE=$_4kCDDDDEB@ FJHHHEIxMDGGF OIE?H_bHEGGGEE\EEGGGIDJ4ILJJJGLtNGJJHZpB= ]]BIIIJA ` PGIIIKHJxLNLLLIN{ RILLJ\r;.h+iGKKKHNo5pJKKKKKLOPNNNLOYJNNMR5nFMNNNF9unMKMMMOLPQQQQORbJPPPH\NPPPI!gPNPPPRPRQTSSQT'mJSSSJ}ANRRRM`TPRRRTPV}UVUUSTfFOTUUQ9{_NTTTP Z~VQTTTVRVPWYWWVU0w _SWWVXS:}jPVVVTYj ]SVVVXVZ"X\YYYW^-vQYYYR;bQ9hSYYYWZbeWXXX[Xf X]\\\ZZ~z^W\[[WdOXKNVV[[[Y\e dY[[[^[f_]_^^]\*x<X]^^\_d)wT[Yt4Y]]]Zar`Z]]]`]b<`c```^]oY```][B?ZY`W"ve]___\h`\___b]mbdbbbba/i[abbaZ_;ryWoW[ab_]pYabbb]rb_aaad_Uewdgeeed`yo\bdddc_ZYZ\addda]G9_dddd\(|ebdddgdg%fjggggep3c_fgggfffffff_cGcefffe`GZeefffifihkiiihe5g'}e`dfghhgeabwj5chhhhd p&fghhhkhj0knkkkkjeNi=| qhdflv0Xddjkkkkd*khjjjjmknmpnnnnmgSm_ex|nkmmmmjmlqllmmmmpku%mrpppppoiD~ tkoonjki.#mnoooororgpuqrrrrqljsoqqps~ttnoqqqqqrn`tvuttttttqsR:ortttm%nb0rrssssusryx1vzwwwwwwvup {WTxquuuvs vpsuvvvvvyuw/x[w|yxxxxyyxwtv+ht2wswxxxxxq7vxxwwwyxw|x~{zzzzzzzzzxwux)-)|uwxzzz{{{{v Ovyyyyyz|w|H {~}}}}||||||||{{{z{{{||||||||||wPy{}||||}}z~~~~~~~~~~~~~~~y!}~~~~~}7}b/n;]+u %gL@Mxs&G#Vs]<T%qss wP70W7 &id 8iuE ( @ d W ,uJ>P&:K bs\n +!!!0J +%%%%ps(%%%%%%%%%%%$$$"5+0***>^@_))))$GyPl 1)))))'....0Xv...C{3---*$3#333Xy2228_#N22227t777O77O`K6 =;eJ6667O:<;I;;;@;;;;;;P@;::U@@@Crj@@(^C???????? Fs???ASADDgAtDDjM|DD7lHyDDDTNCCDIII|)fIIW QHHHx\HHHF!MMMo6sMM#e8sLL^LLLMcRRR\YQQhQQQQQQOVVV"mVVcUUUUUUS\l[[ZjZZz[ZZZ]YYWa*___r___G^*y^^^^^^[acchnccce({lccdAcccbbb_i_hhhHqhgggg mhgg zggggfkllljQ8G#llkVkkkip]qqqqN7ppp~tpppowtuuuu~ut ysttttv'y{yyyyy|@lqJ yyyy|yyyw}5~~~~~~~~~~~}}}}"P}}}}{<Ds6i %3a _$;$lw&(0 \&2$1e1i9E Bl{;O " )C!" &&k' g~g##-MTn[tGc<!''&(1+'&Ne!5XE$.*3d411&Mp@e*5187F;g&(X4\2) @Ms1^188D>;8iK2;,C G9,FE=>UB@[>O3j@^S7WO=DC5KFb@]+iJ@oKLIpQNPP W?|?3s YOPVU#n"nFvcD"mbTQ]v__~QdeUL"rhZYa:b^GOUdFR1VB!vU0 ja_q fmgwvb`hQaabjhgnHomt rdr]hqmrzqtXt xjQnytgzwu ~3^eFvqVrwx +}|{{||||'}~s, 6 >4 L;s 9qf$(  * mcmU#1 4E 16G\  'HU'+#!lKi!$0PUAC$v 3 D;->%>4$4"" R6;AJRT0aIx B UW7U-MRX_5A Tt\tOCBNYl-XNKWWG\IVdmd@Nudy_ou%umQ{}qvCU|Pvv totUH NZ: Ul*././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_logo_32.pngpydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/static/astropy_logo_32.pn0000644000076500000240000000353413434074306034746 0ustar weaverstaff00000000000000PNG  IHDR szzsBIT|d pHYstEXtSoftwarewww.inkscape.org<IDATXŗPT?cP  FUBȘ&diI# JhC2nj5tw:۪5Sƪڡ%ZF2VkJF`(Jdqa., ?3gsϏΞ\_ɕ#Hl 2pZ8ԯKLW;ACȋh4z>$?#hJ~`;&y#D b0¤u2RqKJr'7<6.´;`2ҋ@&a$`+Ɲ1WB], w.rM|rh?G6Bm"GïK0#&: WBa˰mL6p+Δxti@D1;z v7zCrׇE9,_Ghby; !,eUėAlO-^;V~;MKxUZK%:L剜"9Tr3WCWa89`p4XW;KxBjwɥׇ.WLD_e5w`DzFG;z9?@ghI^ UԳMl+ās%bZKo@`!8o)!pu4W;U00i'@V \}> u  bdǑY>rzc0iI,\1DX )ׇ__m cB3߬|f̃I.K;NAq!~*r8g)Bď߅;!*'#DrdN;Ql |( Xj[`aPy* ؗԥhbO 9el 0Hia29HRe 5*@)}˱ cU5aIr m0JnARPrj&5+ޝAL:KA\ e'_໩lg'm/!7|p7zT@50 K޹g@/fHN|ׯ@b b8Xl,yf} ڠU; )U1obS j~¦aS2!&A8/ 7hu.@0D=_oo nI/ I70Fާ&%,*}t {#$'@tbʾ?uO j&DK -T㎉E4| )p,;!7ÿ3i06XԾ8nBSjOENi 0-g<0c&T@e] K . ;z硳-TR[t:iy脷,,4EBY8{Z5FAK]?upjL,<" ^?aRe AO/YHKC}K7ټV='N h@$.:4}rsFp"jw^qo?%f$2H̀O675E)iנس\oF̄*j{YUIܹ !bQ[Ǣ&X])WHT] 텟A֭`ЇuWXq;dgڱ "20֯зka:ob3u2p!}rn,TjN$9L࿡k{rAMP*ari.i[ hШ7O$0 ˕Lg$33 G.8<IENDB`pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/localtoc.html0000644000076500000240000000004213434074306032550 0ustar weaverstaff00000000000000

Page Contents

{{ toc }} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/themes/bootstrap-astropy/searchbox.html0000644000076500000240000000042013434074306032726 0ustar weaverstaff00000000000000{%- if pagename != "search" %}
{%- endif %} pydl-0.7.0/astropy_helpers/astropy_helpers/sphinx/setup_package.py0000644000076500000240000000044413434074306026254 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst def get_package_data(): # Install the theme files return { 'astropy_helpers.sphinx': [ 'local/*.inv', 'themes/bootstrap-astropy/*.*', 'themes/bootstrap-astropy/static/*.*']} pydl-0.7.0/astropy_helpers/astropy_helpers/commands/0000755000076500000240000000000013434104632023351 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/commands/build_ext.py0000644000076500000240000004651113434074306025715 0ustar weaverstaff00000000000000import errno import os import re import shlex import shutil import subprocess import sys import textwrap from distutils import log, ccompiler, sysconfig from distutils.core import Extension from distutils.ccompiler import get_default_compiler from setuptools.command.build_ext import build_ext as SetuptoolsBuildExt from ..utils import get_numpy_include_path, invalidate_caches, classproperty from ..version_helpers import get_pkg_version_module def should_build_with_cython(package, release=None): """Returns the previously used Cython version (or 'unknown' if not previously built) if Cython should be used to build extension modules from pyx files. If the ``release`` parameter is not specified an attempt is made to determine the release flag from `astropy.version`. """ try: version_module = __import__(package + '.cython_version', fromlist=['release', 'cython_version']) except ImportError: version_module = None if release is None and version_module is not None: try: release = version_module.release except AttributeError: pass try: cython_version = version_module.cython_version except AttributeError: cython_version = 'unknown' # Only build with Cython if, of course, Cython is installed, we're in a # development version (i.e. not release) or the Cython-generated source # files haven't been created yet (cython_version == 'unknown'). The latter # case can happen even when release is True if checking out a release tag # from the repository have_cython = False try: import Cython # noqa have_cython = True except ImportError: pass if have_cython and (not release or cython_version == 'unknown'): return cython_version else: return False _compiler_versions = {} def get_compiler_version(compiler): if compiler in _compiler_versions: return _compiler_versions[compiler] # Different flags to try to get the compiler version # TODO: It might be worth making this configurable to support # arbitrary odd compilers; though all bets may be off in such # cases anyway flags = ['--version', '--Version', '-version', '-Version', '-v', '-V'] def try_get_version(flag): process = subprocess.Popen( shlex.split(compiler, posix=('win' not in sys.platform)) + [flag], stdout=subprocess.PIPE, stderr=subprocess.PIPE) stdout, stderr = process.communicate() if process.returncode != 0: return 'unknown' output = stdout.strip().decode('latin-1') # Safest bet if not output: # Some compilers return their version info on stderr output = stderr.strip().decode('latin-1') if not output: output = 'unknown' return output for flag in flags: version = try_get_version(flag) if version != 'unknown': break # Cache results to speed up future calls _compiler_versions[compiler] = version return version # TODO: I think this can be reworked without having to create the class # programmatically. def generate_build_ext_command(packagename, release): """ Creates a custom 'build_ext' command that allows for manipulating some of the C extension options at build time. We use a function to build the class since the base class for build_ext may be different depending on certain build-time parameters (for example, we may use Cython's build_ext instead of the default version in distutils). Uses the default distutils.command.build_ext by default. """ class build_ext(SetuptoolsBuildExt, object): package_name = packagename is_release = release _user_options = SetuptoolsBuildExt.user_options[:] _boolean_options = SetuptoolsBuildExt.boolean_options[:] _help_options = SetuptoolsBuildExt.help_options[:] force_rebuild = False _broken_compiler_mapping = [ ('i686-apple-darwin[0-9]*-llvm-gcc-4.2', 'clang') ] # Warning: Spaghetti code ahead. # During setup.py, the setup_helpers module needs the ability to add # items to a command's user_options list. At this stage we don't know # whether or not we can build with Cython, and so don't know for sure # what base class will be used for build_ext; nevertheless we want to # be able to provide a list to add options into. # # Later, once setup() has been called we should have all build # dependencies included via setup_requires available. distutils needs # to be able to access the user_options as a *class* attribute before # the class has been initialized, but we do need to be able to # enumerate the options for the correct base class at that point @classproperty def user_options(cls): from distutils import core if core._setup_distribution is None: # We haven't gotten into setup() yet, and the Distribution has # not yet been initialized return cls._user_options return cls._final_class.user_options @classproperty def boolean_options(cls): # Similar to user_options above from distutils import core if core._setup_distribution is None: # We haven't gotten into setup() yet, and the Distribution has # not yet been initialized return cls._boolean_options return cls._final_class.boolean_options @classproperty def help_options(cls): # Similar to user_options above from distutils import core if core._setup_distribution is None: # We haven't gotten into setup() yet, and the Distribution has # not yet been initialized return cls._help_options return cls._final_class.help_options @classproperty(lazy=True) def _final_class(cls): """ Late determination of what the build_ext base class should be, depending on whether or not Cython is available. """ uses_cython = should_build_with_cython(cls.package_name, cls.is_release) if uses_cython: # We need to decide late on whether or not to use Cython's # build_ext (since Cython may not be available earlier in the # setup.py if it was brought in via setup_requires) try: from Cython.Distutils.old_build_ext import old_build_ext as base_cls except ImportError: from Cython.Distutils import build_ext as base_cls else: base_cls = SetuptoolsBuildExt # Create and return an instance of a new class based on this class # using one of the above possible base classes def merge_options(attr): base = getattr(base_cls, attr) ours = getattr(cls, '_' + attr) all_base = set(opt[0] for opt in base) return base + [opt for opt in ours if opt[0] not in all_base] boolean_options = (base_cls.boolean_options + [opt for opt in cls._boolean_options if opt not in base_cls.boolean_options]) members = dict(cls.__dict__) members.update({ 'user_options': merge_options('user_options'), 'help_options': merge_options('help_options'), 'boolean_options': boolean_options, 'uses_cython': uses_cython, }) # Update the base class for the original build_ext command build_ext.__bases__ = (base_cls, object) # Create a new class for the existing class, but now with the # appropriate base class depending on whether or not to use Cython. # Ensure that object is one of the bases to make a new-style class. return type(cls.__name__, (build_ext,), members) def __new__(cls, *args, **kwargs): # By the time the command is actually instantialized, the # Distribution instance for the build has been instantiated, which # means setup_requires has been processed--now we can determine # what base class we can use for the actual build, and return an # instance of a build_ext command that uses that base class (right # now the options being Cython.Distutils.build_ext, or the stock # setuptools build_ext) new_cls = super(build_ext, cls._final_class).__new__( cls._final_class) # Since the new cls is not a subclass of the original cls, we must # manually call its __init__ new_cls.__init__(*args, **kwargs) return new_cls def finalize_options(self): # Add a copy of the _compiler.so module as well, but only if there # are in fact C modules to compile (otherwise there's no reason to # include a record of the compiler used) # Note, self.extensions may not be set yet, but # self.distribution.ext_modules is where any extension modules # passed to setup() can be found self._adjust_compiler() extensions = self.distribution.ext_modules if extensions: build_py = self.get_finalized_command('build_py') package_dir = build_py.get_package_dir(packagename) src_path = os.path.relpath( os.path.join(os.path.dirname(__file__), 'src')) shutil.copy(os.path.join(src_path, 'compiler.c'), os.path.join(package_dir, '_compiler.c')) ext = Extension(self.package_name + '._compiler', [os.path.join(package_dir, '_compiler.c')]) extensions.insert(0, ext) super(build_ext, self).finalize_options() # Generate if self.uses_cython: try: from Cython import __version__ as cython_version except ImportError: # This shouldn't happen if we made it this far cython_version = None if (cython_version is not None and cython_version != self.uses_cython): self.force_rebuild = True # Update the used cython version self.uses_cython = cython_version # Regardless of the value of the '--force' option, force a rebuild # if the debug flag changed from the last build if self.force_rebuild: self.force = True def run(self): # For extensions that require 'numpy' in their include dirs, # replace 'numpy' with the actual paths np_include = None for extension in self.extensions: if 'numpy' in extension.include_dirs: if np_include is None: np_include = get_numpy_include_path() idx = extension.include_dirs.index('numpy') extension.include_dirs.insert(idx, np_include) extension.include_dirs.remove('numpy') self._check_cython_sources(extension) super(build_ext, self).run() # Update cython_version.py if building with Cython try: cython_version = get_pkg_version_module( packagename, fromlist=['cython_version'])[0] except (AttributeError, ImportError): cython_version = 'unknown' if self.uses_cython and self.uses_cython != cython_version: build_py = self.get_finalized_command('build_py') package_dir = build_py.get_package_dir(packagename) cython_py = os.path.join(package_dir, 'cython_version.py') with open(cython_py, 'w') as f: f.write('# Generated file; do not modify\n') f.write('cython_version = {0!r}\n'.format(self.uses_cython)) if os.path.isdir(self.build_lib): # The build/lib directory may not exist if the build_py # command was not previously run, which may sometimes be # the case self.copy_file(cython_py, os.path.join(self.build_lib, cython_py), preserve_mode=False) invalidate_caches() def _adjust_compiler(self): """ This function detects broken compilers and switches to another. If the environment variable CC is explicitly set, or a compiler is specified on the commandline, no override is performed -- the purpose here is to only override a default compiler. The specific compilers with problems are: * The default compiler in XCode-4.2, llvm-gcc-4.2, segfaults when compiling wcslib. The set of broken compilers can be updated by changing the compiler_mapping variable. It is a list of 2-tuples where the first in the pair is a regular expression matching the version of the broken compiler, and the second is the compiler to change to. """ if 'CC' in os.environ: # Check that CC is not set to llvm-gcc-4.2 c_compiler = os.environ['CC'] try: version = get_compiler_version(c_compiler) except OSError: msg = textwrap.dedent( """ The C compiler set by the CC environment variable: {compiler:s} cannot be found or executed. """.format(compiler=c_compiler)) log.warn(msg) sys.exit(1) for broken, fixed in self._broken_compiler_mapping: if re.match(broken, version): msg = textwrap.dedent( """Compiler specified by CC environment variable ({compiler:s}:{version:s}) will fail to compile {pkg:s}. Please set CC={fixed:s} and try again. You can do this, for example, by running: CC={fixed:s} python setup.py where is the command you ran. """.format(compiler=c_compiler, version=version, pkg=self.package_name, fixed=fixed)) log.warn(msg) sys.exit(1) # If C compiler is set via CC, and isn't broken, we are good to go. We # should definitely not try accessing the compiler specified by # ``sysconfig.get_config_var('CC')`` lower down, because this may fail # if the compiler used to compile Python is missing (and maybe this is # why the user is setting CC). For example, the official Python 2.7.3 # MacOS X binary was compiled with gcc-4.2, which is no longer available # in XCode 4. return if self.compiler is not None: # At this point, self.compiler will be set only if a compiler # was specified in the command-line or via setup.cfg, in which # case we don't do anything return compiler_type = ccompiler.get_default_compiler() if compiler_type == 'unix': # We have to get the compiler this way, as this is the one that is # used if os.environ['CC'] is not set. It is actually read in from # the Python Makefile. Note that this is not necessarily the same # compiler as returned by ccompiler.new_compiler() c_compiler = sysconfig.get_config_var('CC') try: version = get_compiler_version(c_compiler) except OSError: msg = textwrap.dedent( """ The C compiler used to compile Python {compiler:s}, and which is normally used to compile C extensions, is not available. You can explicitly specify which compiler to use by setting the CC environment variable, for example: CC=gcc python setup.py or if you are using MacOS X, you can try: CC=clang python setup.py """.format(compiler=c_compiler)) log.warn(msg) sys.exit(1) for broken, fixed in self._broken_compiler_mapping: if re.match(broken, version): os.environ['CC'] = fixed break def _check_cython_sources(self, extension): """ Where relevant, make sure that the .c files associated with .pyx modules are present (if building without Cython installed). """ # Determine the compiler we'll be using if self.compiler is None: compiler = get_default_compiler() else: compiler = self.compiler # Replace .pyx with C-equivalents, unless c files are missing for jdx, src in enumerate(extension.sources): base, ext = os.path.splitext(src) pyxfn = base + '.pyx' cfn = base + '.c' cppfn = base + '.cpp' if not os.path.isfile(pyxfn): continue if self.uses_cython: extension.sources[jdx] = pyxfn else: if os.path.isfile(cfn): extension.sources[jdx] = cfn elif os.path.isfile(cppfn): extension.sources[jdx] = cppfn else: msg = ( 'Could not find C/C++ file {0}.(c/cpp) for Cython ' 'file {1} when building extension {2}. Cython ' 'must be installed to build from a git ' 'checkout.'.format(base, pyxfn, extension.name)) raise IOError(errno.ENOENT, msg, cfn) # Current versions of Cython use deprecated Numpy API features # the use of which produces a few warnings when compiling. # These additional flags should squelch those warnings. # TODO: Feel free to remove this if/when a Cython update # removes use of the deprecated Numpy API if compiler == 'unix': extension.extra_compile_args.extend([ '-Wp,-w', '-Wno-unused-function']) return build_ext pydl-0.7.0/astropy_helpers/astropy_helpers/commands/register.py0000644000076500000240000000454713434074306025565 0ustar weaverstaff00000000000000from setuptools.command.register import register as SetuptoolsRegister class AstropyRegister(SetuptoolsRegister): """Extends the built in 'register' command to support a ``--hidden`` option to make the registered version hidden on PyPI by default. The result of this is that when a version is registered as "hidden" it can still be downloaded from PyPI, but it does not show up in the list of actively supported versions under http://pypi.python.org/pypi/astropy, and is not set as the most recent version. Although this can always be set through the web interface it may be more convenient to be able to specify via the 'register' command. Hidden may also be considered a safer default when running the 'register' command, though this command uses distutils' normal behavior if the ``--hidden`` option is omitted. """ user_options = SetuptoolsRegister.user_options + [ ('hidden', None, 'mark this release as hidden on PyPI by default') ] boolean_options = SetuptoolsRegister.boolean_options + ['hidden'] def initialize_options(self): SetuptoolsRegister.initialize_options(self) self.hidden = False def build_post_data(self, action): data = SetuptoolsRegister.build_post_data(self, action) if action == 'submit' and self.hidden: data['_pypi_hidden'] = '1' return data def _set_config(self): # The original register command is buggy--if you use .pypirc with a # server-login section *at all* the repository you specify with the -r # option will be overwritten with either the repository in .pypirc or # with the default, # If you do not have a .pypirc using the -r option will just crash. # Way to go distutils # If we don't set self.repository back to a default value _set_config # can crash if there was a user-supplied value for this option; don't # worry, we'll get the real value back afterwards self.repository = 'pypi' SetuptoolsRegister._set_config(self) options = self.distribution.get_option_dict('register') if 'repository' in options: source, value = options['repository'] # Really anything that came from setup.cfg or the command line # should override whatever was in .pypirc self.repository = value pydl-0.7.0/astropy_helpers/astropy_helpers/commands/install_lib.py0000644000076500000240000000100013434074306026212 0ustar weaverstaff00000000000000from setuptools.command.install_lib import install_lib as SetuptoolsInstallLib from ..utils import _get_platlib_dir class AstropyInstallLib(SetuptoolsInstallLib): user_options = SetuptoolsInstallLib.user_options[:] boolean_options = SetuptoolsInstallLib.boolean_options[:] def finalize_options(self): build_cmd = self.get_finalized_command('build') platlib_dir = _get_platlib_dir(build_cmd) self.build_dir = platlib_dir SetuptoolsInstallLib.finalize_options(self) pydl-0.7.0/astropy_helpers/astropy_helpers/commands/build_py.py0000644000076500000240000000265613434074306025547 0ustar weaverstaff00000000000000from setuptools.command.build_py import build_py as SetuptoolsBuildPy from ..utils import _get_platlib_dir class AstropyBuildPy(SetuptoolsBuildPy): user_options = SetuptoolsBuildPy.user_options[:] boolean_options = SetuptoolsBuildPy.boolean_options[:] def finalize_options(self): # Update build_lib settings from the build command to always put # build files in platform-specific subdirectories of build/, even # for projects with only pure-Python source (this is desirable # specifically for support of multiple Python version). build_cmd = self.get_finalized_command('build') platlib_dir = _get_platlib_dir(build_cmd) build_cmd.build_purelib = platlib_dir build_cmd.build_lib = platlib_dir self.build_lib = platlib_dir SetuptoolsBuildPy.finalize_options(self) def run_2to3(self, files, doctests=False): # Filter the files to exclude things that shouldn't be 2to3'd skip_2to3 = self.distribution.skip_2to3 filtered_files = [] for filename in files: for package in skip_2to3: if filename[len(self.build_lib) + 1:].startswith(package): break else: filtered_files.append(filename) SetuptoolsBuildPy.run_2to3(self, filtered_files, doctests) def run(self): # first run the normal build_py SetuptoolsBuildPy.run(self) pydl-0.7.0/astropy_helpers/astropy_helpers/commands/__init__.py0000644000076500000240000000000012664672030025456 0ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/commands/test.py0000644000076500000240000000252413434074306024711 0ustar weaverstaff00000000000000""" Different implementations of the ``./setup.py test`` command depending on what's locally available. If Astropy v1.1.0.dev or later is available it should be possible to import AstropyTest from ``astropy.tests.command``. If ``astropy`` can be imported but not ``astropy.tests.command`` (i.e. an older version of Astropy), we can use the backwards-compat implementation of the command. If Astropy can't be imported at all then there is a skeleton implementation that allows users to at least discover the ``./setup.py test`` command and learn that they need Astropy to run it. """ # Previously these except statements caught only ImportErrors, but there are # some other obscure exceptional conditions that can occur when importing # astropy.tests (at least on older versions) that can cause these imports to # fail try: import astropy # noqa try: from astropy.tests.command import AstropyTest except Exception: from ._test_compat import AstropyTest except Exception: # No astropy at all--provide the dummy implementation from ._dummy import _DummyCommand class AstropyTest(_DummyCommand): command_name = 'test' description = 'Run the tests for this package' error_msg = ( "The 'test' command requires the astropy package to be " "installed and importable.") pydl-0.7.0/astropy_helpers/astropy_helpers/commands/_dummy.py0000644000076500000240000000557413434074306025234 0ustar weaverstaff00000000000000""" Provides a base class for a 'dummy' setup.py command that has no functionality (probably due to a missing requirement). This dummy command can raise an exception when it is run, explaining to the user what dependencies must be met to use this command. The reason this is at all tricky is that we want the command to be able to provide this message even when the user passes arguments to the command. If we don't know ahead of time what arguments the command can take, this is difficult, because distutils does not allow unknown arguments to be passed to a setup.py command. This hacks around that restriction to provide a useful error message even when a user passes arguments to the dummy implementation of a command. Use this like: try: from some_dependency import SetupCommand except ImportError: from ._dummy import _DummyCommand class SetupCommand(_DummyCommand): description = \ 'Implementation of SetupCommand from some_dependency; ' 'some_dependency must be installed to run this command' # This is the message that will be raised when a user tries to # run this command--define it as a class attribute. error_msg = \ "The 'setup_command' command requires the some_dependency " "package to be installed and importable." """ import sys from setuptools import Command from distutils.errors import DistutilsArgError from textwrap import dedent class _DummyCommandMeta(type): """ Causes an exception to be raised on accessing attributes of a command class so that if ``./setup.py command_name`` is run with additional command-line options we can provide a useful error message instead of the default that tells users the options are unrecognized. """ def __init__(cls, name, bases, members): if bases == (Command, object): # This is the _DummyCommand base class, presumably return if not hasattr(cls, 'description'): raise TypeError( "_DummyCommand subclass must have a 'description' " "attribute.") if not hasattr(cls, 'error_msg'): raise TypeError( "_DummyCommand subclass must have an 'error_msg' " "attribute.") def __getattribute__(cls, attr): if attr in ('description', 'error_msg'): # Allow cls.description to work so that `./setup.py # --help-commands` still works return super(_DummyCommandMeta, cls).__getattribute__(attr) raise DistutilsArgError(cls.error_msg) if sys.version_info[0] < 3: exec(dedent(""" class _DummyCommand(Command, object): __metaclass__ = _DummyCommandMeta """)) else: exec(dedent(""" class _DummyCommand(Command, object, metaclass=_DummyCommandMeta): pass """)) pydl-0.7.0/astropy_helpers/astropy_helpers/commands/build_sphinx.py0000644000076500000240000002526713434074306026433 0ustar weaverstaff00000000000000from __future__ import print_function import inspect import os import pkgutil import re import shutil import subprocess import sys import textwrap import warnings from distutils import log from distutils.cmd import DistutilsOptionError import sphinx from sphinx.setup_command import BuildDoc as SphinxBuildDoc from ..utils import minversion, AstropyDeprecationWarning PY3 = sys.version_info[0] >= 3 class AstropyBuildDocs(SphinxBuildDoc): """ A version of the ``build_docs`` command that uses the version of Astropy that is built by the setup ``build`` command, rather than whatever is installed on the system. To build docs against the installed version, run ``make html`` in the ``astropy/docs`` directory. This also automatically creates the docs/_static directories--this is needed because GitHub won't create the _static dir because it has no tracked files. """ description = 'Build Sphinx documentation for Astropy environment' user_options = SphinxBuildDoc.user_options[:] user_options.append( ('warnings-returncode', 'w', 'Parses the sphinx output and sets the return code to 1 if there ' 'are any warnings. Note that this will cause the sphinx log to ' 'only update when it completes, rather than continuously as is ' 'normally the case.')) user_options.append( ('clean-docs', 'l', 'Completely clean previous builds, including ' 'automodapi-generated files before building new ones')) user_options.append( ('no-intersphinx', 'n', 'Skip intersphinx, even if conf.py says to use it')) user_options.append( ('open-docs-in-browser', 'o', 'Open the docs in a browser (using the webbrowser module) if the ' 'build finishes successfully.')) boolean_options = SphinxBuildDoc.boolean_options[:] boolean_options.append('warnings-returncode') boolean_options.append('clean-docs') boolean_options.append('no-intersphinx') boolean_options.append('open-docs-in-browser') _self_iden_rex = re.compile(r"self\.([^\d\W][\w]+)", re.UNICODE) def initialize_options(self): SphinxBuildDoc.initialize_options(self) self.clean_docs = False self.no_intersphinx = False self.open_docs_in_browser = False self.warnings_returncode = False def finalize_options(self): # We need to make sure this is set before finalize_options otherwise # the default is set to build/sphinx. We also need the absolute path # as in some Sphinx versions, the path is otherwise interpreted as # docs/docs/_build. if self.build_dir is None: self.build_dir = os.path.abspath('docs/_build') else: self.build_dir = os.path.abspath(self.build_dir) SphinxBuildDoc.finalize_options(self) # Clear out previous sphinx builds, if requested if self.clean_docs: dirstorm = [os.path.join(self.source_dir, 'api'), os.path.join(self.source_dir, 'generated')] if self.build_dir is None: dirstorm.append('docs/_build') else: dirstorm.append(self.build_dir) for d in dirstorm: if os.path.isdir(d): log.info('Cleaning directory ' + d) shutil.rmtree(d) else: log.info('Not cleaning directory ' + d + ' because ' 'not present or not a directory') def run(self): # TODO: Break this method up into a few more subroutines and # document them better import webbrowser if PY3: from urllib.request import pathname2url else: from urllib import pathname2url # This is used at the very end of `run` to decide if sys.exit should # be called. If it's None, it won't be. retcode = None # If possible, create the _static dir if self.build_dir is not None: # the _static dir should be in the same place as the _build dir # for Astropy basedir, subdir = os.path.split(self.build_dir) if subdir == '': # the path has a trailing /... basedir, subdir = os.path.split(basedir) staticdir = os.path.join(basedir, '_static') if os.path.isfile(staticdir): raise DistutilsOptionError( 'Attempted to build_docs in a location where' + staticdir + 'is a file. Must be a directory.') self.mkpath(staticdir) # Now make sure Astropy is built and determine where it was built build_cmd = self.reinitialize_command('build') build_cmd.inplace = 0 self.run_command('build') build_cmd = self.get_finalized_command('build') build_cmd_path = os.path.abspath(build_cmd.build_lib) ah_importer = pkgutil.get_importer('astropy_helpers') if ah_importer is None: ah_path = '.' else: ah_path = os.path.abspath(ah_importer.path) # Now generate the source for and spawn a new process that runs the # command. This is needed to get the correct imports for the built # version runlines, runlineno = inspect.getsourcelines(SphinxBuildDoc.run) subproccode = textwrap.dedent(""" from sphinx.setup_command import * os.chdir({srcdir!r}) sys.path.insert(0, {build_cmd_path!r}) sys.path.insert(0, {ah_path!r}) """).format(build_cmd_path=build_cmd_path, ah_path=ah_path, srcdir=self.source_dir) # runlines[1:] removes 'def run(self)' on the first line subproccode += textwrap.dedent(''.join(runlines[1:])) # All "self.foo" in the subprocess code needs to be replaced by the # values taken from the current self in *this* process subproccode = self._self_iden_rex.split(subproccode) for i in range(1, len(subproccode), 2): iden = subproccode[i] val = getattr(self, iden) if iden.endswith('_dir'): # Directories should be absolute, because the `chdir` call # in the new process moves to a different directory subproccode[i] = repr(os.path.abspath(val)) else: subproccode[i] = repr(val) subproccode = ''.join(subproccode) optcode = textwrap.dedent(""" class Namespace(object): pass self = Namespace() self.pdb = {pdb!r} self.verbosity = {verbosity!r} self.traceback = {traceback!r} """).format(pdb=getattr(self, 'pdb', False), verbosity=getattr(self, 'verbosity', 0), traceback=getattr(self, 'traceback', False)) subproccode = optcode + subproccode # This is a quick gross hack, but it ensures that the code grabbed from # SphinxBuildDoc.run will work in Python 2 if it uses the print # function if minversion(sphinx, '1.3'): subproccode = 'from __future__ import print_function' + subproccode if self.no_intersphinx: # the confoverrides variable in sphinx.setup_command.BuildDoc can # be used to override the conf.py ... but this could well break # if future versions of sphinx change the internals of BuildDoc, # so remain vigilant! subproccode = subproccode.replace( 'confoverrides = {}', 'confoverrides = {\'intersphinx_mapping\':{}}') log.debug('Starting subprocess of {0} with python code:\n{1}\n' '[CODE END])'.format(sys.executable, subproccode)) # To return the number of warnings, we need to capture stdout. This # prevents a continuous updating at the terminal, but there's no # apparent way around this. if self.warnings_returncode: proc = subprocess.Popen([sys.executable, '-c', subproccode], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) retcode = 1 with proc.stdout: for line in iter(proc.stdout.readline, b''): line = line.strip(b'\r\n') print(line.decode('utf-8')) if 'build succeeded.' == line.decode('utf-8'): retcode = 0 # Poll to set proc.retcode proc.wait() if retcode != 0: if os.environ.get('TRAVIS', None) == 'true': # this means we are in the travis build, so customize # the message appropriately. msg = ('The build_docs travis build FAILED ' 'because sphinx issued documentation ' 'warnings (scroll up to see the warnings).') else: # standard failure message msg = ('build_docs returning a non-zero exit ' 'code because sphinx issued documentation ' 'warnings.') log.warn(msg) else: proc = subprocess.Popen([sys.executable], stdin=subprocess.PIPE) proc.communicate(subproccode.encode('utf-8')) if proc.returncode == 0: if self.open_docs_in_browser: if self.builder == 'html': absdir = os.path.abspath(self.builder_target_dir) index_path = os.path.join(absdir, 'index.html') fileurl = 'file://' + pathname2url(index_path) webbrowser.open(fileurl) else: log.warn('open-docs-in-browser option was given, but ' 'the builder is not html! Ignoring.') else: log.warn('Sphinx Documentation subprocess failed with return ' 'code ' + str(proc.returncode)) retcode = proc.returncode if retcode is not None: # this is potentially dangerous in that there might be something # after the call to `setup` in `setup.py`, and exiting here will # prevent that from running. But there's no other apparent way # to signal what the return code should be. sys.exit(retcode) class AstropyBuildSphinx(AstropyBuildDocs): # pragma: no cover description = 'deprecated alias to the build_docs command' def run(self): warnings.warn( 'The "build_sphinx" command is now deprecated. Use' '"build_docs" instead.', AstropyDeprecationWarning) AstropyBuildDocs.run(self) pydl-0.7.0/astropy_helpers/astropy_helpers/commands/_test_compat.py0000644000076500000240000002671213434074306026420 0ustar weaverstaff00000000000000""" Old implementation of ``./setup.py test`` command. This has been moved to astropy.tests as of Astropy v1.1.0, but a copy of the implementation is kept here for backwards compatibility. """ from __future__ import absolute_import, unicode_literals import inspect import os import shutil import subprocess import sys import tempfile from setuptools import Command from ..compat import _fix_user_options PY3 = sys.version_info[0] == 3 class AstropyTest(Command, object): description = 'Run the tests for this package' user_options = [ ('package=', 'P', "The name of a specific package to test, e.g. 'io.fits' or 'utils'. " "If nothing is specified, all default tests are run."), ('test-path=', 't', 'Specify a test location by path. If a relative path to a .py file, ' 'it is relative to the built package, so e.g., a leading "astropy/" ' 'is necessary. If a relative path to a .rst file, it is relative to ' 'the directory *below* the --docs-path directory, so a leading ' '"docs/" is usually necessary. May also be an absolute path.'), ('verbose-results', 'V', 'Turn on verbose output from pytest.'), ('plugins=', 'p', 'Plugins to enable when running pytest.'), ('pastebin=', 'b', "Enable pytest pastebin output. Either 'all' or 'failed'."), ('args=', 'a', 'Additional arguments to be passed to pytest.'), ('remote-data', 'R', 'Run tests that download remote data.'), ('pep8', '8', 'Enable PEP8 checking and disable regular tests. ' 'Requires the pytest-pep8 plugin.'), ('pdb', 'd', 'Start the interactive Python debugger on errors.'), ('coverage', 'c', 'Create a coverage report. Requires the coverage package.'), ('open-files', 'o', 'Fail if any tests leave files open. Requires the ' 'psutil package.'), ('parallel=', 'j', 'Run the tests in parallel on the specified number of ' 'CPUs. If negative, all the cores on the machine will be ' 'used. Requires the pytest-xdist plugin.'), ('docs-path=', None, 'The path to the documentation .rst files. If not provided, and ' 'the current directory contains a directory called "docs", that ' 'will be used.'), ('skip-docs', None, "Don't test the documentation .rst files."), ('repeat=', None, 'How many times to repeat each test (can be used to check for ' 'sporadic failures).'), ('temp-root=', None, 'The root directory in which to create the temporary testing files. ' 'If unspecified the system default is used (e.g. /tmp) as explained ' 'in the documentation for tempfile.mkstemp.') ] user_options = _fix_user_options(user_options) package_name = '' def initialize_options(self): self.package = None self.test_path = None self.verbose_results = False self.plugins = None self.pastebin = None self.args = None self.remote_data = False self.pep8 = False self.pdb = False self.coverage = False self.open_files = False self.parallel = 0 self.docs_path = None self.skip_docs = False self.repeat = None self.temp_root = None def finalize_options(self): # Normally we would validate the options here, but that's handled in # run_tests pass # Most of the test runner arguments have the same name as attributes on # this command class, with one exception (for now) _test_runner_arg_attr_map = { 'verbose': 'verbose_results' } def generate_testing_command(self): """ Build a Python script to run the tests. """ cmd_pre = '' # Commands to run before the test function cmd_post = '' # Commands to run after the test function if self.coverage: pre, post = self._generate_coverage_commands() cmd_pre += pre cmd_post += post def get_attr(arg): attr = self._test_runner_arg_attr_map.get(arg, arg) return getattr(self, attr) test_args = filter(lambda arg: hasattr(self, arg), self._get_test_runner_args()) test_args = ', '.join('{0}={1!r}'.format(arg, get_attr(arg)) for arg in test_args) if PY3: set_flag = "import builtins; builtins._ASTROPY_TEST_ = True" else: set_flag = "import __builtin__; __builtin__._ASTROPY_TEST_ = True" cmd = ('{cmd_pre}{0}; import {1.package_name}, sys; result = ' '{1.package_name}.test({test_args}); {cmd_post}' 'sys.exit(result)') return cmd.format(set_flag, self, cmd_pre=cmd_pre, cmd_post=cmd_post, test_args=test_args) def _validate_required_deps(self): """ This method checks that any required modules are installed before running the tests. """ try: import astropy # noqa except ImportError: raise ImportError( "The 'test' command requires the astropy package to be " "installed and importable.") def run(self): """ Run the tests! """ # Ensure there is a doc path if self.docs_path is None: if os.path.exists('docs'): self.docs_path = os.path.abspath('docs') # Build a testing install of the package self._build_temp_install() # Ensure all required packages are installed self._validate_required_deps() # Run everything in a try: finally: so that the tmp dir gets deleted. try: # Construct this modules testing command cmd = self.generate_testing_command() # Run the tests in a subprocess--this is necessary since # new extension modules may have appeared, and this is the # easiest way to set up a new environment # On Python 3.x prior to 3.3, the creation of .pyc files # is not atomic. py.test jumps through some hoops to make # this work by parsing import statements and carefully # importing files atomically. However, it can't detect # when __import__ is used, so its carefulness still fails. # The solution here (admittedly a bit of a hack), is to # turn off the generation of .pyc files altogether by # passing the `-B` switch to `python`. This does mean # that each core will have to compile .py file to bytecode # itself, rather than getting lucky and borrowing the work # already done by another core. Compilation is an # insignificant fraction of total testing time, though, so # it's probably not worth worrying about. retcode = subprocess.call([sys.executable, '-B', '-c', cmd], cwd=self.testing_path, close_fds=False) finally: # Remove temporary directory shutil.rmtree(self.tmp_dir) raise SystemExit(retcode) def _build_temp_install(self): """ Build the package and copy the build to a temporary directory for the purposes of testing this avoids creating pyc and __pycache__ directories inside the build directory """ self.reinitialize_command('build', inplace=True) self.run_command('build') build_cmd = self.get_finalized_command('build') new_path = os.path.abspath(build_cmd.build_lib) # On OSX the default path for temp files is under /var, but in most # cases on OSX /var is actually a symlink to /private/var; ensure we # dereference that link, because py.test is very sensitive to relative # paths... tmp_dir = tempfile.mkdtemp(prefix=self.package_name + '-test-', dir=self.temp_root) self.tmp_dir = os.path.realpath(tmp_dir) self.testing_path = os.path.join(self.tmp_dir, os.path.basename(new_path)) shutil.copytree(new_path, self.testing_path) new_docs_path = os.path.join(self.tmp_dir, os.path.basename(self.docs_path)) shutil.copytree(self.docs_path, new_docs_path) self.docs_path = new_docs_path shutil.copy('setup.cfg', self.tmp_dir) def _generate_coverage_commands(self): """ This method creates the post and pre commands if coverage is to be generated """ if self.parallel != 0: raise ValueError( "--coverage can not be used with --parallel") try: import coverage # noqa except ImportError: raise ImportError( "--coverage requires that the coverage package is " "installed.") # Don't use get_pkg_data_filename here, because it # requires importing astropy.config and thus screwing # up coverage results for those packages. coveragerc = os.path.join( self.testing_path, self.package_name, 'tests', 'coveragerc') # We create a coveragerc that is specific to the version # of Python we're running, so that we can mark branches # as being specifically for Python 2 or Python 3 with open(coveragerc, 'r') as fd: coveragerc_content = fd.read() if PY3: ignore_python_version = '2' else: ignore_python_version = '3' coveragerc_content = coveragerc_content.replace( "{ignore_python_version}", ignore_python_version).replace( "{packagename}", self.package_name) tmp_coveragerc = os.path.join(self.tmp_dir, 'coveragerc') with open(tmp_coveragerc, 'wb') as tmp: tmp.write(coveragerc_content.encode('utf-8')) cmd_pre = ( 'import coverage; ' 'cov = coverage.coverage(data_file="{0}", config_file="{1}"); ' 'cov.start();'.format( os.path.abspath(".coverage"), tmp_coveragerc)) cmd_post = ( 'cov.stop(); ' 'from astropy.tests.helper import _save_coverage; ' '_save_coverage(cov, result, "{0}", "{1}");'.format( os.path.abspath('.'), self.testing_path)) return cmd_pre, cmd_post def _get_test_runner_args(self): """ A hack to determine what arguments are supported by the package's test() function. In the future there should be a more straightforward API to determine this (really it should be determined by the ``TestRunner`` class for whatever version of Astropy is in use). """ if PY3: import builtins builtins._ASTROPY_TEST_ = True else: import __builtin__ __builtin__._ASTROPY_TEST_ = True try: pkg = __import__(self.package_name) if not hasattr(pkg, 'test'): raise ImportError( 'package {0} does not have a {0}.test() function as ' 'required by the Astropy test runner'.format(self.package_name)) argspec = inspect.getargspec(pkg.test) return argspec.args finally: if PY3: del builtins._ASTROPY_TEST_ else: del __builtin__._ASTROPY_TEST_ pydl-0.7.0/astropy_helpers/astropy_helpers/commands/install.py0000644000076500000240000000074613434074306025404 0ustar weaverstaff00000000000000from setuptools.command.install import install as SetuptoolsInstall from ..utils import _get_platlib_dir class AstropyInstall(SetuptoolsInstall): user_options = SetuptoolsInstall.user_options[:] boolean_options = SetuptoolsInstall.boolean_options[:] def finalize_options(self): build_cmd = self.get_finalized_command('build') platlib_dir = _get_platlib_dir(build_cmd) self.build_lib = platlib_dir SetuptoolsInstall.finalize_options(self) pydl-0.7.0/astropy_helpers/astropy_helpers/commands/setup_package.py0000644000076500000240000000017013434074306026540 0ustar weaverstaff00000000000000from os.path import join def get_package_data(): return {'astropy_helpers.commands': [join('src', 'compiler.c')]} pydl-0.7.0/astropy_helpers/astropy_helpers/commands/src/0000755000076500000240000000000013434104632024140 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/commands/src/compiler.c0000644000076500000240000000573113434074306026130 0ustar weaverstaff00000000000000#include /*************************************************************************** * Macros for determining the compiler version. * * These are borrowed from boost, and majorly abridged to include only * the compilers we care about. ***************************************************************************/ #ifndef PY3K #if PY_MAJOR_VERSION >= 3 #define PY3K 1 #else #define PY3K 0 #endif #endif #define STRINGIZE(X) DO_STRINGIZE(X) #define DO_STRINGIZE(X) #X #if defined __clang__ /* Clang C++ emulates GCC, so it has to appear early. */ # define COMPILER "Clang version " __clang_version__ #elif defined(__INTEL_COMPILER) || defined(__ICL) || defined(__ICC) || defined(__ECC) /* Intel */ # if defined(__INTEL_COMPILER) # define INTEL_VERSION __INTEL_COMPILER # elif defined(__ICL) # define INTEL_VERSION __ICL # elif defined(__ICC) # define INTEL_VERSION __ICC # elif defined(__ECC) # define INTEL_VERSION __ECC # endif # define COMPILER "Intel C compiler version " STRINGIZE(INTEL_VERSION) #elif defined(__GNUC__) /* gcc */ # define COMPILER "GCC version " __VERSION__ #elif defined(__SUNPRO_CC) /* Sun Workshop Compiler */ # define COMPILER "Sun compiler version " STRINGIZE(__SUNPRO_CC) #elif defined(_MSC_VER) /* Microsoft Visual C/C++ Must be last since other compilers define _MSC_VER for compatibility as well */ # if _MSC_VER < 1200 # define COMPILER_VERSION 5.0 # elif _MSC_VER < 1300 # define COMPILER_VERSION 6.0 # elif _MSC_VER == 1300 # define COMPILER_VERSION 7.0 # elif _MSC_VER == 1310 # define COMPILER_VERSION 7.1 # elif _MSC_VER == 1400 # define COMPILER_VERSION 8.0 # elif _MSC_VER == 1500 # define COMPILER_VERSION 9.0 # elif _MSC_VER == 1600 # define COMPILER_VERSION 10.0 # else # define COMPILER_VERSION _MSC_VER # endif # define COMPILER "Microsoft Visual C++ version " STRINGIZE(COMPILER_VERSION) #else /* Fallback */ # define COMPILER "Unknown compiler" #endif /*************************************************************************** * Module-level ***************************************************************************/ struct module_state { /* The Sun compiler can't handle empty structs */ #if defined(__SUNPRO_C) || defined(_MSC_VER) int _dummy; #endif }; #if PY3K static struct PyModuleDef moduledef = { PyModuleDef_HEAD_INIT, "_compiler", NULL, sizeof(struct module_state), NULL, NULL, NULL, NULL, NULL }; #define INITERROR return NULL PyMODINIT_FUNC PyInit__compiler(void) #else #define INITERROR return PyMODINIT_FUNC init_compiler(void) #endif { PyObject* m; #if PY3K m = PyModule_Create(&moduledef); #else m = Py_InitModule3("_compiler", NULL, NULL); #endif if (m == NULL) INITERROR; PyModule_AddStringConstant(m, "compiler", COMPILER); #if PY3K return m; #endif } pydl-0.7.0/astropy_helpers/astropy_helpers/test_helpers.py0000644000076500000240000000100313434074306024621 0ustar weaverstaff00000000000000from __future__ import (absolute_import, division, print_function, unicode_literals) import warnings from .commands.test import AstropyTest # noqa # Leaving this module here for now, but really it needn't exist # (and it's doubtful that any code depends on it anymore) warnings.warn('The astropy_helpers.test_helpers module is deprecated as ' 'of version 1.1.0; the AstropyTest command can be found in ' 'astropy_helpers.commands.test.', DeprecationWarning) pydl-0.7.0/astropy_helpers/astropy_helpers/extern/0000755000076500000240000000000013434104632023055 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/extern/__init__.py0000644000076500000240000000111513434074306025170 0ustar weaverstaff00000000000000# The ``astropy_helpers.extern`` sub-module includes modules developed elsewhere # that are bundled here for convenience. At the moment, this consists of the # following two sphinx extensions: # # * `numpydoc `_, a Sphinx extension # developed as part of the Numpy project. This is used to parse docstrings # in Numpy format # # * `sphinx-automodapi `_, a Sphinx # developed as part of the Astropy project. This used to be developed directly # in ``astropy-helpers`` but is now a standalone package. pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/0000755000076500000240000000000013434104632025217 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/automodsumm.py0000644000076500000240000006541213434074306030157 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This directive will produce an "autosummary"-style table for public attributes of a specified module. See the `sphinx.ext.autosummary`_ extension for details on this process. The main difference from the `autosummary`_ directive is that `autosummary`_ requires manually inputting all attributes that appear in the table, while this captures the entries automatically. This directive requires a single argument that must be a module or package. It also accepts any options supported by the `autosummary`_ directive- see `sphinx.ext.autosummary`_ for details. It also accepts some additional options: * ``:classes-only:`` If present, the autosummary table will only contain entries for classes. This cannot be used at the same time with ``:functions-only:`` or ``:variables-only:``. * ``:functions-only:`` If present, the autosummary table will only contain entries for functions. This cannot be used at the same time with ``:classes-only:`` or ``:variables-only:``. * ``:variables-only:`` If present, the autosummary table will only contain entries for variables (everything except functions and classes). This cannot be used at the same time with ``:classes-only:`` or ``:functions-only:``. * ``:skip: obj1, [obj2, obj3, ...]`` If present, specifies that the listed objects should be skipped and not have their documentation generated, nor be included in the summary table. * ``:allowed-package-names: pkgormod1, [pkgormod2, pkgormod3, ...]`` Specifies the packages that functions/classes documented here are allowed to be from, as comma-separated list of package names. If not given, only objects that are actually in a subpackage of the package currently being documented are included. * ``:inherited-members:`` or ``:no-inherited-members:`` The global sphinx configuration option ``automodsumm_inherited_members`` decides if members that a class inherits from a base class are included in the generated documentation. The flags ``:inherited-members:`` or ``:no-inherited-members:`` allows overrriding this global setting. This extension also adds two sphinx configuration options: * ``automodsumm_writereprocessed`` Should be a bool, and if ``True``, will cause `automodsumm`_ to write files with any ``automodsumm`` sections replaced with the content Sphinx processes after ``automodsumm`` has run. The output files are not actually used by sphinx, so this option is only for figuring out the cause of sphinx warnings or other debugging. Defaults to ``False``. * ``automodsumm_inherited_members`` Should be a bool and if ``True``, will cause `automodsumm`_ to document class members that are inherited from a base class. This value can be overriden for any particular automodsumm directive by including the ``:inherited-members:`` or ``:no-inherited-members:`` options. Defaults to ``False``. .. _sphinx.ext.autosummary: http://sphinx-doc.org/latest/ext/autosummary.html .. _autosummary: http://sphinx-doc.org/latest/ext/autosummary.html#directive-autosummary .. _automod-diagram: automod-diagram directive ========================= This directive will produce an inheritance diagram like that of the `sphinx.ext.inheritance_diagram`_ extension. This directive requires a single argument that must be a module or package. It accepts no options. .. note:: Like 'inheritance-diagram', 'automod-diagram' requires `graphviz `_ to generate the inheritance diagram. .. _sphinx.ext.inheritance_diagram: http://sphinx-doc.org/latest/ext/inheritance.html """ import abc import inspect import os import re import io from distutils.version import LooseVersion from sphinx import __version__ from sphinx.ext.autosummary import Autosummary from sphinx.ext.inheritance_diagram import InheritanceDiagram from docutils.parsers.rst.directives import flag from .utils import find_mod_objs, cleanup_whitespace __all__ = ['Automoddiagram', 'Automodsumm', 'automodsumm_to_autosummary_lines', 'generate_automodsumm_docs', 'process_automodsumm_generation'] SPHINX_LT_16 = LooseVersion(__version__) < LooseVersion('1.6') SPHINX_LT_17 = LooseVersion(__version__) < LooseVersion('1.7') def _str_list_converter(argument): """ A directive option conversion function that converts the option into a list of strings. Used for 'skip' option. """ if argument is None: return [] else: return [s.strip() for s in argument.split(',')] class Automodsumm(Autosummary): required_arguments = 1 optional_arguments = 0 final_argument_whitespace = False has_content = False option_spec = dict(Autosummary.option_spec) option_spec['functions-only'] = flag option_spec['classes-only'] = flag option_spec['variables-only'] = flag option_spec['skip'] = _str_list_converter option_spec['allowed-package-names'] = _str_list_converter option_spec['inherited-members'] = flag option_spec['no-inherited-members'] = flag def run(self): env = self.state.document.settings.env modname = self.arguments[0] self.warnings = [] nodelist = [] try: localnames, fqns, objs = find_mod_objs(modname) except ImportError: self.warnings = [] self.warn("Couldn't import module " + modname) return self.warnings try: # set self.content to trick the autosummary internals. # Be sure to respect functions-only and classes-only. funconly = 'functions-only' in self.options clsonly = 'classes-only' in self.options varonly = 'variables-only' in self.options if [clsonly, funconly, varonly].count(True) > 1: self.warning('more than one of functions-only, classes-only, ' 'or variables-only defined. Ignoring.') clsonly = funconly = varonly = False skipnames = [] if 'skip' in self.options: option_skipnames = set(self.options['skip']) for lnm in localnames: if lnm in option_skipnames: option_skipnames.remove(lnm) skipnames.append(lnm) if len(option_skipnames) > 0: self.warn('Tried to skip objects {objs} in module {mod}, ' 'but they were not present. Ignoring.' .format(objs=option_skipnames, mod=modname)) if funconly: cont = [] for nm, obj in zip(localnames, objs): if nm not in skipnames and inspect.isroutine(obj): cont.append(nm) elif clsonly: cont = [] for nm, obj in zip(localnames, objs): if nm not in skipnames and inspect.isclass(obj): cont.append(nm) elif varonly: cont = [] for nm, obj in zip(localnames, objs): if nm not in skipnames and not (inspect.isclass(obj) or inspect.isroutine(obj)): cont.append(nm) else: cont = [nm for nm in localnames if nm not in skipnames] self.content = cont # for some reason, even though ``currentmodule`` is substituted in, # sphinx doesn't necessarily recognize this fact. So we just force # it internally, and that seems to fix things env.temp_data['py:module'] = modname env.ref_context['py:module'] = modname # can't use super because Sphinx/docutils has trouble return # super(Autosummary,self).run() nodelist.extend(Autosummary.run(self)) return self.warnings + nodelist finally: # has_content = False for the Automodsumm self.content = [] def get_items(self, names): self.genopt['imported-members'] = True return Autosummary.get_items(self, names) # <-------------------automod-diagram stuff-----------------------------------> class Automoddiagram(InheritanceDiagram): option_spec = dict(InheritanceDiagram.option_spec) option_spec['allowed-package-names'] = _str_list_converter option_spec['skip'] = _str_list_converter def run(self): try: ols = self.options.get('allowed-package-names', []) ols = True if len(ols) == 0 else ols # if none are given, assume only local nms, objs = find_mod_objs(self.arguments[0], onlylocals=ols)[1:] except ImportError: self.warnings = [] self.warn("Couldn't import module " + self.arguments[0]) return self.warnings # Check if some classes should be skipped skip = self.options.get('skip', []) clsnms = [] for n, o in zip(nms, objs): if n.split('.')[-1] in skip: continue if inspect.isclass(o): clsnms.append(n) oldargs = self.arguments try: if len(clsnms) > 0: self.arguments = [' '.join(clsnms)] return InheritanceDiagram.run(self) finally: self.arguments = oldargs # <---------------------automodsumm generation stuff--------------------------> def process_automodsumm_generation(app): env = app.builder.env filestosearch = [] for docname in env.found_docs: filename = env.doc2path(docname) if os.path.isfile(filename): filestosearch.append(docname + os.path.splitext(filename)[1]) liness = [] for sfn in filestosearch: lines = automodsumm_to_autosummary_lines(sfn, app) liness.append(lines) if app.config.automodsumm_writereprocessed: if lines: # empty list means no automodsumm entry is in the file outfn = os.path.join(app.srcdir, sfn) + '.automodsumm' with open(outfn, 'w') as f: for l in lines: f.write(l) f.write('\n') for sfn, lines in zip(filestosearch, liness): suffix = os.path.splitext(sfn)[1] if len(lines) > 0: generate_automodsumm_docs( lines, sfn, app=app, builder=app.builder, suffix=suffix, base_path=app.srcdir, inherited_members=app.config.automodsumm_inherited_members) # _automodsummrex = re.compile(r'^(\s*)\.\. automodsumm::\s*([A-Za-z0-9_.]+)\s*' # r'\n\1(\s*)(\S|$)', re.MULTILINE) _lineendrex = r'(?:\n|$)' _hdrex = r'^\n?(\s*)\.\. automodsumm::\s*(\S+)\s*' + _lineendrex _oprex1 = r'(?:\1(\s+)\S.*' + _lineendrex + ')' _oprex2 = r'(?:\1\4\S.*' + _lineendrex + ')' _automodsummrex = re.compile(_hdrex + '(' + _oprex1 + '?' + _oprex2 + '*)', re.MULTILINE) def automodsumm_to_autosummary_lines(fn, app): """ Generates lines from a file with an "automodsumm" entry suitable for feeding into "autosummary". Searches the provided file for `automodsumm` directives and returns a list of lines specifying the `autosummary` commands for the modules requested. This does *not* return the whole file contents - just an autosummary section in place of any :automodsumm: entries. Note that any options given for `automodsumm` are also included in the generated `autosummary` section. Parameters ---------- fn : str The name of the file to search for `automodsumm` entries. app : sphinx.application.Application The sphinx Application object Returns ------- lines : list of str Lines for all `automodsumm` entries with the entries replaced by `autosummary` and the module's members added. """ fullfn = os.path.join(app.builder.env.srcdir, fn) with io.open(fullfn, encoding='utf8') as fr: # Note: we use __name__ here instead of just writing the module name in # case this extension is bundled into another package from . import automodapi try: extensions = app.extensions except AttributeError: # Sphinx <1.6 extensions = app._extensions if automodapi.__name__ in extensions: # Must do the automodapi on the source to get the automodsumm # that might be in there docname = os.path.splitext(fn)[0] filestr = automodapi.automodapi_replace(fr.read(), app, True, docname, False) else: filestr = fr.read() spl = _automodsummrex.split(filestr) # 0th entry is the stuff before the first automodsumm line indent1s = spl[1::5] mods = spl[2::5] opssecs = spl[3::5] indent2s = spl[4::5] remainders = spl[5::5] # only grab automodsumm sections and convert them to autosummary with the # entries for all the public objects newlines = [] # loop over all automodsumms in this document for i, (i1, i2, modnm, ops, rem) in enumerate(zip(indent1s, indent2s, mods, opssecs, remainders)): allindent = i1 + (' ' if i2 is None else i2) # filter out functions-only, classes-only, and ariables-only # options if present. oplines = ops.split('\n') toskip = [] allowedpkgnms = [] funcsonly = clssonly = varsonly = False for i, ln in reversed(list(enumerate(oplines))): if ':functions-only:' in ln: funcsonly = True del oplines[i] if ':classes-only:' in ln: clssonly = True del oplines[i] if ':variables-only:' in ln: varsonly = True del oplines[i] if ':skip:' in ln: toskip.extend(_str_list_converter(ln.replace(':skip:', ''))) del oplines[i] if ':allowed-package-names:' in ln: allowedpkgnms.extend(_str_list_converter(ln.replace(':allowed-package-names:', ''))) del oplines[i] if [funcsonly, clssonly, varsonly].count(True) > 1: msg = ('Defined more than one of functions-only, classes-only, ' 'and variables-only. Skipping this directive.') lnnum = sum([spl[j].count('\n') for j in range(i * 5 + 1)]) app.warn('[automodsumm]' + msg, (fn, lnnum)) continue # Use the currentmodule directive so we can just put the local names # in the autosummary table. Note that this doesn't always seem to # actually "take" in Sphinx's eyes, so in `Automodsumm.run`, we have to # force it internally, as well. newlines.extend([i1 + '.. currentmodule:: ' + modnm, '', '.. autosummary::']) newlines.extend(oplines) ols = True if len(allowedpkgnms) == 0 else allowedpkgnms for nm, fqn, obj in zip(*find_mod_objs(modnm, onlylocals=ols)): if nm in toskip: continue if funcsonly and not inspect.isroutine(obj): continue if clssonly and not inspect.isclass(obj): continue if varsonly and (inspect.isclass(obj) or inspect.isroutine(obj)): continue newlines.append(allindent + nm) # add one newline at the end of the autosummary block newlines.append('') return newlines def generate_automodsumm_docs(lines, srcfn, app=None, suffix='.rst', base_path=None, builder=None, template_dir=None, inherited_members=False): """ This function is adapted from `sphinx.ext.autosummary.generate.generate_autosummmary_docs` to generate source for the automodsumm directives that should be autosummarized. Unlike generate_autosummary_docs, this function is called one file at a time. """ from sphinx.jinja2glue import BuiltinTemplateLoader from sphinx.ext.autosummary import import_by_name, get_documenter from sphinx.util.osutil import ensuredir from sphinx.util.inspect import safe_getattr from jinja2 import FileSystemLoader, TemplateNotFound from jinja2.sandbox import SandboxedEnvironment from .utils import find_autosummary_in_lines_for_automodsumm as find_autosummary_in_lines if SPHINX_LT_16: info = app.info warn = app.warn else: from sphinx.util import logging logger = logging.getLogger(__name__) info = logger.info warn = logger.warning # info('[automodsumm] generating automodsumm for: ' + srcfn) # Create our own templating environment - here we use Astropy's # templates rather than the default autosummary templates, in order to # allow docstrings to be shown for methods. template_dirs = [os.path.join(os.path.dirname(__file__), 'templates'), os.path.join(base_path, '_templates')] if builder is not None: # allow the user to override the templates template_loader = BuiltinTemplateLoader() template_loader.init(builder, dirs=template_dirs) else: if template_dir: template_dirs.insert(0, template_dir) template_loader = FileSystemLoader(template_dirs) template_env = SandboxedEnvironment(loader=template_loader) # read # items = find_autosummary_in_files(sources) items = find_autosummary_in_lines(lines, filename=srcfn) if len(items) > 0: msg = '[automodsumm] {1}: found {0} automodsumm entries to generate' info(msg.format(len(items), srcfn)) # gennms = [item[0] for item in items] # if len(gennms) > 20: # gennms = gennms[:10] + ['...'] + gennms[-10:] # info('[automodsumm] generating autosummary for: ' + ', '.join(gennms)) # remove possible duplicates items = list(set(items)) # keep track of new files new_files = [] # write for name, path, template_name, inherited_mem in sorted(items): if path is None: # The corresponding autosummary:: directive did not have # a :toctree: option continue path = os.path.abspath(os.path.join(base_path, path)) ensuredir(path) try: import_by_name_values = import_by_name(name) except ImportError as e: warn('[automodsumm] failed to import %r: %s' % (name, e)) continue # if block to accommodate Sphinx's v1.2.2 and v1.2.3 respectively if len(import_by_name_values) == 3: name, obj, parent = import_by_name_values elif len(import_by_name_values) == 4: name, obj, parent, module_name = import_by_name_values fn = os.path.join(path, name + suffix) # skip it if it exists if os.path.isfile(fn): continue new_files.append(fn) f = open(fn, 'w') try: if SPHINX_LT_17: doc = get_documenter(obj, parent) else: doc = get_documenter(app, obj, parent) if template_name is not None: template = template_env.get_template(template_name) else: tmplstr = 'autosummary_core/%s.rst' try: template = template_env.get_template(tmplstr % doc.objtype) except TemplateNotFound: template = template_env.get_template(tmplstr % 'base') def get_members_mod(obj, typ, include_public=[]): """ typ = None -> all """ items = [] for name in dir(obj): try: if SPHINX_LT_17: documenter = get_documenter(safe_getattr(obj, name), obj) else: documenter = get_documenter(app, safe_getattr(obj, name), obj) except AttributeError: continue if typ is None or documenter.objtype == typ: items.append(name) public = [x for x in items if x in include_public or not x.startswith('_')] return public, items def get_members_class(obj, typ, include_public=[], include_base=False): """ typ = None -> all include_base -> include attrs that are from a base class """ items = [] # using dir gets all of the attributes, including the elements # from the base class, otherwise use __slots__ or __dict__ if include_base: names = dir(obj) else: # Classes deriving from an ABC using the `abc` module will # have an empty `__slots__` attribute in Python 3, unless # other slots were declared along the inheritance chain. If # the ABC-derived class has empty slots, we'll use the # class `__dict__` instead. declares_slots = ( hasattr(obj, '__slots__') and not (type(obj) is abc.ABCMeta and len(obj.__slots__) == 0) ) if declares_slots: names = tuple(getattr(obj, '__slots__')) else: names = getattr(obj, '__dict__').keys() for name in names: try: if SPHINX_LT_17: documenter = get_documenter(safe_getattr(obj, name), obj) else: documenter = get_documenter(app, safe_getattr(obj, name), obj) except AttributeError: continue if typ is None or documenter.objtype == typ: items.append(name) public = [x for x in items if x in include_public or not x.startswith('_')] return public, items ns = {} if doc.objtype == 'module': ns['members'] = get_members_mod(obj, None) ns['functions'], ns['all_functions'] = \ get_members_mod(obj, 'function') ns['classes'], ns['all_classes'] = \ get_members_mod(obj, 'class') ns['exceptions'], ns['all_exceptions'] = \ get_members_mod(obj, 'exception') elif doc.objtype == 'class': if inherited_mem is not None: # option set in this specifc directive include_base = inherited_mem else: # use default value include_base = inherited_members api_class_methods = ['__init__', '__call__'] ns['members'] = get_members_class(obj, None, include_base=include_base) ns['methods'], ns['all_methods'] = \ get_members_class(obj, 'method', api_class_methods, include_base=include_base) ns['attributes'], ns['all_attributes'] = \ get_members_class(obj, 'attribute', include_base=include_base) ns['methods'].sort() ns['attributes'].sort() parts = name.split('.') if doc.objtype in ('method', 'attribute'): mod_name = '.'.join(parts[:-2]) cls_name = parts[-2] obj_name = '.'.join(parts[-2:]) ns['class'] = cls_name else: mod_name, obj_name = '.'.join(parts[:-1]), parts[-1] ns['fullname'] = name ns['module'] = mod_name ns['objname'] = obj_name ns['name'] = parts[-1] ns['objtype'] = doc.objtype ns['underline'] = len(obj_name) * '=' # We now check whether a file for reference footnotes exists for # the module being documented. We first check if the # current module is a file or a directory, as this will give a # different path for the reference file. For example, if # documenting astropy.wcs then the reference file is at # ../wcs/references.txt, while if we are documenting # astropy.config.logging_helper (which is at # astropy/config/logging_helper.py) then the reference file is set # to ../config/references.txt if '.' in mod_name: mod_name_dir = mod_name.replace('.', '/').split('/', 1)[1] else: mod_name_dir = mod_name if not os.path.isdir(os.path.join(base_path, mod_name_dir)) \ and os.path.isdir(os.path.join(base_path, mod_name_dir.rsplit('/', 1)[0])): mod_name_dir = mod_name_dir.rsplit('/', 1)[0] # We then have to check whether it exists, and if so, we pass it # to the template. if os.path.exists(os.path.join(base_path, mod_name_dir, 'references.txt')): # An important subtlety here is that the path we pass in has # to be relative to the file being generated, so we have to # figure out the right number of '..'s ndirsback = path.replace(base_path, '').count('/') ref_file_rel_segments = ['..'] * ndirsback ref_file_rel_segments.append(mod_name_dir) ref_file_rel_segments.append('references.txt') ns['referencefile'] = os.path.join(*ref_file_rel_segments) rendered = template.render(**ns) f.write(cleanup_whitespace(rendered)) finally: f.close() def setup(app): # need autodoc fixes # Note: we use __name__ here instead of just writing the module name in # case this extension is bundled into another package from . import autodoc_enhancements app.setup_extension(autodoc_enhancements.__name__) # need inheritance-diagram for automod-diagram app.setup_extension('sphinx.ext.inheritance_diagram') app.add_directive('automod-diagram', Automoddiagram) app.add_directive('automodsumm', Automodsumm) app.connect('builder-inited', process_automodsumm_generation) app.add_config_value('automodsumm_writereprocessed', False, True) app.add_config_value('automodsumm_inherited_members', False, 'env') return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/automodapi.py0000644000076500000240000003677313434074306027757 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This directive takes a single argument that must be a module or package. It will produce a block of documentation that includes the docstring for the package, an :ref:`automodsumm` directive, and an :ref:`automod-diagram` if there are any classes in the module. If only the main docstring of the module/package is desired in the documentation, use `automodule`_ instead of `automodapi`_. It accepts the following options: * ``:include-all-objects:`` If present, include not just functions and classes, but all objects. This includes variables, for which a possible docstring after the variable definition will be shown. * ``:no-inheritance-diagram:`` If present, the inheritance diagram will not be shown even if the module/package has classes. * ``:skip: str`` This option results in the specified object being skipped, that is the object will *not* be included in the generated documentation. This option may appear any number of times to skip multiple objects. * ``:no-main-docstr:`` If present, the docstring for the module/package will not be generated. The function and class tables will still be used, however. * ``:headings: str`` Specifies the characters (in one string) used as the heading levels used for the generated section. This must have at least 2 characters (any after 2 will be ignored). This also *must* match the rest of the documentation on this page for sphinx to be happy. Defaults to "-^", which matches the convention used for Python's documentation, assuming the automodapi call is inside a top-level section (which usually uses '='). * ``:no-heading:`` If specified do not create a top level heading for the section. That is, do not create a title heading with text like "packagename Package". The actual docstring for the package/module will still be shown, though, unless ``:no-main-docstr:`` is given. * ``:allowed-package-names: str`` Specifies the packages that functions/classes documented here are allowed to be from, as comma-separated list of package names. If not given, only objects that are actually in a subpackage of the package currently being documented are included. * ``:inherited-members:`` / ``:no-inherited-members:`` The global sphinx configuration option ``automodsumm_inherited_members`` decides if members that a class inherits from a base class are included in the generated documentation. The option ``:inherited-members:`` or ``:no-inherited-members:`` allows the user to overrride the global setting. This extension also adds three sphinx configuration options: * ``automodapi_toctreedirnm`` This must be a string that specifies the name of the directory the automodsumm generated documentation ends up in. This directory path should be relative to the documentation root (e.g., same place as ``index.rst``). Defaults to ``'api'``. * ``automodapi_writereprocessed`` Should be a bool, and if `True`, will cause `automodapi`_ to write files with any `automodapi`_ sections replaced with the content Sphinx processes after `automodapi`_ has run. The output files are not actually used by sphinx, so this option is only for figuring out the cause of sphinx warnings or other debugging. Defaults to `False`. * ``automodsumm_inherited_members`` Should be a bool and if ``True`` members that a class inherits from a base class are included in the generated documentation. Defaults to ``False``. .. _automodule: http://sphinx-doc.org/latest/ext/autodoc.html?highlight=automodule#directive-automodule """ # Implementation note: # The 'automodapi' directive is not actually implemented as a docutils # directive. Instead, this extension searches for the 'automodapi' text in # all sphinx documents, and replaces it where necessary from a template built # into this extension. This is necessary because automodsumm (and autosummary) # use the "builder-inited" event, which comes before the directives are # actually built. import inspect import io import os import re import sys from .utils import find_mod_objs if sys.version_info[0] == 3: text_type = str else: text_type = unicode automod_templ_modheader = """ {modname} {pkgormod} {modhds}{pkgormodhds} {automoduleline} """ automod_templ_classes = """ Classes {clshds} .. automodsumm:: {modname} :classes-only: {clsfuncoptions} """ automod_templ_funcs = """ Functions {funchds} .. automodsumm:: {modname} :functions-only: {clsfuncoptions} """ automod_templ_vars = """ Variables {otherhds} .. automodsumm:: {modname} :variables-only: {clsfuncoptions} """ automod_templ_inh = """ Class Inheritance Diagram {clsinhsechds} .. automod-diagram:: {modname} :private-bases: :parts: 1 {allowedpkgnms} {skip} """ _automodapirex = re.compile(r'^(?:\.\.\s+automodapi::\s*)([A-Za-z0-9_.]+)' r'\s*$((?:\n\s+:[a-zA-Z_\-]+:.*$)*)', flags=re.MULTILINE) # the last group of the above regex is intended to go into finall with the below _automodapiargsrex = re.compile(r':([a-zA-Z_\-]+):(.*)$', flags=re.MULTILINE) def automodapi_replace(sourcestr, app, dotoctree=True, docname=None, warnings=True): """ Replaces `sourcestr`'s entries of ".. automdapi::" with the automodapi template form based on provided options. This is used with the sphinx event 'source-read' to replace `automodapi`_ entries before sphinx actually processes them, as automodsumm needs the code to be present to generate stub documentation. Parameters ---------- sourcestr : str The string with sphinx source to be checked for automodapi replacement. app : `sphinx.application.Application` The sphinx application. dotoctree : bool If `True`, a ":toctree:" option will be added in the ".. automodsumm::" sections of the template, pointing to the appropriate "generated" directory based on the Astropy convention (e.g. in ``docs/api``) docname : str The name of the file for this `sourcestr` (if known - if not, it can be `None`). If not provided and `dotoctree` is `True`, the generated files may end up in the wrong place. warnings : bool If `False`, all warnings that would normally be issued are silenced. Returns ------- newstr :str The string with automodapi entries replaced with the correct sphinx markup. """ spl = _automodapirex.split(sourcestr) if len(spl) > 1: # automodsumm is in this document # Use app.srcdir because api folder should be inside source folder not # at folder where sphinx is run. if dotoctree: toctreestr = ':toctree: ' api_dir = os.path.join(app.srcdir, app.config.automodapi_toctreedirnm) if docname is None: doc_path = '.' else: doc_path = os.path.join(app.srcdir, docname) toctreestr += os.path.relpath(api_dir, os.path.dirname(doc_path)) else: toctreestr = '' newstrs = [spl[0]] for grp in range(len(spl) // 3): modnm = spl[grp * 3 + 1] # find where this is in the document for warnings if docname is None: location = None else: location = (docname, spl[0].count('\n')) # initialize default options toskip = [] inhdiag = maindocstr = top_head = True hds = '-^' allowedpkgnms = [] allowothers = False # look for actual options unknownops = [] inherited_members = None for opname, args in _automodapiargsrex.findall(spl[grp * 3 + 2]): if opname == 'skip': toskip.append(args.strip()) elif opname == 'no-inheritance-diagram': inhdiag = False elif opname == 'no-main-docstr': maindocstr = False elif opname == 'headings': hds = args elif opname == 'no-heading': top_head = False elif opname == 'allowed-package-names': allowedpkgnms.append(args.strip()) elif opname == 'inherited-members': inherited_members = True elif opname == 'no-inherited-members': inherited_members = False elif opname == 'include-all-objects': allowothers = True else: unknownops.append(opname) # join all the allowedpkgnms if len(allowedpkgnms) == 0: allowedpkgnms = '' onlylocals = True else: allowedpkgnms = ':allowed-package-names: ' + ','.join(allowedpkgnms) onlylocals = allowedpkgnms # get the two heading chars if len(hds) < 2: msg = 'Not enough headings (got {0}, need 2), using default -^' if warnings: app.warn(msg.format(len(hds)), location) hds = '-^' h1, h2 = hds.lstrip()[:2] # tell sphinx that the remaining args are invalid. if len(unknownops) > 0 and app is not None: opsstrs = ','.join(unknownops) msg = 'Found additional options ' + opsstrs + ' in automodapi.' if warnings: app.warn(msg, location) ispkg, hascls, hasfuncs, hasother = _mod_info( modnm, toskip, onlylocals=onlylocals) # add automodule directive only if no-main-docstr isn't present if maindocstr: automodline = '.. automodule:: {modname}'.format(modname=modnm) else: automodline = '' if top_head: newstrs.append(automod_templ_modheader.format( modname=modnm, modhds=h1 * len(modnm), pkgormod='Package' if ispkg else 'Module', pkgormodhds=h1 * (8 if ispkg else 7), automoduleline=automodline)) # noqa else: newstrs.append(automod_templ_modheader.format( modname='', modhds='', pkgormod='', pkgormodhds='', automoduleline=automodline)) # construct the options for the class/function sections # start out indented at 4 spaces, but need to keep the indentation. clsfuncoptions = [] if toctreestr: clsfuncoptions.append(toctreestr) if toskip: clsfuncoptions.append(':skip: ' + ','.join(toskip)) if allowedpkgnms: clsfuncoptions.append(allowedpkgnms) if hascls: # This makes no sense unless there are classes. if inherited_members is True: clsfuncoptions.append(':inherited-members:') if inherited_members is False: clsfuncoptions.append(':no-inherited-members:') clsfuncoptionstr = '\n '.join(clsfuncoptions) if hasfuncs: newstrs.append(automod_templ_funcs.format( modname=modnm, funchds=h2 * 9, clsfuncoptions=clsfuncoptionstr)) if hascls: newstrs.append(automod_templ_classes.format( modname=modnm, clshds=h2 * 7, clsfuncoptions=clsfuncoptionstr)) if allowothers and hasother: newstrs.append(automod_templ_vars.format( modname=modnm, otherhds=h2 * 9, clsfuncoptions=clsfuncoptionstr)) if inhdiag and hascls: # add inheritance diagram if any classes are in the module if toskip: clsskip = ':skip: ' + ','.join(toskip) else: clsskip = '' diagram_entry = automod_templ_inh.format( modname=modnm, clsinhsechds=h2 * 25, allowedpkgnms=allowedpkgnms, skip=clsskip) diagram_entry = diagram_entry.replace(' \n', '') newstrs.append(diagram_entry) newstrs.append(spl[grp * 3 + 3]) newsourcestr = ''.join(newstrs) if app.config.automodapi_writereprocessed: # sometimes they are unicode, sometimes not, depending on how # sphinx has processed things if isinstance(newsourcestr, text_type): ustr = newsourcestr else: ustr = newsourcestr.decode(app.config.source_encoding) if docname is None: with io.open(os.path.join(app.srcdir, 'unknown.automodapi'), 'a', encoding='utf8') as f: f.write(u'\n**NEW DOC**\n\n') f.write(ustr) else: env = app.builder.env # Determine the filename associated with this doc (specifically # the extension) filename = docname + os.path.splitext(env.doc2path(docname))[1] filename += '.automodapi' with io.open(os.path.join(app.srcdir, filename), 'w', encoding='utf8') as f: f.write(ustr) return newsourcestr else: return sourcestr def _mod_info(modname, toskip=[], onlylocals=True): """ Determines if a module is a module or a package and whether or not it has classes or functions. """ hascls = hasfunc = hasother = False for localnm, fqnm, obj in zip(*find_mod_objs(modname, onlylocals=onlylocals)): if localnm not in toskip: hascls = hascls or inspect.isclass(obj) hasfunc = hasfunc or inspect.isroutine(obj) hasother = hasother or (not inspect.isclass(obj) and not inspect.isroutine(obj)) if hascls and hasfunc and hasother: break # find_mod_objs has already imported modname # TODO: There is probably a cleaner way to do this, though this is pretty # reliable for all Python versions for most cases that we care about. pkg = sys.modules[modname] ispkg = (hasattr(pkg, '__file__') and isinstance(pkg.__file__, str) and os.path.split(pkg.__file__)[1].startswith('__init__.py')) return ispkg, hascls, hasfunc, hasother def process_automodapi(app, docname, source): source[0] = automodapi_replace(source[0], app, True, docname) def setup(app): app.setup_extension('sphinx.ext.autosummary') # Note: we use __name__ here instead of just writing the module name in # case this extension is bundled into another package from . import automodsumm app.setup_extension(automodsumm.__name__) app.connect('source-read', process_automodapi) app.add_config_value('automodapi_toctreedirnm', 'api', True) app.add_config_value('automodapi_writereprocessed', False, True) return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/__init__.py0000644000076500000240000000002413434074306027330 0ustar weaverstaff00000000000000__version__ = '0.9' pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/smart_resolver.py0000644000076500000240000001002013434074306030635 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ The classes in the astropy docs are documented by their API location, which is not necessarily where they are defined in the source. This causes a problem when certain automated features of the doc build, such as the inheritance diagrams or the `Bases` list of a class reference a class by its canonical location rather than its "user" location. In the `autodoc-process-docstring` event, a mapping from the actual name to the API name is maintained. Later, in the `missing-reference` event, unresolved references are looked up in this dictionary and corrected if possible. """ from docutils.nodes import literal, reference def process_docstring(app, what, name, obj, options, lines): if isinstance(obj, type): env = app.env if not hasattr(env, 'class_name_mapping'): env.class_name_mapping = {} mapping = env.class_name_mapping mapping[obj.__module__ + '.' + obj.__name__] = name def merge_mapping(app, env, docnames, env_other): if not hasattr(env_other, 'class_name_mapping'): return if not hasattr(env, 'class_name_mapping'): env.class_name_mapping = {} env.class_name_mapping.update(env_other.class_name_mapping) def missing_reference_handler(app, env, node, contnode): if not hasattr(env, 'class_name_mapping'): env.class_name_mapping = {} mapping = env.class_name_mapping reftype = node['reftype'] reftarget = node['reftarget'] if reftype in ('obj', 'class', 'exc', 'meth'): reftarget = node['reftarget'] suffix = '' if reftarget not in mapping: if '.' in reftarget: front, suffix = reftarget.rsplit('.', 1) else: suffix = reftarget if suffix.startswith('_') and not suffix.startswith('__'): # If this is a reference to a hidden class or method, # we can't link to it, but we don't want to have a # nitpick warning. return node[0].deepcopy() if reftype in ('obj', 'meth') and '.' in reftarget: if front in mapping: reftarget = front suffix = '.' + suffix if (reftype in ('class', ) and '.' in reftarget and reftarget not in mapping): if '.' in front: reftarget, _ = front.rsplit('.', 1) suffix = '.' + suffix reftarget = reftarget + suffix prefix = reftarget.rsplit('.')[0] inventory = env.intersphinx_named_inventory if (reftarget not in mapping and prefix in inventory): if reftarget in inventory[prefix]['py:class']: newtarget = inventory[prefix]['py:class'][reftarget][2] if not node['refexplicit'] and \ '~' not in node.rawsource: contnode = literal(text=reftarget) newnode = reference('', '', internal=True) newnode['reftitle'] = reftarget newnode['refuri'] = newtarget newnode.append(contnode) return newnode if reftarget in mapping: newtarget = mapping[reftarget] + suffix if not node['refexplicit'] and '~' not in node.rawsource: contnode = literal(text=newtarget) newnode = env.domains['py'].resolve_xref( env, node['refdoc'], app.builder, 'class', newtarget, node, contnode) if newnode is not None: newnode['reftitle'] = reftarget return newnode def setup(app): app.connect('autodoc-process-docstring', process_docstring) app.connect('missing-reference', missing_reference_handler) app.connect('env-merge-info', merge_mapping) return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/autodoc_enhancements.py0000644000076500000240000001242613434074306031770 0ustar weaverstaff00000000000000""" Miscellaneous enhancements to help autodoc along. """ import inspect import sys import types import sphinx from distutils.version import LooseVersion from sphinx.ext.autodoc import AttributeDocumenter, ModuleDocumenter from sphinx.util.inspect import isdescriptor if sys.version_info[0] == 3: class_types = (type,) else: class_types = (type, types.ClassType) SPHINX_LT_15 = (LooseVersion(sphinx.__version__) < LooseVersion('1.5')) MethodDescriptorType = type(type.__subclasses__) # See # https://github.com/astropy/astropy-helpers/issues/116#issuecomment-71254836 # for further background on this. def type_object_attrgetter(obj, attr, *defargs): """ This implements an improved attrgetter for type objects (i.e. classes) that can handle class attributes that are implemented as properties on a metaclass. Normally `getattr` on a class with a `property` (say, "foo"), would return the `property` object itself. However, if the class has a metaclass which *also* defines a `property` named "foo", ``getattr(cls, 'foo')`` will find the "foo" property on the metaclass and resolve it. For the purposes of autodoc we just want to document the "foo" property defined on the class, not on the metaclass. For example:: >>> class Meta(type): ... @property ... def foo(cls): ... return 'foo' ... >>> class MyClass(metaclass=Meta): ... @property ... def foo(self): ... \"\"\"Docstring for MyClass.foo property.\"\"\" ... return 'myfoo' ... >>> getattr(MyClass, 'foo') 'foo' >>> type_object_attrgetter(MyClass, 'foo') >>> type_object_attrgetter(MyClass, 'foo').__doc__ 'Docstring for MyClass.foo property.' The last line of the example shows the desired behavior for the purposes of autodoc. """ for base in obj.__mro__: if attr in base.__dict__: if isinstance(base.__dict__[attr], property): # Note, this should only be used for properties--for any other # type of descriptor (classmethod, for example) this can mess # up existing expectations of what getattr(cls, ...) returns return base.__dict__[attr] break return getattr(obj, attr, *defargs) if SPHINX_LT_15: # Provided to work around a bug in Sphinx # See https://github.com/sphinx-doc/sphinx/pull/1843 class AttributeDocumenter(AttributeDocumenter): @classmethod def can_document_member(cls, member, membername, isattr, parent): non_attr_types = cls.method_types + class_types + \ (MethodDescriptorType,) isdatadesc = isdescriptor(member) and not \ isinstance(member, non_attr_types) and not \ type(member).__name__ == "instancemethod" # That last condition addresses an obscure case of C-defined # methods using a deprecated type in Python 3, that is not # otherwise exported anywhere by Python return isdatadesc or (not isinstance(parent, ModuleDocumenter) and not inspect.isroutine(member) and not isinstance(member, class_types)) def setup(app): # Must have the autodoc extension set up first so we can override it app.setup_extension('sphinx.ext.autodoc') # Need to import this too since it re-registers all the documenter types # =_= import sphinx.ext.autosummary.generate app.add_autodoc_attrgetter(type, type_object_attrgetter) if sphinx.version_info < (1, 4, 2): # this is a really ugly hack to supress a warning that sphinx 1.4 # generates when overriding an existing directive (which is *desired* # behavior here). As of sphinx v1.4.2, this has been fixed: # https://github.com/sphinx-doc/sphinx/issues/2451 # But we leave it in for 1.4.0/1.4.1 . But if the "needs_sphinx" is # eventually updated to >= 1.4.2, this should be removed entirely (in # favor of the line in the "else" clause) _oldwarn = app._warning _oldwarncount = app._warncount try: try: # *this* is in a try/finally because we don't want to force six as # a real dependency. In sphinx 1.4, six is a prerequisite, so # there's no issue. But in older sphinxes this may not be true... # but the inderlying warning is absent anyway so we let it slide. from six import StringIO app._warning = StringIO() except ImportError: pass app.add_autodocumenter(AttributeDocumenter) finally: app._warning = _oldwarn app._warncount = _oldwarncount else: suppress_warnigns_orig = app.config.suppress_warnings[:] if 'app.add_directive' not in app.config.suppress_warnings: app.config.suppress_warnings.append('app.add_directive') try: app.add_autodocumenter(AttributeDocumenter) finally: app.config.suppress_warnings = suppress_warnigns_orig return {'parallel_read_safe': True, 'parallel_write_safe': True} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/utils.py0000644000076500000240000001574413434074306026750 0ustar weaverstaff00000000000000import inspect import sys import re import os from warnings import warn from sphinx.ext.autosummary.generate import find_autosummary_in_docstring if sys.version_info[0] >= 3: def iteritems(dictionary): return dictionary.items() else: def iteritems(dictionary): return dictionary.iteritems() # We use \n instead of os.linesep because even on Windows, the generated files # use \n as the newline character. SPACE_NEWLINE = ' \n' SINGLE_NEWLINE = '\n' DOUBLE_NEWLINE = '\n\n' TRIPLE_NEWLINE = '\n\n\n' def cleanup_whitespace(text): """ Make sure there are never more than two consecutive newlines, and that there are no trailing whitespaces. """ # Get rid of overall leading/trailing whitespace text = text.strip() + '\n' # Get rid of trailing whitespace on each line while SPACE_NEWLINE in text: text = text.replace(SPACE_NEWLINE, SINGLE_NEWLINE) # Avoid too many consecutive newlines while TRIPLE_NEWLINE in text: text = text.replace(TRIPLE_NEWLINE, DOUBLE_NEWLINE) return text def find_mod_objs(modname, onlylocals=False): """ Returns all the public attributes of a module referenced by name. .. note:: The returned list *not* include subpackages or modules of `modname`,nor does it include private attributes (those that beginwith '_' or are not in `__all__`). Parameters ---------- modname : str The name of the module to search. onlylocals : bool If True, only attributes that are either members of `modname` OR one of its modules or subpackages will be included. Returns ------- localnames : list of str A list of the names of the attributes as they are named in the module `modname` . fqnames : list of str A list of the full qualified names of the attributes (e.g., ``astropy.utils.misc.find_mod_objs``). For attributes that are simple variables, this is based on the local name, but for functions or classes it can be different if they are actually defined elsewhere and just referenced in `modname`. objs : list of objects A list of the actual attributes themselves (in the same order as the other arguments) """ __import__(modname) mod = sys.modules[modname] if hasattr(mod, '__all__'): pkgitems = [(k, mod.__dict__[k]) for k in mod.__all__] else: pkgitems = [(k, mod.__dict__[k]) for k in dir(mod) if k[0] != '_'] # filter out modules and pull the names and objs out ismodule = inspect.ismodule localnames = [k for k, v in pkgitems if not ismodule(v)] objs = [v for k, v in pkgitems if not ismodule(v)] # fully qualified names can be determined from the object's module fqnames = [] for obj, lnm in zip(objs, localnames): if hasattr(obj, '__module__') and hasattr(obj, '__name__'): fqnames.append(obj.__module__ + '.' + obj.__name__) else: fqnames.append(modname + '.' + lnm) if onlylocals: valids = [fqn.startswith(modname) for fqn in fqnames] localnames = [e for i, e in enumerate(localnames) if valids[i]] fqnames = [e for i, e in enumerate(fqnames) if valids[i]] objs = [e for i, e in enumerate(objs) if valids[i]] return localnames, fqnames, objs def find_autosummary_in_lines_for_automodsumm(lines, module=None, filename=None): """Find out what items appear in autosummary:: directives in the given lines. Returns a list of (name, toctree, template, inherited_members) where *name* is a name of an object and *toctree* the :toctree: path of the corresponding autosummary directive (relative to the root of the file name), *template* the value of the :template: option, and *inherited_members* is the value of the :inherited-members: option. *toctree*, *template*, and *inherited_members* are ``None`` if the directive does not have the corresponding options set. .. note:: This is a slightly modified version of ``sphinx.ext.autosummary.generate.find_autosummary_in_lines`` which recognizes the ``inherited-members`` option. """ autosummary_re = re.compile(r'^(\s*)\.\.\s+autosummary::\s*') automodule_re = re.compile( r'^\s*\.\.\s+automodule::\s*([A-Za-z0-9_.]+)\s*$') module_re = re.compile( r'^\s*\.\.\s+(current)?module::\s*([a-zA-Z0-9_.]+)\s*$') autosummary_item_re = re.compile(r'^\s+(~?[_a-zA-Z][a-zA-Z0-9_.]*)\s*.*?') toctree_arg_re = re.compile(r'^\s+:toctree:\s*(.*?)\s*$') template_arg_re = re.compile(r'^\s+:template:\s*(.*?)\s*$') inherited_members_arg_re = re.compile(r'^\s+:inherited-members:\s*$') no_inherited_members_arg_re = re.compile(r'^\s+:no-inherited-members:\s*$') documented = [] toctree = None template = None inherited_members = None current_module = module in_autosummary = False base_indent = "" for line in lines: if in_autosummary: m = toctree_arg_re.match(line) if m: toctree = m.group(1) if filename: toctree = os.path.join(os.path.dirname(filename), toctree) continue m = template_arg_re.match(line) if m: template = m.group(1).strip() continue m = inherited_members_arg_re.match(line) if m: inherited_members = True continue m = no_inherited_members_arg_re.match(line) if m: inherited_members = False continue if line.strip().startswith(':'): warn(line) continue # skip options m = autosummary_item_re.match(line) if m: name = m.group(1).strip() if name.startswith('~'): name = name[1:] if current_module and \ not name.startswith(current_module + '.'): name = "%s.%s" % (current_module, name) documented.append((name, toctree, template, inherited_members)) continue if not line.strip() or line.startswith(base_indent + " "): continue in_autosummary = False m = autosummary_re.match(line) if m: in_autosummary = True base_indent = m.group(1) toctree = None template = None inherited_members = None continue m = automodule_re.search(line) if m: current_module = m.group(1).strip() # recurse into the automodule docstring documented.extend(find_autosummary_in_docstring( current_module, filename=filename)) continue m = module_re.match(line) if m: current_module = m.group(2) continue return documented pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/templates/0000755000076500000240000000000013434104632027215 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/templates/autosummary_core/0000755000076500000240000000000013434104632032613 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/templates/autosummary_core/class.rst0000644000076500000240000000221113434074306034452 0ustar weaverstaff00000000000000{% if referencefile %} .. include:: {{ referencefile }} {% endif %} {{ objname }} {{ underline }} .. currentmodule:: {{ module }} .. autoclass:: {{ objname }} :show-inheritance: {% if '__init__' in methods %} {% set caught_result = methods.remove('__init__') %} {% endif %} {% block attributes_summary %} {% if attributes %} .. rubric:: Attributes Summary .. autosummary:: {% for item in attributes %} ~{{ name }}.{{ item }} {%- endfor %} {% endif %} {% endblock %} {% block methods_summary %} {% if methods %} .. rubric:: Methods Summary .. autosummary:: {% for item in methods %} ~{{ name }}.{{ item }} {%- endfor %} {% endif %} {% endblock %} {% block attributes_documentation %} {% if attributes %} .. rubric:: Attributes Documentation {% for item in attributes %} .. autoattribute:: {{ item }} {%- endfor %} {% endif %} {% endblock %} {% block methods_documentation %} {% if methods %} .. rubric:: Methods Documentation {% for item in methods %} .. automethod:: {{ item }} {%- endfor %} {% endif %} {% endblock %} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/templates/autosummary_core/base.rst0000644000076500000240000000025213434074306034262 0ustar weaverstaff00000000000000{% if referencefile %} .. include:: {{ referencefile }} {% endif %} {{ objname }} {{ underline }} .. currentmodule:: {{ module }} .. auto{{ objtype }}:: {{ objname }} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/automodapi/templates/autosummary_core/module.rst0000644000076500000240000000127713434074306034645 0ustar weaverstaff00000000000000{% if referencefile %} .. include:: {{ referencefile }} {% endif %} {{ objname }} {{ underline }} .. automodule:: {{ fullname }} {% block functions %} {% if functions %} .. rubric:: Functions .. autosummary:: {% for item in functions %} {{ item }} {%- endfor %} {% endif %} {% endblock %} {% block classes %} {% if classes %} .. rubric:: Classes .. autosummary:: {% for item in classes %} {{ item }} {%- endfor %} {% endif %} {% endblock %} {% block exceptions %} {% if exceptions %} .. rubric:: Exceptions .. autosummary:: {% for item in exceptions %} {{ item }} {%- endfor %} {% endif %} {% endblock %} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/numpydoc/0000755000076500000240000000000013434104632024713 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/extern/numpydoc/__init__.py0000644000076500000240000000030213434074306027023 0ustar weaverstaff00000000000000from __future__ import division, absolute_import, print_function __version__ = '0.8.0' def setup(app, *args, **kwargs): from .numpydoc import setup return setup(app, *args, **kwargs) pydl-0.7.0/astropy_helpers/astropy_helpers/extern/numpydoc/docscrape_sphinx.py0000644000076500000240000003547013434074306030636 0ustar weaverstaff00000000000000from __future__ import division, absolute_import, print_function import sys import re import inspect import textwrap import pydoc import collections import os from jinja2 import FileSystemLoader from jinja2.sandbox import SandboxedEnvironment import sphinx from sphinx.jinja2glue import BuiltinTemplateLoader from .docscrape import NumpyDocString, FunctionDoc, ClassDoc if sys.version_info[0] >= 3: sixu = lambda s: s else: sixu = lambda s: unicode(s, 'unicode_escape') IMPORT_MATPLOTLIB_RE = r'\b(import +matplotlib|from +matplotlib +import)\b' class SphinxDocString(NumpyDocString): def __init__(self, docstring, config={}): NumpyDocString.__init__(self, docstring, config=config) self.load_config(config) def load_config(self, config): self.use_plots = config.get('use_plots', False) self.use_blockquotes = config.get('use_blockquotes', False) self.class_members_toctree = config.get('class_members_toctree', True) self.template = config.get('template', None) if self.template is None: template_dirs = [os.path.join(os.path.dirname(__file__), 'templates')] template_loader = FileSystemLoader(template_dirs) template_env = SandboxedEnvironment(loader=template_loader) self.template = template_env.get_template('numpydoc_docstring.rst') # string conversion routines def _str_header(self, name, symbol='`'): return ['.. rubric:: ' + name, ''] def _str_field_list(self, name): return [':' + name + ':'] def _str_indent(self, doc, indent=4): out = [] for line in doc: out += [' '*indent + line] return out def _str_signature(self): return [''] if self['Signature']: return ['``%s``' % self['Signature']] + [''] else: return [''] def _str_summary(self): return self['Summary'] + [''] def _str_extended_summary(self): return self['Extended Summary'] + [''] def _str_returns(self, name='Returns'): typed_fmt = '**%s** : %s' untyped_fmt = '**%s**' out = [] if self[name]: out += self._str_field_list(name) out += [''] for param, param_type, desc in self[name]: if param_type: out += self._str_indent([typed_fmt % (param.strip(), param_type)]) else: out += self._str_indent([untyped_fmt % param.strip()]) if desc and self.use_blockquotes: out += [''] elif not desc: desc = ['..'] out += self._str_indent(desc, 8) out += [''] return out def _process_param(self, param, desc, fake_autosummary): """Determine how to display a parameter Emulates autosummary behavior if fake_autosummary Parameters ---------- param : str The name of the parameter desc : list of str The parameter description as given in the docstring. This is ignored when autosummary logic applies. fake_autosummary : bool If True, autosummary-style behaviour will apply for params that are attributes of the class and have a docstring. Returns ------- display_param : str The marked up parameter name for display. This may include a link to the corresponding attribute's own documentation. desc : list of str A list of description lines. This may be identical to the input ``desc``, if ``autosum is None`` or ``param`` is not a class attribute, or it will be a summary of the class attribute's docstring. Notes ----- This does not have the autosummary functionality to display a method's signature, and hence is not used to format methods. It may be complicated to incorporate autosummary's signature mangling, as it relies on Sphinx's plugin mechanism. """ param = param.strip() # XXX: If changing the following, please check the rendering when param # ends with '_', e.g. 'word_' # See https://github.com/numpy/numpydoc/pull/144 display_param = '**%s**' % param if not fake_autosummary: return display_param, desc param_obj = getattr(self._obj, param, None) if not (callable(param_obj) or isinstance(param_obj, property) or inspect.isgetsetdescriptor(param_obj)): param_obj = None obj_doc = pydoc.getdoc(param_obj) if not (param_obj and obj_doc): return display_param, desc prefix = getattr(self, '_name', '') if prefix: autosum_prefix = '~%s.' % prefix link_prefix = '%s.' % prefix else: autosum_prefix = '' link_prefix = '' # Referenced object has a docstring display_param = ':obj:`%s <%s%s>`' % (param, link_prefix, param) if obj_doc: # Overwrite desc. Take summary logic of autosummary desc = re.split('\n\s*\n', obj_doc.strip(), 1)[0] # XXX: Should this have DOTALL? # It does not in autosummary m = re.search(r"^([A-Z].*?\.)(?:\s|$)", ' '.join(desc.split())) if m: desc = m.group(1).strip() else: desc = desc.partition('\n')[0] desc = desc.split('\n') return display_param, desc def _str_param_list(self, name, fake_autosummary=False): """Generate RST for a listing of parameters or similar Parameter names are displayed as bold text, and descriptions are in blockquotes. Descriptions may therefore contain block markup as well. Parameters ---------- name : str Section name (e.g. Parameters) fake_autosummary : bool When True, the parameter names may correspond to attributes of the object beign documented, usually ``property`` instances on a class. In this case, names will be linked to fuller descriptions. Returns ------- rst : list of str """ out = [] if self[name]: out += self._str_field_list(name) out += [''] for param, param_type, desc in self[name]: display_param, desc = self._process_param(param, desc, fake_autosummary) if param_type: out += self._str_indent(['%s : %s' % (display_param, param_type)]) else: out += self._str_indent([display_param]) if desc and self.use_blockquotes: out += [''] elif not desc: # empty definition desc = ['..'] out += self._str_indent(desc, 8) out += [''] return out @property def _obj(self): if hasattr(self, '_cls'): return self._cls elif hasattr(self, '_f'): return self._f return None def _str_member_list(self, name): """ Generate a member listing, autosummary:: table where possible, and a table where not. """ out = [] if self[name]: out += ['.. rubric:: %s' % name, ''] prefix = getattr(self, '_name', '') if prefix: prefix = '~%s.' % prefix autosum = [] others = [] for param, param_type, desc in self[name]: param = param.strip() # Check if the referenced member can have a docstring or not param_obj = getattr(self._obj, param, None) if not (callable(param_obj) or isinstance(param_obj, property) or inspect.isdatadescriptor(param_obj)): param_obj = None if param_obj and pydoc.getdoc(param_obj): # Referenced object has a docstring autosum += [" %s%s" % (prefix, param)] else: others.append((param, param_type, desc)) if autosum: out += ['.. autosummary::'] if self.class_members_toctree: out += [' :toctree:'] out += [''] + autosum if others: maxlen_0 = max(3, max([len(x[0]) + 4 for x in others])) hdr = sixu("=") * maxlen_0 + sixu(" ") + sixu("=") * 10 fmt = sixu('%%%ds %%s ') % (maxlen_0,) out += ['', '', hdr] for param, param_type, desc in others: desc = sixu(" ").join(x.strip() for x in desc).strip() if param_type: desc = "(%s) %s" % (param_type, desc) out += [fmt % ("**" + param.strip() + "**", desc)] out += [hdr] out += [''] return out def _str_section(self, name): out = [] if self[name]: out += self._str_header(name) content = textwrap.dedent("\n".join(self[name])).split("\n") out += content out += [''] return out def _str_see_also(self, func_role): out = [] if self['See Also']: see_also = super(SphinxDocString, self)._str_see_also(func_role) out = ['.. seealso::', ''] out += self._str_indent(see_also[2:]) return out def _str_warnings(self): out = [] if self['Warnings']: out = ['.. warning::', ''] out += self._str_indent(self['Warnings']) out += [''] return out def _str_index(self): idx = self['index'] out = [] if len(idx) == 0: return out out += ['.. index:: %s' % idx.get('default', '')] for section, references in idx.items(): if section == 'default': continue elif section == 'refguide': out += [' single: %s' % (', '.join(references))] else: out += [' %s: %s' % (section, ','.join(references))] out += [''] return out def _str_references(self): out = [] if self['References']: out += self._str_header('References') if isinstance(self['References'], str): self['References'] = [self['References']] out.extend(self['References']) out += [''] # Latex collects all references to a separate bibliography, # so we need to insert links to it if sphinx.__version__ >= "0.6": out += ['.. only:: latex', ''] else: out += ['.. latexonly::', ''] items = [] for line in self['References']: m = re.match(r'.. \[([a-z0-9._-]+)\]', line, re.I) if m: items.append(m.group(1)) out += [' ' + ", ".join(["[%s]_" % item for item in items]), ''] return out def _str_examples(self): examples_str = "\n".join(self['Examples']) if (self.use_plots and re.search(IMPORT_MATPLOTLIB_RE, examples_str) and 'plot::' not in examples_str): out = [] out += self._str_header('Examples') out += ['.. plot::', ''] out += self._str_indent(self['Examples']) out += [''] return out else: return self._str_section('Examples') def __str__(self, indent=0, func_role="obj"): ns = { 'signature': self._str_signature(), 'index': self._str_index(), 'summary': self._str_summary(), 'extended_summary': self._str_extended_summary(), 'parameters': self._str_param_list('Parameters'), 'returns': self._str_returns('Returns'), 'yields': self._str_returns('Yields'), 'other_parameters': self._str_param_list('Other Parameters'), 'raises': self._str_param_list('Raises'), 'warns': self._str_param_list('Warns'), 'warnings': self._str_warnings(), 'see_also': self._str_see_also(func_role), 'notes': self._str_section('Notes'), 'references': self._str_references(), 'examples': self._str_examples(), 'attributes': self._str_param_list('Attributes', fake_autosummary=True), 'methods': self._str_member_list('Methods'), } ns = dict((k, '\n'.join(v)) for k, v in ns.items()) rendered = self.template.render(**ns) return '\n'.join(self._str_indent(rendered.split('\n'), indent)) class SphinxFunctionDoc(SphinxDocString, FunctionDoc): def __init__(self, obj, doc=None, config={}): self.load_config(config) FunctionDoc.__init__(self, obj, doc=doc, config=config) class SphinxClassDoc(SphinxDocString, ClassDoc): def __init__(self, obj, doc=None, func_doc=None, config={}): self.load_config(config) ClassDoc.__init__(self, obj, doc=doc, func_doc=None, config=config) class SphinxObjDoc(SphinxDocString): def __init__(self, obj, doc=None, config={}): self._f = obj self.load_config(config) SphinxDocString.__init__(self, doc, config=config) def get_doc_object(obj, what=None, doc=None, config={}, builder=None): if what is None: if inspect.isclass(obj): what = 'class' elif inspect.ismodule(obj): what = 'module' elif isinstance(obj, collections.Callable): what = 'function' else: what = 'object' template_dirs = [os.path.join(os.path.dirname(__file__), 'templates')] if builder is not None: template_loader = BuiltinTemplateLoader() template_loader.init(builder, dirs=template_dirs) else: template_loader = FileSystemLoader(template_dirs) template_env = SandboxedEnvironment(loader=template_loader) config['template'] = template_env.get_template('numpydoc_docstring.rst') if what == 'class': return SphinxClassDoc(obj, func_doc=SphinxFunctionDoc, doc=doc, config=config) elif what in ('function', 'method'): return SphinxFunctionDoc(obj, doc=doc, config=config) else: if doc is None: doc = pydoc.getdoc(obj) return SphinxObjDoc(obj, doc, config=config) pydl-0.7.0/astropy_helpers/astropy_helpers/extern/numpydoc/numpydoc.py0000644000076500000240000002620313434074306027132 0ustar weaverstaff00000000000000""" ======== numpydoc ======== Sphinx extension that handles docstrings in the Numpy standard format. [1] It will: - Convert Parameters etc. sections to field lists. - Convert See Also section to a See also entry. - Renumber references. - Extract the signature from the docstring, if it can't be determined otherwise. .. [1] https://github.com/numpy/numpy/blob/master/doc/HOWTO_DOCUMENT.rst.txt """ from __future__ import division, absolute_import, print_function import sys import re import pydoc import inspect import collections import hashlib from docutils.nodes import citation, Text import sphinx from sphinx.addnodes import pending_xref, desc_content if sphinx.__version__ < '1.0.1': raise RuntimeError("Sphinx 1.0.1 or newer is required") from .docscrape_sphinx import get_doc_object, SphinxDocString from . import __version__ if sys.version_info[0] >= 3: sixu = lambda s: s else: sixu = lambda s: unicode(s, 'unicode_escape') HASH_LEN = 12 def rename_references(app, what, name, obj, options, lines): # decorate reference numbers so that there are no duplicates # these are later undecorated in the doctree, in relabel_references references = set() for line in lines: line = line.strip() m = re.match(sixu('^.. \\[(%s)\\]') % app.config.numpydoc_citation_re, line, re.I) if m: references.add(m.group(1)) if references: # we use a hash to mangle the reference name to avoid invalid names sha = hashlib.sha256() sha.update(name.encode('utf8')) prefix = 'R' + sha.hexdigest()[:HASH_LEN] for r in references: new_r = prefix + '-' + r for i, line in enumerate(lines): lines[i] = lines[i].replace(sixu('[%s]_') % r, sixu('[%s]_') % new_r) lines[i] = lines[i].replace(sixu('.. [%s]') % r, sixu('.. [%s]') % new_r) def _ascend(node, cls): while node and not isinstance(node, cls): node = node.parent return node def relabel_references(app, doc): # Change 'hash-ref' to 'ref' in label text for citation_node in doc.traverse(citation): if _ascend(citation_node, desc_content) is None: # no desc node in ancestry -> not in a docstring # XXX: should we also somehow check it's in a References section? continue label_node = citation_node[0] prefix, _, new_label = label_node[0].astext().partition('-') assert len(prefix) == HASH_LEN + 1 new_text = Text(new_label) label_node.replace(label_node[0], new_text) for id in citation_node['backrefs']: ref = doc.ids[id] ref_text = ref[0] # Sphinx has created pending_xref nodes with [reftext] text. def matching_pending_xref(node): return (isinstance(node, pending_xref) and node[0].astext() == '[%s]' % ref_text) for xref_node in ref.parent.traverse(matching_pending_xref): xref_node.replace(xref_node[0], Text('[%s]' % new_text)) ref.replace(ref_text, new_text.copy()) DEDUPLICATION_TAG = ' !! processed by numpydoc !!' def mangle_docstrings(app, what, name, obj, options, lines): if DEDUPLICATION_TAG in lines: return cfg = {'use_plots': app.config.numpydoc_use_plots, 'use_blockquotes': app.config.numpydoc_use_blockquotes, 'show_class_members': app.config.numpydoc_show_class_members, 'show_inherited_class_members': app.config.numpydoc_show_inherited_class_members, 'class_members_toctree': app.config.numpydoc_class_members_toctree} u_NL = sixu('\n') if what == 'module': # Strip top title pattern = '^\\s*[#*=]{4,}\\n[a-z0-9 -]+\\n[#*=]{4,}\\s*' title_re = re.compile(sixu(pattern), re.I | re.S) lines[:] = title_re.sub(sixu(''), u_NL.join(lines)).split(u_NL) else: doc = get_doc_object(obj, what, u_NL.join(lines), config=cfg, builder=app.builder) if sys.version_info[0] >= 3: doc = str(doc) else: doc = unicode(doc) lines[:] = doc.split(u_NL) if (app.config.numpydoc_edit_link and hasattr(obj, '__name__') and obj.__name__): if hasattr(obj, '__module__'): v = dict(full_name=sixu("%s.%s") % (obj.__module__, obj.__name__)) else: v = dict(full_name=obj.__name__) lines += [sixu(''), sixu('.. htmlonly::'), sixu('')] lines += [sixu(' %s') % x for x in (app.config.numpydoc_edit_link % v).split("\n")] # call function to replace reference numbers so that there are no # duplicates rename_references(app, what, name, obj, options, lines) lines += ['..', DEDUPLICATION_TAG] def mangle_signature(app, what, name, obj, options, sig, retann): # Do not try to inspect classes that don't define `__init__` if (inspect.isclass(obj) and (not hasattr(obj, '__init__') or 'initializes x; see ' in pydoc.getdoc(obj.__init__))): return '', '' if not (isinstance(obj, collections.Callable) or hasattr(obj, '__argspec_is_invalid_')): return if not hasattr(obj, '__doc__'): return doc = SphinxDocString(pydoc.getdoc(obj)) sig = doc['Signature'] or getattr(obj, '__text_signature__', None) if sig: sig = re.sub(sixu("^[^(]*"), sixu(""), sig) return sig, sixu('') def setup(app, get_doc_object_=get_doc_object): if not hasattr(app, 'add_config_value'): return # probably called by nose, better bail out global get_doc_object get_doc_object = get_doc_object_ app.connect('autodoc-process-docstring', mangle_docstrings) app.connect('autodoc-process-signature', mangle_signature) app.connect('doctree-read', relabel_references) app.add_config_value('numpydoc_edit_link', None, False) app.add_config_value('numpydoc_use_plots', None, False) app.add_config_value('numpydoc_use_blockquotes', None, False) app.add_config_value('numpydoc_show_class_members', True, True) app.add_config_value('numpydoc_show_inherited_class_members', True, True) app.add_config_value('numpydoc_class_members_toctree', True, True) app.add_config_value('numpydoc_citation_re', '[a-z0-9_.-]+', True) # Extra mangling domains app.add_domain(NumpyPythonDomain) app.add_domain(NumpyCDomain) app.setup_extension('sphinx.ext.autosummary') metadata = {'version': __version__, 'parallel_read_safe': True} return metadata # ------------------------------------------------------------------------------ # Docstring-mangling domains # ------------------------------------------------------------------------------ from docutils.statemachine import ViewList from sphinx.domains.c import CDomain from sphinx.domains.python import PythonDomain class ManglingDomainBase(object): directive_mangling_map = {} def __init__(self, *a, **kw): super(ManglingDomainBase, self).__init__(*a, **kw) self.wrap_mangling_directives() def wrap_mangling_directives(self): for name, objtype in list(self.directive_mangling_map.items()): self.directives[name] = wrap_mangling_directive( self.directives[name], objtype) class NumpyPythonDomain(ManglingDomainBase, PythonDomain): name = 'np' directive_mangling_map = { 'function': 'function', 'class': 'class', 'exception': 'class', 'method': 'function', 'classmethod': 'function', 'staticmethod': 'function', 'attribute': 'attribute', } indices = [] class NumpyCDomain(ManglingDomainBase, CDomain): name = 'np-c' directive_mangling_map = { 'function': 'function', 'member': 'attribute', 'macro': 'function', 'type': 'class', 'var': 'object', } def match_items(lines, content_old): """Create items for mangled lines. This function tries to match the lines in ``lines`` with the items (source file references and line numbers) in ``content_old``. The ``mangle_docstrings`` function changes the actual docstrings, but doesn't keep track of where each line came from. The manging does many operations on the original lines, which are hard to track afterwards. Many of the line changes come from deleting or inserting blank lines. This function tries to match lines by ignoring blank lines. All other changes (such as inserting figures or changes in the references) are completely ignored, so the generated line numbers will be off if ``mangle_docstrings`` does anything non-trivial. This is a best-effort function and the real fix would be to make ``mangle_docstrings`` actually keep track of the ``items`` together with the ``lines``. Examples -------- >>> lines = ['', 'A', '', 'B', ' ', '', 'C', 'D'] >>> lines_old = ['a', '', '', 'b', '', 'c'] >>> items_old = [('file1.py', 0), ('file1.py', 1), ('file1.py', 2), ... ('file2.py', 0), ('file2.py', 1), ('file2.py', 2)] >>> content_old = ViewList(lines_old, items=items_old) >>> match_items(lines, content_old) # doctest: +NORMALIZE_WHITESPACE [('file1.py', 0), ('file1.py', 0), ('file2.py', 0), ('file2.py', 0), ('file2.py', 2), ('file2.py', 2), ('file2.py', 2), ('file2.py', 2)] >>> # first 2 ``lines`` are matched to 'a', second 2 to 'b', rest to 'c' >>> # actual content is completely ignored. Notes ----- The algorithm tries to match any line in ``lines`` with one in ``lines_old``. It skips over all empty lines in ``lines_old`` and assigns this line number to all lines in ``lines``, unless a non-empty line is found in ``lines`` in which case it goes to the next line in ``lines_old``. """ items_new = [] lines_old = content_old.data items_old = content_old.items j = 0 for i, line in enumerate(lines): # go to next non-empty line in old: # line.strip() checks whether the string is all whitespace while j < len(lines_old) - 1 and not lines_old[j].strip(): j += 1 items_new.append(items_old[j]) if line.strip() and j < len(lines_old) - 1: j += 1 assert(len(items_new) == len(lines)) return items_new def wrap_mangling_directive(base_directive, objtype): class directive(base_directive): def run(self): env = self.state.document.settings.env name = None if self.arguments: m = re.match(r'^(.*\s+)?(.*?)(\(.*)?', self.arguments[0]) name = m.group(2).strip() if not name: name = self.arguments[0] lines = list(self.content) mangle_docstrings(env.app, objtype, name, None, None, lines) if self.content: items = match_items(lines, self.content) self.content = ViewList(lines, items=items, parent=self.content.parent) return base_directive.run(self) return directive pydl-0.7.0/astropy_helpers/astropy_helpers/extern/numpydoc/docscrape.py0000644000076500000240000004540613434074306027245 0ustar weaverstaff00000000000000"""Extract reference documentation from the NumPy source tree. """ from __future__ import division, absolute_import, print_function import inspect import textwrap import re import pydoc from warnings import warn import collections import copy import sys def strip_blank_lines(l): "Remove leading and trailing blank lines from a list of lines" while l and not l[0].strip(): del l[0] while l and not l[-1].strip(): del l[-1] return l class Reader(object): """A line-based string reader. """ def __init__(self, data): """ Parameters ---------- data : str String with lines separated by '\n'. """ if isinstance(data, list): self._str = data else: self._str = data.split('\n') # store string as list of lines self.reset() def __getitem__(self, n): return self._str[n] def reset(self): self._l = 0 # current line nr def read(self): if not self.eof(): out = self[self._l] self._l += 1 return out else: return '' def seek_next_non_empty_line(self): for l in self[self._l:]: if l.strip(): break else: self._l += 1 def eof(self): return self._l >= len(self._str) def read_to_condition(self, condition_func): start = self._l for line in self[start:]: if condition_func(line): return self[start:self._l] self._l += 1 if self.eof(): return self[start:self._l+1] return [] def read_to_next_empty_line(self): self.seek_next_non_empty_line() def is_empty(line): return not line.strip() return self.read_to_condition(is_empty) def read_to_next_unindented_line(self): def is_unindented(line): return (line.strip() and (len(line.lstrip()) == len(line))) return self.read_to_condition(is_unindented) def peek(self, n=0): if self._l + n < len(self._str): return self[self._l + n] else: return '' def is_empty(self): return not ''.join(self._str).strip() class ParseError(Exception): def __str__(self): message = self.args[0] if hasattr(self, 'docstring'): message = "%s in %r" % (message, self.docstring) return message class NumpyDocString(collections.Mapping): """Parses a numpydoc string to an abstract representation Instances define a mapping from section title to structured data. """ sections = { 'Signature': '', 'Summary': [''], 'Extended Summary': [], 'Parameters': [], 'Returns': [], 'Yields': [], 'Raises': [], 'Warns': [], 'Other Parameters': [], 'Attributes': [], 'Methods': [], 'See Also': [], 'Notes': [], 'Warnings': [], 'References': '', 'Examples': '', 'index': {} } def __init__(self, docstring, config={}): orig_docstring = docstring docstring = textwrap.dedent(docstring).split('\n') self._doc = Reader(docstring) self._parsed_data = copy.deepcopy(self.sections) try: self._parse() except ParseError as e: e.docstring = orig_docstring raise def __getitem__(self, key): return self._parsed_data[key] def __setitem__(self, key, val): if key not in self._parsed_data: self._error_location("Unknown section %s" % key, error=False) else: self._parsed_data[key] = val def __iter__(self): return iter(self._parsed_data) def __len__(self): return len(self._parsed_data) def _is_at_section(self): self._doc.seek_next_non_empty_line() if self._doc.eof(): return False l1 = self._doc.peek().strip() # e.g. Parameters if l1.startswith('.. index::'): return True l2 = self._doc.peek(1).strip() # ---------- or ========== return l2.startswith('-'*len(l1)) or l2.startswith('='*len(l1)) def _strip(self, doc): i = 0 j = 0 for i, line in enumerate(doc): if line.strip(): break for j, line in enumerate(doc[::-1]): if line.strip(): break return doc[i:len(doc)-j] def _read_to_next_section(self): section = self._doc.read_to_next_empty_line() while not self._is_at_section() and not self._doc.eof(): if not self._doc.peek(-1).strip(): # previous line was empty section += [''] section += self._doc.read_to_next_empty_line() return section def _read_sections(self): while not self._doc.eof(): data = self._read_to_next_section() name = data[0].strip() if name.startswith('..'): # index section yield name, data[1:] elif len(data) < 2: yield StopIteration else: yield name, self._strip(data[2:]) def _parse_param_list(self, content): r = Reader(content) params = [] while not r.eof(): header = r.read().strip() if ' : ' in header: arg_name, arg_type = header.split(' : ')[:2] else: arg_name, arg_type = header, '' desc = r.read_to_next_unindented_line() desc = dedent_lines(desc) desc = strip_blank_lines(desc) params.append((arg_name, arg_type, desc)) return params _name_rgx = re.compile(r"^\s*(:(?P\w+):" r"`(?P(?:~\w+\.)?[a-zA-Z0-9_.-]+)`|" r" (?P[a-zA-Z0-9_.-]+))\s*", re.X) def _parse_see_also(self, content): """ func_name : Descriptive text continued text another_func_name : Descriptive text func_name1, func_name2, :meth:`func_name`, func_name3 """ items = [] def parse_item_name(text): """Match ':role:`name`' or 'name'""" m = self._name_rgx.match(text) if m: g = m.groups() if g[1] is None: return g[3], None else: return g[2], g[1] raise ParseError("%s is not a item name" % text) def push_item(name, rest): if not name: return name, role = parse_item_name(name) items.append((name, list(rest), role)) del rest[:] current_func = None rest = [] for line in content: if not line.strip(): continue m = self._name_rgx.match(line) if m and line[m.end():].strip().startswith(':'): push_item(current_func, rest) current_func, line = line[:m.end()], line[m.end():] rest = [line.split(':', 1)[1].strip()] if not rest[0]: rest = [] elif not line.startswith(' '): push_item(current_func, rest) current_func = None if ',' in line: for func in line.split(','): if func.strip(): push_item(func, []) elif line.strip(): current_func = line elif current_func is not None: rest.append(line.strip()) push_item(current_func, rest) return items def _parse_index(self, section, content): """ .. index: default :refguide: something, else, and more """ def strip_each_in(lst): return [s.strip() for s in lst] out = {} section = section.split('::') if len(section) > 1: out['default'] = strip_each_in(section[1].split(','))[0] for line in content: line = line.split(':') if len(line) > 2: out[line[1]] = strip_each_in(line[2].split(',')) return out def _parse_summary(self): """Grab signature (if given) and summary""" if self._is_at_section(): return # If several signatures present, take the last one while True: summary = self._doc.read_to_next_empty_line() summary_str = " ".join([s.strip() for s in summary]).strip() if re.compile('^([\w., ]+=)?\s*[\w\.]+\(.*\)$').match(summary_str): self['Signature'] = summary_str if not self._is_at_section(): continue break if summary is not None: self['Summary'] = summary if not self._is_at_section(): self['Extended Summary'] = self._read_to_next_section() def _parse(self): self._doc.reset() self._parse_summary() sections = list(self._read_sections()) section_names = set([section for section, content in sections]) has_returns = 'Returns' in section_names has_yields = 'Yields' in section_names # We could do more tests, but we are not. Arbitrarily. if has_returns and has_yields: msg = 'Docstring contains both a Returns and Yields section.' raise ValueError(msg) for (section, content) in sections: if not section.startswith('..'): section = (s.capitalize() for s in section.split(' ')) section = ' '.join(section) if self.get(section): self._error_location("The section %s appears twice" % section) if section in ('Parameters', 'Returns', 'Yields', 'Raises', 'Warns', 'Other Parameters', 'Attributes', 'Methods'): self[section] = self._parse_param_list(content) elif section.startswith('.. index::'): self['index'] = self._parse_index(section, content) elif section == 'See Also': self['See Also'] = self._parse_see_also(content) else: self[section] = content def _error_location(self, msg, error=True): if hasattr(self, '_obj'): # we know where the docs came from: try: filename = inspect.getsourcefile(self._obj) except TypeError: filename = None msg = msg + (" in the docstring of %s in %s." % (self._obj, filename)) if error: raise ValueError(msg) else: warn(msg) # string conversion routines def _str_header(self, name, symbol='-'): return [name, len(name)*symbol] def _str_indent(self, doc, indent=4): out = [] for line in doc: out += [' '*indent + line] return out def _str_signature(self): if self['Signature']: return [self['Signature'].replace('*', '\*')] + [''] else: return [''] def _str_summary(self): if self['Summary']: return self['Summary'] + [''] else: return [] def _str_extended_summary(self): if self['Extended Summary']: return self['Extended Summary'] + [''] else: return [] def _str_param_list(self, name): out = [] if self[name]: out += self._str_header(name) for param, param_type, desc in self[name]: if param_type: out += ['%s : %s' % (param, param_type)] else: out += [param] if desc and ''.join(desc).strip(): out += self._str_indent(desc) out += [''] return out def _str_section(self, name): out = [] if self[name]: out += self._str_header(name) out += self[name] out += [''] return out def _str_see_also(self, func_role): if not self['See Also']: return [] out = [] out += self._str_header("See Also") last_had_desc = True for func, desc, role in self['See Also']: if role: link = ':%s:`%s`' % (role, func) elif func_role: link = ':%s:`%s`' % (func_role, func) else: link = "`%s`_" % func if desc or last_had_desc: out += [''] out += [link] else: out[-1] += ", %s" % link if desc: out += self._str_indent([' '.join(desc)]) last_had_desc = True else: last_had_desc = False out += [''] return out def _str_index(self): idx = self['index'] out = [] out += ['.. index:: %s' % idx.get('default', '')] for section, references in idx.items(): if section == 'default': continue out += [' :%s: %s' % (section, ', '.join(references))] return out def __str__(self, func_role=''): out = [] out += self._str_signature() out += self._str_summary() out += self._str_extended_summary() for param_list in ('Parameters', 'Returns', 'Yields', 'Other Parameters', 'Raises', 'Warns'): out += self._str_param_list(param_list) out += self._str_section('Warnings') out += self._str_see_also(func_role) for s in ('Notes', 'References', 'Examples'): out += self._str_section(s) for param_list in ('Attributes', 'Methods'): out += self._str_param_list(param_list) out += self._str_index() return '\n'.join(out) def indent(str, indent=4): indent_str = ' '*indent if str is None: return indent_str lines = str.split('\n') return '\n'.join(indent_str + l for l in lines) def dedent_lines(lines): """Deindent a list of lines maximally""" return textwrap.dedent("\n".join(lines)).split("\n") def header(text, style='-'): return text + '\n' + style*len(text) + '\n' class FunctionDoc(NumpyDocString): def __init__(self, func, role='func', doc=None, config={}): self._f = func self._role = role # e.g. "func" or "meth" if doc is None: if func is None: raise ValueError("No function or docstring given") doc = inspect.getdoc(func) or '' NumpyDocString.__init__(self, doc) if not self['Signature'] and func is not None: func, func_name = self.get_func() try: try: signature = str(inspect.signature(func)) except (AttributeError, ValueError): # try to read signature, backward compat for older Python if sys.version_info[0] >= 3: argspec = inspect.getfullargspec(func) else: argspec = inspect.getargspec(func) signature = inspect.formatargspec(*argspec) signature = '%s%s' % (func_name, signature.replace('*', '\*')) except TypeError: signature = '%s()' % func_name self['Signature'] = signature def get_func(self): func_name = getattr(self._f, '__name__', self.__class__.__name__) if inspect.isclass(self._f): func = getattr(self._f, '__call__', self._f.__init__) else: func = self._f return func, func_name def __str__(self): out = '' func, func_name = self.get_func() signature = self['Signature'].replace('*', '\*') roles = {'func': 'function', 'meth': 'method'} if self._role: if self._role not in roles: print("Warning: invalid role %s" % self._role) out += '.. %s:: %s\n \n\n' % (roles.get(self._role, ''), func_name) out += super(FunctionDoc, self).__str__(func_role=self._role) return out class ClassDoc(NumpyDocString): extra_public_methods = ['__call__'] def __init__(self, cls, doc=None, modulename='', func_doc=FunctionDoc, config={}): if not inspect.isclass(cls) and cls is not None: raise ValueError("Expected a class or None, but got %r" % cls) self._cls = cls self.show_inherited_members = config.get( 'show_inherited_class_members', True) if modulename and not modulename.endswith('.'): modulename += '.' self._mod = modulename if doc is None: if cls is None: raise ValueError("No class or documentation string given") doc = pydoc.getdoc(cls) NumpyDocString.__init__(self, doc) if config.get('show_class_members', True): def splitlines_x(s): if not s: return [] else: return s.splitlines() for field, items in [('Methods', self.methods), ('Attributes', self.properties)]: if not self[field]: doc_list = [] for name in sorted(items): try: doc_item = pydoc.getdoc(getattr(self._cls, name)) doc_list.append((name, '', splitlines_x(doc_item))) except AttributeError: pass # method doesn't exist self[field] = doc_list @property def methods(self): if self._cls is None: return [] return [name for name, func in inspect.getmembers(self._cls) if ((not name.startswith('_') or name in self.extra_public_methods) and isinstance(func, collections.Callable) and self._is_show_member(name))] @property def properties(self): if self._cls is None: return [] return [name for name, func in inspect.getmembers(self._cls) if (not name.startswith('_') and (func is None or isinstance(func, property) or inspect.isdatadescriptor(func)) and self._is_show_member(name))] def _is_show_member(self, name): if self.show_inherited_members: return True # show all class members if name not in self._cls.__dict__: return False # class member is inherited, we do not show it return True pydl-0.7.0/astropy_helpers/astropy_helpers/extern/numpydoc/templates/0000755000076500000240000000000013434104632026711 5ustar weaverstaff00000000000000pydl-0.7.0/astropy_helpers/astropy_helpers/extern/numpydoc/templates/numpydoc_docstring.rst0000644000076500000240000000032613434074306033362 0ustar weaverstaff00000000000000{{index}} {{summary}} {{extended_summary}} {{parameters}} {{returns}} {{yields}} {{other_parameters}} {{raises}} {{warns}} {{warnings}} {{see_also}} {{notes}} {{references}} {{examples}} {{attributes}} {{methods}} pydl-0.7.0/astropy_helpers/astropy_helpers/extern/setup_package.py0000644000076500000240000000027513434074306026252 0ustar weaverstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst def get_package_data(): return {'astropy_helpers.extern': ['automodapi/templates/*/*.rst', 'numpydoc/templates/*.rst']} pydl-0.7.0/astropy_helpers/README.rst0000644000076500000240000000503313434074306020021 0ustar weaverstaff00000000000000astropy-helpers =============== * Stable versions: https://pypi.org/project/astropy-helpers/ * Development version, issue tracker: https://github.com/astropy/astropy-helpers This project provides a Python package, ``astropy_helpers``, which includes many build, installation, and documentation-related tools used by the Astropy project, but packaged separately for use by other projects that wish to leverage this work. The motivation behind this package and details of its implementation are in the accepted `Astropy Proposal for Enhancement (APE) 4 `_. The ``astropy_helpers.extern`` sub-module includes modules developed elsewhere that are bundled here for convenience. At the moment, this consists of the following two sphinx extensions: * `numpydoc `_, a Sphinx extension developed as part of the Numpy project. This is used to parse docstrings in Numpy format * `sphinx-automodapi `_, a Sphinx extension developed as part of the Astropy project. This used to be developed directly in ``astropy-helpers`` but is now a standalone package. Issues with these sub-modules should be reported in their respective repositories, and we will regularly update the bundled versions to reflect the latest released versions. ``astropy_helpers`` includes a special "bootstrap" module called ``ah_bootstrap.py`` which is intended to be used by a project's setup.py in order to ensure that the ``astropy_helpers`` package is available for build/installation. This is similar to the ``ez_setup.py`` module that is shipped with some projects to bootstrap `setuptools `_. As described in APE4, the version numbers for ``astropy_helpers`` follow the corresponding major/minor version of the `astropy core package `_, but with an independent sequence of micro (bugfix) version numbers. Hence, the initial release is 0.4, in parallel with Astropy v0.4, which will be the first version of Astropy to use ``astropy-helpers``. For examples of how to implement ``astropy-helpers`` in a project, see the ``setup.py`` and ``setup.cfg`` files of the `Affiliated package template `_. .. image:: https://travis-ci.org/astropy/astropy-helpers.svg :target: https://travis-ci.org/astropy/astropy-helpers .. image:: https://coveralls.io/repos/astropy/astropy-helpers/badge.svg :target: https://coveralls.io/r/astropy/astropy-helpers pydl-0.7.0/astropy_helpers/CHANGES.rst0000644000076500000240000004227613434074306020146 0ustar weaverstaff00000000000000astropy-helpers Changelog ************************* 2.0.8 (2018-12-04) ------------------ - Fixed compatibility with Sphinx 1.8+. [#428] - Fixed error that occurs when installing a package in an environment where ``numpy`` is not already installed. [#404] - Updated bundled version of sphinx-automodapi to v0.9. [#422] - Updated bundled version of numpydoc to v0.8.0. [#423] 2.0.7 (2018-06-01) ------------------ - Removing ez_setup.py file and requiring setuptools 1.0 or later. [#384] 2.0.6 (2018-02-24) ------------------ - Avoid deprecation warning due to ``exclude=`` keyword in ``setup.py``. [#379] 2.0.5 (2018-02-22) ------------------ - Fix segmentation faults that occurred when the astropy-helpers submodule was first initialized in packages that also contained Cython code. [#375] 2.0.4 (2018-02-09) ------------------ - Support dotted package names as namespace packages in generate_version_py. [#370] - Fix compatibility with setuptools 36.x and above. [#372] - Fix false negative in add_openmp_flags_if_available when measuring code coverage with gcc. [#374] 2.0.3 (2018-01-20) ------------------ - Make sure that astropy-helpers 3.x.x is not downloaded on Python 2. [#363] - The bundled version of sphinx-automodapi has been updated to v0.7. [#365] - Add --auto-use and --no-auto-use command-line flags to match the ``auto_use`` configuration option, and add an alias ``--use-system-astropy-helpers`` for ``--no-auto-use``. [#366] 2.0.2 (2017-10-13) ------------------ - Added new helper function add_openmp_flags_if_available that can add OpenMP compilation flags to a C/Cython extension if needed. [#346] - Update numpydoc to v0.7. [#343] - The function ``get_git_devstr`` now returns ``'0'`` instead of ``None`` when no git repository is present. This allows generation of development version strings that are in a format that ``setuptools`` expects (e.g. "1.1.3.dev0" instead of "1.1.3.dev"). [#330] - It is now possible to override generated timestamps to make builds reproducible by setting the ``SOURCE_DATE_EPOCH`` environment variable [#341] - Mark Sphinx extensions as parallel-safe. [#344] - Switch to using mathjax instead of imgmath for local builds. [#342] - Deprecate ``exclude`` parameter of various functions in setup_helpers since it could not work as intended. Add new function ``add_exclude_packages`` to provide intended behavior. [#331] - Allow custom Sphinx doctest extension to recognize and process standard doctest directives ``testsetup`` and ``doctest``. [#335] 2.0.1 (2017-07-28) ------------------ - Fix compatibility with Sphinx <1.5. [#326] 2.0 (2017-07-06) ---------------- - Add support for package that lies in a subdirectory. [#249] - Removing ``compat.subprocess``. [#298] - Python 3.3 is no longer supported. [#300] - The 'automodapi' Sphinx extension (and associated dependencies) has now been moved to a standalone package which can be found at https://github.com/astropy/sphinx-automodapi - this is now bundled in astropy-helpers under astropy_helpers.extern.automodapi for convenience. Version shipped with astropy-helpers is v0.6. [#278, #303, #309, #323] - The ``numpydoc`` Sphinx extension has now been moved to ``astropy_helpers.extern``. [#278] - Fix ``build_docs`` error catching, so it doesn't hide Sphinx errors. [#292] - Fix compatibility with Sphinx 1.6. [#318] - Updating ez_setup.py to the last version before it's removal. [#321] 1.3.1 (2017-03-18) ------------------ - Fixed the missing button to hide output in documentation code blocks. [#287] - Fixed bug when ``build_docs`` when running with the clean (-l) option. [#289] - Add alternative location for various intersphinx inventories to fall back to. [#293] 1.3 (2016-12-16) ---------------- - ``build_sphinx`` has been deprecated in favor of the ``build_docs`` command. [#246] - Force the use of Cython's old ``build_ext`` command. A new ``build_ext`` command was added in Cython 0.25, but it does not work with astropy-helpers currently. [#261] 1.2 (2016-06-18) ---------------- - Added sphinx configuration value ``automodsumm_inherited_members``. If ``True`` this will include members that are inherited from a base class in the generated API docs. Defaults to ``False`` which matches the previous behavior. [#215] - Fixed ``build_sphinx`` to recognize builds that succeeded but have output *after* the "build succeeded." statement. This only applies when ``--warnings-returncode`` is given (which is primarily relevant for Travis documentation builds). [#223] - Fixed ``build_sphinx`` the sphinx extensions to not output a spurious warning for sphinx versions > 1.4. [#229] - Add Python version dependent local sphinx inventories that contain otherwise missing references. [#216] - ``astropy_helpers`` now require Sphinx 1.3 or later. [#226] 1.1.2 (2016-03-9) ----------------- - The CSS for the sphinx documentation was altered to prevent some text overflow problems. [#217] 1.1.1 (2015-12-23) ------------------ - Fixed crash in build with ``AttributeError: cython_create_listing`` with older versions of setuptools. [#209, #210] 1.1 (2015-12-10) ---------------- - The original ``AstropyTest`` class in ``astropy_helpers``, which implements the ``setup.py test`` command, is deprecated in favor of moving the implementation of that command closer to the actual Astropy test runner in ``astropy.tests``. Now a dummy ``test`` command is provided solely for informing users that they need ``astropy`` installed to run the tests (however, the previous, now deprecated implementation is still provided and continues to work with older versions of Astropy). See the related issue for more details. [#184] - Added a useful new utility function to ``astropy_helpers.utils`` called ``find_data_files``. This is similar to the ``find_packages`` function in setuptools in that it can be used to search a package for data files (matching a pattern) that can be passed to the ``package_data`` argument for ``setup()``. See the docstring to ``astropy_helpers.utils.find_data_files`` for more details. [#42] - The ``astropy_helpers`` module now sets the global ``_ASTROPY_SETUP_`` flag upon import (from within a ``setup.py``) script, so it's not necessary to have this in the ``setup.py`` script explicitly. If in doubt though, there's no harm in setting it twice. Putting it in ``astropy_helpers`` just ensures that any other imports that occur during build will have this flag set. [#191] - It is now possible to use Cython as a ``setup_requires`` build requirement, and still build Cython extensions even if Cython wasn't available at the beginning of the build processes (that is, is automatically downloaded via setuptools' processing of ``setup_requires``). [#185] - Moves the ``adjust_compiler`` check into the ``build_ext`` command itself, so it's only used when actually building extension modules. This also deprecates the stand-alone ``adjust_compiler`` function. [#76] - When running the ``build_sphinx`` / ``build_docs`` command with the ``-w`` option, the output from Sphinx is streamed as it runs instead of silently buffering until the doc build is complete. [#197] 1.0.7 (unreleased) ------------------ - Fix missing import in ``astropy_helpers/utils.py``. [#196] 1.0.6 (2015-12-04) ------------------ - Fixed bug where running ``./setup.py build_sphinx`` could return successfully even when the build was not successful (and should have returned a non-zero error code). [#199] 1.0.5 (2015-10-02) ------------------ - Fixed a regression in the ``./setup.py test`` command that was introduced in v1.0.4. 1.0.4 (2015-10-02) ------------------ - Fixed issue with the sphinx documentation css where the line numbers for code blocks were not aligned with the code. [#179, #180] - Fixed crash that could occur when trying to build Cython extension modules when Cython isn't installed. Normally this still results in a failed build, but was supposed to provide a useful error message rather than crash outright (this was a regression introduced in v1.0.3). [#181] - Fixed a crash that could occur on Python 3 when a working C compiler isn't found. [#182] - Quieted warnings about deprecated Numpy API in Cython extensions, when building Cython extensions against Numpy >= 1.7. [#183, #186] - Improved support for py.test >= 2.7--running the ``./setup.py test`` command now copies all doc pages into the temporary test directory as well, so that all test files have a "common root directory". [#189, #190] 1.0.3 (2015-07-22) ------------------ - Added workaround for sphinx-doc/sphinx#1843, a but in Sphinx which prevented descriptor classes with a custom metaclass from being documented correctly. [#158] - Added an alias for the ``./setup.py build_sphinx`` command as ``./setup.py build_docs`` which, to a new contributor, should hopefully be less cryptic. [#161] - The fonts in graphviz diagrams now match the font of the HTML content. [#169] - When the documentation is built on readthedocs.org, MathJax will be used for math rendering. When built elsewhere, the "pngmath" extension is still used for math rendering. [#170] - Fix crash when importing astropy_helpers when running with ``python -OO`` [#171] - The ``build`` and ``build_ext`` stages now correctly recognize the presence of C++ files in Cython extensions (previously only vanilla C worked). [#173] 1.0.2 (2015-04-02) ------------------ - Various fixes enabling the astropy-helpers Sphinx build command and Sphinx extensions to work with Sphinx 1.3. [#148] - More improvement to the ability to handle multiple versions of astropy-helpers being imported in the same Python interpreter session in the (somewhat rare) case of nested installs. [#147] - To better support high resolution displays, use SVG for the astropy logo and linkout image, falling back to PNGs for browsers that support it. [#150, #151] - Improve ``setup_helpers.get_compiler_version`` to work with more compilers, and to return more info. This will help fix builds of Astropy on less common compilers, like Sun C. [#153] 1.0.1 (2015-03-04) ------------------ - Released in concert with v0.4.8 to address the same issues. 0.4.8 (2015-03-04) ------------------ - Improved the ``ah_bootstrap`` script's ability to override existing installations of astropy-helpers with new versions in the context of installing multiple packages simultaneously within the same Python interpreter (e.g. when one package has in its ``setup_requires`` another package that uses a different version of astropy-helpers. [#144] - Added a workaround to an issue in matplotlib that can, in rare cases, lead to a crash when installing packages that import matplotlib at build time. [#144] 1.0 (2015-02-17) ---------------- - Added new pre-/post-command hook points for ``setup.py`` commands. Now any package can define code to run before and/or after any ``setup.py`` command without having to manually subclass that command by adding ``pre__hook`` and ``post__hook`` callables to the package's ``setup_package.py`` module. See the PR for more details. [#112] - The following objects in the ``astropy_helpers.setup_helpers`` module have been relocated: - ``get_dummy_distribution``, ``get_distutils_*``, ``get_compiler_option``, ``add_command_option``, ``is_distutils_display_option`` -> ``astropy_helpers.distutils_helpers`` - ``should_build_with_cython``, ``generate_build_ext_command`` -> ``astropy_helpers.commands.build_ext`` - ``AstropyBuildPy`` -> ``astropy_helpers.commands.build_py`` - ``AstropyBuildSphinx`` -> ``astropy_helpers.commands.build_sphinx`` - ``AstropyInstall`` -> ``astropy_helpers.commands.install`` - ``AstropyInstallLib`` -> ``astropy_helpers.commands.install_lib`` - ``AstropyRegister`` -> ``astropy_helpers.commands.register`` - ``get_pkg_version_module`` -> ``astropy_helpers.version_helpers`` - ``write_if_different``, ``import_file``, ``get_numpy_include_path`` -> ``astropy_helpers.utils`` All of these are "soft" deprecations in the sense that they are still importable from ``astropy_helpers.setup_helpers`` for now, and there is no (easy) way to produce deprecation warnings when importing these objects from ``setup_helpers`` rather than directly from the modules they are defined in. But please consider updating any imports to these objects. [#110] - Use of the ``astropy.sphinx.ext.astropyautosummary`` extension is deprecated for use with Sphinx < 1.2. Instead it should suffice to remove this extension for the ``extensions`` list in your ``conf.py`` and add the stock ``sphinx.ext.autosummary`` instead. [#131] 0.4.7 (2015-02-17) ------------------ - Fixed incorrect/missing git hash being added to the generated ``version.py`` when creating a release. [#141] 0.4.6 (2015-02-16) ------------------ - Fixed problems related to the automatically generated _compiler module not being created properly. [#139] 0.4.5 (2015-02-11) ------------------ - Fixed an issue where ah_bootstrap.py could blow up when astropy_helper's version number is 1.0. - Added a workaround for documentation of properties in the rare case where the class's metaclass has a property of the same name. [#130] - Fixed an issue on Python 3 where importing a package using astropy-helper's generated version.py module would crash when the current working directory is an empty git repository. [#114, #137] - Fixed an issue where the "revision count" appended to .dev versions by the generated version.py did not accurately reflect the revision count for the package it belongs to, and could be invalid if the current working directory is an unrelated git repository. [#107, #137] - Likewise, fixed a confusing warning message that could occur in the same circumstances as the above issue. [#121, #137] 0.4.4 (2014-12-31) ------------------ - More improvements for building the documentation using Python 3.x. [#100] - Additional minor fixes to Python 3 support. [#115] - Updates to support new test features in Astropy [#92, #106] 0.4.3 (2014-10-22) ------------------ - The generated ``version.py`` file now preserves the git hash of installed copies of the package as well as when building a source distribution. That is, the git hash of the changeset that was installed/released is preserved. [#87] - In smart resolver add resolution for class links when they exist in the intersphinx inventory, but not the mapping of the current package (e.g. when an affiliated package uses an astropy core class of which "actual" and "documented" location differs) [#88] - Fixed a bug that could occur when running ``setup.py`` for the first time in a repository that uses astropy-helpers as a submodule: ``AttributeError: 'NoneType' object has no attribute 'mkdtemp'`` [#89] - Fixed a bug where optional arguments to the ``doctest-skip`` Sphinx directive were sometimes being left in the generated documentation output. [#90] - Improved support for building the documentation using Python 3.x. [#96] - Avoid error message if .git directory is not present. [#91] 0.4.2 (2014-08-09) ------------------ - Fixed some CSS issues in generated API docs. [#69] - Fixed the warning message that could be displayed when generating a version number with some older versions of git. [#77] - Fixed automodsumm to work with new versions of Sphinx (>= 1.2.2). [#80] 0.4.1 (2014-08-08) ------------------ - Fixed git revision count on systems with git versions older than v1.7.2. [#70] - Fixed display of warning text when running a git command fails (previously the output of stderr was not being decoded properly). [#70] - The ``--offline`` flag to ``setup.py`` understood by ``ah_bootstrap.py`` now also prevents git from going online to fetch submodule updates. [#67] - The Sphinx extension for converting issue numbers to links in the changelog now supports working on arbitrary pages via a new ``conf.py`` setting: ``changelog_links_docpattern``. By default it affects the ``changelog`` and ``whatsnew`` pages in one's Sphinx docs. [#61] - Fixed crash that could result from users with missing/misconfigured locale settings. [#58] - The font used for code examples in the docs is now the system-defined ``monospace`` font, rather than ``Minaco``, which is not available on all platforms. [#50] 0.4 (2014-07-15) ---------------- - Initial release of astropy-helpers. See `APE4 `_ for details of the motivation and design of this package. - The ``astropy_helpers`` package replaces the following modules in the ``astropy`` package: - ``astropy.setup_helpers`` -> ``astropy_helpers.setup_helpers`` - ``astropy.version_helpers`` -> ``astropy_helpers.version_helpers`` - ``astropy.sphinx`` - > ``astropy_helpers.sphinx`` These modules should be considered deprecated in ``astropy``, and any new, non-critical changes to those modules will be made in ``astropy_helpers`` instead. Affiliated packages wishing to make use those modules (as in the Astropy package-template) should use the versions from ``astropy_helpers`` instead, and include the ``ah_bootstrap.py`` script in their project, for bootstrapping the ``astropy_helpers`` package in their setup.py script. pydl-0.7.0/README.rst0000644000076500000240000000634213434104050014570 0ustar weaverstaff00000000000000==== PyDL ==== .. image:: http://img.shields.io/badge/powered%20by-AstroPy-orange.svg?style=flat :target: http://www.astropy.org :alt: Powered by Astropy Badge .. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.1095151.svg :target: https://doi.org/10.5281/zenodo.1095151 :alt: DOI: 10.5281/zenodo.1095151 .. image:: https://img.shields.io/pypi/v/pydl.svg :target: https://pypi.python.org/pypi/pydl :alt: PyPI Badge Description ----------- This package consists of Python_ replacements for functions that are part of the `IDL®`_ built-in library or part of astronomical `IDL®`_ libraries. The emphasis is on reproducing results of the astronomical library functions. Only the bare minimum of `IDL®`_ built-in functions are implemented to support this. There are four astronomical libraries targeted: * idlutils_ : a general suite of tools heavily used by SDSS_. * `Goddard utilities`_ : The `IDL®`_ Astronomy User's Libary, maintained by Wayne Landsman and distributed with idlutils_. * idlspec2d_ : tools for working with SDSS_, BOSS_ and eBOSS_ spectroscopic data. * photoop_ : tools for working with SDSS_ imaging data. This package affiliated with the astropy_ project and is registered with PyPI_. Full Documentation ------------------ Please visit `PyDL on Read the Docs`_ .. image:: https://readthedocs.org/projects/pydl/badge/?version=latest :target: http://pydl.readthedocs.org/en/latest/ :alt: Documentation Status History ------- This package was initially developed on the SDSS-III_ `svn repository`_. It was moved to the new GitHub_ repository on 2013-03-06. The present location of the repository is http://github.com/weaverba137/pydl . Travis Build Status ------------------- .. image:: https://img.shields.io/travis/weaverba137/pydl.svg :target: https://travis-ci.org/weaverba137/pydl :alt: Travis Build Status Test Coverage Status -------------------- .. image:: https://coveralls.io/repos/weaverba137/pydl/badge.svg?branch=master&service=github :target: https://coveralls.io/github/weaverba137/pydl?branch=master :alt: Test Coverage Status License ------- .. image:: https://img.shields.io/pypi/l/pydl.svg :target: https://pypi.python.org/pypi/pydl :alt: License PyDL is free software licensed under a 3-clause BSD-style license. For details see the ``licenses/LICENSE.rst`` file. Legal ----- * IDL is a registered trademark of `Harris Geospatial Solutions`_. .. _Python: http://python.org .. _`IDL®`: http://www.harrisgeospatial.com/SoftwareTechnology/IDL.aspx .. _idlutils: https://www.sdss.org/dr14/software/idlutils/ .. _SDSS: https://www.sdss.org .. _`Goddard utilities`: http://idlastro.gsfc.nasa.gov/ .. _idlspec2d: https://svn.sdss.org/public/repo/eboss/idlspec2d/trunk/ .. _BOSS: https://www.sdss.org/surveys/boss/ .. _eBOSS: https://www.sdss.org/surveys/eboss/ .. _photoop: https://svn.sdss.org/public/repo/sdss/photoop/trunk/ .. _astropy: http://www.astropy.org .. _PyPI: https://pypi.python.org/pypi/pydl/ .. _`PyDL on Read the Docs`: https://pydl.readthedocs.io/en/latest/ .. _SDSS-III: http://www.sdss3.org .. _`svn repository`: https://www.sdss.org/dr14/software/products/ .. _GitHub: https://github.com .. _`Harris Geospatial Solutions`: http://www.harrisgeospatial.com/