././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.6531785 astropy-healpix-0.5/0000755000077000000240000000000000000000000014426 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574695473.5461984 astropy-healpix-0.5/CHANGES.rst0000644000077000000240000000337100000000000016234 0ustar00tomstaff00000000000000.. _changes: ******* Changes ******* 0.5 (2019-11-25) ================ - Update package infrastructure to use ``setup.cfg``. [#134] - Make sure that Numpy is declared as a build-time dependency. [#134] - Update astropy-helpers to v3.2.2. [#134] - Update minimum required Python version to 3.6. [#125] - Add ``HEALPix.from_header``. [#127] - Clean up C code to avoid compilation warnings. [#118, #119, #120, #121, #122, #123] - Fix unit tests on 32-bit architectures. [#117] - Fix compatibility with Numpy 1.16 and later. [#116] 0.4 (2018-12-18) ================ - Healpix rangesearch cleanup [#113] - Update astropy-helpers to v2.0.8 [#112] - Rewrite core module in C to make ``healpix_to_lonlat`` and ``lonlat_to_healpix`` broadcastable over both pixel index and nside. [#110] 0.3.1 (2018-10-24) ================== - Ensure .c files are included in tar file. 0.3 (2018-10-24) ================ - Remove OpenMP from astropy-healpix [#108] - Fix bilinear interpolation of invalid values [#106] - Add uniq to (level, ipix) and inverse function [#105] - compute z more stably; improve on z2dec [#101] - use more stable cos(Dec) term [#94] - Fix get_interp_weights for phi=None case [#89] - Add pix2vec, vec2pix, ang2vec [#73] - Add ``pixel_resolution_to_nside`` function. [#31] 0.2 (2017-10-15) ================ - Expand benchmarks to include ang2pix, nest2ring and ring2nest. [#62] - Use OpenMP to parallelize the Cython wrappers. [#59] - Renamed the ``healpix_neighbours`` function to ``neighbours`` and added a wrapper to the high-level class. [#61] - Fix bilinear interpolation which was being done incorrectly, and added a new ``bilinear_interpolation_weights`` function to get the interpolation weights. [#63] 0.1 (2017-10-01) ================ - Initial release ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8710742 astropy-healpix-0.5/LICENSE.md0000644000077000000240000000272700000000000016042 0ustar00tomstaff00000000000000Copyright (c) 2016-2018, Astropy Developers All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the Astropy Team nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.6539373 astropy-healpix-0.5/PKG-INFO0000644000077000000240000000324100000000000015523 0ustar00tomstaff00000000000000Metadata-Version: 2.1 Name: astropy-healpix Version: 0.5 Summary: BSD-licensed HEALPix for Astropy Home-page: https://github.com/astropy/astropy-healpix Author: Christoph Deil, Thomas Robitaille, and Dustin Lang Author-email: astropy.team@gmail.com License: BSD 3-Clause Description: |RTD| |Travis Status| |AppVeyor status| |CircleCI| |Coverage Status| About ----- This is a BSD-licensed HEALPix package developed by the Astropy project and based on C code written by Dustin Lang in `astrometry.net `__. See the `Documentation `__ for information about installing and using this package. .. |Travis Status| image:: https://travis-ci.org/astropy/astropy-healpix.svg :target: https://travis-ci.org/astropy/astropy-healpix?branch=master .. |AppVeyor status| image:: https://ci.appveyor.com/api/projects/status/5kxwb47o2umy370m/branch/master?svg=true :target: https://ci.appveyor.com/project/Astropy/astropy-healpix/branch/master .. |CircleCI| image:: https://circleci.com/gh/astropy/astropy-healpix.svg?style=svg :target: https://circleci.com/gh/astropy/astropy-healpix .. |Coverage Status| image:: https://coveralls.io/repos/astropy/astropy-healpix/badge.svg :target: https://coveralls.io/r/astropy/astropy-healpix .. |RTD| image:: https://readthedocs.org/projects/astropy-healpix/badge/?version=latest :target: http://astropy-healpix.readthedocs.io/en/latest/?badge=latest Platform: UNKNOWN Requires-Python: >=3.6 Provides-Extra: docs Provides-Extra: test ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1526665132.250059 astropy-healpix-0.5/README.rst0000644000077000000240000000222100000000000016112 0ustar00tomstaff00000000000000|RTD| |Travis Status| |AppVeyor status| |CircleCI| |Coverage Status| About ----- This is a BSD-licensed HEALPix package developed by the Astropy project and based on C code written by Dustin Lang in `astrometry.net `__. See the `Documentation `__ for information about installing and using this package. .. |Travis Status| image:: https://travis-ci.org/astropy/astropy-healpix.svg :target: https://travis-ci.org/astropy/astropy-healpix?branch=master .. |AppVeyor status| image:: https://ci.appveyor.com/api/projects/status/5kxwb47o2umy370m/branch/master?svg=true :target: https://ci.appveyor.com/project/Astropy/astropy-healpix/branch/master .. |CircleCI| image:: https://circleci.com/gh/astropy/astropy-healpix.svg?style=svg :target: https://circleci.com/gh/astropy/astropy-healpix .. |Coverage Status| image:: https://coveralls.io/repos/astropy/astropy-healpix/badge.svg :target: https://coveralls.io/r/astropy/astropy-healpix .. |RTD| image:: https://readthedocs.org/projects/astropy-healpix/badge/?version=latest :target: http://astropy-healpix.readthedocs.io/en/latest/?badge=latest ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.4971526 astropy-healpix-0.5/ah_bootstrap.py0000644000077000000240000011063300000000000017471 0ustar00tomstaff00000000000000""" This bootstrap module contains code for ensuring that the astropy_helpers package will be importable by the time the setup.py script runs. It also includes some workarounds to ensure that a recent-enough version of setuptools is being used for the installation. This module should be the first thing imported in the setup.py of distributions that make use of the utilities in astropy_helpers. If the distribution ships with its own copy of astropy_helpers, this module will first attempt to import from the shipped copy. However, it will also check PyPI to see if there are any bug-fix releases on top of the current version that may be useful to get past platform-specific bugs that have been fixed. When running setup.py, use the ``--offline`` command-line option to disable the auto-upgrade checks. When this module is imported or otherwise executed it automatically calls a main function that attempts to read the project's setup.cfg file, which it checks for a configuration section called ``[ah_bootstrap]`` the presences of that section, and options therein, determine the next step taken: If it contains an option called ``auto_use`` with a value of ``True``, it will automatically call the main function of this module called `use_astropy_helpers` (see that function's docstring for full details). Otherwise no further action is taken and by default the system-installed version of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers`` may be called manually from within the setup.py script). This behavior can also be controlled using the ``--auto-use`` and ``--no-auto-use`` command-line flags. For clarity, an alias for ``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using the latter if needed. Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same names as the arguments to `use_astropy_helpers`, and can be used to configure the bootstrap script when ``auto_use = True``. See https://github.com/astropy/astropy-helpers for more details, and for the latest version of this module. """ import contextlib import errno import io import locale import os import re import subprocess as sp import sys from distutils import log from distutils.debug import DEBUG from configparser import ConfigParser, RawConfigParser import pkg_resources from setuptools import Distribution from setuptools.package_index import PackageIndex # This is the minimum Python version required for astropy-helpers __minimum_python_version__ = (3, 5) # TODO: Maybe enable checking for a specific version of astropy_helpers? DIST_NAME = 'astropy-helpers' PACKAGE_NAME = 'astropy_helpers' UPPER_VERSION_EXCLUSIVE = None # Defaults for other options DOWNLOAD_IF_NEEDED = True INDEX_URL = 'https://pypi.python.org/simple' USE_GIT = True OFFLINE = False AUTO_UPGRADE = True # A list of all the configuration options and their required types CFG_OPTIONS = [ ('auto_use', bool), ('path', str), ('download_if_needed', bool), ('index_url', str), ('use_git', bool), ('offline', bool), ('auto_upgrade', bool) ] # Start off by parsing the setup.cfg file _err_help_msg = """ If the problem persists consider installing astropy_helpers manually using pip (`pip install astropy_helpers`) or by manually downloading the source archive, extracting it, and installing by running `python setup.py install` from the root of the extracted source code. """ SETUP_CFG = ConfigParser() if os.path.exists('setup.cfg'): try: SETUP_CFG.read('setup.cfg') except Exception as e: if DEBUG: raise log.error( "Error reading setup.cfg: {0!r}\n{1} will not be " "automatically bootstrapped and package installation may fail." "\n{2}".format(e, PACKAGE_NAME, _err_help_msg)) # We used package_name in the package template for a while instead of name if SETUP_CFG.has_option('metadata', 'name'): parent_package = SETUP_CFG.get('metadata', 'name') elif SETUP_CFG.has_option('metadata', 'package_name'): parent_package = SETUP_CFG.get('metadata', 'package_name') else: parent_package = None if SETUP_CFG.has_option('options', 'python_requires'): python_requires = SETUP_CFG.get('options', 'python_requires') # The python_requires key has a syntax that can be parsed by SpecifierSet # in the packaging package. However, we don't want to have to depend on that # package, so instead we can use setuptools (which bundles packaging). We # have to add 'python' to parse it with Requirement. from pkg_resources import Requirement req = Requirement.parse('python' + python_requires) # We want the Python version as a string, which we can get from the platform module import platform # strip off trailing '+' incase this is a dev install of python python_version = platform.python_version().strip('+') # allow pre-releases to count as 'new enough' if not req.specifier.contains(python_version, True): if parent_package is None: message = "ERROR: Python {} is required by this package\n".format(req.specifier) else: message = "ERROR: Python {} is required by {}\n".format(req.specifier, parent_package) sys.stderr.write(message) sys.exit(1) if sys.version_info < __minimum_python_version__: if parent_package is None: message = "ERROR: Python {} or later is required by astropy-helpers\n".format( __minimum_python_version__) else: message = "ERROR: Python {} or later is required by astropy-helpers for {}\n".format( __minimum_python_version__, parent_package) sys.stderr.write(message) sys.exit(1) _str_types = (str, bytes) # What follows are several import statements meant to deal with install-time # issues with either missing or misbehaving pacakges (including making sure # setuptools itself is installed): # Check that setuptools 30.3 or later is present from distutils.version import LooseVersion try: import setuptools assert LooseVersion(setuptools.__version__) >= LooseVersion('30.3') except (ImportError, AssertionError): sys.stderr.write("ERROR: setuptools 30.3 or later is required by astropy-helpers\n") sys.exit(1) # typing as a dependency for 1.6.1+ Sphinx causes issues when imported after # initializing submodule with ah_boostrap.py # See discussion and references in # https://github.com/astropy/astropy-helpers/issues/302 try: import typing # noqa except ImportError: pass # Note: The following import is required as a workaround to # https://github.com/astropy/astropy-helpers/issues/89; if we don't import this # module now, it will get cleaned up after `run_setup` is called, but that will # later cause the TemporaryDirectory class defined in it to stop working when # used later on by setuptools try: import setuptools.py31compat # noqa except ImportError: pass # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass # End compatibility imports... class _Bootstrapper(object): """ Bootstrapper implementation. See ``use_astropy_helpers`` for parameter documentation. """ def __init__(self, path=None, index_url=None, use_git=None, offline=None, download_if_needed=None, auto_upgrade=None): if path is None: path = PACKAGE_NAME if not (isinstance(path, _str_types) or path is False): raise TypeError('path must be a string or False') if not isinstance(path, str): fs_encoding = sys.getfilesystemencoding() path = path.decode(fs_encoding) # path to unicode self.path = path # Set other option attributes, using defaults where necessary self.index_url = index_url if index_url is not None else INDEX_URL self.offline = offline if offline is not None else OFFLINE # If offline=True, override download and auto-upgrade if self.offline: download_if_needed = False auto_upgrade = False self.download = (download_if_needed if download_if_needed is not None else DOWNLOAD_IF_NEEDED) self.auto_upgrade = (auto_upgrade if auto_upgrade is not None else AUTO_UPGRADE) # If this is a release then the .git directory will not exist so we # should not use git. git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git')) if use_git is None and not git_dir_exists: use_git = False self.use_git = use_git if use_git is not None else USE_GIT # Declared as False by default--later we check if astropy-helpers can be # upgraded from PyPI, but only if not using a source distribution (as in # the case of import from a git submodule) self.is_submodule = False @classmethod def main(cls, argv=None): if argv is None: argv = sys.argv config = cls.parse_config() config.update(cls.parse_command_line(argv)) auto_use = config.pop('auto_use', False) bootstrapper = cls(**config) if auto_use: # Run the bootstrapper, otherwise the setup.py is using the old # use_astropy_helpers() interface, in which case it will run the # bootstrapper manually after reconfiguring it. bootstrapper.run() return bootstrapper @classmethod def parse_config(cls): if not SETUP_CFG.has_section('ah_bootstrap'): return {} config = {} for option, type_ in CFG_OPTIONS: if not SETUP_CFG.has_option('ah_bootstrap', option): continue if type_ is bool: value = SETUP_CFG.getboolean('ah_bootstrap', option) else: value = SETUP_CFG.get('ah_bootstrap', option) config[option] = value return config @classmethod def parse_command_line(cls, argv=None): if argv is None: argv = sys.argv config = {} # For now we just pop recognized ah_bootstrap options out of the # arg list. This is imperfect; in the unlikely case that a setup.py # custom command or even custom Distribution class defines an argument # of the same name then we will break that. However there's a catch22 # here that we can't just do full argument parsing right here, because # we don't yet know *how* to parse all possible command-line arguments. if '--no-git' in argv: config['use_git'] = False argv.remove('--no-git') if '--offline' in argv: config['offline'] = True argv.remove('--offline') if '--auto-use' in argv: config['auto_use'] = True argv.remove('--auto-use') if '--no-auto-use' in argv: config['auto_use'] = False argv.remove('--no-auto-use') if '--use-system-astropy-helpers' in argv: config['auto_use'] = False argv.remove('--use-system-astropy-helpers') return config def run(self): strategies = ['local_directory', 'local_file', 'index'] dist = None # First, remove any previously imported versions of astropy_helpers; # this is necessary for nested installs where one package's installer # is installing another package via setuptools.sandbox.run_setup, as in # the case of setup_requires for key in list(sys.modules): try: if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'): del sys.modules[key] except AttributeError: # Sometimes mysterious non-string things can turn up in # sys.modules continue # Check to see if the path is a submodule self.is_submodule = self._check_submodule() for strategy in strategies: method = getattr(self, 'get_{0}_dist'.format(strategy)) dist = method() if dist is not None: break else: raise _AHBootstrapSystemExit( "No source found for the {0!r} package; {0} must be " "available and importable as a prerequisite to building " "or installing this package.".format(PACKAGE_NAME)) # This is a bit hacky, but if astropy_helpers was loaded from a # directory/submodule its Distribution object gets a "precedence" of # "DEVELOP_DIST". However, in other cases it gets a precedence of # "EGG_DIST". However, when activing the distribution it will only be # placed early on sys.path if it is treated as an EGG_DIST, so always # do that dist = dist.clone(precedence=pkg_resources.EGG_DIST) # Otherwise we found a version of astropy-helpers, so we're done # Just active the found distribution on sys.path--if we did a # download this usually happens automatically but it doesn't hurt to # do it again # Note: Adding the dist to the global working set also activates it # (makes it importable on sys.path) by default. try: pkg_resources.working_set.add(dist, replace=True) except TypeError: # Some (much) older versions of setuptools do not have the # replace=True option here. These versions are old enough that all # bets may be off anyways, but it's easy enough to work around just # in case... if dist.key in pkg_resources.working_set.by_key: del pkg_resources.working_set.by_key[dist.key] pkg_resources.working_set.add(dist) @property def config(self): """ A `dict` containing the options this `_Bootstrapper` was configured with. """ return dict((optname, getattr(self, optname)) for optname, _ in CFG_OPTIONS if hasattr(self, optname)) def get_local_directory_dist(self): """ Handle importing a vendored package from a subdirectory of the source distribution. """ if not os.path.isdir(self.path): return log.info('Attempting to import astropy_helpers from {0} {1!r}'.format( 'submodule' if self.is_submodule else 'directory', self.path)) dist = self._directory_import() if dist is None: log.warn( 'The requested path {0!r} for importing {1} does not ' 'exist, or does not contain a copy of the {1} ' 'package.'.format(self.path, PACKAGE_NAME)) elif self.auto_upgrade and not self.is_submodule: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_local_file_dist(self): """ Handle importing from a source archive; this also uses setup_requires but points easy_install directly to the source archive. """ if not os.path.isfile(self.path): return log.info('Attempting to unpack and import astropy_helpers from ' '{0!r}'.format(self.path)) try: dist = self._do_download(find_links=[self.path]) except Exception as e: if DEBUG: raise log.warn( 'Failed to import {0} from the specified archive {1!r}: ' '{2}'.format(PACKAGE_NAME, self.path, str(e))) dist = None if dist is not None and self.auto_upgrade: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_index_dist(self): if not self.download: log.warn('Downloading {0!r} disabled.'.format(DIST_NAME)) return None log.warn( "Downloading {0!r}; run setup.py with the --offline option to " "force offline installation.".format(DIST_NAME)) try: dist = self._do_download() except Exception as e: if DEBUG: raise log.warn( 'Failed to download and/or install {0!r} from {1!r}:\n' '{2}'.format(DIST_NAME, self.index_url, str(e))) dist = None # No need to run auto-upgrade here since we've already presumably # gotten the most up-to-date version from the package index return dist def _directory_import(self): """ Import astropy_helpers from the given path, which will be added to sys.path. Must return True if the import succeeded, and False otherwise. """ # Return True on success, False on failure but download is allowed, and # otherwise raise SystemExit path = os.path.abspath(self.path) # Use an empty WorkingSet rather than the man # pkg_resources.working_set, since on older versions of setuptools this # will invoke a VersionConflict when trying to install an upgrade ws = pkg_resources.WorkingSet([]) ws.add_entry(path) dist = ws.by_key.get(DIST_NAME) if dist is None: # We didn't find an egg-info/dist-info in the given path, but if a # setup.py exists we can generate it setup_py = os.path.join(path, 'setup.py') if os.path.isfile(setup_py): # We use subprocess instead of run_setup from setuptools to # avoid segmentation faults - see the following for more details: # https://github.com/cython/cython/issues/2104 sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path) for dist in pkg_resources.find_distributions(path, True): # There should be only one... return dist return dist def _do_download(self, version='', find_links=None): if find_links: allow_hosts = '' index_url = None else: allow_hosts = None index_url = self.index_url # Annoyingly, setuptools will not handle other arguments to # Distribution (such as options) before handling setup_requires, so it # is not straightforward to programmatically augment the arguments which # are passed to easy_install class _Distribution(Distribution): def get_option_dict(self, command_name): opts = Distribution.get_option_dict(self, command_name) if command_name == 'easy_install': if find_links is not None: opts['find_links'] = ('setup script', find_links) if index_url is not None: opts['index_url'] = ('setup script', index_url) if allow_hosts is not None: opts['allow_hosts'] = ('setup script', allow_hosts) return opts if version: req = '{0}=={1}'.format(DIST_NAME, version) else: if UPPER_VERSION_EXCLUSIVE is None: req = DIST_NAME else: req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE) attrs = {'setup_requires': [req]} # NOTE: we need to parse the config file (e.g. setup.cfg) to make sure # it honours the options set in the [easy_install] section, and we need # to explicitly fetch the requirement eggs as setup_requires does not # get honored in recent versions of setuptools: # https://github.com/pypa/setuptools/issues/1273 try: context = _verbose if DEBUG else _silence with context(): dist = _Distribution(attrs=attrs) try: dist.parse_config_files(ignore_option_errors=True) dist.fetch_build_eggs(req) except TypeError: # On older versions of setuptools, ignore_option_errors # doesn't exist, and the above two lines are not needed # so we can just continue pass # If the setup_requires succeeded it will have added the new dist to # the main working_set return pkg_resources.working_set.by_key.get(DIST_NAME) except Exception as e: if DEBUG: raise msg = 'Error retrieving {0} from {1}:\n{2}' if find_links: source = find_links[0] elif index_url != INDEX_URL: source = index_url else: source = 'PyPI' raise Exception(msg.format(DIST_NAME, source, repr(e))) def _do_upgrade(self, dist): # Build up a requirement for a higher bugfix release but a lower minor # release (so API compatibility is guaranteed) next_version = _next_version(dist.parsed_version) req = pkg_resources.Requirement.parse( '{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version)) package_index = PackageIndex(index_url=self.index_url) upgrade = package_index.obtain(req) if upgrade is not None: return self._do_download(version=upgrade.version) def _check_submodule(self): """ Check if the given path is a git submodule. See the docstrings for ``_check_submodule_using_git`` and ``_check_submodule_no_git`` for further details. """ if (self.path is None or (os.path.exists(self.path) and not os.path.isdir(self.path))): return False if self.use_git: return self._check_submodule_using_git() else: return self._check_submodule_no_git() def _check_submodule_using_git(self): """ Check if the given path is a git submodule. If so, attempt to initialize and/or update the submodule if needed. This function makes calls to the ``git`` command in subprocesses. The ``_check_submodule_no_git`` option uses pure Python to check if the given path looks like a git submodule, but it cannot perform updates. """ cmd = ['git', 'submodule', 'status', '--', self.path] try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except _CommandNotFound: # The git command simply wasn't found; this is most likely the # case on user systems that don't have git and are simply # trying to install the package from PyPI or a source # distribution. Silently ignore this case and simply don't try # to use submodules return False stderr = stderr.strip() if returncode != 0 and stderr: # Unfortunately the return code alone cannot be relied on, as # earlier versions of git returned 0 even if the requested submodule # does not exist # This is a warning that occurs in perl (from running git submodule) # which only occurs with a malformatted locale setting which can # happen sometimes on OSX. See again # https://github.com/astropy/astropy/issues/2749 perl_warning = ('perl: warning: Falling back to the standard locale ' '("C").') if not stderr.strip().endswith(perl_warning): # Some other unknown error condition occurred log.warn('git submodule command failed ' 'unexpectedly:\n{0}'.format(stderr)) return False # Output of `git submodule status` is as follows: # # 1: Status indicator: '-' for submodule is uninitialized, '+' if # submodule is initialized but is not at the commit currently indicated # in .gitmodules (and thus needs to be updated), or 'U' if the # submodule is in an unstable state (i.e. has merge conflicts) # # 2. SHA-1 hash of the current commit of the submodule (we don't really # need this information but it's useful for checking that the output is # correct) # # 3. The output of `git describe` for the submodule's current commit # hash (this includes for example what branches the commit is on) but # only if the submodule is initialized. We ignore this information for # now _git_submodule_status_re = re.compile( r'^(?P[+-U ])(?P[0-9a-f]{40}) ' r'(?P\S+)( .*)?$') # The stdout should only contain one line--the status of the # requested submodule m = _git_submodule_status_re.match(stdout) if m: # Yes, the path *is* a git submodule self._update_submodule(m.group('submodule'), m.group('status')) return True else: log.warn( 'Unexpected output from `git submodule status`:\n{0}\n' 'Will attempt import from {1!r} regardless.'.format( stdout, self.path)) return False def _check_submodule_no_git(self): """ Like ``_check_submodule_using_git``, but simply parses the .gitmodules file to determine if the supplied path is a git submodule, and does not exec any subprocesses. This can only determine if a path is a submodule--it does not perform updates, etc. This function may need to be updated if the format of the .gitmodules file is changed between git versions. """ gitmodules_path = os.path.abspath('.gitmodules') if not os.path.isfile(gitmodules_path): return False # This is a minimal reader for gitconfig-style files. It handles a few of # the quirks that make gitconfig files incompatible with ConfigParser-style # files, but does not support the full gitconfig syntax (just enough # needed to read a .gitmodules file). gitmodules_fileobj = io.StringIO() # Must use io.open for cross-Python-compatible behavior wrt unicode with io.open(gitmodules_path) as f: for line in f: # gitconfig files are more flexible with leading whitespace; just # go ahead and remove it line = line.lstrip() # comments can start with either # or ; if line and line[0] in (':', ';'): continue gitmodules_fileobj.write(line) gitmodules_fileobj.seek(0) cfg = RawConfigParser() try: cfg.readfp(gitmodules_fileobj) except Exception as exc: log.warn('Malformatted .gitmodules file: {0}\n' '{1} cannot be assumed to be a git submodule.'.format( exc, self.path)) return False for section in cfg.sections(): if not cfg.has_option(section, 'path'): continue submodule_path = cfg.get(section, 'path').rstrip(os.sep) if submodule_path == self.path.rstrip(os.sep): return True return False def _update_submodule(self, submodule, status): if status == ' ': # The submodule is up to date; no action necessary return elif status == '-': if self.offline: raise _AHBootstrapSystemExit( "Cannot initialize the {0} submodule in --offline mode; " "this requires being able to clone the submodule from an " "online repository.".format(submodule)) cmd = ['update', '--init'] action = 'Initializing' elif status == '+': cmd = ['update'] action = 'Updating' if self.offline: cmd.append('--no-fetch') elif status == 'U': raise _AHBootstrapSystemExit( 'Error: Submodule {0} contains unresolved merge conflicts. ' 'Please complete or abandon any changes in the submodule so that ' 'it is in a usable state, then try again.'.format(submodule)) else: log.warn('Unknown status {0!r} for git submodule {1!r}. Will ' 'attempt to use the submodule as-is, but try to ensure ' 'that the submodule is in a clean state and contains no ' 'conflicts or errors.\n{2}'.format(status, submodule, _err_help_msg)) return err_msg = None cmd = ['git', 'submodule'] + cmd + ['--', submodule] log.warn('{0} {1} submodule with: `{2}`'.format( action, submodule, ' '.join(cmd))) try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except OSError as e: err_msg = str(e) else: if returncode != 0: err_msg = stderr if err_msg is not None: log.warn('An unexpected error occurred updating the git submodule ' '{0!r}:\n{1}\n{2}'.format(submodule, err_msg, _err_help_msg)) class _CommandNotFound(OSError): """ An exception raised when a command run with run_cmd is not found on the system. """ def run_cmd(cmd): """ Run a command in a subprocess, given as a list of command-line arguments. Returns a ``(returncode, stdout, stderr)`` tuple. """ try: p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE) # XXX: May block if either stdout or stderr fill their buffers; # however for the commands this is currently used for that is # unlikely (they should have very brief output) stdout, stderr = p.communicate() except OSError as e: if DEBUG: raise if e.errno == errno.ENOENT: msg = 'Command not found: `{0}`'.format(' '.join(cmd)) raise _CommandNotFound(msg, cmd) else: raise _AHBootstrapSystemExit( 'An unexpected error occurred when running the ' '`{0}` command:\n{1}'.format(' '.join(cmd), str(e))) # Can fail of the default locale is not configured properly. See # https://github.com/astropy/astropy/issues/2749. For the purposes under # consideration 'latin1' is an acceptable fallback. try: stdio_encoding = locale.getdefaultlocale()[1] or 'latin1' except ValueError: # Due to an OSX oddity locale.getdefaultlocale() can also crash # depending on the user's locale/language settings. See: # http://bugs.python.org/issue18378 stdio_encoding = 'latin1' # Unlikely to fail at this point but even then let's be flexible if not isinstance(stdout, str): stdout = stdout.decode(stdio_encoding, 'replace') if not isinstance(stderr, str): stderr = stderr.decode(stdio_encoding, 'replace') return (p.returncode, stdout, stderr) def _next_version(version): """ Given a parsed version from pkg_resources.parse_version, returns a new version string with the next minor version. Examples ======== >>> _next_version(pkg_resources.parse_version('1.2.3')) '1.3.0' """ if hasattr(version, 'base_version'): # New version parsing from setuptools >= 8.0 if version.base_version: parts = version.base_version.split('.') else: parts = [] else: parts = [] for part in version: if part.startswith('*'): break parts.append(part) parts = [int(p) for p in parts] if len(parts) < 3: parts += [0] * (3 - len(parts)) major, minor, micro = parts[:3] return '{0}.{1}.{2}'.format(major, minor + 1, 0) class _DummyFile(object): """A noop writeable object.""" errors = '' # Required for Python 3.x encoding = 'utf-8' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def _verbose(): yield @contextlib.contextmanager def _silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr class _AHBootstrapSystemExit(SystemExit): def __init__(self, *args): if not args: msg = 'An unknown problem occurred bootstrapping astropy_helpers.' else: msg = args[0] msg += '\n' + _err_help_msg super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:]) BOOTSTRAPPER = _Bootstrapper.main() def use_astropy_helpers(**kwargs): """ Ensure that the `astropy_helpers` module is available and is importable. This supports automatic submodule initialization if astropy_helpers is included in a project as a git submodule, or will download it from PyPI if necessary. Parameters ---------- path : str or None, optional A filesystem path relative to the root of the project's source code that should be added to `sys.path` so that `astropy_helpers` can be imported from that path. If the path is a git submodule it will automatically be initialized and/or updated. The path may also be to a ``.tar.gz`` archive of the astropy_helpers source distribution. In this case the archive is automatically unpacked and made temporarily available on `sys.path` as a ``.egg`` archive. If `None` skip straight to downloading. download_if_needed : bool, optional If the provided filesystem path is not found an attempt will be made to download astropy_helpers from PyPI. It will then be made temporarily available on `sys.path` as a ``.egg`` archive (using the ``setup_requires`` feature of setuptools. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. index_url : str, optional If provided, use a different URL for the Python package index than the main PyPI server. use_git : bool, optional If `False` no git commands will be used--this effectively disables support for git submodules. If the ``--no-git`` option is given at the command line the value of this argument is overridden to `False`. auto_upgrade : bool, optional By default, when installing a package from a non-development source distribution ah_boostrap will try to automatically check for patch releases to astropy-helpers on PyPI and use the patched version over any bundled versions. Setting this to `False` will disable that functionality. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. offline : bool, optional If `False` disable all actions that require an internet connection, including downloading packages from the package index and fetching updates to any git submodule. Defaults to `True`. """ global BOOTSTRAPPER config = BOOTSTRAPPER.config config.update(**kwargs) # Create a new bootstrapper with the updated configuration and run it BOOTSTRAPPER = _Bootstrapper(**config) BOOTSTRAPPER.run() ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5238602 astropy-healpix-0.5/astropy_healpix/0000755000077000000240000000000000000000000017641 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2514584 astropy-healpix-0.5/astropy_healpix/__init__.py0000644000077000000240000000064000000000000021752 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ BSD-licensed HEALPix for Astropy. """ # Affiliated packages may add whatever they like to this file, but # should keep this content at the top. from ._astropy_init import * # noqa # For egg_info test builds to pass, put package imports here. if not _ASTROPY_SETUP_: # noqa from .high_level import * # noqa from .core import * # noqa ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.4981136 astropy-healpix-0.5/astropy_healpix/_astropy_init.py0000644000077000000240000000356400000000000023106 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst __all__ = ['__version__', '__githash__'] # this indicates whether or not we are in the package's setup.py try: _ASTROPY_SETUP_ except NameError: import builtins builtins._ASTROPY_SETUP_ = False try: from .version import version as __version__ except ImportError: __version__ = '' try: from .version import githash as __githash__ except ImportError: __githash__ = '' if not _ASTROPY_SETUP_: # noqa import os from warnings import warn from astropy.config.configuration import ( update_default_config, ConfigurationDefaultMissingError, ConfigurationDefaultMissingWarning) # Create the test function for self test from astropy.tests.runner import TestRunner test = TestRunner.make_test_runner_in(os.path.dirname(__file__)) test.__test__ = False __all__ += ['test'] # add these here so we only need to cleanup the namespace at the end config_dir = None if not os.environ.get('ASTROPY_SKIP_CONFIG_UPDATE', False): config_dir = os.path.dirname(__file__) config_template = os.path.join(config_dir, __package__ + ".cfg") if os.path.isfile(config_template): try: update_default_config( __package__, config_dir, version=__version__) except TypeError as orig_error: try: update_default_config(__package__, config_dir) except ConfigurationDefaultMissingError as e: wmsg = (e.args[0] + " Cannot install default profile. If you are " "importing from source, this is expected.") warn(ConfigurationDefaultMissingWarning(wmsg)) del e except Exception: raise orig_error ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.3273942 astropy-healpix-0.5/astropy_healpix/_compiler.c0000644000077000000240000000524000000000000021757 0ustar00tomstaff00000000000000#include /*************************************************************************** * Macros for determining the compiler version. * * These are borrowed from boost, and majorly abridged to include only * the compilers we care about. ***************************************************************************/ #define STRINGIZE(X) DO_STRINGIZE(X) #define DO_STRINGIZE(X) #X #if defined __clang__ /* Clang C++ emulates GCC, so it has to appear early. */ # define COMPILER "Clang version " __clang_version__ #elif defined(__INTEL_COMPILER) || defined(__ICL) || defined(__ICC) || defined(__ECC) /* Intel */ # if defined(__INTEL_COMPILER) # define INTEL_VERSION __INTEL_COMPILER # elif defined(__ICL) # define INTEL_VERSION __ICL # elif defined(__ICC) # define INTEL_VERSION __ICC # elif defined(__ECC) # define INTEL_VERSION __ECC # endif # define COMPILER "Intel C compiler version " STRINGIZE(INTEL_VERSION) #elif defined(__GNUC__) /* gcc */ # define COMPILER "GCC version " __VERSION__ #elif defined(__SUNPRO_CC) /* Sun Workshop Compiler */ # define COMPILER "Sun compiler version " STRINGIZE(__SUNPRO_CC) #elif defined(_MSC_VER) /* Microsoft Visual C/C++ Must be last since other compilers define _MSC_VER for compatibility as well */ # if _MSC_VER < 1200 # define COMPILER_VERSION 5.0 # elif _MSC_VER < 1300 # define COMPILER_VERSION 6.0 # elif _MSC_VER == 1300 # define COMPILER_VERSION 7.0 # elif _MSC_VER == 1310 # define COMPILER_VERSION 7.1 # elif _MSC_VER == 1400 # define COMPILER_VERSION 8.0 # elif _MSC_VER == 1500 # define COMPILER_VERSION 9.0 # elif _MSC_VER == 1600 # define COMPILER_VERSION 10.0 # else # define COMPILER_VERSION _MSC_VER # endif # define COMPILER "Microsoft Visual C++ version " STRINGIZE(COMPILER_VERSION) #else /* Fallback */ # define COMPILER "Unknown compiler" #endif /*************************************************************************** * Module-level ***************************************************************************/ struct module_state { /* The Sun compiler can't handle empty structs */ #if defined(__SUNPRO_C) || defined(_MSC_VER) int _dummy; #endif }; static struct PyModuleDef moduledef = { PyModuleDef_HEAD_INIT, "compiler_version", NULL, sizeof(struct module_state), NULL, NULL, NULL, NULL, NULL }; #define INITERROR return NULL PyMODINIT_FUNC PyInit_compiler_version(void) { PyObject* m; m = PyModule_Create(&moduledef); if (m == NULL) INITERROR; PyModule_AddStringConstant(m, "compiler", COMPILER); return m; } ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574433289.498598 astropy-healpix-0.5/astropy_healpix/_core.c0000644000077000000240000002665600000000000021113 0ustar00tomstaff00000000000000#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION #include #include #include #include "healpix.h" #include "healpix-utils.h" #include "interpolation.h" /* FIXME: We need npy_set_floatstatus_invalid(), but unlike most of the Numpy * C API it is only available on some platforms if you explicitly link against * Numpy, which is not typically done for building C extensions. This bundled * header file contains a static definition of _npy_set_floatstatus_invalid(). */ #include "ieee754.h" #define INVALID_INDEX (-1) /* Data structure for storing function pointers for routines that are specific * to the HEALPix ordering scheme. When we create the ufuncs using * PyUFunc_FromFuncAndData, we will set them up to pass a pointer to this * data structure through as the void *data parameter for the loop functions * defined below. */ typedef struct { int64_t (*order_to_xy)(int64_t, int); int64_t (*xy_to_order)(int64_t, int); } order_funcs; static order_funcs nested_order_funcs = {healpixl_nested_to_xy, healpixl_xy_to_nested}, ring_order_funcs = {healpixl_ring_to_xy, healpixl_xy_to_ring}; static void *no_ufunc_data[] = {NULL}, *nested_ufunc_data[] = {&nested_order_funcs}, *ring_ufunc_data[] = {&ring_order_funcs}; static int64_t nside2npix(int nside) { return 12 * ((int64_t) nside) * ((int64_t) nside); } static int pixel_nside_valid(int64_t pixel, int nside) { return pixel >= 0 && pixel < nside2npix(nside); } static void healpix_to_lonlat_loop( char **args, npy_intp *dimensions, npy_intp *steps, void *data) { order_funcs *funcs = data; npy_intp i, n = dimensions[0]; for (i = 0; i < n; i ++) { int64_t pixel = *(int64_t *) &args[0][i * steps[0]]; int nside = *(int *) &args[1][i * steps[1]]; double dx = *(double *) &args[2][i * steps[2]]; double dy = *(double *) &args[3][i * steps[3]]; double *lon = (double *) &args[4][i * steps[4]]; double *lat = (double *) &args[5][i * steps[5]]; int64_t xy = INVALID_INDEX; if (pixel_nside_valid(pixel, nside)) xy = funcs->order_to_xy(pixel, nside); if (xy >= 0) healpixl_to_radec(xy, nside, dx, dy, lon, lat); else { *lon = *lat = NPY_NAN; _npy_set_floatstatus_invalid(); } } } static void lonlat_to_healpix_loop( char **args, npy_intp *dimensions, npy_intp *steps, void *data) { order_funcs *funcs = data; npy_intp i, n = dimensions[0]; for (i = 0; i < n; i ++) { double lon = *(double *) &args[0][i * steps[0]]; double lat = *(double *) &args[1][i * steps[1]]; int nside = *(int *) &args[2][i * steps[2]]; int64_t *pixel = (int64_t *) &args[3][i * steps[3]]; double *dx = (double *) &args[4][i * steps[4]]; double *dy = (double *) &args[5][i * steps[5]]; int64_t xy = INVALID_INDEX; xy = radec_to_healpixlf(lon, lat, nside, dx, dy); if (xy >= 0) *pixel = funcs->xy_to_order(xy, nside); else { *pixel = INVALID_INDEX; *dx = *dy = NPY_NAN; _npy_set_floatstatus_invalid(); } } } static void nested_to_ring_loop( char **args, npy_intp *dimensions, npy_intp *steps, void *data) { npy_intp i, n = dimensions[0]; for (i = 0; i < n; i ++) { int64_t nested = *(int64_t *) &args[0][i * steps[0]]; int nside = *(int *) &args[1][i * steps[1]]; int64_t *ring = (int64_t *) &args[2][i * steps[2]]; int64_t xy = INVALID_INDEX; if (pixel_nside_valid(nested, nside)) xy = healpixl_nested_to_xy(nested, nside); if (xy >= 0) *ring = healpixl_xy_to_ring(xy, nside); else { *ring = INVALID_INDEX; _npy_set_floatstatus_invalid(); } } } static void ring_to_nested_loop( char **args, npy_intp *dimensions, npy_intp *steps, void *data) { npy_intp i, n = dimensions[0]; for (i = 0; i < n; i ++) { int64_t ring = *(int64_t *) &args[0][i * steps[0]]; int nside = *(int *) &args[1][i * steps[1]]; int64_t *nested = (int64_t *) &args[2][i * steps[2]]; int64_t xy = INVALID_INDEX; if (pixel_nside_valid(ring, nside)) xy = healpixl_ring_to_xy(ring, nside); if (xy >= 0) *nested = healpixl_xy_to_nested(xy, nside); else { *nested = INVALID_INDEX; _npy_set_floatstatus_invalid(); } } } static void bilinear_interpolation_weights_loop( char **args, npy_intp *dimensions, npy_intp *steps, void *data) { npy_intp i, n = dimensions[0]; for (i = 0; i < n; i ++) { double lon = *(double *) &args[0][i * steps[0]]; double lat = *(double *) &args[1][i * steps[1]]; int nside = *(int *) &args[2][i * steps[2]]; int64_t indices[4]; double weights[4]; int j; interpolate_weights(lon, lat, indices, weights, nside); for (j = 0; j < 4; j ++) { *(int64_t *) &args[3 + j][i * steps[3 + j]] = indices[j]; *(double *) &args[7 + j][i * steps[7 + j]] = weights[j]; } } } static void neighbours_loop( char **args, npy_intp *dimensions, npy_intp *steps, void *data) { order_funcs *funcs = data; npy_intp i, n = dimensions[0]; for (i = 0; i < n; i ++) { int64_t pixel = *(int64_t *) &args[0][i * steps[0]]; int nside = *(int *) &args[1][i * steps[1]]; int64_t neighbours[] = { INVALID_INDEX, INVALID_INDEX, INVALID_INDEX, INVALID_INDEX, INVALID_INDEX, INVALID_INDEX, INVALID_INDEX, INVALID_INDEX}; int j; int64_t xy = INVALID_INDEX; if (pixel_nside_valid(pixel, nside)) xy = funcs->order_to_xy(pixel, nside); if (xy >= 0) healpixl_get_neighbours(xy, neighbours, nside); for (j = 0; j < 8; j ++) { int k = 4 - j; if (k < 0) k += 8; xy = neighbours[k]; if (xy >= 0) pixel = funcs->xy_to_order(xy, nside); else { pixel = INVALID_INDEX; _npy_set_floatstatus_invalid(); } *(int64_t *) &args[2 + j][i * steps[2 + j]] = pixel; } } } static PyObject *healpix_cone_search( PyObject *self, PyObject *args, PyObject *kwargs) { PyObject *result; static char *kws[] = {"lon", "lat", "radius", "nside", "order", NULL}; double lon, lat, radius; int nside; char *order; int64_t *indices, n_indices; int64_t *result_data; npy_intp dims[1]; if (!PyArg_ParseTupleAndKeywords( args, kwargs, "dddis", kws, &lon, &lat, &radius, &nside, &order)) return NULL; n_indices = healpix_rangesearch_radec_simple( lon, lat, radius, nside, 0, &indices); if (!indices) { PyErr_SetString( PyExc_RuntimeError, "healpix_rangesearch_radec_simple failed"); return NULL; } dims[0] = n_indices; result = PyArray_SimpleNew(1, dims, NPY_INT64); if (result) { result_data = PyArray_DATA((PyArrayObject *) result); if (strcmp(order, "nested") == 0) { int i; for (i = 0; i < n_indices; i ++) result_data[i] = healpixl_xy_to_nested(indices[i], nside); } else { int i; for (i = 0; i < n_indices; i ++) result_data[i] = healpixl_xy_to_ring(indices[i], nside); } } free(indices); return result; } static PyMethodDef methods[] = { {"healpix_cone_search", (PyCFunction) healpix_cone_search, METH_VARARGS | METH_KEYWORDS, NULL}, {NULL, NULL, 0, NULL} }; static PyModuleDef moduledef = { PyModuleDef_HEAD_INIT, "_core", NULL, -1, methods }; static PyUFuncGenericFunction healpix_to_lonlat_loops [] = {healpix_to_lonlat_loop}, lonlat_to_healpix_loops [] = {lonlat_to_healpix_loop}, nested_to_ring_loops [] = {nested_to_ring_loop}, ring_to_nested_loops [] = {ring_to_nested_loop}, bilinear_interpolation_weights_loops[] = {bilinear_interpolation_weights_loop}, neighbours_loops [] = {neighbours_loop}; static char healpix_to_lonlat_types[] = { NPY_INT64, NPY_INT, NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE}, lonlat_to_healpix_types[] = { NPY_DOUBLE, NPY_DOUBLE, NPY_INT, NPY_INT64, NPY_DOUBLE, NPY_DOUBLE}, healpix_to_healpix_types[] = { NPY_INT64, NPY_INT, NPY_INT64}, bilinear_interpolation_weights_types[] = { NPY_DOUBLE, NPY_DOUBLE, NPY_INT, NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64, NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE}, neighbours_types[] = { NPY_INT64, NPY_INT, NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64}; PyMODINIT_FUNC PyInit__core(void) { PyObject *module; import_array(); import_umath(); module = PyModule_Create(&moduledef); PyModule_AddObject( module, "healpix_nested_to_lonlat", PyUFunc_FromFuncAndData( healpix_to_lonlat_loops, nested_ufunc_data, healpix_to_lonlat_types, 1, 4, 2, PyUFunc_None, "healpix_nested_to_lonlat", NULL, 0)); PyModule_AddObject( module, "healpix_ring_to_lonlat", PyUFunc_FromFuncAndData( healpix_to_lonlat_loops, ring_ufunc_data, healpix_to_lonlat_types, 1, 4, 2, PyUFunc_None, "healpix_ring_to_lonlat", NULL, 0)); PyModule_AddObject( module, "lonlat_to_healpix_nested", PyUFunc_FromFuncAndData( lonlat_to_healpix_loops, nested_ufunc_data, lonlat_to_healpix_types, 1, 3, 3, PyUFunc_None, "lonlat_to_healpix_nested", NULL, 0)); PyModule_AddObject( module, "lonlat_to_healpix_ring", PyUFunc_FromFuncAndData( lonlat_to_healpix_loops, ring_ufunc_data, lonlat_to_healpix_types, 1, 3, 3, PyUFunc_None, "lonlat_to_healpix_ring", NULL, 0)); PyModule_AddObject( module, "nested_to_ring", PyUFunc_FromFuncAndData( nested_to_ring_loops, no_ufunc_data, healpix_to_healpix_types, 1, 2, 1, PyUFunc_None, "nested_to_ring", NULL, 0)); PyModule_AddObject( module, "ring_to_nested", PyUFunc_FromFuncAndData( ring_to_nested_loops, no_ufunc_data, healpix_to_healpix_types, 1, 2, 1, PyUFunc_None, "ring_to_nested", NULL, 0)); PyModule_AddObject( module, "bilinear_interpolation_weights", PyUFunc_FromFuncAndData( bilinear_interpolation_weights_loops, no_ufunc_data, bilinear_interpolation_weights_types, 1, 3, 8, PyUFunc_None, "bilinear_interpolation_weights", NULL, 0)); PyModule_AddObject( module, "neighbours_nested", PyUFunc_FromFuncAndData( neighbours_loops, nested_ufunc_data, neighbours_types, 1, 2, 8, PyUFunc_None, "neighbours_nested", NULL, 0)); PyModule_AddObject( module, "neighbours_ring", PyUFunc_FromFuncAndData( neighbours_loops, ring_ufunc_data, neighbours_types, 1, 2, 8, PyUFunc_None, "neighbours_ring", NULL, 0)); return module; } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.4992325 astropy-healpix-0.5/astropy_healpix/bench.py0000644000077000000240000001506200000000000021276 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """Benchmarks for this package. To run all benchmarks and print a report to the console:: python -m healpix.bench You can also run the benchmarks first, save the results dict to disk as a JSON file (or share it with others) and then print the results later, or compare them with other results. We should now that this is not very comprehensive / flexible. If your application depends on performance of HEALPix computations, you should write benchmarks with cases relevant for that application and check if HEALPix computations are really the bottleneck and if this package is fast enough for you or not. """ import timeit from astropy.table import Table # NOTE: If healpy is installed, we use it in the benchmarks, but healpy is not # a formal dependency of astropy-healpix. try: import healpy as hp # noqa except ImportError: HEALPY_INSTALLED = False else: HEALPY_INSTALLED = True # Copied from https://github.com/kwgoodman/bottleneck/blob/master/bottleneck/benchmark/autotimeit.py def autotimeit(stmt, setup='pass', repeat=3, mintime=0.2): timer = timeit.Timer(stmt, setup) number, time1 = autoscaler(timer, mintime) time2 = timer.repeat(repeat=repeat - 1, number=number) return min(time2 + [time1]) / number # Copied from https://github.com/kwgoodman/bottleneck/blob/master/bottleneck/benchmark/autotimeit.py def autoscaler(timer, mintime): number = 1 for i in range(12): time = timer.timeit(number) if time > mintime: return number, time number *= 10 raise RuntimeError('function is too fast to test') def get_import(package, function): if package == 'astropy_healpix': return f'from astropy_healpix.healpy import {function}' else: return f'from healpy import {function}' def bench_pix2ang(size=None, nside=None, nest=None, package=None, fast=False): setup = '\n'.join([ get_import(package, 'pix2ang'), 'import numpy as np', f'nside={nside}', f'ipix=(np.random.random({size}) * 12 * nside ** 2).astype(np.int64)', f'nest={nest}']) stmt = 'pix2ang(nside, ipix, nest)' return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1) def bench_ang2pix(size=None, nside=None, nest=None, package=None, fast=False): setup = '\n'.join([ get_import(package, 'ang2pix'), 'import numpy as np', f'nside={nside}', f'lon=360 * np.random.random({size})', f'lat=180 * np.random.random({size}) - 90', f'nest={nest}']) stmt = 'ang2pix(nside, lon, lat, nest, lonlat=True)' return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1) def bench_nest2ring(size=None, nside=None, package=None, fast=False): setup = '\n'.join([ get_import(package, 'nest2ring'), 'import numpy as np', f'nside={nside}', f'ipix=(np.random.random({size}) * 12 * nside ** 2).astype(np.int64)']) stmt = 'nest2ring(nside, ipix)' return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1) def bench_ring2nest(size=None, nside=None, package=None, fast=False): setup = '\n'.join([ get_import(package, 'ring2nest'), 'import numpy as np', f'nside={nside}', f'ipix=(np.random.random({size}) * 12 * nside ** 2).astype(np.int64)']) stmt = 'ring2nest(nside, ipix)' return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1) def bench_get_interp_weights(size=None, nside=None, nest=None, package=None, fast=False): setup = '\n'.join([ get_import(package, 'get_interp_weights'), 'import numpy as np', f'nside={nside}', f'lon=360 * np.random.random({size})', f'lat=180 * np.random.random({size}) - 90', f'nest={nest}']) stmt = 'get_interp_weights(nside, lon, lat, nest=nest, lonlat=True)' return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1) def run_single(name, benchmark, fast=False, **kwargs): time_self = benchmark(package='astropy_healpix', fast=fast, **kwargs) results_single = dict(function=name, time_self=time_self, **kwargs) if HEALPY_INSTALLED: time_healpy = bench_ang2pix(package='healpy', fast=fast, **kwargs) results_single['time_healpy'] = time_healpy return results_single def bench_run(fast=False): """Run all benchmarks. Return results as a dict.""" results = [] if fast: SIZES = [10, 1_000, 100_000] else: SIZES = [10, 1_000, 1_000_000] for nest in [True, False]: for size in SIZES: for nside in [1, 128]: results.append(run_single('pix2ang', bench_pix2ang, fast=fast, size=size, nside=nside, nest=nest)) for nest in [True, False]: for size in SIZES: for nside in [1, 128]: results.append(run_single('ang2pix', bench_ang2pix, fast=fast, size=size, nside=nside, nest=nest)) for size in SIZES: for nside in [1, 128]: results.append(run_single('nest2ring', bench_nest2ring, fast=fast, size=size, nside=nside)) for size in SIZES: for nside in [1, 128]: results.append(run_single('ring2nest', bench_ring2nest, fast=fast, size=size, nside=nside)) for nest in [True, False]: for size in SIZES: for nside in [1, 128]: results.append(run_single('get_interp_weights', bench_get_interp_weights, fast=fast, size=size, nside=nside, nest=nest)) return results def bench_report(results): """Print a report for given benchmark results to the console.""" table = Table(names=['function', 'nest', 'nside', 'size', 'time_healpy', 'time_self', 'ratio'], dtype=['S20', bool, int, int, float, float, float], masked=True) for row in results: table.add_row(row) table['time_self'].format = '10.7f' if HEALPY_INSTALLED: table['ratio'] = table['time_self'] / table['time_healpy'] table['time_healpy'].format = '10.7f' table['ratio'].format = '7.2f' table.pprint(max_lines=-1) def main(fast=False): """Run all benchmarks and print report to the console.""" print('Running benchmarks...\n') results = bench_run(fast=fast) bench_report(results) if __name__ == '__main__': main() ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5029109 astropy-healpix-0.5/astropy_healpix/conftest.py0000644000077000000240000000231400000000000022040 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst import os import numpy as np from astropy.version import version as astropy_version if astropy_version < '3.0': from astropy.tests.pytest_plugins import * del pytest_report_header else: from pytest_astropy_header.display import PYTEST_HEADER_MODULES, TESTED_VERSIONS from astropy.tests.helper import enable_deprecations_as_exceptions enable_deprecations_as_exceptions() def pytest_configure(config): config.option.astropy_header = True PYTEST_HEADER_MODULES.pop('h5py', None) PYTEST_HEADER_MODULES.pop('Pandas', None) PYTEST_HEADER_MODULES['Astropy'] = 'astropy' PYTEST_HEADER_MODULES['healpy'] = 'healpy' from .version import version, astropy_helpers_version packagename = os.path.basename(os.path.dirname(__file__)) TESTED_VERSIONS[packagename] = version TESTED_VERSIONS['astropy_helpers'] = astropy_helpers_version # Set the Numpy print style to a fixed version to make doctest outputs # reproducible. try: np.set_printoptions(legacy='1.13') except TypeError: # On older versions of Numpy, the unrecognized 'legacy' option will # raise a TypeError. pass ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5036943 astropy-healpix-0.5/astropy_healpix/core.py0000644000077000000240000004610500000000000021151 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst import math import numpy as np from astropy import units as u from astropy.coordinates import Longitude, Latitude from . import _core __all__ = [ 'nside_to_pixel_area', 'nside_to_pixel_resolution', 'pixel_resolution_to_nside', 'nside_to_npix', 'npix_to_nside', 'level_to_nside', 'nside_to_level', 'level_ipix_to_uniq', 'uniq_to_level_ipix', 'lonlat_to_healpix', 'healpix_to_lonlat', 'bilinear_interpolation_weights', 'interpolate_bilinear_lonlat', 'neighbours', ] def _restore_shape(*args, **kwargs): shape = kwargs['shape'] if shape: if len(args) > 1: return [arg.reshape(shape) for arg in args] else: return args[0].reshape(shape) else: if len(args) > 1: return [arg.item() for arg in args] else: return args[0].item() def _validate_order(order): # We also support upper-case, to support directly the values # ORDERING = {'RING', 'NESTED'} in FITS headers # This is currently undocumented in the docstrings. if order == 'nested' or order == 'NESTED': return 'nested' elif order == 'ring' or order == 'RING': return 'ring' else: raise ValueError("order must be 'nested' or 'ring'") def _validate_healpix_index(label, healpix_index, nside): npix = nside_to_npix(nside) if np.any((healpix_index < 0) | (healpix_index > npix - 1)): raise ValueError(f'{label} must be in the range [0:{npix}]') def _validate_offset(label, offset): offset = np.asarray(offset) if np.any((offset < 0) | (offset > 1)): raise ValueError(f'd{label} must be in the range [0:1]') def _validate_level(level): if np.any(level < 0): raise ValueError('level must be positive') def _validate_nside(nside): log_2_nside = np.round(np.log2(nside)) if not np.all(2 ** log_2_nside == nside): raise ValueError('nside must be a power of two') def _validate_npix(level, ipix): if not np.all(ipix < (3 << 2*(level + 1))): raise ValueError('ipix for a specific level must be inferior to npix') def level_to_nside(level): """ Find the pixel dimensions of the top-level HEALPix tiles. This is given by ``nside = 2**level``. Parameters ---------- level : int The resolution level Returns ------- nside : int The number of pixels on the side of one of the 12 'top-level' HEALPix tiles. """ level = np.asarray(level, dtype=np.int64) _validate_level(level) return 2 ** level def nside_to_level(nside): """ Find the HEALPix level for a given nside. This is given by ``level = log2(nside)``. This function is the inverse of `level_to_nside`. Parameters ---------- nside : int The number of pixels on the side of one of the 12 'top-level' HEALPix tiles. Must be a power of two. Returns ------- level : int The level of the HEALPix cells """ nside = np.asarray(nside, dtype=np.int64) _validate_nside(nside) return np.log2(nside).astype(np.int64) def uniq_to_level_ipix(uniq): """ Convert a HEALPix cell uniq number to its (level, ipix) equivalent. A uniq number is a 64 bits integer equaling to : ipix + 4*(4**level). Please read this `paper `_ for more details about uniq numbers. Parameters ---------- uniq : int The uniq number of a HEALPix cell. Returns ------- level, ipix: int, int The level and index of the HEALPix cell computed from ``uniq``. """ uniq = np.asarray(uniq, dtype=np.int64) level = (np.log2(uniq//4)) // 2 level = level.astype(np.int64) _validate_level(level) ipix = uniq - (1 << 2*(level + 1)) _validate_npix(level, ipix) return level, ipix def level_ipix_to_uniq(level, ipix): """ Convert a level and HEALPix index into a uniq number representing the cell. This function is the inverse of `uniq_to_level_ipix`. Parameters ---------- level : int The level of the HEALPix cell ipix : int The index of the HEALPix cell Returns ------- uniq : int The uniq number representing the HEALPix cell. """ level = np.asarray(level, dtype=np.int64) ipix = np.asarray(ipix, dtype=np.int64) _validate_level(level) _validate_npix(level, ipix) return ipix + (1 << 2*(level + 1)) def nside_to_pixel_area(nside): """ Find the area of HEALPix pixels given the pixel dimensions of one of the 12 'top-level' HEALPix tiles. Parameters ---------- nside : int The number of pixels on the side of one of the 12 'top-level' HEALPix tiles. Returns ------- pixel_area : :class:`~astropy.units.Quantity` The area of the HEALPix pixels """ nside = np.asanyarray(nside, dtype=np.int64) _validate_nside(nside) npix = 12 * nside * nside pixel_area = 4 * math.pi / npix * u.sr return pixel_area def nside_to_pixel_resolution(nside): """ Find the resolution of HEALPix pixels given the pixel dimensions of one of the 12 'top-level' HEALPix tiles. Parameters ---------- nside : int The number of pixels on the side of one of the 12 'top-level' HEALPix tiles. Returns ------- resolution : :class:`~astropy.units.Quantity` The resolution of the HEALPix pixels See also -------- pixel_resolution_to_nside """ nside = np.asanyarray(nside, dtype=np.int64) _validate_nside(nside) return (nside_to_pixel_area(nside) ** 0.5).to(u.arcmin) def pixel_resolution_to_nside(resolution, round='nearest'): """Find closest HEALPix nside for a given angular resolution. This function is the inverse of `nside_to_pixel_resolution`, for the default rounding scheme of ``round='nearest'``. If you choose ``round='up'``, you'll get HEALPix pixels that have at least the requested resolution (usually a bit better due to rounding). Pixel resolution is defined as square root of pixel area. Parameters ---------- resolution : `~astropy.units.Quantity` Angular resolution round : {'up', 'nearest', 'down'} Which way to round Returns ------- nside : int The number of pixels on the side of one of the 12 'top-level' HEALPix tiles. Always a power of 2. Examples -------- >>> from astropy import units as u >>> from astropy_healpix import pixel_resolution_to_nside >>> pixel_resolution_to_nside(13 * u.arcmin) 256 >>> pixel_resolution_to_nside(13 * u.arcmin, round='up') 512 """ resolution = resolution.to(u.rad).value pixel_area = resolution * resolution npix = 4 * math.pi / pixel_area nside = np.sqrt(npix / 12) # Now we have to round to the closest ``nside`` # Since ``nside`` must be a power of two, # we first compute the corresponding ``level = log2(nside)` # round the level and then go back to nside level = np.log2(nside) if round == 'up': level = np.ceil(level) elif round == 'nearest': level = np.round(level) elif round == 'down': level = np.floor(level) else: raise ValueError(f'Invalid value for round: {round!r}') # For very low requested resolution (i.e. large angle values), we # return ``level=0``, i.e. ``nside=1``, i.e. the lowest resolution # that exists with HEALPix level = np.clip(level.astype(int), 0, None) return level_to_nside(level) def nside_to_npix(nside): """ Find the number of pixels corresponding to a HEALPix resolution. Parameters ---------- nside : int The number of pixels on the side of one of the 12 'top-level' HEALPix tiles. Returns ------- npix : int The number of pixels in the HEALPix map. """ nside = np.asanyarray(nside, dtype=np.int64) _validate_nside(nside) return 12 * nside ** 2 def npix_to_nside(npix): """ Find the number of pixels on the side of one of the 12 'top-level' HEALPix tiles given a total number of pixels. Parameters ---------- npix : int The number of pixels in the HEALPix map. Returns ------- nside : int The number of pixels on the side of one of the 12 'top-level' HEALPix tiles. """ npix = np.asanyarray(npix, dtype=np.int64) if not np.all(npix % 12 == 0): raise ValueError('Number of pixels must be divisible by 12') square_root = np.sqrt(npix / 12) if not np.all(square_root ** 2 == npix / 12): raise ValueError('Number of pixels is not of the form 12 * nside ** 2') return np.round(square_root).astype(int) def healpix_to_lonlat(healpix_index, nside, dx=None, dy=None, order='ring'): """ Convert HEALPix indices (optionally with offsets) to longitudes/latitudes. If no offsets (``dx`` and ``dy``) are provided, the coordinates will default to those at the center of the HEALPix pixels. Parameters ---------- healpix_index : int or `~numpy.ndarray` HEALPix indices (as a scalar or array) nside : int or `~numpy.ndarray` Number of pixels along the side of each of the 12 top-level HEALPix tiles dx, dy : float or `~numpy.ndarray`, optional Offsets inside the HEALPix pixel, which must be in the range [0:1], where 0.5 is the center of the HEALPix pixels (as scalars or arrays) order : { 'nested' | 'ring' }, optional Order of HEALPix pixels Returns ------- lon : :class:`~astropy.coordinates.Longitude` The longitude values lat : :class:`~astropy.coordinates.Latitude` The latitude values """ _validate_nside(nside) if _validate_order(order) == 'ring': func = _core.healpix_ring_to_lonlat else: # _validate_order(order) == 'nested' func = _core.healpix_nested_to_lonlat if dx is None: dx = 0.5 else: _validate_offset('x', dx) if dy is None: dy = 0.5 else: _validate_offset('y', dy) nside = np.asarray(nside, dtype=np.intc) lon, lat = func(healpix_index, nside, dx, dy) lon = Longitude(lon, unit=u.rad, copy=False) lat = Latitude(lat, unit=u.rad, copy=False) return lon, lat def lonlat_to_healpix(lon, lat, nside, return_offsets=False, order='ring'): """ Convert longitudes/latitudes to HEALPix indices Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude values as :class:`~astropy.units.Quantity` instances with angle units. nside : int or `~numpy.ndarray` Number of pixels along the side of each of the 12 top-level HEALPix tiles order : { 'nested' | 'ring' } Order of HEALPix pixels return_offsets : bool, optional If `True`, the returned values are the HEALPix pixel indices as well as ``dx`` and ``dy``, the fractional positions inside the pixels. If `False` (the default), only the HEALPix pixel indices is returned. Returns ------- healpix_index : int or `~numpy.ndarray` The HEALPix indices dx, dy : `~numpy.ndarray` Offsets inside the HEALPix pixel in the range [0:1], where 0.5 is the center of the HEALPix pixels """ if _validate_order(order) == 'ring': func = _core.lonlat_to_healpix_ring else: # _validate_order(order) == 'nested' func = _core.lonlat_to_healpix_nested nside = np.asarray(nside, dtype=np.intc) lon = lon.to_value(u.rad) lat = lat.to_value(u.rad) healpix_index, dx, dy = func(lon, lat, nside) if return_offsets: return healpix_index, dx, dy else: return healpix_index def nested_to_ring(nested_index, nside): """ Convert a HEALPix 'nested' index to a HEALPix 'ring' index Parameters ---------- nested_index : int or `~numpy.ndarray` Healpix index using the 'nested' ordering nside : int or `~numpy.ndarray` Number of pixels along the side of each of the 12 top-level HEALPix tiles Returns ------- ring_index : int or `~numpy.ndarray` Healpix index using the 'ring' ordering """ nside = np.asarray(nside, dtype=np.intc) return _core.nested_to_ring(nested_index, nside) def ring_to_nested(ring_index, nside): """ Convert a HEALPix 'ring' index to a HEALPix 'nested' index Parameters ---------- ring_index : int or `~numpy.ndarray` Healpix index using the 'ring' ordering nside : int or `~numpy.ndarray` Number of pixels along the side of each of the 12 top-level HEALPix tiles Returns ------- nested_index : int or `~numpy.ndarray` Healpix index using the 'nested' ordering """ nside = np.asarray(nside, dtype=np.intc) return _core.ring_to_nested(ring_index, nside) def bilinear_interpolation_weights(lon, lat, nside, order='ring'): """ Get the four neighbours for each (lon, lat) position and the weight associated with each one for bilinear interpolation. Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude values as :class:`~astropy.units.Quantity` instances with angle units. nside : int Number of pixels along the side of each of the 12 top-level HEALPix tiles order : { 'nested' | 'ring' } Order of HEALPix pixels Returns ------- indices : `~numpy.ndarray` 2-D array with shape (4, N) giving the four indices to use for the interpolation weights : `~numpy.ndarray` 2-D array with shape (4, N) giving the four weights to use for the interpolation """ lon = lon.to_value(u.rad) lat = lat.to_value(u.rad) _validate_nside(nside) nside = np.asarray(nside, dtype=np.intc) result = _core.bilinear_interpolation_weights(lon, lat, nside) indices = np.stack(result[:4]) weights = np.stack(result[4:]) if _validate_order(order) == 'nested': indices = ring_to_nested(indices, nside) return indices, weights def interpolate_bilinear_lonlat(lon, lat, values, order='ring'): """ Interpolate values at specific longitudes/latitudes using bilinear interpolation Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude values as :class:`~astropy.units.Quantity` instances with angle units. values : `~numpy.ndarray` Array with the values in each HEALPix pixel. The first dimension should have length 12 * nside ** 2 (and nside is determined automatically from this). order : { 'nested' | 'ring' } Order of HEALPix pixels Returns ------- result : float `~numpy.ndarray` The interpolated values """ nside = npix_to_nside(values.shape[0]) indices, weights = bilinear_interpolation_weights(lon, lat, nside, order=order) values = values[indices] # At this point values has shape (N, M) where both N and M might be several # dimensions, and weights has shape (N,), so we need to transpose in order # to benefit from broadcasting, then transpose back so that the dimension # with length 4 is at the start again, ready for summing. result = (values.T * weights.T).T return result.sum(axis=0) def neighbours(healpix_index, nside, order='ring'): """ Find all the HEALPix pixels that are the neighbours of a HEALPix pixel Parameters ---------- healpix_index : `~numpy.ndarray` Array of HEALPix pixels nside : int Number of pixels along the side of each of the 12 top-level HEALPix tiles order : { 'nested' | 'ring' } Order of HEALPix pixels Returns ------- neigh : `~numpy.ndarray` Array giving the neighbours starting SW and rotating clockwise. This has one extra dimension compared to ``healpix_index`` - the first dimension - which is set to 8. For example if healpix_index has shape (2, 3), ``neigh`` has shape (8, 2, 3). """ _validate_nside(nside) nside = np.asarray(nside, dtype=np.intc) if _validate_order(order) == 'ring': func = _core.neighbours_ring else: # _validate_order(order) == 'nested' func = _core.neighbours_nested return np.stack(func(healpix_index, nside)) def healpix_cone_search(lon, lat, radius, nside, order='ring'): """ Find all the HEALPix pixels within a given radius of a longitude/latitude. Note that this returns all pixels that overlap, including partially, with the search cone. This function can only be used for a single lon/lat pair at a time, since different calls to the function may result in a different number of matches. Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude to search around radius : :class:`~astropy.units.Quantity` The search radius nside : int Number of pixels along the side of each of the 12 top-level HEALPix tiles order : { 'nested' | 'ring' } Order of HEALPix pixels Returns ------- healpix_index : `~numpy.ndarray` 1-D array with all the matching HEALPix pixel indices. """ lon = lon.to_value(u.deg) lat = lat.to_value(u.deg) radius = radius.to_value(u.deg) _validate_nside(nside) order = _validate_order(order) return _core.healpix_cone_search(lon, lat, radius, nside, order) def boundaries_lonlat(healpix_index, step, nside, order='ring'): """ Return the longitude and latitude of the edges of HEALPix pixels This returns the longitude and latitude of points along the edge of each HEALPIX pixel. The number of points returned for each pixel is ``4 * step``, so setting ``step`` to 1 returns just the corners. Parameters ---------- healpix_index : `~numpy.ndarray` 1-D array of HEALPix pixels step : int The number of steps to take along each edge. nside : int Number of pixels along the side of each of the 12 top-level HEALPix tiles order : { 'nested' | 'ring' } Order of HEALPix pixels Returns ------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude, as 2-D arrays where the first dimension is the same as the ``healpix_index`` input, and the second dimension has size ``4 * step``. """ healpix_index = np.asarray(healpix_index, dtype=np.int64) step = int(step) if step < 1: raise ValueError('step must be at least 1') # PERF: this could be optimized by writing a Cython routine to do this to # avoid allocating temporary arrays frac = np.linspace(0., 1., step + 1)[:-1] dx = np.hstack([1 - frac, np.repeat(0, step), frac, np.repeat(1, step)]) dy = np.hstack([np.repeat(1, step), 1 - frac, np.repeat(0, step), frac]) healpix_index, dx, dy = np.broadcast_arrays(healpix_index.reshape(-1, 1), dx, dy) lon, lat = healpix_to_lonlat(healpix_index.ravel(), nside, dx.ravel(), dy.ravel(), order=order) lon = lon.reshape(-1, 4 * step) lat = lat.reshape(-1, 4 * step) return lon, lat ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5046349 astropy-healpix-0.5/astropy_healpix/healpy.py0000644000077000000240000001516600000000000021506 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This submodule provides a healpy-compatible interface. """ import numpy as np from astropy import units as u from astropy.coordinates.representation import CartesianRepresentation, UnitSphericalRepresentation from .core import (nside_to_pixel_resolution, nside_to_pixel_area, nside_to_npix, npix_to_nside, nested_to_ring, ring_to_nested, level_to_nside, lonlat_to_healpix, healpix_to_lonlat, boundaries_lonlat, bilinear_interpolation_weights, interpolate_bilinear_lonlat) RAD2DEG = 180 / np.pi PI_2 = np.pi / 2 __all__ = ['nside2resol', 'nside2pixarea', 'nside2npix', 'npix2nside', 'pix2ang', 'ang2pix', 'pix2vec', 'vec2pix', 'order2nside', 'nest2ring', 'ring2nest', 'boundaries', 'vec2ang', 'ang2vec', 'get_interp_weights', 'get_interp_val'] def _lonlat_to_healpy(lon, lat, lonlat=False): # We use in-place operations below to avoid making temporary arrays - this # is safe because the lon/lat arrays returned from healpix_to_lonlat are # new and not used elsewhere. if lonlat: return lon.to(u.deg).value, lat.to(u.deg).value else: lat, lon = lat.to(u.rad).value, lon.to(u.rad).value if np.isscalar(lon): return PI_2 - lat, lon else: lat = np.subtract(PI_2, lat, out=lat) return lat, lon def _healpy_to_lonlat(theta, phi, lonlat=False): # Unlike in _lonlat_to_healpy, we don't use in-place operations since we # don't want to modify theta and phi since the user may be using them # elsewhere. if lonlat: lon = np.asarray(theta) / RAD2DEG lat = np.asarray(phi) / RAD2DEG else: lat = PI_2 - np.asarray(theta) lon = np.asarray(phi) return u.Quantity(lon, u.rad, copy=False), u.Quantity(lat, u.rad, copy=False) def nside2resol(nside, arcmin=False): """Drop-in replacement for healpy `~healpy.pixelfunc.nside2resol`.""" resolution = nside_to_pixel_resolution(nside) if arcmin: return resolution.to(u.arcmin).value else: return resolution.to(u.rad).value def nside2pixarea(nside, degrees=False): """Drop-in replacement for healpy `~healpy.pixelfunc.nside2pixarea`.""" area = nside_to_pixel_area(nside) if degrees: return area.to(u.deg ** 2).value else: return area.to(u.sr).value def nside2npix(nside): """Drop-in replacement for healpy `~healpy.pixelfunc.nside2npix`.""" return nside_to_npix(nside) def npix2nside(npix): """Drop-in replacement for healpy `~healpy.pixelfunc.npix2nside`.""" return npix_to_nside(npix) def order2nside(order): """Drop-in replacement for healpy `~healpy.pixelfunc.order2nside`.""" return level_to_nside(order) def pix2ang(nside, ipix, nest=False, lonlat=False): """Drop-in replacement for healpy `~healpy.pixelfunc.pix2ang`.""" lon, lat = healpix_to_lonlat(ipix, nside, order='nested' if nest else 'ring') return _lonlat_to_healpy(lon, lat, lonlat=lonlat) def ang2pix(nside, theta, phi, nest=False, lonlat=False): """Drop-in replacement for healpy `~healpy.pixelfunc.ang2pix`.""" lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat) return lonlat_to_healpix(lon, lat, nside, order='nested' if nest else 'ring') def pix2vec(nside, ipix, nest=False): """Drop-in replacement for healpy `~healpy.pixelfunc.pix2vec`.""" lon, lat = healpix_to_lonlat(ipix, nside, order='nested' if nest else 'ring') return ang2vec(*_lonlat_to_healpy(lon, lat)) def vec2pix(nside, x, y, z, nest=False): """Drop-in replacement for healpy `~healpy.pixelfunc.vec2pix`.""" theta, phi = vec2ang(np.transpose([x, y, z])) # hp.vec2ang() returns raveled arrays, which are 1D. if np.isscalar(x): theta = theta.item() phi = phi.item() else: shape = np.shape(x) theta = theta.reshape(shape) phi = phi.reshape(shape) lon, lat = _healpy_to_lonlat(theta, phi) return lonlat_to_healpix(lon, lat, nside, order='nested' if nest else 'ring') def nest2ring(nside, ipix): """Drop-in replacement for healpy `~healpy.pixelfunc.nest2ring`.""" ipix = np.atleast_1d(ipix).astype(np.int64, copy=False) return nested_to_ring(ipix, nside) def ring2nest(nside, ipix): """Drop-in replacement for healpy `~healpy.pixelfunc.ring2nest`.""" ipix = np.atleast_1d(ipix).astype(np.int64, copy=False) return ring_to_nested(ipix, nside) def boundaries(nside, pix, step=1, nest=False): """Drop-in replacement for healpy `~healpy.boundaries`.""" pix = np.asarray(pix) if pix.ndim > 1: # For consistency with healpy we only support scalars or 1D arrays raise ValueError("Array has to be one dimensional") lon, lat = boundaries_lonlat(pix, step, nside, order='nested' if nest else 'ring') rep_sph = UnitSphericalRepresentation(lon, lat) rep_car = rep_sph.to_cartesian().xyz.value.swapaxes(0, 1) if rep_car.shape[0] == 1: return rep_car[0] else: return rep_car def vec2ang(vectors, lonlat=False): """Drop-in replacement for healpy `~healpy.pixelfunc.vec2ang`.""" x, y, z = vectors.transpose() rep_car = CartesianRepresentation(x, y, z) rep_sph = rep_car.represent_as(UnitSphericalRepresentation) return _lonlat_to_healpy(rep_sph.lon.ravel(), rep_sph.lat.ravel(), lonlat=lonlat) def ang2vec(theta, phi, lonlat=False): """Drop-in replacement for healpy `~healpy.pixelfunc.ang2vec`.""" lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat) rep_sph = UnitSphericalRepresentation(lon, lat) rep_car = rep_sph.represent_as(CartesianRepresentation) return rep_car.xyz.value def get_interp_weights(nside, theta, phi=None, nest=False, lonlat=False): """ Drop-in replacement for healpy `~healpy.pixelfunc.get_interp_weights`. Although note that the order of the weights and pixels may differ. """ # if phi is not given, theta is interpreted as pixel number if phi is None: theta, phi = pix2ang(nside, ipix=theta, nest=nest) lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat) return bilinear_interpolation_weights(lon, lat, nside, order='nested' if nest else 'ring') def get_interp_val(m, theta, phi, nest=False, lonlat=False): """ Drop-in replacement for healpy `~healpy.pixelfunc.get_interp_val`. """ lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat) return interpolate_bilinear_lonlat(lon, lat, m, order='nested' if nest else 'ring') ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574433289.505507 astropy-healpix-0.5/astropy_healpix/high_level.py0000644000077000000240000004327600000000000022335 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst import os from astropy.coordinates import SkyCoord from astropy.coordinates.representation import UnitSphericalRepresentation from .core import (nside_to_pixel_area, nside_to_pixel_resolution, nside_to_npix, npix_to_nside, healpix_to_lonlat, lonlat_to_healpix, bilinear_interpolation_weights, interpolate_bilinear_lonlat, ring_to_nested, nested_to_ring, healpix_cone_search, boundaries_lonlat, neighbours) from .utils import parse_input_healpix_data __all__ = ['HEALPix'] NO_FRAME_MESSAGE = """ No frame was specified when initializing HEALPix, so SkyCoord objects cannot be returned. Either specify a frame when initializing HEALPix or use the {0} method. """.replace(os.linesep, ' ').strip() class NoFrameError(Exception): def __init__(self, alternative_method): super().__init__(NO_FRAME_MESSAGE.format(alternative_method)) class HEALPix: """ A HEALPix pixellization. Parameters ---------- nside : int Number of pixels along the side of each of the 12 top-level HEALPix tiles order : { 'nested' | 'ring' } Order of HEALPix pixels frame : :class:`~astropy.coordinates.BaseCoordinateFrame`, optional The celestial coordinate frame of the pixellization. This can be ommitted, in which case the pixellization will not be attached to any particular celestial frame, and the methods ending in _skycoord will not work (but the _lonlat methods will still work and continue to return generic longitudes/latitudes). """ def __init__(self, nside=None, order='ring', frame=None): if nside is None: raise ValueError('nside has not been set') self.nside = nside self.order = order self.frame = frame @classmethod def from_header(cls, input_data, field=0, hdu_in=None, nested=None): """ Parameters ---------- input_data : str or `~astropy.io.fits.TableHDU` or `~astropy.io.fits.BinTableHDU` or tuple The input data to reproject. This can be: * The name of a HEALPIX FITS file * A `~astropy.io.fits.TableHDU` or `~astropy.io.fits.BinTableHDU` instance * A tuple where the first element is a `~numpy.ndarray` and the second element is a `~astropy.coordinates.BaseCoordinateFrame` instance or a string alias for a coordinate frame. hdu_in : int or str, optional If ``input_data`` is a FITS file, specifies the HDU to use. (the default HDU for HEALPIX data is 1, unlike with image files where it is generally 0) nested : bool, optional The order of the healpix_data, either nested (True) or ring (False). If a FITS file is passed in, this is determined from the header. Returns ------- healpix : `~astropy_healpix.HEALPix` A HEALPix pixellization corresponding to the input data. """ array_in, frame, nested = parse_input_healpix_data( input_data, field=field, hdu_in=hdu_in, nested=nested) nside = npix_to_nside(len(array_in)) order = 'nested' if nested else 'ring' return cls(nside=nside, order=order, frame=frame) @property def pixel_area(self): """ The area of a single HEALPix pixel. """ return nside_to_pixel_area(self.nside) @property def pixel_resolution(self): """ The resolution of a single HEALPix pixel. """ return nside_to_pixel_resolution(self.nside) @property def npix(self): """ The number of pixels in the pixellization of the sphere. """ return nside_to_npix(self.nside) def healpix_to_lonlat(self, healpix_index, dx=None, dy=None): """ Convert HEALPix indices (optionally with offsets) to longitudes/latitudes Parameters ---------- healpix_index : `~numpy.ndarray` 1-D array of HEALPix indices dx, dy : `~numpy.ndarray`, optional 1-D arrays of offsets inside the HEALPix pixel, which must be in the range [0:1] (0.5 is the center of the HEALPix pixels). If not specified, the position at the center of the pixel is used. Returns ------- lon : :class:`~astropy.coordinates.Longitude` The longitude values lat : :class:`~astropy.coordinates.Latitude` The latitude values """ return healpix_to_lonlat(healpix_index, self.nside, dx=dx, dy=dy, order=self.order) def lonlat_to_healpix(self, lon, lat, return_offsets=False): """ Convert longitudes/latitudes to HEALPix indices (optionally with offsets) Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude values as :class:`~astropy.units.Quantity` instances with angle units. return_offsets : bool If `True`, the returned values are the HEALPix pixel as well as ``dx`` and ``dy``, the fractional positions inside the pixel. If `False` (the default), only the HEALPix pixel is returned. Returns ------- healpix_index : `~numpy.ndarray` 1-D array of HEALPix indices dx, dy : `~numpy.ndarray` 1-D arrays of offsets inside the HEALPix pixel in the range [0:1] (0.5 is the center of the HEALPix pixels). This is returned if ``return_offsets`` is `True`. """ return lonlat_to_healpix(lon, lat, self.nside, return_offsets=return_offsets, order=self.order) def nested_to_ring(self, nested_index): """ Convert a healpix 'nested' index to a healpix 'ring' index Parameters ---------- nested_index : `~numpy.ndarray` Healpix index using the 'nested' ordering Returns ------- ring_index : `~numpy.ndarray` Healpix index using the 'ring' ordering """ return nested_to_ring(nested_index, self.nside) def ring_to_nested(self, ring_index): """ Convert a healpix 'ring' index to a healpix 'nested' index Parameters ---------- ring_index : `~numpy.ndarray` Healpix index using the 'ring' ordering Returns ------- nested_index : `~numpy.ndarray` Healpix index using the 'nested' ordering """ return ring_to_nested(ring_index, self.nside) def bilinear_interpolation_weights(self, lon, lat): """ Get the four neighbours for each (lon, lat) position and the weight associated with each one for bilinear interpolation. Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude values as :class:`~astropy.units.Quantity` instances with angle units. Returns ------- indices : `~numpy.ndarray` 2-D array with shape (4, N) giving the four indices to use for the interpolation weights : `~numpy.ndarray` 2-D array with shape (4, N) giving the four weights to use for the interpolation """ return bilinear_interpolation_weights(lon, lat, self.nside, order=self.order) def interpolate_bilinear_lonlat(self, lon, lat, values): """ Interpolate values at specific longitudes/latitudes using bilinear interpolation If a position does not have four neighbours, this currently returns NaN. Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude values as :class:`~astropy.units.Quantity` instances with angle units. values : `~numpy.ndarray` 1-D array with the values in each HEALPix pixel. This must have a length of the form 12 * nside ** 2 (and nside is determined automatically from this). Returns ------- result : `~numpy.ndarray` 1-D array of interpolated values """ if len(values) != self.npix: raise ValueError('values must be an array of length {} (got {})'.format(self.npix, len(values))) return interpolate_bilinear_lonlat(lon, lat, values, order=self.order) def cone_search_lonlat(self, lon, lat, radius): """ Find all the HEALPix pixels within a given radius of a longitude/latitude. Note that this returns all pixels that overlap, including partially, with the search cone. This function can only be used for a single lon/lat pair at a time, since different calls to the function may result in a different number of matches. Parameters ---------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude to search around radius : :class:`~astropy.units.Quantity` The search radius Returns ------- healpix_index : `~numpy.ndarray` 1-D array with all the matching HEALPix pixel indices. """ if not lon.isscalar or not lat.isscalar or not radius.isscalar: raise ValueError('The longitude, latitude and radius must be ' 'scalar Quantity objects') return healpix_cone_search(lon, lat, radius, self.nside, order=self.order) def boundaries_lonlat(self, healpix_index, step): """ Return the longitude and latitude of the edges of HEALPix pixels This returns the longitude and latitude of points along the edge of each HEALPIX pixel. The number of points returned for each pixel is ``4 * step``, so setting ``step`` to 1 returns just the corners. Parameters ---------- healpix_index : `~numpy.ndarray` 1-D array of HEALPix pixels step : int The number of steps to take along each edge. Returns ------- lon, lat : :class:`~astropy.units.Quantity` The longitude and latitude, as 2-D arrays where the first dimension is the same as the ``healpix_index`` input, and the second dimension has size ``4 * step``. """ return boundaries_lonlat(healpix_index, step, self.nside, order=self.order) def neighbours(self, healpix_index): """ Find all the HEALPix pixels that are the neighbours of a HEALPix pixel Parameters ---------- healpix_index : `~numpy.ndarray` Array of HEALPix pixels Returns ------- neigh : `~numpy.ndarray` Array giving the neighbours starting SW and rotating clockwise. This has one extra dimension compared to ``healpix_index`` - the first dimension - which is set to 8. For example if healpix_index has shape (2, 3), ``neigh`` has shape (8, 2, 3). """ return neighbours(healpix_index, self.nside, order=self.order) def healpix_to_skycoord(self, healpix_index, dx=None, dy=None): """ Convert HEALPix indices (optionally with offsets) to celestial coordinates. Note that this method requires that a celestial frame was specified when initializing HEALPix. If you don't know or need the celestial frame, you can instead use :meth:`~astropy_healpix.HEALPix.healpix_to_lonlat`. Parameters ---------- healpix_index : `~numpy.ndarray` 1-D array of HEALPix indices dx, dy : `~numpy.ndarray`, optional 1-D arrays of offsets inside the HEALPix pixel, which must be in the range [0:1] (0.5 is the center of the HEALPix pixels). If not specified, the position at the center of the pixel is used. Returns ------- coord : :class:`~astropy.coordinates.SkyCoord` The resulting celestial coordinates """ if self.frame is None: raise NoFrameError("healpix_to_skycoord") lon, lat = self.healpix_to_lonlat(healpix_index, dx=dx, dy=dy) representation = UnitSphericalRepresentation(lon, lat, copy=False) return SkyCoord(self.frame.realize_frame(representation)) def skycoord_to_healpix(self, skycoord, return_offsets=False): """ Convert celestial coordinates to HEALPix indices (optionally with offsets). Note that this method requires that a celestial frame was specified when initializing HEALPix. If you don't know or need the celestial frame, you can instead use :meth:`~astropy_healpix.HEALPix.lonlat_to_healpix`. Parameters ---------- skycoord : :class:`~astropy.coordinates.SkyCoord` The celestial coordinates to convert return_offsets : bool If `True`, the returned values are the HEALPix pixel as well as ``dx`` and ``dy``, the fractional positions inside the pixel. If `False` (the default), only the HEALPix pixel is returned. Returns ------- healpix_index : `~numpy.ndarray` 1-D array of HEALPix indices dx, dy : `~numpy.ndarray` 1-D arrays of offsets inside the HEALPix pixel in the range [0:1] (0.5 is the center of the HEALPix pixels). This is returned if ``return_offsets`` is `True`. """ if self.frame is None: raise NoFrameError("skycoord_to_healpix") skycoord = skycoord.transform_to(self.frame) representation = skycoord.represent_as(UnitSphericalRepresentation) lon, lat = representation.lon, representation.lat return self.lonlat_to_healpix(lon, lat, return_offsets=return_offsets) def interpolate_bilinear_skycoord(self, skycoord, values): """ Interpolate values at specific celestial coordinates using bilinear interpolation. If a position does not have four neighbours, this currently returns NaN. Note that this method requires that a celestial frame was specified when initializing HEALPix. If you don't know or need the celestial frame, you can instead use :meth:`~astropy_healpix.HEALPix.interpolate_bilinear_lonlat`. Parameters ---------- skycoord : :class:`~astropy.coordinates.SkyCoord` The celestial coordinates at which to interpolate values : `~numpy.ndarray` 1-D array with the values in each HEALPix pixel. This must have a length of the form 12 * nside ** 2 (and nside is determined automatically from this). Returns ------- result : `~numpy.ndarray` 1-D array of interpolated values """ if self.frame is None: raise NoFrameError("interpolate_bilinear_skycoord") skycoord = skycoord.transform_to(self.frame) representation = skycoord.represent_as(UnitSphericalRepresentation) lon, lat = representation.lon, representation.lat return self.interpolate_bilinear_lonlat(lon, lat, values) def cone_search_skycoord(self, skycoord, radius): """ Find all the HEALPix pixels within a given radius of a celestial position. Note that this returns all pixels that overlap, including partially, with the search cone. This function can only be used for a single celestial position at a time, since different calls to the function may result in a different number of matches. This method requires that a celestial frame was specified when initializing HEALPix. If you don't know or need the celestial frame, you can instead use :meth:`~astropy_healpix.HEALPix.cone_search_lonlat`. Parameters ---------- skycoord : :class:`~astropy.coordinates.SkyCoord` The celestial coordinates to use for the cone search radius : :class:`~astropy.units.Quantity` The search radius Returns ------- healpix_index : `~numpy.ndarray` 1-D array with all the matching HEALPix pixel indices. """ if self.frame is None: raise NoFrameError("cone_search_skycoord") skycoord = skycoord.transform_to(self.frame) representation = skycoord.represent_as(UnitSphericalRepresentation) lon, lat = representation.lon, representation.lat return self.cone_search_lonlat(lon, lat, radius) def boundaries_skycoord(self, healpix_index, step): """ Return the celestial coordinates of the edges of HEALPix pixels This returns the celestial coordinates of points along the edge of each HEALPIX pixel. The number of points returned for each pixel is ``4 * step``, so setting ``step`` to 1 returns just the corners. This method requires that a celestial frame was specified when initializing HEALPix. If you don't know or need the celestial frame, you can instead use :meth:`~astropy_healpix.HEALPix.boundaries_lonlat`. Parameters ---------- healpix_index : `~numpy.ndarray` 1-D array of HEALPix pixels step : int The number of steps to take along each edge. Returns ------- skycoord : :class:`~astropy.coordinates.SkyCoord` The celestial coordinates of the HEALPix pixel boundaries """ if self.frame is None: raise NoFrameError("boundaries_skycoord") lon, lat = self.boundaries_lonlat(healpix_index, step) representation = UnitSphericalRepresentation(lon, lat, copy=False) return SkyCoord(self.frame.realize_frame(representation)) ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1556095356.926286 astropy-healpix-0.5/astropy_healpix/interpolation.c0000644000077000000240000001554000000000000022701 0ustar00tomstaff00000000000000#include #include "interpolation.h" #include "healpix.h" // Old versions of MSVC do not support C99 and therefore // do not define NAN in math.h. #ifndef NAN static const union { unsigned long integer; float value; } type_punned_nan = {0xFFFFFFFFFFFFFFFFul}; #define NAN (type_punned_nan.value) #endif #ifndef M_PI #define M_PI 3.14159265358979323846 #endif void interpolate_weights(double lon, double lat, int64_t *ring_indices, double *weights, int Nside) { // Given a longitude and a latitude, Nside, and pre-allocated arrays of 4 // elements ring_indices and weights, find the ring index of the four nearest // neighbours and the weights to use for each neighbour to interpolate. int64_t xy_index, npix; int64_t ring1, ring2, ring3, ring4; double lon1, lat1, lon2, lat2; double lon3, lat3, lon4, lat4; double xfrac1, xfrac2, yfrac, lon_frac; int ring_number, longitude_index, n_in_ring; // Find the xy index of the pixel in which the coordinates fall xy_index = radec_to_healpixl(lon, lat, Nside); // Find the lon/lat of the center of that pixel healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon1, &lat1); // Take into account possible wrapping so that the pixel longitude/latitude // are close to the requested longitude/latitude if (lon - lon1 > M_PI) lon1 += 2 * M_PI; if (lon1 - lon > M_PI) lon1 -= 2 * M_PI; // Convert to a ring index and decompose into ring number and longitude index ring1 = healpixl_xy_to_ring(xy_index, Nside); if (ring1 < 0) { int i; for (i = 0; i < 4; i ++) { ring_indices[i] = -1; weights[i] = NAN; } return; } healpixl_decompose_ring(ring1, Nside, &ring_number, &longitude_index); // Figure out how many pixels are in the ring if (ring_number < Nside) { n_in_ring = 4 * ring_number; } else if (ring_number < 3 * Nside) { n_in_ring = 4 * Nside; } else { n_in_ring = (int)(4 * (4 * (int64_t)Nside - (int64_t)ring_number)); } // We now want to find the next index in the ring so that the point to // interpolate is between the two. First we check what direction to look in by // finding the longitude/latitude of the center of the HEALPix pixel. if (lon < lon1) { // Go to the left if (longitude_index == 0) { ring2 = ring1 + n_in_ring - 1; } else { ring2 = ring1 - 1; } } else { // Go to the right if (longitude_index == n_in_ring - 1) { ring2 = ring1 - n_in_ring + 1; } else { ring2 = ring1 + 1; } } // Find the lon/lat of the new pixel xy_index = healpixl_ring_to_xy(ring2, Nside); healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon2, &lat2); // Take into account possible wrapping so that the pixel longitude/latitude // are close to the requested longitude/latitude if (lon - lon2 > M_PI) lon2 += 2 * M_PI; if (lon2 - lon > M_PI) lon2 -= 2 * M_PI; // Now check whether we are moving up or down in terms of ring index if (lat > lat1) { // Move up (0 index is at the top) ring_number -= 1; } else { // Move down ring_number += 1; } if (ring_number > 0 && ring_number < 4 * Nside) { // Now figure out again how many pixels are in the ring if (ring_number < Nside) { n_in_ring = 4 * ring_number; } else if (ring_number < 3 * Nside) { n_in_ring = 4 * Nside; } else { n_in_ring = (int)(4 * (4 * (int64_t)Nside - (int64_t)ring_number)); } // Now determine the longitude index in which the requested longitude falls. // In all regions, the longitude elements are spaced by 360 / n_in_ring. For // convenience we convert the longitude index so that the spacing is 1. lon_frac = lon * n_in_ring / (2 * M_PI); // In the equatorial region, the first ring starts at 0.5 and the second at // 0 (in lon_frac space). The ring number is 1-based and the first ring in // the equatorial region is even. In this ring we can simply take // int(lon_frac) to get the longitude index but in the odd rings we need to // adjust lon_frac if (n_in_ring == 4 * Nside && ring_number % 2 == 1) { // Equatorial region lon_frac += 0.5; } // Find the longitude index of the closest pixel longitude_index = (int)lon_frac; if (longitude_index == n_in_ring) { longitude_index -= 1; } // Find the longitude/latitude and ring index of this pixel ring3 = healpixl_compose_ring(ring_number, longitude_index, Nside); xy_index = healpixl_ring_to_xy(ring3, Nside); healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon3, &lat3); // Take into account possible wrapping so that the pixel longitude/latitude // are close to the requested longitude/latitude if (lon - lon3 > M_PI) lon3 += 2 * M_PI; if (lon3 - lon > M_PI) lon3 -= 2 * M_PI; // Finally we can find the fourth pixel as before if (lon < lon3) { // Go to the left if (longitude_index == 0) { ring4 = ring3 + n_in_ring - 1; } else { ring4 = ring3 - 1; } } else { // Go to the right if (longitude_index == n_in_ring - 1) { ring4 = ring3 - n_in_ring + 1; } else { ring4 = ring3 + 1; } } xy_index = healpixl_ring_to_xy(ring4, Nside); healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon4, &lat4); // Take into account possible wrapping so that the pixel longitude/latitude // are close to the requested longitude/latitude if (lon - lon4 > M_PI) lon4 += 2 * M_PI; if (lon4 - lon > M_PI) lon4 -= 2 * M_PI; // Determine the interpolation weights xfrac1 = (lon - lon1) / (lon2 - lon1); xfrac2 = (lon - lon3) / (lon4 - lon3); yfrac = (lat - lat1) / (lat3 - lat1); weights[0] = (1 - xfrac1) * (1 - yfrac); weights[1] = xfrac1 * (1 - yfrac); weights[2] = (1 - xfrac2) * yfrac; weights[3] = xfrac2 * yfrac; } else { // In the case where we are inside the four top/bottom-most // values, we effectively place a value at the pole that // is the average of the four values, and the interpolation // is the weighted average of this polar value and the // value interpolated along the ring. xfrac1 = (lon - lon1) / (lon2 - lon1); yfrac = (lat - lat1) / (0.5 * M_PI - lat1); if (ring_number == 0) { ring3 = (ring1 + 2) % 4; ring4 = (ring2 + 2) % 4; yfrac = (lat - lat1) / (0.5 * M_PI - lat1); } else { npix = 12 * (int64_t)Nside * (int64_t)Nside; ring3 = ((ring1 - (npix - 4)) + 2) % 4 + npix - 4; ring4 = ((ring2 - (npix - 4)) + 2) % 4 + npix - 4; yfrac = (lat - lat1) / (-0.5 * M_PI - lat1); } weights[0] = (1 - xfrac1) * (1 - yfrac) + 0.25 * yfrac; weights[1] = xfrac1 * (1 - yfrac) + 0.25 * yfrac; weights[2] = 0.25 * yfrac; weights[3] = 0.25 * yfrac; } ring_indices[0] = ring1; ring_indices[1] = ring2; ring_indices[2] = ring3; ring_indices[3] = ring4; } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8769257 astropy-healpix-0.5/astropy_healpix/interpolation.h0000644000077000000240000000053500000000000022704 0ustar00tomstaff00000000000000#ifndef ASTROPY_HEALPIX_INTERPOLATION_INCL #define ASTROPY_HEALPIX_INTERPOLATION_INCL #ifdef _MSC_VER #if _MSC_VER >= 1600 #include #else #include #endif #else #include #endif void interpolate_weights(double lon, double lat, int64_t *ring_indices, double *weights, int Nside); #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5061731 astropy-healpix-0.5/astropy_healpix/setup_package.py0000644000077000000240000000170300000000000023027 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst import os from distutils.core import Extension HEALPIX_ROOT = os.path.relpath(os.path.dirname(__file__)) C_FILES = ['bl.c', 'healpix-utils.c', 'healpix.c', 'mathutil.c', 'permutedsort.c', 'qsort_reentrant.c', 'starutil.c'] C_DIR = os.path.join('cextern', 'astrometry.net') C_DIRS = ['numpy', C_DIR, HEALPIX_ROOT, os.path.join('cextern', 'numpy')] def get_extensions(): libraries = [] sources = [os.path.join(C_DIR, filename) for filename in C_FILES] sources.append(os.path.join(HEALPIX_ROOT, 'interpolation.c')) sources.append(os.path.join(HEALPIX_ROOT, '_core.c')) extension = Extension( name="astropy_healpix._core", sources=sources, include_dirs=C_DIRS, libraries=libraries, language="c", extra_compile_args=['-O2']) return [extension] ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5300088 astropy-healpix-0.5/astropy_healpix/tests/0000755000077000000240000000000000000000000021003 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2572062 astropy-healpix-0.5/astropy_healpix/tests/__init__.py0000644000077000000240000000010000000000000023103 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1526665132.258326 astropy-healpix-0.5/astropy_healpix/tests/coveragerc0000644000077000000240000000140000000000000023041 0ustar00tomstaff00000000000000[run] source = {packagename} omit = {packagename}/_astropy_init* {packagename}/conftest* {packagename}/cython_version* {packagename}/setup_package* {packagename}/*/setup_package* {packagename}/*/*/setup_package* {packagename}/tests/* {packagename}/*/tests/* {packagename}/*/*/tests/* {packagename}/version* [report] exclude_lines = # Have to re-enable the standard pragma pragma: no cover # Don't complain about packages we have installed except ImportError # Don't complain if tests don't hit assertions raise AssertionError raise NotImplementedError # Don't complain about script hooks def main\(.*\): # Ignore branches that don't pertain to this version of Python pragma: py{ignore_python_version}././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2588434 astropy-healpix-0.5/astropy_healpix/tests/setup_package.py0000644000077000000240000000024200000000000024166 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst def get_package_data(): return { _ASTROPY_PACKAGE_NAME_ + '.tests': ['coveragerc']} ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5070071 astropy-healpix-0.5/astropy_healpix/tests/test_bench.py0000644000077000000240000000010100000000000023463 0ustar00tomstaff00000000000000from ..bench import main def test_bench(): main(fast=True) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5075772 astropy-healpix-0.5/astropy_healpix/tests/test_core.py0000644000077000000240000002774200000000000023360 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst from itertools import product import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal from astropy import units as u from astropy.coordinates import Longitude, Latitude from ..core import (nside_to_pixel_area, nside_to_pixel_resolution, pixel_resolution_to_nside, nside_to_npix, npix_to_nside, healpix_to_lonlat, lonlat_to_healpix, interpolate_bilinear_lonlat, neighbours, healpix_cone_search, boundaries_lonlat, level_to_nside, nside_to_level, nested_to_ring, ring_to_nested, level_ipix_to_uniq, uniq_to_level_ipix, bilinear_interpolation_weights) def test_level_to_nside(): assert level_to_nside(5) == 2 ** 5 with pytest.raises(ValueError) as exc: level_to_nside(-1) assert exc.value.args[0] == 'level must be positive' def test_nside_to_level(): assert nside_to_level(1024) == 10 with pytest.raises(ValueError) as exc: nside_to_level(511) assert exc.value.args[0] == 'nside must be a power of two' def test_level_ipix_to_uniq(): assert 11 + 4*4**0 == level_ipix_to_uniq(0, 11) assert 62540 + 4*4**15 == level_ipix_to_uniq(15, 62540) with pytest.raises(ValueError) as exc: level_ipix_to_uniq(1, 49) assert exc.value.args[0] == 'ipix for a specific level must be inferior to npix' @pytest.mark.parametrize("level", [ 0, 5, 10, 15, 20, 22, 25, 26, 27, 28, 29 ]) def test_uniq_to_level_ipix(level): npix = 3 << 2*(level + 1) # Take 10 pixel indices between 0 and npix - 1 size = 10 ipix = np.arange(size, dtype=np.int64) * (npix // size) level = np.ones(size) * level level_res, ipix_res = uniq_to_level_ipix(level_ipix_to_uniq(level, ipix)) assert np.all(level_res == level) & np.all(ipix_res == ipix) def test_nside_to_pixel_area(): resolution = nside_to_pixel_area(256) assert_allclose(resolution.value, 1.5978966540475428e-05) assert resolution.unit == u.sr def test_nside_to_pixel_resolution(): resolution = nside_to_pixel_resolution(256) assert_allclose(resolution.value, 13.741945647269624) assert resolution.unit == u.arcmin def test_pixel_resolution_to_nside(): # Check the different rounding options nside = pixel_resolution_to_nside(13 * u.arcmin, round='nearest') assert nside == 256 nside = pixel_resolution_to_nside(13 * u.arcmin, round='up') assert nside == 512 nside = pixel_resolution_to_nside(13 * u.arcmin, round='down') assert nside == 256 # Check that it works with arrays nside = pixel_resolution_to_nside([1e3, 10, 1e-3] * u.deg, round='nearest') assert_equal(nside, [1, 8, 65536]) with pytest.raises(ValueError) as exc: pixel_resolution_to_nside(13 * u.arcmin, round='peaches') assert exc.value.args[0] == "Invalid value for round: 'peaches'" with pytest.raises(AttributeError) as exc: pixel_resolution_to_nside(13) assert exc.value.args[0] == "'int' object has no attribute 'to'" def test_nside_to_npix(): npix = nside_to_npix(4) assert npix == 192 npix = nside_to_npix([4, 4]) assert_equal(npix, 192) with pytest.raises(ValueError) as exc: nside_to_npix(15) assert exc.value.args[0] == 'nside must be a power of two' def test_npix_to_nside(): nside = npix_to_nside(192) assert nside == 4 nside = npix_to_nside([192, 192]) assert_equal(nside, 4) with pytest.raises(ValueError) as exc: npix_to_nside(7) assert exc.value.args[0] == 'Number of pixels must be divisible by 12' with pytest.raises(ValueError) as exc: npix_to_nside(12 * 8 * 9) assert exc.value.args[0] == 'Number of pixels is not of the form 12 * nside ** 2' # For the following tests, the numerical accuracy of this function is already # tested in test_cython_api.py, so we focus here on functionality specific to # the Python functions. @pytest.mark.parametrize('order', ['nested', 'ring']) def test_healpix_to_lonlat(order): lon, lat = healpix_to_lonlat([1, 2, 3], 4, order=order) assert isinstance(lon, Longitude) assert isinstance(lat, Latitude) index = lonlat_to_healpix(lon, lat, 4, order=order) assert_equal(index, [1, 2, 3]) lon, lat = healpix_to_lonlat([1, 2, 3], 4, dx=[0.1, 0.2, 0.3], dy=[0.5, 0.4, 0.7], order=order) assert isinstance(lon, Longitude) assert isinstance(lat, Latitude) index, dx, dy = lonlat_to_healpix(lon, lat, 4, order=order, return_offsets=True) assert_equal(index, [1, 2, 3]) assert_allclose(dx, [0.1, 0.2, 0.3]) assert_allclose(dy, [0.5, 0.4, 0.7]) def test_healpix_to_lonlat_invalid(): dx = [0.1, 0.4, 0.9] dy = [0.4, 0.3, 0.2] with pytest.warns(RuntimeWarning, match='invalid value'): lon, lat = healpix_to_lonlat([-1, 2, 3], 4) with pytest.warns(RuntimeWarning, match='invalid value'): lon, lat = healpix_to_lonlat([192, 2, 3], 4) with pytest.raises(ValueError) as exc: lon, lat = healpix_to_lonlat([1, 2, 3], 5) assert exc.value.args[0] == 'nside must be a power of two' with pytest.raises(ValueError) as exc: lon, lat = healpix_to_lonlat([1, 2, 3], 4, order='banana') assert exc.value.args[0] == "order must be 'nested' or 'ring'" with pytest.raises(ValueError) as exc: lon, lat = healpix_to_lonlat([1, 2, 3], 4, dx=[-0.1, 0.4, 0.5], dy=dy) assert exc.value.args[0] == 'dx must be in the range [0:1]' with pytest.raises(ValueError) as exc: lon, lat = healpix_to_lonlat([1, 2, 3], 4, dx=dx, dy=[-0.1, 0.4, 0.5]) assert exc.value.args[0] == 'dy must be in the range [0:1]' def test_healpix_to_lonlat_shape(): lon, lat = healpix_to_lonlat(2, 8) assert lon.isscalar and lat.isscalar lon, lat = healpix_to_lonlat([[1, 2, 3], [3, 4, 4]], 8) assert lon.shape == (2, 3) and lat.shape == (2, 3) lon, lat = healpix_to_lonlat([[1], [2], [3]], nside=8, dx=0.2, dy=[[0.1, 0.3]]) assert lon.shape == (3, 2) and lat.shape == (3, 2) def test_lonlat_to_healpix_shape(): healpix_index = lonlat_to_healpix(2 * u.deg, 3 * u.deg, 8) assert np.can_cast(healpix_index, np.int64) lon, lat = np.ones((2, 4)) * u.deg, np.zeros((2, 4)) * u.deg healpix_index = lonlat_to_healpix(lon, lat, 8) assert healpix_index.shape == (2, 4) healpix_index, dx, dy = lonlat_to_healpix(2 * u.deg, 3 * u.deg, 8, return_offsets=True) assert np.can_cast(healpix_index, np.int64) assert isinstance(dx, float) assert isinstance(dy, float) lon, lat = np.ones((2, 4)) * u.deg, np.zeros((2, 4)) * u.deg healpix_index, dx, dy = lonlat_to_healpix(lon, lat, 8, return_offsets=True) assert healpix_index.shape == (2, 4) assert dx.shape == (2, 4) assert dy.shape == (2, 4) @pytest.mark.parametrize('function', [nested_to_ring, ring_to_nested]) def test_nested_ring_shape(function): index = function(1, 8) assert np.can_cast(index, np.int64) index = function([[1, 2, 3], [2, 3, 4]], 8) assert index.shape == (2, 3) @pytest.mark.parametrize('order', ['nested', 'ring']) def test_bilinear_interpolation_weights(order): indices, weights = bilinear_interpolation_weights(100 * u.deg, 10 * u.deg, nside=4, order=order) if order == 'nested': indices = nested_to_ring(indices, nside=4) assert_equal(indices, [76, 77, 60, 59]) assert_allclose(weights, [0.532723, 0.426179, 0.038815, 0.002283], atol=1e-6) def test_bilinear_interpolation_weights_invalid(): with pytest.raises(ValueError) as exc: bilinear_interpolation_weights(1 * u.deg, 2 * u.deg, nside=5) assert exc.value.args[0] == 'nside must be a power of two' with pytest.raises(ValueError) as exc: bilinear_interpolation_weights(3 * u.deg, 4 * u.deg, nside=4, order='banana') assert exc.value.args[0] == "order must be 'nested' or 'ring'" def test_bilinear_interpolation_weights_shape(): indices, weights = bilinear_interpolation_weights(3 * u.deg, 4 * u.deg, nside=8) assert indices.shape == (4,) assert weights.shape == (4,) indices, weights = bilinear_interpolation_weights([[1, 2, 3], [2, 3, 4]] * u.deg, [[1, 2, 3], [2, 3, 4]] * u.deg, nside=8) assert indices.shape == (4, 2, 3) assert weights.shape == (4, 2, 3) @pytest.mark.parametrize('order', ['nested', 'ring']) def test_interpolate_bilinear_lonlat(order): values = np.ones(192) * 3 result = interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg, values, order=order) assert_allclose(result, [3, 3, 3]) def test_interpolate_bilinear_invalid(): values = np.ones(133) with pytest.raises(ValueError) as exc: interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg, values) assert exc.value.args[0] == 'Number of pixels must be divisible by 12' values = np.ones(192) with pytest.raises(ValueError) as exc: interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg, values, order='banana') assert exc.value.args[0] == "order must be 'nested' or 'ring'" result = interpolate_bilinear_lonlat([0, np.nan] * u.deg, [0, np.nan] * u.deg, values, order='nested') assert result.shape == (2,) assert result[0] == 1 assert np.isnan(result[1]) def test_interpolate_bilinear_lonlat_shape(): values = np.ones(192) * 3 result = interpolate_bilinear_lonlat(3 * u.deg, 4 * u.deg, values) assert isinstance(result, float) result = interpolate_bilinear_lonlat([[1, 2, 3], [2, 3, 4]] * u.deg, [[1, 2, 3], [2, 3, 4]] * u.deg, values) assert result.shape == (2, 3) values = np.ones((192, 50)) * 3 lon = np.ones((3, 6, 5)) * u.deg lat = np.ones((3, 6, 5)) * u.deg result = interpolate_bilinear_lonlat(lon, lat, values) assert result.shape == (3, 6, 5, 50) @pytest.mark.parametrize('order', ['nested', 'ring']) def test_neighbours(order): neigh = neighbours([1, 2, 3], 4, order=order) if order == 'nested': expected = [[0, 71, 2], [2, 77, 8], [3, 8, 9], [6, 9, 12], [4, 3, 6], [94, 1, 4], [91, 0, 1], [90, 69, 0]] else: expected = [[6, 8, 10], [5, 7, 9], [0, 1, 2], [3, 0, 1], [2, 3, 0], [8, 10, 4], [7, 9, 11], [16, 19, 22]] assert_equal(neigh, expected) def test_neighbours_invalid(): with pytest.warns(RuntimeWarning, match='invalid value'): neighbours([-1, 2, 3], 4) with pytest.warns(RuntimeWarning, match='invalid value'): neighbours([192, 2, 3], 4) with pytest.raises(ValueError) as exc: neighbours([1, 2, 3], 5) assert exc.value.args[0] == 'nside must be a power of two' with pytest.raises(ValueError) as exc: neighbours([1, 2, 3], 4, order='banana') assert exc.value.args[0] == "order must be 'nested' or 'ring'" def test_neighbours_shape(): neigh = neighbours([[1, 2, 3], [2, 3, 4]], 4) assert neigh.shape == (8, 2, 3) @pytest.mark.parametrize('order', ['nested', 'ring']) def test_healpix_cone_search(order): indices = healpix_cone_search(10 * u.deg, 20 * u.deg, 1 * u.deg, nside=256, order=order) assert len(indices) == 80 @pytest.mark.parametrize(('step', 'order'), product([1, 4, 10], ['nested', 'ring'])) def test_boundaries_lonlat(step, order): lon, lat = boundaries_lonlat([10, 20, 30], step, 256, order=order) assert lon.shape == (3, 4 * step) assert lat.shape == (3, 4 * step) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5080934 astropy-healpix-0.5/astropy_healpix/tests/test_healpy.py0000644000077000000240000002742600000000000023711 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst from itertools import product import pytest import numpy as np from numpy.testing import assert_equal, assert_allclose from .. import healpy as hp_compat # NOTE: If healpy is installed, we use it in these tests, but healpy is not a # formal dependency of astropy-healpix. hp = pytest.importorskip('healpy') from hypothesis import given, settings, example from hypothesis.strategies import integers, floats, booleans from hypothesis.extra.numpy import arrays NSIDE_VALUES = [2 ** n for n in range(1, 6)] @pytest.mark.parametrize(('nside', 'degrees'), product(NSIDE_VALUES, (False, True))) def test_nside2pixarea(nside, degrees): actual = hp_compat.nside2pixarea(nside=nside, degrees=degrees) expected = hp.nside2pixarea(nside=nside, degrees=degrees) assert_equal(actual, expected) @pytest.mark.parametrize(('nside', 'arcmin'), product(NSIDE_VALUES, (False, True))) def test_nside2resol(nside, arcmin): actual = hp_compat.nside2resol(nside=nside, arcmin=arcmin) expected = hp.nside2resol(nside=nside, arcmin=arcmin) assert_equal(actual, expected) @pytest.mark.parametrize('nside', NSIDE_VALUES) def test_nside2npix(nside): actual = hp_compat.nside2npix(nside) expected = hp.nside2npix(nside) assert_equal(actual, expected) @pytest.mark.parametrize('level', [0, 3, 7]) def test_order2nside(level): actual = hp_compat.order2nside(level) expected = hp.order2nside(level) assert_equal(actual, expected) @pytest.mark.parametrize('npix', [12 * 2 ** (2 * n) for n in range(1, 6)]) def test_npix2nside(npix): actual = hp_compat.npix2nside(npix) expected = hp.npix2nside(npix) assert_equal(actual, expected) # For the test below, we exclude latitudes that fall exactly on the pole or # the equator since points that fall at exact boundaries are ambiguous. @given(nside_pow=integers(0, 29), nest=booleans(), lonlat=booleans(), lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-10), lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter( lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-10)) @settings(max_examples=2000, derandomize=True) def test_ang2pix(nside_pow, lon, lat, nest, lonlat): nside = 2 ** nside_pow if lonlat: theta, phi = lon, lat else: theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon) ipix1 = hp_compat.ang2pix(nside, theta, phi, nest=nest, lonlat=lonlat) ipix2 = hp.ang2pix(nside, theta, phi, nest=nest, lonlat=lonlat) assert ipix1 == ipix2 def test_ang2pix_shape(): ipix = hp_compat.ang2pix(8, 1., 2.) assert np.can_cast(ipix, np.int64) ipix = hp_compat.ang2pix(8, [[1., 2.], [3., 4.]], [[1., 2.], [3., 4.]]) assert ipix.shape == (2, 2) def test_pix2ang_shape(): lon, lat = hp_compat.pix2ang(8, 1) assert isinstance(lon, float) assert isinstance(lat, float) lon, lat = hp_compat.pix2ang(8, [[1, 2, 3], [4, 5, 6]]) assert lon.shape == (2, 3) assert lat.shape == (2, 3) @given(nside_pow=integers(0, 29), nest=booleans(), lonlat=booleans(), frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1)) @settings(max_examples=2000, derandomize=True) @example(nside_pow=29, frac=0.1666666694606345, nest=False, lonlat=False) @example(nside_pow=27, frac=2./3., nest=True, lonlat=False) def test_pix2ang(nside_pow, frac, nest, lonlat): nside = 2 ** nside_pow ipix = int(frac * 12 * nside ** 2) theta1, phi1 = hp_compat.pix2ang(nside, ipix, nest=nest, lonlat=lonlat) theta2, phi2 = hp.pix2ang(nside, ipix, nest=nest, lonlat=lonlat) if lonlat: assert_allclose(phi1, phi2, atol=1e-8) if abs(phi1) < 90: assert_allclose(theta1, theta2, atol=1e-10) else: assert_allclose(theta1, theta2, atol=1e-8) if theta1 > 0: assert_allclose(phi1, phi2, atol=1e-10) @given(nside_pow=integers(0, 29), nest=booleans(), x=floats(-1, 1, allow_nan=False, allow_infinity=False).filter(lambda x: abs(x) > 1e-10), y=floats(-1, 1, allow_nan=False, allow_infinity=False).filter(lambda y: abs(y) > 1e-10), z=floats(-1, 1, allow_nan=False, allow_infinity=False).filter(lambda z: abs(z) > 1e-10)) @settings(max_examples=2000, derandomize=True) def test_vec2pix(nside_pow, x, y, z, nest): nside = 2 ** nside_pow ipix1 = hp_compat.vec2pix(nside, x, y, z, nest=nest) ipix2 = hp.vec2pix(nside, x, y, z, nest=nest) assert ipix1 == ipix2 @given(nside_pow=integers(0, 29), nest=booleans(), frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1)) @settings(max_examples=2000, derandomize=True) @example(nside_pow=29, frac=0.1666666694606345, nest=False) def test_pix2vec(nside_pow, frac, nest): nside = 2 ** nside_pow ipix = int(frac * 12 * nside ** 2) xyz1 = hp_compat.pix2vec(nside, ipix, nest=nest) xyz2 = hp.pix2vec(nside, ipix, nest=nest) assert_allclose(xyz1, xyz2, atol=1e-8) def test_vec2pix_shape(): ipix = hp_compat.vec2pix(8, 1., 2., 3.) assert np.can_cast(ipix, np.int64) ipix = hp_compat.vec2pix(8, [[1., 2.], [3., 4.]], [[5., 6.], [7., 8.]], [[9., 10.], [11., 12.]]) assert ipix.shape == (2, 2) def test_pix2vec_shape(): x, y, z = hp_compat.pix2vec(8, 1) assert isinstance(x, float) assert isinstance(y, float) assert isinstance(z, float) x, y, z = hp_compat.pix2vec(8, [[1, 2, 3], [4, 5, 6]]) assert x.shape == (2, 3) assert y.shape == (2, 3) assert z.shape == (2, 3) @given(nside_pow=integers(0, 29), frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1)) @settings(max_examples=2000, derandomize=True) def test_nest2ring(nside_pow, frac): nside = 2 ** nside_pow nest = int(frac * 12 * nside ** 2) ring1 = hp_compat.nest2ring(nside, nest) ring2 = hp.nest2ring(nside, nest) assert ring1 == ring2 @given(nside_pow=integers(0, 29), frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1)) @settings(max_examples=2000, derandomize=True) @example(nside_pow=29, frac=0.16666666697710755) def test_ring2nest(nside_pow, frac): nside = 2 ** nside_pow ring = int(frac * 12 * nside ** 2) nest1 = hp_compat.ring2nest(nside, ring) nest2 = hp.ring2nest(nside, ring) assert nest1 == nest2 @given(nside_pow=integers(0, 29), step=integers(1, 10), nest=booleans(), frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1)) @settings(max_examples=500, derandomize=True) def test_boundaries(nside_pow, frac, step, nest): nside = 2 ** nside_pow pix = int(frac * 12 * nside ** 2) b1 = hp_compat.boundaries(nside, pix, step=step, nest=nest) b2 = hp.boundaries(nside, pix, step=step, nest=nest) assert_allclose(b1, b2, atol=1e-8) def test_boundaries_shape(): pix = 1 b1 = hp_compat.boundaries(8, pix, step=4) b2 = hp.boundaries(8, pix, step=4) assert b1.shape == b2.shape pix = [1, 2, 3, 4, 5] b1 = hp_compat.boundaries(8, pix, step=4) b2 = hp.boundaries(8, pix, step=4) assert b1.shape == b2.shape def not_at_origin(vec): return np.linalg.norm(vec) > 0 @given(vectors=arrays(float, (3,), elements=floats(-1, 1)).filter(not_at_origin), lonlat=booleans(), ndim=integers(0, 4)) @settings(max_examples=500, derandomize=True) def test_vec2ang(vectors, lonlat, ndim): vectors = np.broadcast_to(vectors, (2,) * ndim + (3,)) theta1, phi1 = hp_compat.vec2ang(vectors, lonlat=lonlat) theta2, phi2 = hp.vec2ang(vectors, lonlat=lonlat) # Healpy sometimes returns NaNs for phi (somewhat incorrectly) phi2 = np.nan_to_num(phi2) assert_allclose(theta1, theta1, atol=1e-10) assert_allclose(phi1, phi2, atol=1e-10) @given(lonlat=booleans(), lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-10), lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter( lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-10)) @settings(max_examples=2000, derandomize=True) def test_ang2vec(lon, lat, lonlat): if lonlat: theta, phi = lon, lat else: theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon) xyz1 = hp_compat.ang2vec(theta, phi, lonlat=lonlat) xyz2 = hp.ang2vec(theta, phi, lonlat=lonlat) assert_allclose(xyz1, xyz2, atol=1e-10) # The following fails, need to investigate: # @example(nside_pow=29, lon=1.0000000028043134e-05, lat=1.000000000805912e-05, nest=False, lonlat=False) # @given(nside_pow=integers(0, 28), nest=booleans(), lonlat=booleans(), lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-5), lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter( lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-5)) @settings(max_examples=500, derandomize=True) @example(nside_pow=27, lon=1.0000000028043134e-05, lat=-41.81031451395941, nest=False, lonlat=False) @example(nside_pow=6, lon=1.6345238095238293, lat=69.42254649458224, nest=False, lonlat=False) @example(nside_pow=15, lon=1.0000000028043134e-05, lat=1.000000000805912e-05, nest=False, lonlat=False) @example(nside_pow=0, lon=315.0000117809725, lat=1.000000000805912e-05, nest=False, lonlat=False) @example(nside_pow=0, lon=1.0000000028043134e-05, lat=-41.81031489577861, nest=False, lonlat=False) @example(nside_pow=0, lon=35.559942143736414, lat=-41.8103252622604, nest=False, lonlat=False) @example(nside_pow=28, lon=359.9999922886491, lat=-41.81031470486902, nest=False, lonlat=False) @example(nside_pow=0, lon=1.0000000028043134e-05, lat=-41.81031489577861, nest=False, lonlat=False) @example(nside_pow=27, lon=1.0000000028043134e-05, lat=-41.81031451395941, nest=False, lonlat=False) @example(nside_pow=26, lon=359.9999986588955, lat=41.81031489577861, nest=False, lonlat=False) @example(nside_pow=27, lon=359.999997317791, lat=-41.81031451395943, nest=False, lonlat=False) @example(nside_pow=27, lon=1.0000000028043134e-05, lat=89.80224636153702, nest=False, lonlat=False) def test_interp_weights(nside_pow, lon, lat, nest, lonlat): nside = 2 ** nside_pow if lonlat: theta, phi = lon, lat else: theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon) indices1, weights1 = hp_compat.get_interp_weights(nside, theta, phi, nest=nest, lonlat=lonlat) indices2, weights2 = hp.get_interp_weights(nside, theta, phi, nest=nest, lonlat=lonlat) # Ignore neighbours with weights < 1e-6 - we have to exclude these otherwise # in some corner cases there will be different low-probability neighbours. keep = weights1 > 1e-6 indices1, weights1 = indices1[keep], weights1[keep] keep = weights2 > 1e-6 indices2, weights2 = indices2[keep], weights2[keep] order1 = np.argsort(indices1) order2 = np.argsort(indices2) assert_equal(indices1[order1], indices2[order2]) assert_allclose(weights1[order1], weights2[order2], atol=1e-5) # Make an array that can be useful up to the highest nside tested below NSIDE_POW_MAX = 8 VALUES = np.random.random(12 * NSIDE_POW_MAX ** 2) @given(nside_pow=integers(0, NSIDE_POW_MAX), nest=booleans(), lonlat=booleans(), lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-5), lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter( lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-5)) @settings(max_examples=500, derandomize=True) def test_interp_val(nside_pow, lon, lat, nest, lonlat): nside = 2 ** nside_pow if lonlat: theta, phi = lon, lat else: theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon) m = VALUES[:12 * nside ** 2] value1 = hp_compat.get_interp_val(m, theta, phi, nest=nest, lonlat=lonlat) value2 = hp.get_interp_val(m, theta, phi, nest=nest, lonlat=lonlat) assert_allclose(value1, value2, rtol=0.1, atol=1.e-10) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5086794 astropy-healpix-0.5/astropy_healpix/tests/test_high_level.py0000644000077000000240000001530000000000000024521 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal from astropy import units as u from astropy.coordinates import Longitude, Latitude, Galactic, SkyCoord from ..high_level import HEALPix class TestHEALPix: def setup_class(self): self.pix = HEALPix(nside=256, order='nested') def test_pixel_area(self): pixel_area = self.pix.pixel_area assert_allclose(pixel_area.value, 1.5978966540475428e-05) assert pixel_area.unit == u.sr def test_pixel_resolution(self): pixel_resolution = self.pix.pixel_resolution assert_allclose(pixel_resolution.value, 13.741945647269624) assert pixel_resolution.unit == u.arcmin def test_npix(self): assert self.pix.npix == 12 * 256 ** 2 # For the following tests, the numerical accuracy of this function is # already tested in test_cython_api.py, so we focus here on functionality # specific to the high-level functions. def test_healpix_to_lonlat(self): lon, lat = self.pix.healpix_to_lonlat([1, 2, 3]) assert isinstance(lon, Longitude) assert isinstance(lat, Latitude) index = self.pix.lonlat_to_healpix(lon, lat) assert_equal(index, [1, 2, 3]) lon, lat = self.pix.healpix_to_lonlat([1, 2, 3], dx=[0.1, 0.2, 0.3], dy=[0.5, 0.4, 0.7]) assert isinstance(lon, Longitude) assert isinstance(lat, Latitude) index, dx, dy = self.pix.lonlat_to_healpix(lon, lat, return_offsets=True) assert_equal(index, [1, 2, 3]) assert_allclose(dx, [0.1, 0.2, 0.3]) assert_allclose(dy, [0.5, 0.4, 0.7]) def test_nested_to_ring(self): nested_index_1 = [1, 3, 22] ring_index = self.pix.nested_to_ring(nested_index_1) nested_index_2 = self.pix.ring_to_nested(ring_index) assert_equal(nested_index_1, nested_index_2) def test_bilinear_interpolation_weights(self): indices, weights = self.pix.bilinear_interpolation_weights([1, 3, 4] * u.deg, [3, 2, 6] * u.deg) assert indices.shape == (4, 3) assert weights.shape == (4, 3) def test_interpolate_bilinear_lonlat(self): values = np.ones(12 * 256 ** 2) * 3 result = self.pix.interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg, values) assert_allclose(result, [3, 3, 3]) def test_interpolate_bilinear_lonlat_invalid(self): values = np.ones(222) * 3 with pytest.raises(ValueError) as exc: self.pix.interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg, values) assert exc.value.args[0] == 'values must be an array of length 786432 (got 222)' def test_cone_search_lonlat(self): lon, lat = 1 * u.deg, 4 * u.deg result = self.pix.cone_search_lonlat(lon, lat, 1 * u.deg) assert len(result) == 77 def test_cone_search_lonlat_invalid(self): lon, lat = [1, 2] * u.deg, [3, 4] * u.deg with pytest.raises(ValueError) as exc: self.pix.cone_search_lonlat(lon, lat, 1 * u.deg) assert exc.value.args[0] == 'The longitude, latitude and radius must be scalar Quantity objects' def test_boundaries_lonlat(self): lon, lat = self.pix.boundaries_lonlat([10, 20, 30], 4) assert lon.shape == (3, 16) assert lat.shape == (3, 16) def test_neighbours(self): neigh = self.pix.neighbours([10, 20, 30]) assert neigh.shape == (8, 3) class TestCelestialHEALPix: def setup_class(self): self.pix = HEALPix(nside=256, order='nested', frame=Galactic()) def test_healpix_from_header(self): """Test instantiation from a FITS header. Notes ----- We don't need to test all possible options, because :meth:`~astropy_healpix.HEALPix.from_header` is just a wrapper around :meth:`~astropy_healpix.utils.parse_input_healpix_data`, which is tested exhaustively in :mod:`~astropy_healpix.tests.test_utils`. """ pix = HEALPix.from_header( (np.empty(self.pix.npix), 'G'), nested=self.pix.order == 'nested') assert pix.nside == self.pix.nside assert type(pix.frame) == type(self.pix.frame) assert pix.order == self.pix.order def test_healpix_to_skycoord(self): coord = self.pix.healpix_to_skycoord([1, 2, 3]) assert isinstance(coord, SkyCoord) assert isinstance(coord.frame, Galactic) # Make sure that the skycoord_to_healpix method converts coordinates # to the frame of the HEALPix coord = coord.transform_to('fk5') index = self.pix.skycoord_to_healpix(coord) assert_equal(index, [1, 2, 3]) coord = self.pix.healpix_to_skycoord([1, 2, 3], dx=[0.1, 0.2, 0.3], dy=[0.5, 0.4, 0.7]) assert isinstance(coord, SkyCoord) assert isinstance(coord.frame, Galactic) # Make sure that the skycoord_to_healpix method converts coordinates # to the frame of the HEALPix coord = coord.transform_to('fk5') index, dx, dy = self.pix.skycoord_to_healpix(coord, return_offsets=True) assert_equal(index, [1, 2, 3]) assert_allclose(dx, [0.1, 0.2, 0.3]) assert_allclose(dy, [0.5, 0.4, 0.7]) def test_interpolate_bilinear_skycoord(self): values = np.ones(12 * 256 ** 2) * 3 coord = SkyCoord([1, 2, 3] * u.deg, [4, 3, 1] * u.deg, frame='fk4') result = self.pix.interpolate_bilinear_skycoord(coord, values) assert_allclose(result, [3, 3, 3]) # Make sure that coordinate system is correctly taken into account values = np.arange(12 * 256 ** 2) * 3 coord = SkyCoord([1, 2, 3] * u.deg, [4, 3, 1] * u.deg, frame='fk4') result1 = self.pix.interpolate_bilinear_skycoord(coord, values) result2 = self.pix.interpolate_bilinear_skycoord(coord.icrs, values) assert_allclose(result1, result2) def test_cone_search_skycoord(self): coord = SkyCoord(1 * u.deg, 4 * u.deg, frame='galactic') result1 = self.pix.cone_search_skycoord(coord, 1 * u.deg) assert len(result1) == 77 result2 = self.pix.cone_search_skycoord(coord.icrs, 1 * u.deg) assert_allclose(result1, result2) def test_boundaries_skycoord(self): coord = self.pix.boundaries_skycoord([10, 20, 30], 4) assert coord.shape == (3, 16) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5089996 astropy-healpix-0.5/astropy_healpix/tests/test_utils.py0000644000077000000240000000323200000000000023554 0ustar00tomstaff00000000000000import numpy as np import pytest from astropy.coordinates import FK5, Galactic from astropy.io import fits from ..utils import parse_coord_system, parse_input_healpix_data def test_parse_coord_system(): frame = parse_coord_system(Galactic()) assert isinstance(frame, Galactic) frame = parse_coord_system('fk5') assert isinstance(frame, FK5) with pytest.raises(ValueError) as exc: frame = parse_coord_system('e') assert exc.value.args[0] == "Ecliptic coordinate frame not yet supported" frame = parse_coord_system('g') assert isinstance(frame, Galactic) with pytest.raises(ValueError) as exc: frame = parse_coord_system('spam') assert exc.value.args[0] == "Could not determine frame for system=spam" def test_parse_input_healpix_data(tmpdir): data = np.arange(3072) col = fits.Column(array=data, name='flux', format="E") hdu = fits.BinTableHDU.from_columns([col]) hdu.header['NSIDE'] = 512 hdu.header['COORDSYS'] = "G" # As HDU array, coordinate_system, nested = parse_input_healpix_data(hdu) np.testing.assert_allclose(array, data) # As filename filename = tmpdir.join('test.fits').strpath hdu.writeto(filename) array, coordinate_system, nested = parse_input_healpix_data(filename) np.testing.assert_allclose(array, data) # As array array, coordinate_system, nested = parse_input_healpix_data((data, "galactic")) np.testing.assert_allclose(array, data) # Invalid with pytest.raises(TypeError) as exc: parse_input_healpix_data(data) assert exc.value.args[0] == "input_data should either be an HDU object or a tuple of (array, frame)" ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574433289.509307 astropy-healpix-0.5/astropy_healpix/utils.py0000644000077000000240000000345300000000000021360 0ustar00tomstaff00000000000000import numpy as np from astropy.io import fits from astropy.io.fits import TableHDU, BinTableHDU from astropy.coordinates import BaseCoordinateFrame, frame_transform_graph, Galactic, ICRS FRAMES = { 'g': Galactic(), 'c': ICRS() } def parse_coord_system(system): if isinstance(system, BaseCoordinateFrame): return system elif isinstance(system, str): system = system.lower() if system == 'e': raise ValueError("Ecliptic coordinate frame not yet supported") elif system in FRAMES: return FRAMES[system] else: system_new = frame_transform_graph.lookup_name(system) if system_new is None: raise ValueError(f"Could not determine frame for system={system}") else: return system_new() def parse_input_healpix_data(input_data, field=0, hdu_in=None, nested=None): """ Parse input HEALPIX data to return a Numpy array and coordinate frame object. """ if isinstance(input_data, (TableHDU, BinTableHDU)): data = input_data.data header = input_data.header coordinate_system_in = parse_coord_system(header['COORDSYS']) array_in = data[data.columns[field].name].ravel() if 'ORDERING' in header: nested = header['ORDERING'].lower() == 'nested' elif isinstance(input_data, str): hdu = fits.open(input_data)[hdu_in or 1] return parse_input_healpix_data(hdu, field=field) elif isinstance(input_data, tuple) and isinstance(input_data[0], np.ndarray): array_in = input_data[0] coordinate_system_in = parse_coord_system(input_data[1]) else: raise TypeError("input_data should either be an HDU object or a tuple of (array, frame)") return array_in, coordinate_system_in, nested ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696079.8735893 astropy-healpix-0.5/astropy_healpix/version.py0000644000077000000240000000057300000000000021705 0ustar00tomstaff00000000000000# Autogenerated by Astropy-affiliated package astropy_healpix's setup.py on 2019-11-25 15:34:39 UTC import datetime version = "0.5" githash = "09192c016185fa9af35d876f8364dc0af19a825a" major = 0 minor = 5 bugfix = 0 version_info = (major, minor, bugfix) release = True timestamp = datetime.datetime(2019, 11, 25, 15, 34, 39) debug = False astropy_helpers_version = "3.2.2" ././@PaxHeader0000000000000000000000000000003200000000000011450 xustar000000000000000026 mtime=1574696080.53355 astropy-healpix-0.5/astropy_helpers/0000755000077000000240000000000000000000000017651 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6107712 astropy-healpix-0.5/astropy_helpers/CHANGES.rst0000644000077000000240000005317300000000000021464 0ustar00tomstaff00000000000000astropy-helpers Changelog ************************* 3.2.2 (2019-10-25) ------------------ - Correctly handle main package directory inside namespace package. [#486] 3.2.1 (2019-06-13) ------------------ - Reverting issuing deprecation warning for the ``build_sphinx`` command. [#482] - Make sure that all data files get included in tar file releases. [#485] 3.2 (2019-05-29) ---------------- - Make sure that ``[options.package_data]`` in setup.cfg is taken into account when collecting package data. [#453] - Simplified the code for the custom build_ext command. [#446] - Avoid importing the astropy package when trying to get the test command when testing astropy itself. [#450] - Avoid importing whole package when trying to get version information. Note that this has also introduced a small API change - ``cython_version`` and ``compiler`` can no longer be imported from the ``package.version`` module generated by astropy-helpers. Instead, you can import these from ``package.cython_version`` and ``package.compiler_version`` respectively. [#442] - Make it possible to call ``generate_version_py`` and ``register_commands`` without any arguments, which causes information to be read in from the ``setup.cfg`` file. [#440] - Simplified setup.py and moved most of the configuration to setup.cfg. [#445] - Add a new ``astropy_helpers.setup_helpers.setup`` function that does all the default boilerplate in typical ``setup.py`` files that use astropy-helpers. [#443] - Remove ``deprecated``, ``deprecated_attribute``, and ``minversion`` from ``astropy_helpers.utils``. [#447] - Updated minimum required version of setuptools to 30.3.0. [#440] - Remove functionality to adjust compilers if a broken compiler is detected. This is not useful anymore as only a single compiler was previously patched (now unlikely to be used) and this was only to fix a compilation issue in the core astropy package. [#421] - ``sphinx-astropy`` is now a required dependency to build the docs, the machinery to install it as eggs have been removed. [#474] 3.1.1 (2019-02-22) ------------------ - Moved documentation from README to Sphinx. [#444] - Fixed broken OpenMP detection when building with ``-coverage``. [#434] 3.1 (2018-12-04) ---------------- - Added extensive documentation about astropy-helpers to the README.rst file. [#416] - Fixed the compatibility of the build_docs command with Sphinx 1.8 and above. [#413] - Removing deprecated test_helpers.py file. [#369] - Removing ez_setup.py file and requiring setuptools 1.0 or later. [#384] - Remove all sphinx components from ``astropy-helpers``. These are now replaced by the ``sphinx-astropy`` package in conjunction with the ``astropy-theme-sphinx``, ``sphinx-automodapi``, and ``numpydoc`` packages. [#368] - openmp_helpers.py: Make add_openmp_flags_if_available() work for clang. The necessary include, library, and runtime paths now get added to the C test code used to determine if openmp works. Autogenerator utility added ``openmp_enabled.is_openmp_enabled()`` which can be called post build to determine state of OpenMP support. [#382] - Add version_info tuple to autogenerated version.py. Allows for simple version checking, i.e. version_info > (2,0,1). [#385] 3.0.2 (2018-06-01) ------------------ - Nothing changed. 3.0.1 (2018-02-22) ------------------ - Nothing changed. 3.0 (2018-02-09) ---------------- - Removing Python 2 support, including 2to3. Packages wishing to keep Python 2 support should NOT update to this version. [#340] - Removing deprecated _test_compat making astropy a hard dependency for packages wishing to use the astropy tests machinery. [#314] - Removing unused 'register' command since packages should be uploaded with twine and get registered automatically. [#332] 2.0.11 (2019-10-25) ------------------- - Fixed deprecation warning in sphinx theme. [#493] - Fixed an issue that caused pytest to crash if it tried to collect tests. [#488] 2.0.10 (2019-05-29) ------------------- - Removed ``tocdepthfix`` sphinx extension that worked around a big in Sphinx that has been long fixed. [#475] - Allow Python dev versions to pass the python version check. [#476] - Updated bundled version of sphinx-automodapi to v0.11. [#478] 2.0.9 (2019-02-22) ------------------ - Updated bundled version of sphinx-automodapi to v0.10. [#439] - Updated bundled sphinx extensions version to sphinx-astropy v1.1.1. [#454] - Include package name in error message for Python version in ``ah_bootstrap.py``. [#441] 2.0.8 (2018-12-04) ------------------ - Fixed compatibility with Sphinx 1.8+. [#428] - Fixed error that occurs when installing a package in an environment where ``numpy`` is not already installed. [#404] - Updated bundled version of sphinx-automodapi to v0.9. [#422] - Updated bundled version of numpydoc to v0.8.0. [#423] 2.0.7 (2018-06-01) ------------------ - Removing ez_setup.py file and requiring setuptools 1.0 or later. [#384] 2.0.6 (2018-02-24) ------------------ - Avoid deprecation warning due to ``exclude=`` keyword in ``setup.py``. [#379] 2.0.5 (2018-02-22) ------------------ - Fix segmentation faults that occurred when the astropy-helpers submodule was first initialized in packages that also contained Cython code. [#375] 2.0.4 (2018-02-09) ------------------ - Support dotted package names as namespace packages in generate_version_py. [#370] - Fix compatibility with setuptools 36.x and above. [#372] - Fix false negative in add_openmp_flags_if_available when measuring code coverage with gcc. [#374] 2.0.3 (2018-01-20) ------------------ - Make sure that astropy-helpers 3.x.x is not downloaded on Python 2. [#362, #363] - The bundled version of sphinx-automodapi has been updated to v0.7. [#365] - Add --auto-use and --no-auto-use command-line flags to match the ``auto_use`` configuration option, and add an alias ``--use-system-astropy-helpers`` for ``--no-auto-use``. [#366] 2.0.2 (2017-10-13) ------------------ - Added new helper function add_openmp_flags_if_available that can add OpenMP compilation flags to a C/Cython extension if needed. [#346] - Update numpydoc to v0.7. [#343] - The function ``get_git_devstr`` now returns ``'0'`` instead of ``None`` when no git repository is present. This allows generation of development version strings that are in a format that ``setuptools`` expects (e.g. "1.1.3.dev0" instead of "1.1.3.dev"). [#330] - It is now possible to override generated timestamps to make builds reproducible by setting the ``SOURCE_DATE_EPOCH`` environment variable [#341] - Mark Sphinx extensions as parallel-safe. [#344] - Switch to using mathjax instead of imgmath for local builds. [#342] - Deprecate ``exclude`` parameter of various functions in setup_helpers since it could not work as intended. Add new function ``add_exclude_packages`` to provide intended behavior. [#331] - Allow custom Sphinx doctest extension to recognize and process standard doctest directives ``testsetup`` and ``doctest``. [#335] 2.0.1 (2017-07-28) ------------------ - Fix compatibility with Sphinx <1.5. [#326] 2.0 (2017-07-06) ---------------- - Add support for package that lies in a subdirectory. [#249] - Removing ``compat.subprocess``. [#298] - Python 3.3 is no longer supported. [#300] - The 'automodapi' Sphinx extension (and associated dependencies) has now been moved to a standalone package which can be found at https://github.com/astropy/sphinx-automodapi - this is now bundled in astropy-helpers under astropy_helpers.extern.automodapi for convenience. Version shipped with astropy-helpers is v0.6. [#278, #303, #309, #323] - The ``numpydoc`` Sphinx extension has now been moved to ``astropy_helpers.extern``. [#278] - Fix ``build_docs`` error catching, so it doesn't hide Sphinx errors. [#292] - Fix compatibility with Sphinx 1.6. [#318] - Updating ez_setup.py to the last version before it's removal. [#321] 1.3.1 (2017-03-18) ------------------ - Fixed the missing button to hide output in documentation code blocks. [#287] - Fixed bug when ``build_docs`` when running with the clean (-l) option. [#289] - Add alternative location for various intersphinx inventories to fall back to. [#293] 1.3 (2016-12-16) ---------------- - ``build_sphinx`` has been deprecated in favor of the ``build_docs`` command. [#246] - Force the use of Cython's old ``build_ext`` command. A new ``build_ext`` command was added in Cython 0.25, but it does not work with astropy-helpers currently. [#261] 1.2 (2016-06-18) ---------------- - Added sphinx configuration value ``automodsumm_inherited_members``. If ``True`` this will include members that are inherited from a base class in the generated API docs. Defaults to ``False`` which matches the previous behavior. [#215] - Fixed ``build_sphinx`` to recognize builds that succeeded but have output *after* the "build succeeded." statement. This only applies when ``--warnings-returncode`` is given (which is primarily relevant for Travis documentation builds). [#223] - Fixed ``build_sphinx`` the sphinx extensions to not output a spurious warning for sphinx versions > 1.4. [#229] - Add Python version dependent local sphinx inventories that contain otherwise missing references. [#216] - ``astropy_helpers`` now require Sphinx 1.3 or later. [#226] 1.1.2 (2016-03-9) ----------------- - The CSS for the sphinx documentation was altered to prevent some text overflow problems. [#217] 1.1.1 (2015-12-23) ------------------ - Fixed crash in build with ``AttributeError: cython_create_listing`` with older versions of setuptools. [#209, #210] 1.1 (2015-12-10) ---------------- - The original ``AstropyTest`` class in ``astropy_helpers``, which implements the ``setup.py test`` command, is deprecated in favor of moving the implementation of that command closer to the actual Astropy test runner in ``astropy.tests``. Now a dummy ``test`` command is provided solely for informing users that they need ``astropy`` installed to run the tests (however, the previous, now deprecated implementation is still provided and continues to work with older versions of Astropy). See the related issue for more details. [#184] - Added a useful new utility function to ``astropy_helpers.utils`` called ``find_data_files``. This is similar to the ``find_packages`` function in setuptools in that it can be used to search a package for data files (matching a pattern) that can be passed to the ``package_data`` argument for ``setup()``. See the docstring to ``astropy_helpers.utils.find_data_files`` for more details. [#42] - The ``astropy_helpers`` module now sets the global ``_ASTROPY_SETUP_`` flag upon import (from within a ``setup.py``) script, so it's not necessary to have this in the ``setup.py`` script explicitly. If in doubt though, there's no harm in setting it twice. Putting it in ``astropy_helpers`` just ensures that any other imports that occur during build will have this flag set. [#191] - It is now possible to use Cython as a ``setup_requires`` build requirement, and still build Cython extensions even if Cython wasn't available at the beginning of the build processes (that is, is automatically downloaded via setuptools' processing of ``setup_requires``). [#185] - Moves the ``adjust_compiler`` check into the ``build_ext`` command itself, so it's only used when actually building extension modules. This also deprecates the stand-alone ``adjust_compiler`` function. [#76] - When running the ``build_sphinx`` / ``build_docs`` command with the ``-w`` option, the output from Sphinx is streamed as it runs instead of silently buffering until the doc build is complete. [#197] 1.0.7 (unreleased) ------------------ - Fix missing import in ``astropy_helpers/utils.py``. [#196] 1.0.6 (2015-12-04) ------------------ - Fixed bug where running ``./setup.py build_sphinx`` could return successfully even when the build was not successful (and should have returned a non-zero error code). [#199] 1.0.5 (2015-10-02) ------------------ - Fixed a regression in the ``./setup.py test`` command that was introduced in v1.0.4. 1.0.4 (2015-10-02) ------------------ - Fixed issue with the sphinx documentation css where the line numbers for code blocks were not aligned with the code. [#179, #180] - Fixed crash that could occur when trying to build Cython extension modules when Cython isn't installed. Normally this still results in a failed build, but was supposed to provide a useful error message rather than crash outright (this was a regression introduced in v1.0.3). [#181] - Fixed a crash that could occur on Python 3 when a working C compiler isn't found. [#182] - Quieted warnings about deprecated Numpy API in Cython extensions, when building Cython extensions against Numpy >= 1.7. [#183, #186] - Improved support for py.test >= 2.7--running the ``./setup.py test`` command now copies all doc pages into the temporary test directory as well, so that all test files have a "common root directory". [#189, #190] 1.0.3 (2015-07-22) ------------------ - Added workaround for sphinx-doc/sphinx#1843, a but in Sphinx which prevented descriptor classes with a custom metaclass from being documented correctly. [#158] - Added an alias for the ``./setup.py build_sphinx`` command as ``./setup.py build_docs`` which, to a new contributor, should hopefully be less cryptic. [#161] - The fonts in graphviz diagrams now match the font of the HTML content. [#169] - When the documentation is built on readthedocs.org, MathJax will be used for math rendering. When built elsewhere, the "pngmath" extension is still used for math rendering. [#170] - Fix crash when importing astropy_helpers when running with ``python -OO`` [#171] - The ``build`` and ``build_ext`` stages now correctly recognize the presence of C++ files in Cython extensions (previously only vanilla C worked). [#173] 1.0.2 (2015-04-02) ------------------ - Various fixes enabling the astropy-helpers Sphinx build command and Sphinx extensions to work with Sphinx 1.3. [#148] - More improvement to the ability to handle multiple versions of astropy-helpers being imported in the same Python interpreter session in the (somewhat rare) case of nested installs. [#147] - To better support high resolution displays, use SVG for the astropy logo and linkout image, falling back to PNGs for browsers that support it. [#150, #151] - Improve ``setup_helpers.get_compiler_version`` to work with more compilers, and to return more info. This will help fix builds of Astropy on less common compilers, like Sun C. [#153] 1.0.1 (2015-03-04) ------------------ - Released in concert with v0.4.8 to address the same issues. 0.4.8 (2015-03-04) ------------------ - Improved the ``ah_bootstrap`` script's ability to override existing installations of astropy-helpers with new versions in the context of installing multiple packages simultaneously within the same Python interpreter (e.g. when one package has in its ``setup_requires`` another package that uses a different version of astropy-helpers. [#144] - Added a workaround to an issue in matplotlib that can, in rare cases, lead to a crash when installing packages that import matplotlib at build time. [#144] 1.0 (2015-02-17) ---------------- - Added new pre-/post-command hook points for ``setup.py`` commands. Now any package can define code to run before and/or after any ``setup.py`` command without having to manually subclass that command by adding ``pre__hook`` and ``post__hook`` callables to the package's ``setup_package.py`` module. See the PR for more details. [#112] - The following objects in the ``astropy_helpers.setup_helpers`` module have been relocated: - ``get_dummy_distribution``, ``get_distutils_*``, ``get_compiler_option``, ``add_command_option``, ``is_distutils_display_option`` -> ``astropy_helpers.distutils_helpers`` - ``should_build_with_cython``, ``generate_build_ext_command`` -> ``astropy_helpers.commands.build_ext`` - ``AstropyBuildPy`` -> ``astropy_helpers.commands.build_py`` - ``AstropyBuildSphinx`` -> ``astropy_helpers.commands.build_sphinx`` - ``AstropyInstall`` -> ``astropy_helpers.commands.install`` - ``AstropyInstallLib`` -> ``astropy_helpers.commands.install_lib`` - ``AstropyRegister`` -> ``astropy_helpers.commands.register`` - ``get_pkg_version_module`` -> ``astropy_helpers.version_helpers`` - ``write_if_different``, ``import_file``, ``get_numpy_include_path`` -> ``astropy_helpers.utils`` All of these are "soft" deprecations in the sense that they are still importable from ``astropy_helpers.setup_helpers`` for now, and there is no (easy) way to produce deprecation warnings when importing these objects from ``setup_helpers`` rather than directly from the modules they are defined in. But please consider updating any imports to these objects. [#110] - Use of the ``astropy.sphinx.ext.astropyautosummary`` extension is deprecated for use with Sphinx < 1.2. Instead it should suffice to remove this extension for the ``extensions`` list in your ``conf.py`` and add the stock ``sphinx.ext.autosummary`` instead. [#131] 0.4.7 (2015-02-17) ------------------ - Fixed incorrect/missing git hash being added to the generated ``version.py`` when creating a release. [#141] 0.4.6 (2015-02-16) ------------------ - Fixed problems related to the automatically generated _compiler module not being created properly. [#139] 0.4.5 (2015-02-11) ------------------ - Fixed an issue where ah_bootstrap.py could blow up when astropy_helper's version number is 1.0. - Added a workaround for documentation of properties in the rare case where the class's metaclass has a property of the same name. [#130] - Fixed an issue on Python 3 where importing a package using astropy-helper's generated version.py module would crash when the current working directory is an empty git repository. [#114, #137] - Fixed an issue where the "revision count" appended to .dev versions by the generated version.py did not accurately reflect the revision count for the package it belongs to, and could be invalid if the current working directory is an unrelated git repository. [#107, #137] - Likewise, fixed a confusing warning message that could occur in the same circumstances as the above issue. [#121, #137] 0.4.4 (2014-12-31) ------------------ - More improvements for building the documentation using Python 3.x. [#100] - Additional minor fixes to Python 3 support. [#115] - Updates to support new test features in Astropy [#92, #106] 0.4.3 (2014-10-22) ------------------ - The generated ``version.py`` file now preserves the git hash of installed copies of the package as well as when building a source distribution. That is, the git hash of the changeset that was installed/released is preserved. [#87] - In smart resolver add resolution for class links when they exist in the intersphinx inventory, but not the mapping of the current package (e.g. when an affiliated package uses an astropy core class of which "actual" and "documented" location differs) [#88] - Fixed a bug that could occur when running ``setup.py`` for the first time in a repository that uses astropy-helpers as a submodule: ``AttributeError: 'NoneType' object has no attribute 'mkdtemp'`` [#89] - Fixed a bug where optional arguments to the ``doctest-skip`` Sphinx directive were sometimes being left in the generated documentation output. [#90] - Improved support for building the documentation using Python 3.x. [#96] - Avoid error message if .git directory is not present. [#91] 0.4.2 (2014-08-09) ------------------ - Fixed some CSS issues in generated API docs. [#69] - Fixed the warning message that could be displayed when generating a version number with some older versions of git. [#77] - Fixed automodsumm to work with new versions of Sphinx (>= 1.2.2). [#80] 0.4.1 (2014-08-08) ------------------ - Fixed git revision count on systems with git versions older than v1.7.2. [#70] - Fixed display of warning text when running a git command fails (previously the output of stderr was not being decoded properly). [#70] - The ``--offline`` flag to ``setup.py`` understood by ``ah_bootstrap.py`` now also prevents git from going online to fetch submodule updates. [#67] - The Sphinx extension for converting issue numbers to links in the changelog now supports working on arbitrary pages via a new ``conf.py`` setting: ``changelog_links_docpattern``. By default it affects the ``changelog`` and ``whatsnew`` pages in one's Sphinx docs. [#61] - Fixed crash that could result from users with missing/misconfigured locale settings. [#58] - The font used for code examples in the docs is now the system-defined ``monospace`` font, rather than ``Minaco``, which is not available on all platforms. [#50] 0.4 (2014-07-15) ---------------- - Initial release of astropy-helpers. See `APE4 `_ for details of the motivation and design of this package. - The ``astropy_helpers`` package replaces the following modules in the ``astropy`` package: - ``astropy.setup_helpers`` -> ``astropy_helpers.setup_helpers`` - ``astropy.version_helpers`` -> ``astropy_helpers.version_helpers`` - ``astropy.sphinx`` - > ``astropy_helpers.sphinx`` These modules should be considered deprecated in ``astropy``, and any new, non-critical changes to those modules will be made in ``astropy_helpers`` instead. Affiliated packages wishing to make use those modules (as in the Astropy package-template) should use the versions from ``astropy_helpers`` instead, and include the ``ah_bootstrap.py`` script in their project, for bootstrapping the ``astropy_helpers`` package in their setup.py script. ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1526666202.828646 astropy-healpix-0.5/astropy_helpers/LICENSE.rst0000644000077000000240000000272300000000000021471 0ustar00tomstaff00000000000000Copyright (c) 2014, Astropy Developers All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the Astropy Team nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000003200000000000011450 xustar000000000000000026 mtime=1574694604.61213 astropy-healpix-0.5/astropy_helpers/README.rst0000644000077000000240000000266300000000000021347 0ustar00tomstaff00000000000000astropy-helpers =============== .. image:: https://travis-ci.org/astropy/astropy-helpers.svg :target: https://travis-ci.org/astropy/astropy-helpers .. image:: https://ci.appveyor.com/api/projects/status/rt9161t9mhx02xp7/branch/master?svg=true :target: https://ci.appveyor.com/project/Astropy/astropy-helpers .. image:: https://codecov.io/gh/astropy/astropy-helpers/branch/master/graph/badge.svg :target: https://codecov.io/gh/astropy/astropy-helpers The **astropy-helpers** package includes many build, installation, and documentation-related tools used by the Astropy project, but packaged separately for use by other projects that wish to leverage this work. The motivation behind this package and details of its implementation are in the accepted `Astropy Proposal for Enhancement (APE) 4 `_. Astropy-helpers is not a traditional package in the sense that it is not intended to be installed directly by users or developers. Instead, it is meant to be accessed when the ``setup.py`` command is run - see the "Using astropy-helpers in a package" section in the documentation for how to do this. For a real-life example of how to implement astropy-helpers in a project, see the ``setup.py`` and ``setup.cfg`` files of the `Affiliated package template `_. For more information, see the documentation at http://astropy-helpers.readthedocs.io ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6132336 astropy-healpix-0.5/astropy_helpers/ah_bootstrap.py0000644000077000000240000011063300000000000022714 0ustar00tomstaff00000000000000""" This bootstrap module contains code for ensuring that the astropy_helpers package will be importable by the time the setup.py script runs. It also includes some workarounds to ensure that a recent-enough version of setuptools is being used for the installation. This module should be the first thing imported in the setup.py of distributions that make use of the utilities in astropy_helpers. If the distribution ships with its own copy of astropy_helpers, this module will first attempt to import from the shipped copy. However, it will also check PyPI to see if there are any bug-fix releases on top of the current version that may be useful to get past platform-specific bugs that have been fixed. When running setup.py, use the ``--offline`` command-line option to disable the auto-upgrade checks. When this module is imported or otherwise executed it automatically calls a main function that attempts to read the project's setup.cfg file, which it checks for a configuration section called ``[ah_bootstrap]`` the presences of that section, and options therein, determine the next step taken: If it contains an option called ``auto_use`` with a value of ``True``, it will automatically call the main function of this module called `use_astropy_helpers` (see that function's docstring for full details). Otherwise no further action is taken and by default the system-installed version of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers`` may be called manually from within the setup.py script). This behavior can also be controlled using the ``--auto-use`` and ``--no-auto-use`` command-line flags. For clarity, an alias for ``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using the latter if needed. Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same names as the arguments to `use_astropy_helpers`, and can be used to configure the bootstrap script when ``auto_use = True``. See https://github.com/astropy/astropy-helpers for more details, and for the latest version of this module. """ import contextlib import errno import io import locale import os import re import subprocess as sp import sys from distutils import log from distutils.debug import DEBUG from configparser import ConfigParser, RawConfigParser import pkg_resources from setuptools import Distribution from setuptools.package_index import PackageIndex # This is the minimum Python version required for astropy-helpers __minimum_python_version__ = (3, 5) # TODO: Maybe enable checking for a specific version of astropy_helpers? DIST_NAME = 'astropy-helpers' PACKAGE_NAME = 'astropy_helpers' UPPER_VERSION_EXCLUSIVE = None # Defaults for other options DOWNLOAD_IF_NEEDED = True INDEX_URL = 'https://pypi.python.org/simple' USE_GIT = True OFFLINE = False AUTO_UPGRADE = True # A list of all the configuration options and their required types CFG_OPTIONS = [ ('auto_use', bool), ('path', str), ('download_if_needed', bool), ('index_url', str), ('use_git', bool), ('offline', bool), ('auto_upgrade', bool) ] # Start off by parsing the setup.cfg file _err_help_msg = """ If the problem persists consider installing astropy_helpers manually using pip (`pip install astropy_helpers`) or by manually downloading the source archive, extracting it, and installing by running `python setup.py install` from the root of the extracted source code. """ SETUP_CFG = ConfigParser() if os.path.exists('setup.cfg'): try: SETUP_CFG.read('setup.cfg') except Exception as e: if DEBUG: raise log.error( "Error reading setup.cfg: {0!r}\n{1} will not be " "automatically bootstrapped and package installation may fail." "\n{2}".format(e, PACKAGE_NAME, _err_help_msg)) # We used package_name in the package template for a while instead of name if SETUP_CFG.has_option('metadata', 'name'): parent_package = SETUP_CFG.get('metadata', 'name') elif SETUP_CFG.has_option('metadata', 'package_name'): parent_package = SETUP_CFG.get('metadata', 'package_name') else: parent_package = None if SETUP_CFG.has_option('options', 'python_requires'): python_requires = SETUP_CFG.get('options', 'python_requires') # The python_requires key has a syntax that can be parsed by SpecifierSet # in the packaging package. However, we don't want to have to depend on that # package, so instead we can use setuptools (which bundles packaging). We # have to add 'python' to parse it with Requirement. from pkg_resources import Requirement req = Requirement.parse('python' + python_requires) # We want the Python version as a string, which we can get from the platform module import platform # strip off trailing '+' incase this is a dev install of python python_version = platform.python_version().strip('+') # allow pre-releases to count as 'new enough' if not req.specifier.contains(python_version, True): if parent_package is None: message = "ERROR: Python {} is required by this package\n".format(req.specifier) else: message = "ERROR: Python {} is required by {}\n".format(req.specifier, parent_package) sys.stderr.write(message) sys.exit(1) if sys.version_info < __minimum_python_version__: if parent_package is None: message = "ERROR: Python {} or later is required by astropy-helpers\n".format( __minimum_python_version__) else: message = "ERROR: Python {} or later is required by astropy-helpers for {}\n".format( __minimum_python_version__, parent_package) sys.stderr.write(message) sys.exit(1) _str_types = (str, bytes) # What follows are several import statements meant to deal with install-time # issues with either missing or misbehaving pacakges (including making sure # setuptools itself is installed): # Check that setuptools 30.3 or later is present from distutils.version import LooseVersion try: import setuptools assert LooseVersion(setuptools.__version__) >= LooseVersion('30.3') except (ImportError, AssertionError): sys.stderr.write("ERROR: setuptools 30.3 or later is required by astropy-helpers\n") sys.exit(1) # typing as a dependency for 1.6.1+ Sphinx causes issues when imported after # initializing submodule with ah_boostrap.py # See discussion and references in # https://github.com/astropy/astropy-helpers/issues/302 try: import typing # noqa except ImportError: pass # Note: The following import is required as a workaround to # https://github.com/astropy/astropy-helpers/issues/89; if we don't import this # module now, it will get cleaned up after `run_setup` is called, but that will # later cause the TemporaryDirectory class defined in it to stop working when # used later on by setuptools try: import setuptools.py31compat # noqa except ImportError: pass # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass # End compatibility imports... class _Bootstrapper(object): """ Bootstrapper implementation. See ``use_astropy_helpers`` for parameter documentation. """ def __init__(self, path=None, index_url=None, use_git=None, offline=None, download_if_needed=None, auto_upgrade=None): if path is None: path = PACKAGE_NAME if not (isinstance(path, _str_types) or path is False): raise TypeError('path must be a string or False') if not isinstance(path, str): fs_encoding = sys.getfilesystemencoding() path = path.decode(fs_encoding) # path to unicode self.path = path # Set other option attributes, using defaults where necessary self.index_url = index_url if index_url is not None else INDEX_URL self.offline = offline if offline is not None else OFFLINE # If offline=True, override download and auto-upgrade if self.offline: download_if_needed = False auto_upgrade = False self.download = (download_if_needed if download_if_needed is not None else DOWNLOAD_IF_NEEDED) self.auto_upgrade = (auto_upgrade if auto_upgrade is not None else AUTO_UPGRADE) # If this is a release then the .git directory will not exist so we # should not use git. git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git')) if use_git is None and not git_dir_exists: use_git = False self.use_git = use_git if use_git is not None else USE_GIT # Declared as False by default--later we check if astropy-helpers can be # upgraded from PyPI, but only if not using a source distribution (as in # the case of import from a git submodule) self.is_submodule = False @classmethod def main(cls, argv=None): if argv is None: argv = sys.argv config = cls.parse_config() config.update(cls.parse_command_line(argv)) auto_use = config.pop('auto_use', False) bootstrapper = cls(**config) if auto_use: # Run the bootstrapper, otherwise the setup.py is using the old # use_astropy_helpers() interface, in which case it will run the # bootstrapper manually after reconfiguring it. bootstrapper.run() return bootstrapper @classmethod def parse_config(cls): if not SETUP_CFG.has_section('ah_bootstrap'): return {} config = {} for option, type_ in CFG_OPTIONS: if not SETUP_CFG.has_option('ah_bootstrap', option): continue if type_ is bool: value = SETUP_CFG.getboolean('ah_bootstrap', option) else: value = SETUP_CFG.get('ah_bootstrap', option) config[option] = value return config @classmethod def parse_command_line(cls, argv=None): if argv is None: argv = sys.argv config = {} # For now we just pop recognized ah_bootstrap options out of the # arg list. This is imperfect; in the unlikely case that a setup.py # custom command or even custom Distribution class defines an argument # of the same name then we will break that. However there's a catch22 # here that we can't just do full argument parsing right here, because # we don't yet know *how* to parse all possible command-line arguments. if '--no-git' in argv: config['use_git'] = False argv.remove('--no-git') if '--offline' in argv: config['offline'] = True argv.remove('--offline') if '--auto-use' in argv: config['auto_use'] = True argv.remove('--auto-use') if '--no-auto-use' in argv: config['auto_use'] = False argv.remove('--no-auto-use') if '--use-system-astropy-helpers' in argv: config['auto_use'] = False argv.remove('--use-system-astropy-helpers') return config def run(self): strategies = ['local_directory', 'local_file', 'index'] dist = None # First, remove any previously imported versions of astropy_helpers; # this is necessary for nested installs where one package's installer # is installing another package via setuptools.sandbox.run_setup, as in # the case of setup_requires for key in list(sys.modules): try: if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'): del sys.modules[key] except AttributeError: # Sometimes mysterious non-string things can turn up in # sys.modules continue # Check to see if the path is a submodule self.is_submodule = self._check_submodule() for strategy in strategies: method = getattr(self, 'get_{0}_dist'.format(strategy)) dist = method() if dist is not None: break else: raise _AHBootstrapSystemExit( "No source found for the {0!r} package; {0} must be " "available and importable as a prerequisite to building " "or installing this package.".format(PACKAGE_NAME)) # This is a bit hacky, but if astropy_helpers was loaded from a # directory/submodule its Distribution object gets a "precedence" of # "DEVELOP_DIST". However, in other cases it gets a precedence of # "EGG_DIST". However, when activing the distribution it will only be # placed early on sys.path if it is treated as an EGG_DIST, so always # do that dist = dist.clone(precedence=pkg_resources.EGG_DIST) # Otherwise we found a version of astropy-helpers, so we're done # Just active the found distribution on sys.path--if we did a # download this usually happens automatically but it doesn't hurt to # do it again # Note: Adding the dist to the global working set also activates it # (makes it importable on sys.path) by default. try: pkg_resources.working_set.add(dist, replace=True) except TypeError: # Some (much) older versions of setuptools do not have the # replace=True option here. These versions are old enough that all # bets may be off anyways, but it's easy enough to work around just # in case... if dist.key in pkg_resources.working_set.by_key: del pkg_resources.working_set.by_key[dist.key] pkg_resources.working_set.add(dist) @property def config(self): """ A `dict` containing the options this `_Bootstrapper` was configured with. """ return dict((optname, getattr(self, optname)) for optname, _ in CFG_OPTIONS if hasattr(self, optname)) def get_local_directory_dist(self): """ Handle importing a vendored package from a subdirectory of the source distribution. """ if not os.path.isdir(self.path): return log.info('Attempting to import astropy_helpers from {0} {1!r}'.format( 'submodule' if self.is_submodule else 'directory', self.path)) dist = self._directory_import() if dist is None: log.warn( 'The requested path {0!r} for importing {1} does not ' 'exist, or does not contain a copy of the {1} ' 'package.'.format(self.path, PACKAGE_NAME)) elif self.auto_upgrade and not self.is_submodule: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_local_file_dist(self): """ Handle importing from a source archive; this also uses setup_requires but points easy_install directly to the source archive. """ if not os.path.isfile(self.path): return log.info('Attempting to unpack and import astropy_helpers from ' '{0!r}'.format(self.path)) try: dist = self._do_download(find_links=[self.path]) except Exception as e: if DEBUG: raise log.warn( 'Failed to import {0} from the specified archive {1!r}: ' '{2}'.format(PACKAGE_NAME, self.path, str(e))) dist = None if dist is not None and self.auto_upgrade: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_index_dist(self): if not self.download: log.warn('Downloading {0!r} disabled.'.format(DIST_NAME)) return None log.warn( "Downloading {0!r}; run setup.py with the --offline option to " "force offline installation.".format(DIST_NAME)) try: dist = self._do_download() except Exception as e: if DEBUG: raise log.warn( 'Failed to download and/or install {0!r} from {1!r}:\n' '{2}'.format(DIST_NAME, self.index_url, str(e))) dist = None # No need to run auto-upgrade here since we've already presumably # gotten the most up-to-date version from the package index return dist def _directory_import(self): """ Import astropy_helpers from the given path, which will be added to sys.path. Must return True if the import succeeded, and False otherwise. """ # Return True on success, False on failure but download is allowed, and # otherwise raise SystemExit path = os.path.abspath(self.path) # Use an empty WorkingSet rather than the man # pkg_resources.working_set, since on older versions of setuptools this # will invoke a VersionConflict when trying to install an upgrade ws = pkg_resources.WorkingSet([]) ws.add_entry(path) dist = ws.by_key.get(DIST_NAME) if dist is None: # We didn't find an egg-info/dist-info in the given path, but if a # setup.py exists we can generate it setup_py = os.path.join(path, 'setup.py') if os.path.isfile(setup_py): # We use subprocess instead of run_setup from setuptools to # avoid segmentation faults - see the following for more details: # https://github.com/cython/cython/issues/2104 sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path) for dist in pkg_resources.find_distributions(path, True): # There should be only one... return dist return dist def _do_download(self, version='', find_links=None): if find_links: allow_hosts = '' index_url = None else: allow_hosts = None index_url = self.index_url # Annoyingly, setuptools will not handle other arguments to # Distribution (such as options) before handling setup_requires, so it # is not straightforward to programmatically augment the arguments which # are passed to easy_install class _Distribution(Distribution): def get_option_dict(self, command_name): opts = Distribution.get_option_dict(self, command_name) if command_name == 'easy_install': if find_links is not None: opts['find_links'] = ('setup script', find_links) if index_url is not None: opts['index_url'] = ('setup script', index_url) if allow_hosts is not None: opts['allow_hosts'] = ('setup script', allow_hosts) return opts if version: req = '{0}=={1}'.format(DIST_NAME, version) else: if UPPER_VERSION_EXCLUSIVE is None: req = DIST_NAME else: req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE) attrs = {'setup_requires': [req]} # NOTE: we need to parse the config file (e.g. setup.cfg) to make sure # it honours the options set in the [easy_install] section, and we need # to explicitly fetch the requirement eggs as setup_requires does not # get honored in recent versions of setuptools: # https://github.com/pypa/setuptools/issues/1273 try: context = _verbose if DEBUG else _silence with context(): dist = _Distribution(attrs=attrs) try: dist.parse_config_files(ignore_option_errors=True) dist.fetch_build_eggs(req) except TypeError: # On older versions of setuptools, ignore_option_errors # doesn't exist, and the above two lines are not needed # so we can just continue pass # If the setup_requires succeeded it will have added the new dist to # the main working_set return pkg_resources.working_set.by_key.get(DIST_NAME) except Exception as e: if DEBUG: raise msg = 'Error retrieving {0} from {1}:\n{2}' if find_links: source = find_links[0] elif index_url != INDEX_URL: source = index_url else: source = 'PyPI' raise Exception(msg.format(DIST_NAME, source, repr(e))) def _do_upgrade(self, dist): # Build up a requirement for a higher bugfix release but a lower minor # release (so API compatibility is guaranteed) next_version = _next_version(dist.parsed_version) req = pkg_resources.Requirement.parse( '{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version)) package_index = PackageIndex(index_url=self.index_url) upgrade = package_index.obtain(req) if upgrade is not None: return self._do_download(version=upgrade.version) def _check_submodule(self): """ Check if the given path is a git submodule. See the docstrings for ``_check_submodule_using_git`` and ``_check_submodule_no_git`` for further details. """ if (self.path is None or (os.path.exists(self.path) and not os.path.isdir(self.path))): return False if self.use_git: return self._check_submodule_using_git() else: return self._check_submodule_no_git() def _check_submodule_using_git(self): """ Check if the given path is a git submodule. If so, attempt to initialize and/or update the submodule if needed. This function makes calls to the ``git`` command in subprocesses. The ``_check_submodule_no_git`` option uses pure Python to check if the given path looks like a git submodule, but it cannot perform updates. """ cmd = ['git', 'submodule', 'status', '--', self.path] try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except _CommandNotFound: # The git command simply wasn't found; this is most likely the # case on user systems that don't have git and are simply # trying to install the package from PyPI or a source # distribution. Silently ignore this case and simply don't try # to use submodules return False stderr = stderr.strip() if returncode != 0 and stderr: # Unfortunately the return code alone cannot be relied on, as # earlier versions of git returned 0 even if the requested submodule # does not exist # This is a warning that occurs in perl (from running git submodule) # which only occurs with a malformatted locale setting which can # happen sometimes on OSX. See again # https://github.com/astropy/astropy/issues/2749 perl_warning = ('perl: warning: Falling back to the standard locale ' '("C").') if not stderr.strip().endswith(perl_warning): # Some other unknown error condition occurred log.warn('git submodule command failed ' 'unexpectedly:\n{0}'.format(stderr)) return False # Output of `git submodule status` is as follows: # # 1: Status indicator: '-' for submodule is uninitialized, '+' if # submodule is initialized but is not at the commit currently indicated # in .gitmodules (and thus needs to be updated), or 'U' if the # submodule is in an unstable state (i.e. has merge conflicts) # # 2. SHA-1 hash of the current commit of the submodule (we don't really # need this information but it's useful for checking that the output is # correct) # # 3. The output of `git describe` for the submodule's current commit # hash (this includes for example what branches the commit is on) but # only if the submodule is initialized. We ignore this information for # now _git_submodule_status_re = re.compile( r'^(?P[+-U ])(?P[0-9a-f]{40}) ' r'(?P\S+)( .*)?$') # The stdout should only contain one line--the status of the # requested submodule m = _git_submodule_status_re.match(stdout) if m: # Yes, the path *is* a git submodule self._update_submodule(m.group('submodule'), m.group('status')) return True else: log.warn( 'Unexpected output from `git submodule status`:\n{0}\n' 'Will attempt import from {1!r} regardless.'.format( stdout, self.path)) return False def _check_submodule_no_git(self): """ Like ``_check_submodule_using_git``, but simply parses the .gitmodules file to determine if the supplied path is a git submodule, and does not exec any subprocesses. This can only determine if a path is a submodule--it does not perform updates, etc. This function may need to be updated if the format of the .gitmodules file is changed between git versions. """ gitmodules_path = os.path.abspath('.gitmodules') if not os.path.isfile(gitmodules_path): return False # This is a minimal reader for gitconfig-style files. It handles a few of # the quirks that make gitconfig files incompatible with ConfigParser-style # files, but does not support the full gitconfig syntax (just enough # needed to read a .gitmodules file). gitmodules_fileobj = io.StringIO() # Must use io.open for cross-Python-compatible behavior wrt unicode with io.open(gitmodules_path) as f: for line in f: # gitconfig files are more flexible with leading whitespace; just # go ahead and remove it line = line.lstrip() # comments can start with either # or ; if line and line[0] in (':', ';'): continue gitmodules_fileobj.write(line) gitmodules_fileobj.seek(0) cfg = RawConfigParser() try: cfg.readfp(gitmodules_fileobj) except Exception as exc: log.warn('Malformatted .gitmodules file: {0}\n' '{1} cannot be assumed to be a git submodule.'.format( exc, self.path)) return False for section in cfg.sections(): if not cfg.has_option(section, 'path'): continue submodule_path = cfg.get(section, 'path').rstrip(os.sep) if submodule_path == self.path.rstrip(os.sep): return True return False def _update_submodule(self, submodule, status): if status == ' ': # The submodule is up to date; no action necessary return elif status == '-': if self.offline: raise _AHBootstrapSystemExit( "Cannot initialize the {0} submodule in --offline mode; " "this requires being able to clone the submodule from an " "online repository.".format(submodule)) cmd = ['update', '--init'] action = 'Initializing' elif status == '+': cmd = ['update'] action = 'Updating' if self.offline: cmd.append('--no-fetch') elif status == 'U': raise _AHBootstrapSystemExit( 'Error: Submodule {0} contains unresolved merge conflicts. ' 'Please complete or abandon any changes in the submodule so that ' 'it is in a usable state, then try again.'.format(submodule)) else: log.warn('Unknown status {0!r} for git submodule {1!r}. Will ' 'attempt to use the submodule as-is, but try to ensure ' 'that the submodule is in a clean state and contains no ' 'conflicts or errors.\n{2}'.format(status, submodule, _err_help_msg)) return err_msg = None cmd = ['git', 'submodule'] + cmd + ['--', submodule] log.warn('{0} {1} submodule with: `{2}`'.format( action, submodule, ' '.join(cmd))) try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except OSError as e: err_msg = str(e) else: if returncode != 0: err_msg = stderr if err_msg is not None: log.warn('An unexpected error occurred updating the git submodule ' '{0!r}:\n{1}\n{2}'.format(submodule, err_msg, _err_help_msg)) class _CommandNotFound(OSError): """ An exception raised when a command run with run_cmd is not found on the system. """ def run_cmd(cmd): """ Run a command in a subprocess, given as a list of command-line arguments. Returns a ``(returncode, stdout, stderr)`` tuple. """ try: p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE) # XXX: May block if either stdout or stderr fill their buffers; # however for the commands this is currently used for that is # unlikely (they should have very brief output) stdout, stderr = p.communicate() except OSError as e: if DEBUG: raise if e.errno == errno.ENOENT: msg = 'Command not found: `{0}`'.format(' '.join(cmd)) raise _CommandNotFound(msg, cmd) else: raise _AHBootstrapSystemExit( 'An unexpected error occurred when running the ' '`{0}` command:\n{1}'.format(' '.join(cmd), str(e))) # Can fail of the default locale is not configured properly. See # https://github.com/astropy/astropy/issues/2749. For the purposes under # consideration 'latin1' is an acceptable fallback. try: stdio_encoding = locale.getdefaultlocale()[1] or 'latin1' except ValueError: # Due to an OSX oddity locale.getdefaultlocale() can also crash # depending on the user's locale/language settings. See: # http://bugs.python.org/issue18378 stdio_encoding = 'latin1' # Unlikely to fail at this point but even then let's be flexible if not isinstance(stdout, str): stdout = stdout.decode(stdio_encoding, 'replace') if not isinstance(stderr, str): stderr = stderr.decode(stdio_encoding, 'replace') return (p.returncode, stdout, stderr) def _next_version(version): """ Given a parsed version from pkg_resources.parse_version, returns a new version string with the next minor version. Examples ======== >>> _next_version(pkg_resources.parse_version('1.2.3')) '1.3.0' """ if hasattr(version, 'base_version'): # New version parsing from setuptools >= 8.0 if version.base_version: parts = version.base_version.split('.') else: parts = [] else: parts = [] for part in version: if part.startswith('*'): break parts.append(part) parts = [int(p) for p in parts] if len(parts) < 3: parts += [0] * (3 - len(parts)) major, minor, micro = parts[:3] return '{0}.{1}.{2}'.format(major, minor + 1, 0) class _DummyFile(object): """A noop writeable object.""" errors = '' # Required for Python 3.x encoding = 'utf-8' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def _verbose(): yield @contextlib.contextmanager def _silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr class _AHBootstrapSystemExit(SystemExit): def __init__(self, *args): if not args: msg = 'An unknown problem occurred bootstrapping astropy_helpers.' else: msg = args[0] msg += '\n' + _err_help_msg super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:]) BOOTSTRAPPER = _Bootstrapper.main() def use_astropy_helpers(**kwargs): """ Ensure that the `astropy_helpers` module is available and is importable. This supports automatic submodule initialization if astropy_helpers is included in a project as a git submodule, or will download it from PyPI if necessary. Parameters ---------- path : str or None, optional A filesystem path relative to the root of the project's source code that should be added to `sys.path` so that `astropy_helpers` can be imported from that path. If the path is a git submodule it will automatically be initialized and/or updated. The path may also be to a ``.tar.gz`` archive of the astropy_helpers source distribution. In this case the archive is automatically unpacked and made temporarily available on `sys.path` as a ``.egg`` archive. If `None` skip straight to downloading. download_if_needed : bool, optional If the provided filesystem path is not found an attempt will be made to download astropy_helpers from PyPI. It will then be made temporarily available on `sys.path` as a ``.egg`` archive (using the ``setup_requires`` feature of setuptools. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. index_url : str, optional If provided, use a different URL for the Python package index than the main PyPI server. use_git : bool, optional If `False` no git commands will be used--this effectively disables support for git submodules. If the ``--no-git`` option is given at the command line the value of this argument is overridden to `False`. auto_upgrade : bool, optional By default, when installing a package from a non-development source distribution ah_boostrap will try to automatically check for patch releases to astropy-helpers on PyPI and use the patched version over any bundled versions. Setting this to `False` will disable that functionality. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. offline : bool, optional If `False` disable all actions that require an internet connection, including downloading packages from the package index and fetching updates to any git submodule. Defaults to `True`. """ global BOOTSTRAPPER config = BOOTSTRAPPER.config config.update(**kwargs) # Create a new bootstrapper with the updated configuration and run it BOOTSTRAPPER = _Bootstrapper(**config) BOOTSTRAPPER.run() ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5401027 astropy-healpix-0.5/astropy_helpers/astropy_helpers/0000755000077000000240000000000000000000000023074 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6139154 astropy-healpix-0.5/astropy_helpers/astropy_helpers/__init__.py0000644000077000000240000000331200000000000025204 0ustar00tomstaff00000000000000try: from .version import version as __version__ from .version import githash as __githash__ except ImportError: __version__ = '' __githash__ = '' # If we've made it as far as importing astropy_helpers, we don't need # ah_bootstrap in sys.modules anymore. Getting rid of it is actually necessary # if the package we're installing has a setup_requires of another package that # uses astropy_helpers (and possibly a different version at that) # See https://github.com/astropy/astropy/issues/3541 import sys if 'ah_bootstrap' in sys.modules: del sys.modules['ah_bootstrap'] # Note, this is repeated from ah_bootstrap.py, but is here too in case this # astropy-helpers was upgraded to from an older version that did not have this # check in its ah_bootstrap. # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass import os # Ensure that all module-level code in astropy or other packages know that # we're in setup mode: if ('__main__' in sys.modules and hasattr(sys.modules['__main__'], '__file__')): filename = os.path.basename(sys.modules['__main__'].__file__) if filename.rstrip('co') == 'setup.py': import builtins builtins._ASTROPY_SETUP_ = True del filename ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5495896 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/0000755000077000000240000000000000000000000024675 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526666202.8335035 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/__init__.py0000644000077000000240000000000000000000000026774 0ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6141853 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/_dummy.py0000644000077000000240000000531200000000000026542 0ustar00tomstaff00000000000000""" Provides a base class for a 'dummy' setup.py command that has no functionality (probably due to a missing requirement). This dummy command can raise an exception when it is run, explaining to the user what dependencies must be met to use this command. The reason this is at all tricky is that we want the command to be able to provide this message even when the user passes arguments to the command. If we don't know ahead of time what arguments the command can take, this is difficult, because distutils does not allow unknown arguments to be passed to a setup.py command. This hacks around that restriction to provide a useful error message even when a user passes arguments to the dummy implementation of a command. Use this like: try: from some_dependency import SetupCommand except ImportError: from ._dummy import _DummyCommand class SetupCommand(_DummyCommand): description = \ 'Implementation of SetupCommand from some_dependency; ' 'some_dependency must be installed to run this command' # This is the message that will be raised when a user tries to # run this command--define it as a class attribute. error_msg = \ "The 'setup_command' command requires the some_dependency " "package to be installed and importable." """ import sys from setuptools import Command from distutils.errors import DistutilsArgError from textwrap import dedent class _DummyCommandMeta(type): """ Causes an exception to be raised on accessing attributes of a command class so that if ``./setup.py command_name`` is run with additional command-line options we can provide a useful error message instead of the default that tells users the options are unrecognized. """ def __init__(cls, name, bases, members): if bases == (Command, object): # This is the _DummyCommand base class, presumably return if not hasattr(cls, 'description'): raise TypeError( "_DummyCommand subclass must have a 'description' " "attribute.") if not hasattr(cls, 'error_msg'): raise TypeError( "_DummyCommand subclass must have an 'error_msg' " "attribute.") def __getattribute__(cls, attr): if attr in ('description', 'error_msg') or attr.startswith('_'): # Allow cls.description to work so that `./setup.py # --help-commands` still works return super(_DummyCommandMeta, cls).__getattribute__(attr) raise DistutilsArgError(cls.error_msg) class _DummyCommand(Command, object, metaclass=_DummyCommandMeta): pass ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574694604.615091 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/build_ext.py0000644000077000000240000002074200000000000027233 0ustar00tomstaff00000000000000import errno import os import shutil from distutils.core import Extension from distutils.ccompiler import get_default_compiler from distutils.command.build_ext import build_ext as DistutilsBuildExt from ..distutils_helpers import get_main_package_directory from ..utils import get_numpy_include_path, import_file __all__ = ['AstropyHelpersBuildExt'] def should_build_with_cython(previous_cython_version, is_release): """ Returns the previously used Cython version (or 'unknown' if not previously built) if Cython should be used to build extension modules from pyx files. """ # Only build with Cython if, of course, Cython is installed, we're in a # development version (i.e. not release) or the Cython-generated source # files haven't been created yet (cython_version == 'unknown'). The latter # case can happen even when release is True if checking out a release tag # from the repository have_cython = False try: from Cython import __version__ as cython_version # noqa have_cython = True except ImportError: pass if have_cython and (not is_release or previous_cython_version == 'unknown'): return cython_version else: return False class AstropyHelpersBuildExt(DistutilsBuildExt): """ A custom 'build_ext' command that allows for manipulating some of the C extension options at build time. """ _uses_cython = False _force_rebuild = False def __new__(cls, value, **kwargs): # NOTE: we need to wait until AstropyHelpersBuildExt is initialized to # import setuptools.command.build_ext because when that package is # imported, setuptools tries to import Cython - and if it's not found # it will affect the rest of the build process. This is an issue because # if we import that module at the top of this one, setup_requires won't # have been honored yet, so Cython may not yet be available - and if we # import build_ext too soon, it will think Cython is not available even # if it is then intalled when setup_requires is processed. To get around # this we dynamically create a new class that inherits from the # setuptools build_ext, and by this point setup_requires has been # processed. from setuptools.command.build_ext import build_ext as SetuptoolsBuildExt class FinalBuildExt(AstropyHelpersBuildExt, SetuptoolsBuildExt): pass new_type = type(cls.__name__, (FinalBuildExt,), dict(cls.__dict__)) obj = SetuptoolsBuildExt.__new__(new_type) obj.__init__(value) return obj def finalize_options(self): # First let's find the package folder, then we can check if the # version and cython_version are accessible self.package_dir = get_main_package_directory(self.distribution) version = import_file(os.path.join(self.package_dir, 'version.py'), name='version').version self.is_release = 'dev' not in version try: self.previous_cython_version = import_file(os.path.join(self.package_dir, 'cython_version.py'), name='cython_version').cython_version except (FileNotFoundError, ImportError): self.previous_cython_version = 'unknown' self._uses_cython = should_build_with_cython(self.previous_cython_version, self.is_release) # Add a copy of the _compiler.so module as well, but only if there # are in fact C modules to compile (otherwise there's no reason to # include a record of the compiler used). Note that self.extensions # may not be set yet, but self.distribution.ext_modules is where any # extension modules passed to setup() can be found extensions = self.distribution.ext_modules if extensions: build_py = self.get_finalized_command('build_py') package_dir = build_py.get_package_dir(self.package_dir) src_path = os.path.relpath( os.path.join(os.path.dirname(__file__), 'src')) shutil.copy(os.path.join(src_path, 'compiler.c'), os.path.join(package_dir, '_compiler.c')) ext = Extension(self.package_dir + '.compiler_version', [os.path.join(package_dir, '_compiler.c')]) extensions.insert(0, ext) super().finalize_options() # If we are using Cython, then make sure we re-build if the version # of Cython that is installed is different from the version last # used to generate the C files. if self._uses_cython and self._uses_cython != self.previous_cython_version: self._force_rebuild = True # Regardless of the value of the '--force' option, force a rebuild # if the debug flag changed from the last build if self._force_rebuild: self.force = True def run(self): # For extensions that require 'numpy' in their include dirs, # replace 'numpy' with the actual paths np_include = None for extension in self.extensions: if 'numpy' in extension.include_dirs: if np_include is None: np_include = get_numpy_include_path() idx = extension.include_dirs.index('numpy') extension.include_dirs.insert(idx, np_include) extension.include_dirs.remove('numpy') self._check_cython_sources(extension) # Note that setuptools automatically uses Cython to discover and # build extensions if available, so we don't have to explicitly call # e.g. cythonize. super().run() # Update cython_version.py if building with Cython if self._uses_cython and self._uses_cython != self.previous_cython_version: build_py = self.get_finalized_command('build_py') package_dir = build_py.get_package_dir(self.package_dir) cython_py = os.path.join(package_dir, 'cython_version.py') with open(cython_py, 'w') as f: f.write('# Generated file; do not modify\n') f.write('cython_version = {0!r}\n'.format(self._uses_cython)) if os.path.isdir(self.build_lib): # The build/lib directory may not exist if the build_py # command was not previously run, which may sometimes be # the case self.copy_file(cython_py, os.path.join(self.build_lib, cython_py), preserve_mode=False) def _check_cython_sources(self, extension): """ Where relevant, make sure that the .c files associated with .pyx modules are present (if building without Cython installed). """ # Determine the compiler we'll be using if self.compiler is None: compiler = get_default_compiler() else: compiler = self.compiler # Replace .pyx with C-equivalents, unless c files are missing for jdx, src in enumerate(extension.sources): base, ext = os.path.splitext(src) pyxfn = base + '.pyx' cfn = base + '.c' cppfn = base + '.cpp' if not os.path.isfile(pyxfn): continue if self._uses_cython: extension.sources[jdx] = pyxfn else: if os.path.isfile(cfn): extension.sources[jdx] = cfn elif os.path.isfile(cppfn): extension.sources[jdx] = cppfn else: msg = ( 'Could not find C/C++ file {0}.(c/cpp) for Cython ' 'file {1} when building extension {2}. Cython ' 'must be installed to build from a git ' 'checkout.'.format(base, pyxfn, extension.name)) raise IOError(errno.ENOENT, msg, cfn) # Cython (at least as of 0.29.2) uses deprecated Numpy API features # the use of which produces a few warnings when compiling. # These additional flags should squelch those warnings. # TODO: Feel free to remove this if/when a Cython update # removes use of the deprecated Numpy API if compiler == 'unix': extension.extra_compile_args.extend([ '-Wp,-w', '-Wno-unused-function']) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6157284 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/build_sphinx.py0000644000077000000240000002173200000000000027744 0ustar00tomstaff00000000000000 import os import pkgutil import re import shutil import subprocess import sys from distutils.version import LooseVersion from distutils import log from sphinx import __version__ as sphinx_version from sphinx.setup_command import BuildDoc as SphinxBuildDoc SPHINX_LT_16 = LooseVersion(sphinx_version) < LooseVersion('1.6') SPHINX_LT_17 = LooseVersion(sphinx_version) < LooseVersion('1.7') SUBPROCESS_TEMPLATE = """ import os import sys {build_main} os.chdir({srcdir!r}) {sys_path_inserts} for builder in {builders!r}: retcode = build_main(argv={argv!r} + ['-b', builder, '.', os.path.join({output_dir!r}, builder)]) if retcode != 0: sys.exit(retcode) """ def ensure_sphinx_astropy_installed(): """ Make sure that sphinx-astropy is available. This returns the available version of sphinx-astropy as well as any paths that should be added to sys.path for sphinx-astropy to be available. """ # We've split out the Sphinx part of astropy-helpers into sphinx-astropy # but we want it to be auto-installed seamlessly for anyone using # build_docs. We check if it's already installed, and if not, we install # it to a local .eggs directory and add the eggs to the path (these # have to each be added to the path, we can't add them by simply adding # .eggs to the path) sys_path_inserts = [] sphinx_astropy_version = None try: from sphinx_astropy import __version__ as sphinx_astropy_version # noqa except ImportError: raise ImportError("sphinx-astropy needs to be installed to build " "the documentation.") return sphinx_astropy_version, sys_path_inserts class AstropyBuildDocs(SphinxBuildDoc): """ A version of the ``build_docs`` command that uses the version of Astropy that is built by the setup ``build`` command, rather than whatever is installed on the system. To build docs against the installed version, run ``make html`` in the ``astropy/docs`` directory. """ description = 'Build Sphinx documentation for Astropy environment' user_options = SphinxBuildDoc.user_options[:] user_options.append( ('warnings-returncode', 'w', 'Parses the sphinx output and sets the return code to 1 if there ' 'are any warnings. Note that this will cause the sphinx log to ' 'only update when it completes, rather than continuously as is ' 'normally the case.')) user_options.append( ('clean-docs', 'l', 'Completely clean previous builds, including ' 'automodapi-generated files before building new ones')) user_options.append( ('no-intersphinx', 'n', 'Skip intersphinx, even if conf.py says to use it')) user_options.append( ('open-docs-in-browser', 'o', 'Open the docs in a browser (using the webbrowser module) if the ' 'build finishes successfully.')) boolean_options = SphinxBuildDoc.boolean_options[:] boolean_options.append('warnings-returncode') boolean_options.append('clean-docs') boolean_options.append('no-intersphinx') boolean_options.append('open-docs-in-browser') _self_iden_rex = re.compile(r"self\.([^\d\W][\w]+)", re.UNICODE) def initialize_options(self): SphinxBuildDoc.initialize_options(self) self.clean_docs = False self.no_intersphinx = False self.open_docs_in_browser = False self.warnings_returncode = False self.traceback = False def finalize_options(self): # This has to happen before we call the parent class's finalize_options if self.build_dir is None: self.build_dir = 'docs/_build' SphinxBuildDoc.finalize_options(self) # Clear out previous sphinx builds, if requested if self.clean_docs: dirstorm = [os.path.join(self.source_dir, 'api'), os.path.join(self.source_dir, 'generated')] dirstorm.append(self.build_dir) for d in dirstorm: if os.path.isdir(d): log.info('Cleaning directory ' + d) shutil.rmtree(d) else: log.info('Not cleaning directory ' + d + ' because ' 'not present or not a directory') def run(self): # TODO: Break this method up into a few more subroutines and # document them better import webbrowser from urllib.request import pathname2url # This is used at the very end of `run` to decide if sys.exit should # be called. If it's None, it won't be. retcode = None # Now make sure Astropy is built and determine where it was built build_cmd = self.reinitialize_command('build') build_cmd.inplace = 0 self.run_command('build') build_cmd = self.get_finalized_command('build') build_cmd_path = os.path.abspath(build_cmd.build_lib) ah_importer = pkgutil.get_importer('astropy_helpers') if ah_importer is None: ah_path = '.' else: ah_path = os.path.abspath(ah_importer.path) if SPHINX_LT_17: build_main = 'from sphinx import build_main' else: build_main = 'from sphinx.cmd.build import build_main' # We need to make sure sphinx-astropy is installed sphinx_astropy_version, extra_paths = ensure_sphinx_astropy_installed() sys_path_inserts = [build_cmd_path, ah_path] + extra_paths sys_path_inserts = os.linesep.join(['sys.path.insert(0, {0!r})'.format(path) for path in sys_path_inserts]) argv = [] if self.warnings_returncode: argv.append('-W') if self.no_intersphinx: # Note, if sphinx_astropy_version is None, this could indicate an # old version of setuptools, but sphinx-astropy is likely ok, so # we can proceed. if sphinx_astropy_version is None or LooseVersion(sphinx_astropy_version) >= LooseVersion('1.1'): argv.extend(['-D', 'disable_intersphinx=1']) else: log.warn('The -n option to disable intersphinx requires ' 'sphinx-astropy>=1.1. Ignoring.') # We now need to adjust the flags based on the parent class's options if self.fresh_env: argv.append('-E') if self.all_files: argv.append('-a') if getattr(self, 'pdb', False): argv.append('-P') if getattr(self, 'nitpicky', False): argv.append('-n') if self.traceback: argv.append('-T') # The default verbosity level is 1, so in that case we just don't add a flag if self.verbose == 0: argv.append('-q') elif self.verbose > 1: argv.append('-v') if SPHINX_LT_17: argv.insert(0, 'sphinx-build') if isinstance(self.builder, str): builders = [self.builder] else: builders = self.builder subproccode = SUBPROCESS_TEMPLATE.format(build_main=build_main, srcdir=self.source_dir, sys_path_inserts=sys_path_inserts, builders=builders, argv=argv, output_dir=os.path.abspath(self.build_dir)) log.debug('Starting subprocess of {0} with python code:\n{1}\n' '[CODE END])'.format(sys.executable, subproccode)) proc = subprocess.Popen([sys.executable], stdin=subprocess.PIPE) proc.communicate(subproccode.encode('utf-8')) if proc.returncode != 0: retcode = proc.returncode if retcode is None: if self.open_docs_in_browser: if self.builder == 'html': absdir = os.path.abspath(self.builder_target_dir) index_path = os.path.join(absdir, 'index.html') fileurl = 'file://' + pathname2url(index_path) webbrowser.open(fileurl) else: log.warn('open-docs-in-browser option was given, but ' 'the builder is not html! Ignoring.') # Here we explicitly check proc.returncode since we only want to output # this for cases where the return code really wasn't 0. if proc.returncode: log.warn('Sphinx Documentation subprocess failed with return ' 'code ' + str(proc.returncode)) if retcode is not None: # this is potentially dangerous in that there might be something # after the call to `setup` in `setup.py`, and exiting here will # prevent that from running. But there's no other apparent way # to signal what the return code should be. sys.exit(retcode) class AstropyBuildSphinx(AstropyBuildDocs): # pragma: no cover def run(self): AstropyBuildDocs.run(self) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5501132 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/src/0000755000077000000240000000000000000000000025464 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6160283 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/src/compiler.c0000644000077000000240000000524000000000000027443 0ustar00tomstaff00000000000000#include /*************************************************************************** * Macros for determining the compiler version. * * These are borrowed from boost, and majorly abridged to include only * the compilers we care about. ***************************************************************************/ #define STRINGIZE(X) DO_STRINGIZE(X) #define DO_STRINGIZE(X) #X #if defined __clang__ /* Clang C++ emulates GCC, so it has to appear early. */ # define COMPILER "Clang version " __clang_version__ #elif defined(__INTEL_COMPILER) || defined(__ICL) || defined(__ICC) || defined(__ECC) /* Intel */ # if defined(__INTEL_COMPILER) # define INTEL_VERSION __INTEL_COMPILER # elif defined(__ICL) # define INTEL_VERSION __ICL # elif defined(__ICC) # define INTEL_VERSION __ICC # elif defined(__ECC) # define INTEL_VERSION __ECC # endif # define COMPILER "Intel C compiler version " STRINGIZE(INTEL_VERSION) #elif defined(__GNUC__) /* gcc */ # define COMPILER "GCC version " __VERSION__ #elif defined(__SUNPRO_CC) /* Sun Workshop Compiler */ # define COMPILER "Sun compiler version " STRINGIZE(__SUNPRO_CC) #elif defined(_MSC_VER) /* Microsoft Visual C/C++ Must be last since other compilers define _MSC_VER for compatibility as well */ # if _MSC_VER < 1200 # define COMPILER_VERSION 5.0 # elif _MSC_VER < 1300 # define COMPILER_VERSION 6.0 # elif _MSC_VER == 1300 # define COMPILER_VERSION 7.0 # elif _MSC_VER == 1310 # define COMPILER_VERSION 7.1 # elif _MSC_VER == 1400 # define COMPILER_VERSION 8.0 # elif _MSC_VER == 1500 # define COMPILER_VERSION 9.0 # elif _MSC_VER == 1600 # define COMPILER_VERSION 10.0 # else # define COMPILER_VERSION _MSC_VER # endif # define COMPILER "Microsoft Visual C++ version " STRINGIZE(COMPILER_VERSION) #else /* Fallback */ # define COMPILER "Unknown compiler" #endif /*************************************************************************** * Module-level ***************************************************************************/ struct module_state { /* The Sun compiler can't handle empty structs */ #if defined(__SUNPRO_C) || defined(_MSC_VER) int _dummy; #endif }; static struct PyModuleDef moduledef = { PyModuleDef_HEAD_INIT, "compiler_version", NULL, sizeof(struct module_state), NULL, NULL, NULL, NULL, NULL }; #define INITERROR return NULL PyMODINIT_FUNC PyInit_compiler_version(void) { PyObject* m; m = PyModule_Create(&moduledef); if (m == NULL) INITERROR; PyModule_AddStringConstant(m, "compiler", COMPILER); return m; } ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574694604.616569 astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/test.py0000644000077000000240000000267200000000000026235 0ustar00tomstaff00000000000000""" Different implementations of the ``./setup.py test`` command depending on what's locally available. If Astropy v1.1 or later is available it should be possible to import AstropyTest from ``astropy.tests.command``. Otherwise there is a skeleton implementation that allows users to at least discover the ``./setup.py test`` command and learn that they need Astropy to run it. """ import os from ..utils import import_file # Previously these except statements caught only ImportErrors, but there are # some other obscure exceptional conditions that can occur when importing # astropy.tests (at least on older versions) that can cause these imports to # fail try: # If we are testing astropy itself, we need to use import_file to avoid # actually importing astropy (just the file we need). command_file = os.path.join('astropy', 'tests', 'command.py') if os.path.exists(command_file): AstropyTest = import_file(command_file, 'astropy_tests_command').AstropyTest else: import astropy # noqa from astropy.tests.command import AstropyTest except Exception: # No astropy at all--provide the dummy implementation from ._dummy import _DummyCommand class AstropyTest(_DummyCommand): command_name = 'test' description = 'Run the tests for this package' error_msg = ( "The 'test' command requires the astropy package to be " "installed and importable.") ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6168427 astropy-healpix-0.5/astropy_helpers/astropy_helpers/conftest.py0000644000077000000240000000361200000000000025275 0ustar00tomstaff00000000000000# This file contains settings for pytest that are specific to astropy-helpers. # Since we run many of the tests in sub-processes, we need to collect coverage # data inside each subprocess and then combine it into a single .coverage file. # To do this we set up a list which run_setup appends coverage objects to. # This is not intended to be used by packages other than astropy-helpers. import os import glob try: from coverage import CoverageData except ImportError: HAS_COVERAGE = False else: HAS_COVERAGE = True if HAS_COVERAGE: SUBPROCESS_COVERAGE = [] def pytest_configure(config): if HAS_COVERAGE: SUBPROCESS_COVERAGE.clear() def pytest_unconfigure(config): if HAS_COVERAGE: # We create an empty coverage data object combined_cdata = CoverageData() # Add all files from astropy_helpers to make sure we compute the total # coverage, not just the coverage of the files that have non-zero # coverage. lines = {} for filename in glob.glob(os.path.join('astropy_helpers', '**', '*.py'), recursive=True): lines[os.path.abspath(filename)] = [] for cdata in SUBPROCESS_COVERAGE: # For each CoverageData object, we go through all the files and # change the filename from one which might be a temporary path # to the local filename. We then only keep files that actually # exist. for filename in cdata.measured_files(): try: pos = filename.rindex('astropy_helpers') except ValueError: continue short_filename = filename[pos:] if os.path.exists(short_filename): lines[os.path.abspath(short_filename)].extend(cdata.lines(filename)) combined_cdata.add_lines(lines) combined_cdata.write_file('.coverage.subprocess') ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6171172 astropy-healpix-0.5/astropy_helpers/astropy_helpers/distutils_helpers.py0000644000077000000240000001767100000000000027230 0ustar00tomstaff00000000000000""" This module contains various utilities for introspecting the distutils module and the setup process. Some of these utilities require the `astropy_helpers.setup_helpers.register_commands` function to be called first, as it will affect introspection of setuptools command-line arguments. Other utilities in this module do not have that restriction. """ import os import sys from distutils import ccompiler, log from distutils.dist import Distribution from distutils.errors import DistutilsError from .utils import silence # This function, and any functions that call it, require the setup in # `astropy_helpers.setup_helpers.register_commands` to be run first. def get_dummy_distribution(): """ Returns a distutils Distribution object used to instrument the setup environment before calling the actual setup() function. """ from .setup_helpers import _module_state if _module_state['registered_commands'] is None: raise RuntimeError( 'astropy_helpers.setup_helpers.register_commands() must be ' 'called before using ' 'astropy_helpers.setup_helpers.get_dummy_distribution()') # Pre-parse the Distutils command-line options and config files to if # the option is set. dist = Distribution({'script_name': os.path.basename(sys.argv[0]), 'script_args': sys.argv[1:]}) dist.cmdclass.update(_module_state['registered_commands']) with silence(): try: dist.parse_config_files() dist.parse_command_line() except (DistutilsError, AttributeError, SystemExit): # Let distutils handle DistutilsErrors itself AttributeErrors can # get raise for ./setup.py --help SystemExit can be raised if a # display option was used, for example pass return dist def get_main_package_directory(distribution): """ Given a Distribution object, return the main package directory. """ return min(distribution.packages, key=len).replace('.', os.sep) def get_distutils_option(option, commands): """ Returns the value of the given distutils option. Parameters ---------- option : str The name of the option commands : list of str The list of commands on which this option is available Returns ------- val : str or None the value of the given distutils option. If the option is not set, returns None. """ dist = get_dummy_distribution() for cmd in commands: cmd_opts = dist.command_options.get(cmd) if cmd_opts is not None and option in cmd_opts: return cmd_opts[option][1] else: return None def get_distutils_build_option(option): """ Returns the value of the given distutils build option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build option. If the option is not set, returns None. """ return get_distutils_option(option, ['build', 'build_ext', 'build_clib']) def get_distutils_install_option(option): """ Returns the value of the given distutils install option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build option. If the option is not set, returns None. """ return get_distutils_option(option, ['install']) def get_distutils_build_or_install_option(option): """ Returns the value of the given distutils build or install option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build or install option. If the option is not set, returns None. """ return get_distutils_option(option, ['build', 'build_ext', 'build_clib', 'install']) def get_compiler_option(): """ Determines the compiler that will be used to build extension modules. Returns ------- compiler : str The compiler option specified for the build, build_ext, or build_clib command; or the default compiler for the platform if none was specified. """ compiler = get_distutils_build_option('compiler') if compiler is None: return ccompiler.get_default_compiler() return compiler def add_command_option(command, name, doc, is_bool=False): """ Add a custom option to a setup command. Issues a warning if the option already exists on that command. Parameters ---------- command : str The name of the command as given on the command line name : str The name of the build option doc : str A short description of the option, for the `--help` message is_bool : bool, optional When `True`, the option is a boolean option and doesn't require an associated value. """ dist = get_dummy_distribution() cmdcls = dist.get_command_class(command) if (hasattr(cmdcls, '_astropy_helpers_options') and name in cmdcls._astropy_helpers_options): return attr = name.replace('-', '_') if hasattr(cmdcls, attr): raise RuntimeError( '{0!r} already has a {1!r} class attribute, barring {2!r} from ' 'being usable as a custom option name.'.format(cmdcls, attr, name)) for idx, cmd in enumerate(cmdcls.user_options): if cmd[0] == name: log.warn('Overriding existing {0!r} option ' '{1!r}'.format(command, name)) del cmdcls.user_options[idx] if name in cmdcls.boolean_options: cmdcls.boolean_options.remove(name) break cmdcls.user_options.append((name, None, doc)) if is_bool: cmdcls.boolean_options.append(name) # Distutils' command parsing requires that a command object have an # attribute with the same name as the option (with '-' replaced with '_') # in order for that option to be recognized as valid setattr(cmdcls, attr, None) # This caches the options added through add_command_option so that if it is # run multiple times in the same interpreter repeated adds are ignored # (this way we can still raise a RuntimeError if a custom option overrides # a built-in option) if not hasattr(cmdcls, '_astropy_helpers_options'): cmdcls._astropy_helpers_options = set([name]) else: cmdcls._astropy_helpers_options.add(name) def get_distutils_display_options(): """ Returns a set of all the distutils display options in their long and short forms. These are the setup.py arguments such as --name or --version which print the project's metadata and then exit. Returns ------- opts : set The long and short form display option arguments, including the - or -- """ short_display_opts = set('-' + o[1] for o in Distribution.display_options if o[1]) long_display_opts = set('--' + o[0] for o in Distribution.display_options) # Include -h and --help which are not explicitly listed in # Distribution.display_options (as they are handled by optparse) short_display_opts.add('-h') long_display_opts.add('--help') # This isn't the greatest approach to hardcode these commands. # However, there doesn't seem to be a good way to determine # whether build *will be* run as part of the command at this # phase. display_commands = set([ 'clean', 'register', 'setopt', 'saveopts', 'egg_info', 'alias']) return short_display_opts.union(long_display_opts.union(display_commands)) def is_distutils_display_option(): """ Returns True if sys.argv contains any of the distutils display options such as --version or --name. """ display_options = get_distutils_display_options() return bool(set(sys.argv[1:]).intersection(display_options)) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6175387 astropy-healpix-0.5/astropy_helpers/astropy_helpers/git_helpers.py0000644000077000000240000001453700000000000025765 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Utilities for retrieving revision information from a project's git repository. """ # Do not remove the following comment; it is used by # astropy_helpers.version_helpers to determine the beginning of the code in # this module # BEGIN import locale import os import subprocess import warnings __all__ = ['get_git_devstr'] def _decode_stdio(stream): try: stdio_encoding = locale.getdefaultlocale()[1] or 'utf-8' except ValueError: stdio_encoding = 'utf-8' try: text = stream.decode(stdio_encoding) except UnicodeDecodeError: # Final fallback text = stream.decode('latin1') return text def update_git_devstr(version, path=None): """ Updates the git revision string if and only if the path is being imported directly from a git working copy. This ensures that the revision number in the version string is accurate. """ try: # Quick way to determine if we're in git or not - returns '' if not devstr = get_git_devstr(sha=True, show_warning=False, path=path) except OSError: return version if not devstr: # Probably not in git so just pass silently return version if 'dev' in version: # update to the current git revision version_base = version.split('.dev', 1)[0] devstr = get_git_devstr(sha=False, show_warning=False, path=path) return version_base + '.dev' + devstr else: # otherwise it's already the true/release version return version def get_git_devstr(sha=False, show_warning=True, path=None): """ Determines the number of revisions in this repository. Parameters ---------- sha : bool If True, the full SHA1 hash will be returned. Otherwise, the total count of commits in the repository will be used as a "revision number". show_warning : bool If True, issue a warning if git returns an error code, otherwise errors pass silently. path : str or None If a string, specifies the directory to look in to find the git repository. If `None`, the current working directory is used, and must be the root of the git repository. If given a filename it uses the directory containing that file. Returns ------- devversion : str Either a string with the revision number (if `sha` is False), the SHA1 hash of the current commit (if `sha` is True), or an empty string if git version info could not be identified. """ if path is None: path = os.getcwd() if not os.path.isdir(path): path = os.path.abspath(os.path.dirname(path)) if sha: # Faster for getting just the hash of HEAD cmd = ['rev-parse', 'HEAD'] else: cmd = ['rev-list', '--count', 'HEAD'] def run_git(cmd): try: p = subprocess.Popen(['git'] + cmd, cwd=path, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE) stdout, stderr = p.communicate() except OSError as e: if show_warning: warnings.warn('Error running git: ' + str(e)) return (None, b'', b'') if p.returncode == 128: if show_warning: warnings.warn('No git repository present at {0!r}! Using ' 'default dev version.'.format(path)) return (p.returncode, b'', b'') if p.returncode == 129: if show_warning: warnings.warn('Your git looks old (does it support {0}?); ' 'consider upgrading to v1.7.2 or ' 'later.'.format(cmd[0])) return (p.returncode, stdout, stderr) elif p.returncode != 0: if show_warning: warnings.warn('Git failed while determining revision ' 'count: {0}'.format(_decode_stdio(stderr))) return (p.returncode, stdout, stderr) return p.returncode, stdout, stderr returncode, stdout, stderr = run_git(cmd) if not sha and returncode == 128: # git returns 128 if the command is not run from within a git # repository tree. In this case, a warning is produced above but we # return the default dev version of '0'. return '0' elif not sha and returncode == 129: # git returns 129 if a command option failed to parse; in # particular this could happen in git versions older than 1.7.2 # where the --count option is not supported # Also use --abbrev-commit and --abbrev=0 to display the minimum # number of characters needed per-commit (rather than the full hash) cmd = ['rev-list', '--abbrev-commit', '--abbrev=0', 'HEAD'] returncode, stdout, stderr = run_git(cmd) # Fall back on the old method of getting all revisions and counting # the lines if returncode == 0: return str(stdout.count(b'\n')) else: return '' elif sha: return _decode_stdio(stdout)[:40] else: return _decode_stdio(stdout).strip() # This function is tested but it is only ever executed within a subprocess when # creating a fake package, so it doesn't get picked up by coverage metrics. def _get_repo_path(pathname, levels=None): # pragma: no cover """ Given a file or directory name, determine the root of the git repository this path is under. If given, this won't look any higher than ``levels`` (that is, if ``levels=0`` then the given path must be the root of the git repository and is returned if so. Returns `None` if the given path could not be determined to belong to a git repo. """ if os.path.isfile(pathname): current_dir = os.path.abspath(os.path.dirname(pathname)) elif os.path.isdir(pathname): current_dir = os.path.abspath(pathname) else: return None current_level = 0 while levels is None or current_level <= levels: if os.path.exists(os.path.join(current_dir, '.git')): return current_dir current_level += 1 if current_dir == os.path.dirname(current_dir): break current_dir = os.path.dirname(current_dir) return None ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6178815 astropy-healpix-0.5/astropy_helpers/astropy_helpers/openmp_helpers.py0000644000077000000240000002216100000000000026470 0ustar00tomstaff00000000000000# This module defines functions that can be used to check whether OpenMP is # available and if so what flags to use. To use this, import the # add_openmp_flags_if_available function in a setup_package.py file where you # are defining your extensions: # # from astropy_helpers.openmp_helpers import add_openmp_flags_if_available # # then call it with a single extension as the only argument: # # add_openmp_flags_if_available(extension) # # this will add the OpenMP flags if available. import os import sys import glob import time import datetime import tempfile import subprocess from distutils import log from distutils.ccompiler import new_compiler from distutils.sysconfig import customize_compiler, get_config_var from distutils.errors import CompileError, LinkError from .distutils_helpers import get_compiler_option __all__ = ['add_openmp_flags_if_available'] try: # Check if this has already been instantiated, only set the default once. _ASTROPY_DISABLE_SETUP_WITH_OPENMP_ except NameError: import builtins # It hasn't, so do so. builtins._ASTROPY_DISABLE_SETUP_WITH_OPENMP_ = False CCODE = """ #include #include int main(void) { #pragma omp parallel printf("nthreads=%d\\n", omp_get_num_threads()); return 0; } """ def _get_flag_value_from_var(flag, var, delim=' '): """ Extract flags from an environment variable. Parameters ---------- flag : str The flag to extract, for example '-I' or '-L' var : str The environment variable to extract the flag from, e.g. CFLAGS or LDFLAGS. delim : str, optional The delimiter separating flags inside the environment variable Examples -------- Let's assume the LDFLAGS is set to '-L/usr/local/include -customflag'. This function will then return the following: >>> _get_flag_value_from_var('-L', 'LDFLAGS') '/usr/local/include' Notes ----- Environment variables are first checked in ``os.environ[var]``, then in ``distutils.sysconfig.get_config_var(var)``. This function is not supported on Windows. """ if sys.platform.startswith('win'): return None # Simple input validation if not var or not flag: return None flag_length = len(flag) if not flag_length: return None # Look for var in os.eviron then in get_config_var if var in os.environ: flags = os.environ[var] else: try: flags = get_config_var(var) except KeyError: return None # Extract flag from {var:value} if flags: for item in flags.split(delim): if item.startswith(flag): return item[flag_length:] def get_openmp_flags(): """ Utility for returning compiler and linker flags possibly needed for OpenMP support. Returns ------- result : `{'compiler_flags':, 'linker_flags':}` Notes ----- The flags returned are not tested for validity, use `check_openmp_support(openmp_flags=get_openmp_flags())` to do so. """ compile_flags = [] link_flags = [] if get_compiler_option() == 'msvc': compile_flags.append('-openmp') else: include_path = _get_flag_value_from_var('-I', 'CFLAGS') if include_path: compile_flags.append('-I' + include_path) lib_path = _get_flag_value_from_var('-L', 'LDFLAGS') if lib_path: link_flags.append('-L' + lib_path) link_flags.append('-Wl,-rpath,' + lib_path) compile_flags.append('-fopenmp') link_flags.append('-fopenmp') return {'compiler_flags': compile_flags, 'linker_flags': link_flags} def check_openmp_support(openmp_flags=None): """ Check whether OpenMP test code can be compiled and run. Parameters ---------- openmp_flags : dict, optional This should be a dictionary with keys ``compiler_flags`` and ``linker_flags`` giving the compiliation and linking flags respectively. These are passed as `extra_postargs` to `compile()` and `link_executable()` respectively. If this is not set, the flags will be automatically determined using environment variables. Returns ------- result : bool `True` if the test passed, `False` otherwise. """ ccompiler = new_compiler() customize_compiler(ccompiler) if not openmp_flags: # customize_compiler() extracts info from os.environ. If certain keys # exist it uses these plus those from sysconfig.get_config_vars(). # If the key is missing in os.environ it is not extracted from # sysconfig.get_config_var(). E.g. 'LDFLAGS' get left out, preventing # clang from finding libomp.dylib because -L is not passed to # linker. Call get_openmp_flags() to get flags missed by # customize_compiler(). openmp_flags = get_openmp_flags() compile_flags = openmp_flags.get('compiler_flags') link_flags = openmp_flags.get('linker_flags') tmp_dir = tempfile.mkdtemp() start_dir = os.path.abspath('.') try: os.chdir(tmp_dir) # Write test program with open('test_openmp.c', 'w') as f: f.write(CCODE) os.mkdir('objects') # Compile, test program ccompiler.compile(['test_openmp.c'], output_dir='objects', extra_postargs=compile_flags) # Link test program objects = glob.glob(os.path.join('objects', '*' + ccompiler.obj_extension)) ccompiler.link_executable(objects, 'test_openmp', extra_postargs=link_flags) # Run test program output = subprocess.check_output('./test_openmp') output = output.decode(sys.stdout.encoding or 'utf-8').splitlines() if 'nthreads=' in output[0]: nthreads = int(output[0].strip().split('=')[1]) if len(output) == nthreads: is_openmp_supported = True else: log.warn("Unexpected number of lines from output of test OpenMP " "program (output was {0})".format(output)) is_openmp_supported = False else: log.warn("Unexpected output from test OpenMP " "program (output was {0})".format(output)) is_openmp_supported = False except (CompileError, LinkError, subprocess.CalledProcessError): is_openmp_supported = False finally: os.chdir(start_dir) return is_openmp_supported def is_openmp_supported(): """ Determine whether the build compiler has OpenMP support. """ log_threshold = log.set_threshold(log.FATAL) ret = check_openmp_support() log.set_threshold(log_threshold) return ret def add_openmp_flags_if_available(extension): """ Add OpenMP compilation flags, if supported (if not a warning will be printed to the console and no flags will be added.) Returns `True` if the flags were added, `False` otherwise. """ if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_: log.info("OpenMP support has been explicitly disabled.") return False openmp_flags = get_openmp_flags() using_openmp = check_openmp_support(openmp_flags=openmp_flags) if using_openmp: compile_flags = openmp_flags.get('compiler_flags') link_flags = openmp_flags.get('linker_flags') log.info("Compiling Cython/C/C++ extension with OpenMP support") extension.extra_compile_args.extend(compile_flags) extension.extra_link_args.extend(link_flags) else: log.warn("Cannot compile Cython/C/C++ extension with OpenMP, reverting " "to non-parallel code") return using_openmp _IS_OPENMP_ENABLED_SRC = """ # Autogenerated by {packagetitle}'s setup.py on {timestamp!s} def is_openmp_enabled(): \"\"\" Determine whether this package was built with OpenMP support. \"\"\" return {return_bool} """[1:] def generate_openmp_enabled_py(packagename, srcdir='.', disable_openmp=None): """ Generate ``package.openmp_enabled.is_openmp_enabled``, which can then be used to determine, post build, whether the package was built with or without OpenMP support. """ if packagename.lower() == 'astropy': packagetitle = 'Astropy' else: packagetitle = packagename epoch = int(os.environ.get('SOURCE_DATE_EPOCH', time.time())) timestamp = datetime.datetime.utcfromtimestamp(epoch) if disable_openmp is not None: import builtins builtins._ASTROPY_DISABLE_SETUP_WITH_OPENMP_ = disable_openmp if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_: log.info("OpenMP support has been explicitly disabled.") openmp_support = False if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_ else is_openmp_supported() src = _IS_OPENMP_ENABLED_SRC.format(packagetitle=packagetitle, timestamp=timestamp, return_bool=openmp_support) package_srcdir = os.path.join(srcdir, *packagename.split('.')) is_openmp_enabled_py = os.path.join(package_srcdir, 'openmp_enabled.py') with open(is_openmp_enabled_py, 'w') as f: f.write(src) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6193264 astropy-healpix-0.5/astropy_helpers/astropy_helpers/setup_helpers.py0000644000077000000240000007055200000000000026341 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This module contains a number of utilities for use during setup/build/packaging that are useful to astropy as a whole. """ import collections import os import re import subprocess import sys import traceback import warnings from configparser import ConfigParser import builtins from distutils import log from distutils.errors import DistutilsOptionError, DistutilsModuleError from distutils.core import Extension from distutils.core import Command from distutils.command.sdist import sdist as DistutilsSdist from setuptools import setup as setuptools_setup from setuptools.config import read_configuration from setuptools import find_packages as _find_packages from .distutils_helpers import (add_command_option, get_compiler_option, get_dummy_distribution, get_distutils_build_option, get_distutils_build_or_install_option) from .version_helpers import get_pkg_version_module, generate_version_py from .utils import (walk_skip_hidden, import_file, extends_doc, resolve_name, AstropyDeprecationWarning) from .commands.build_ext import AstropyHelpersBuildExt from .commands.test import AstropyTest # These imports are not used in this module, but are included for backwards # compat with older versions of this module from .utils import get_numpy_include_path, write_if_different # noqa __all__ = ['register_commands', 'get_package_info'] _module_state = {'registered_commands': None, 'have_sphinx': False, 'package_cache': None, 'exclude_packages': set(), 'excludes_too_late': False} try: import sphinx # noqa _module_state['have_sphinx'] = True except ValueError as e: # This can occur deep in the bowels of Sphinx's imports by way of docutils # and an occurrence of this bug: http://bugs.python.org/issue18378 # In this case sphinx is effectively unusable if 'unknown locale' in e.args[0]: log.warn( "Possible misconfiguration of one of the environment variables " "LC_ALL, LC_CTYPES, LANG, or LANGUAGE. For an example of how to " "configure your system's language environment on OSX see " "http://blog.remibergsma.com/2012/07/10/" "setting-locales-correctly-on-mac-osx-terminal-application/") except ImportError: pass except SyntaxError: # occurs if markupsafe is recent version, which doesn't support Python 3.2 pass def setup(**kwargs): """ A wrapper around setuptools' setup() function that automatically sets up custom commands, generates a version file, and customizes the setup process via the ``setup_package.py`` files. """ # DEPRECATED: store the package name in a built-in variable so it's easy # to get from other parts of the setup infrastructure. We should phase this # out in packages that use it - the cookiecutter template should now be # able to put the right package name where needed. conf = read_configuration('setup.cfg') builtins._ASTROPY_PACKAGE_NAME_ = conf['metadata']['name'] # Create a dictionary with setup command overrides. Note that this gets # information about the package (name and version) from the setup.cfg file. cmdclass = register_commands() # Freeze build information in version.py. Note that this gets information # about the package (name and version) from the setup.cfg file. version = generate_version_py() # Get configuration information from all of the various subpackages. # See the docstring for setup_helpers.update_package_files for more # details. package_info = get_package_info() package_info['cmdclass'] = cmdclass package_info['version'] = version # Override using any specified keyword arguments package_info.update(kwargs) setuptools_setup(**package_info) def adjust_compiler(package): warnings.warn( 'The adjust_compiler function in setup.py is ' 'deprecated and can be removed from your setup.py.', AstropyDeprecationWarning) def get_debug_option(packagename): """ Determines if the build is in debug mode. Returns ------- debug : bool True if the current build was started with the debug option, False otherwise. """ try: current_debug = get_pkg_version_module(packagename, fromlist=['debug'])[0] except (ImportError, AttributeError): current_debug = None # Only modify the debug flag if one of the build commands was explicitly # run (i.e. not as a sub-command of something else) dist = get_dummy_distribution() if any(cmd in dist.commands for cmd in ['build', 'build_ext']): debug = bool(get_distutils_build_option('debug')) else: debug = bool(current_debug) if current_debug is not None and current_debug != debug: build_ext_cmd = dist.get_command_class('build_ext') build_ext_cmd._force_rebuild = True return debug def add_exclude_packages(excludes): if _module_state['excludes_too_late']: raise RuntimeError( "add_package_excludes must be called before all other setup helper " "functions in order to properly handle excluded packages") _module_state['exclude_packages'].update(set(excludes)) def register_commands(package=None, version=None, release=None, srcdir='.'): """ This function generates a dictionary containing customized commands that can then be passed to the ``cmdclass`` argument in ``setup()``. """ if package is not None: warnings.warn('The package argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the package name in setup.cfg instead', AstropyDeprecationWarning) if version is not None: warnings.warn('The version argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the version number in setup.cfg instead', AstropyDeprecationWarning) if release is not None: warnings.warn('The release argument to generate_version_py has ' 'been deprecated and will be removed in future. We now ' 'use the presence of the "dev" string in the version to ' 'determine whether this is a release', AstropyDeprecationWarning) # We use ConfigParser instead of read_configuration here because the latter # only reads in keys recognized by setuptools, but we need to access # package_name below. conf = ConfigParser() conf.read('setup.cfg') if conf.has_option('metadata', 'name'): package = conf.get('metadata', 'name') elif conf.has_option('metadata', 'package_name'): # The package-template used package_name instead of name for a while warnings.warn('Specifying the package name using the "package_name" ' 'option in setup.cfg is deprecated - use the "name" ' 'option instead.', AstropyDeprecationWarning) package = conf.get('metadata', 'package_name') elif package is not None: # deprecated pass else: sys.stderr.write('ERROR: Could not read package name from setup.cfg\n') sys.exit(1) if _module_state['registered_commands'] is not None: return _module_state['registered_commands'] if _module_state['have_sphinx']: try: from .commands.build_sphinx import (AstropyBuildSphinx, AstropyBuildDocs) except ImportError: AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx else: AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx _module_state['registered_commands'] = registered_commands = { 'test': generate_test_command(package), # Use distutils' sdist because it respects package_data. # setuptools/distributes sdist requires duplication of information in # MANIFEST.in 'sdist': DistutilsSdist, 'build_ext': AstropyHelpersBuildExt, 'build_sphinx': AstropyBuildSphinx, 'build_docs': AstropyBuildDocs } # Need to override the __name__ here so that the commandline options are # presented as being related to the "build" command, for example; normally # this wouldn't be necessary since commands also have a command_name # attribute, but there is a bug in distutils' help display code that it # uses __name__ instead of command_name. Yay distutils! for name, cls in registered_commands.items(): cls.__name__ = name # Add a few custom options; more of these can be added by specific packages # later for option in [ ('use-system-libraries', "Use system libraries whenever possible", True)]: add_command_option('build', *option) add_command_option('install', *option) add_command_hooks(registered_commands, srcdir=srcdir) return registered_commands def add_command_hooks(commands, srcdir='.'): """ Look through setup_package.py modules for functions with names like ``pre__hook`` and ``post__hook`` where ```` is the name of a ``setup.py`` command (e.g. build_ext). If either hook is present this adds a wrapped version of that command to the passed in ``commands`` `dict`. ``commands`` may be pre-populated with other custom distutils command classes that should be wrapped if there are hooks for them (e.g. `AstropyBuildPy`). """ hook_re = re.compile(r'^(pre|post)_(.+)_hook$') # Distutils commands have a method of the same name, but it is not a # *classmethod* (which probably didn't exist when distutils was first # written) def get_command_name(cmdcls): if hasattr(cmdcls, 'command_name'): return cmdcls.command_name else: return cmdcls.__name__ packages = find_packages(srcdir) dist = get_dummy_distribution() hooks = collections.defaultdict(dict) for setuppkg in iter_setup_packages(srcdir, packages): for name, obj in vars(setuppkg).items(): match = hook_re.match(name) if not match: continue hook_type = match.group(1) cmd_name = match.group(2) if hook_type not in hooks[cmd_name]: hooks[cmd_name][hook_type] = [] hooks[cmd_name][hook_type].append((setuppkg.__name__, obj)) for cmd_name, cmd_hooks in hooks.items(): commands[cmd_name] = generate_hooked_command( cmd_name, dist.get_command_class(cmd_name), cmd_hooks) def generate_hooked_command(cmd_name, cmd_cls, hooks): """ Returns a generated subclass of ``cmd_cls`` that runs the pre- and post-command hooks for that command before and after the ``cmd_cls.run`` method. """ def run(self, orig_run=cmd_cls.run): self.run_command_hooks('pre_hooks') orig_run(self) self.run_command_hooks('post_hooks') return type(cmd_name, (cmd_cls, object), {'run': run, 'run_command_hooks': run_command_hooks, 'pre_hooks': hooks.get('pre', []), 'post_hooks': hooks.get('post', [])}) def run_command_hooks(cmd_obj, hook_kind): """Run hooks registered for that command and phase. *cmd_obj* is a finalized command object; *hook_kind* is either 'pre_hook' or 'post_hook'. """ hooks = getattr(cmd_obj, hook_kind, None) if not hooks: return for modname, hook in hooks: if isinstance(hook, str): try: hook_obj = resolve_name(hook) except ImportError as exc: raise DistutilsModuleError( 'cannot find hook {0}: {1}'.format(hook, exc)) else: hook_obj = hook if not callable(hook_obj): raise DistutilsOptionError('hook {0!r} is not callable' % hook) log.info('running {0} from {1} for {2} command'.format( hook_kind.rstrip('s'), modname, cmd_obj.get_command_name())) try: hook_obj(cmd_obj) except Exception: log.error('{0} command hook {1} raised an exception: %s\n'.format( hook_obj.__name__, cmd_obj.get_command_name())) log.error(traceback.format_exc()) sys.exit(1) def generate_test_command(package_name): """ Creates a custom 'test' command for the given package which sets the command's ``package_name`` class attribute to the name of the package being tested. """ return type(package_name.title() + 'Test', (AstropyTest,), {'package_name': package_name}) def update_package_files(srcdir, extensions, package_data, packagenames, package_dirs): """ This function is deprecated and maintained for backward compatibility with affiliated packages. Affiliated packages should update their setup.py to use `get_package_info` instead. """ info = get_package_info(srcdir) extensions.extend(info['ext_modules']) package_data.update(info['package_data']) packagenames = list(set(packagenames + info['packages'])) package_dirs.update(info['package_dir']) def get_package_info(srcdir='.', exclude=()): """ Collates all of the information for building all subpackages and returns a dictionary of keyword arguments that can be passed directly to `distutils.setup`. The purpose of this function is to allow subpackages to update the arguments to the package's ``setup()`` function in its setup.py script, rather than having to specify all extensions/package data directly in the ``setup.py``. See Astropy's own ``setup.py`` for example usage and the Astropy development docs for more details. This function obtains that information by iterating through all packages in ``srcdir`` and locating a ``setup_package.py`` module. This module can contain the following functions: ``get_extensions()``, ``get_package_data()``, ``get_build_options()``, and ``get_external_libraries()``. Each of those functions take no arguments. - ``get_extensions`` returns a list of `distutils.extension.Extension` objects. - ``get_package_data()`` returns a dict formatted as required by the ``package_data`` argument to ``setup()``. - ``get_build_options()`` returns a list of tuples describing the extra build options to add. - ``get_external_libraries()`` returns a list of libraries that can optionally be built using external dependencies. """ ext_modules = [] packages = [] package_dir = {} # Read in existing package data, and add to it below setup_cfg = os.path.join(srcdir, 'setup.cfg') if os.path.exists(setup_cfg): conf = read_configuration(setup_cfg) if 'options' in conf and 'package_data' in conf['options']: package_data = conf['options']['package_data'] else: package_data = {} else: package_data = {} if exclude: warnings.warn( "Use of the exclude parameter is no longer supported since it does " "not work as expected. Use add_exclude_packages instead. Note that " "it must be called prior to any other calls from setup helpers.", AstropyDeprecationWarning) # Use the find_packages tool to locate all packages and modules packages = find_packages(srcdir, exclude=exclude) # Update package_dir if the package lies in a subdirectory if srcdir != '.': package_dir[''] = srcdir # For each of the setup_package.py modules, extract any # information that is needed to install them. The build options # are extracted first, so that their values will be available in # subsequent calls to `get_extensions`, etc. for setuppkg in iter_setup_packages(srcdir, packages): if hasattr(setuppkg, 'get_build_options'): options = setuppkg.get_build_options() for option in options: add_command_option('build', *option) if hasattr(setuppkg, 'get_external_libraries'): libraries = setuppkg.get_external_libraries() for library in libraries: add_external_library(library) for setuppkg in iter_setup_packages(srcdir, packages): # get_extensions must include any Cython extensions by their .pyx # filename. if hasattr(setuppkg, 'get_extensions'): ext_modules.extend(setuppkg.get_extensions()) if hasattr(setuppkg, 'get_package_data'): package_data.update(setuppkg.get_package_data()) # Locate any .pyx files not already specified, and add their extensions in. # The default include dirs include numpy to facilitate numerical work. ext_modules.extend(get_cython_extensions(srcdir, packages, ext_modules, ['numpy'])) # Now remove extensions that have the special name 'skip_cython', as they # exist Only to indicate that the cython extensions shouldn't be built for i, ext in reversed(list(enumerate(ext_modules))): if ext.name == 'skip_cython': del ext_modules[i] # On Microsoft compilers, we need to pass the '/MANIFEST' # commandline argument. This was the default on MSVC 9.0, but is # now required on MSVC 10.0, but it doesn't seem to hurt to add # it unconditionally. if get_compiler_option() == 'msvc': for ext in ext_modules: ext.extra_link_args.append('/MANIFEST') return { 'ext_modules': ext_modules, 'packages': packages, 'package_dir': package_dir, 'package_data': package_data, } def iter_setup_packages(srcdir, packages): """ A generator that finds and imports all of the ``setup_package.py`` modules in the source packages. Returns ------- modgen : generator A generator that yields (modname, mod), where `mod` is the module and `modname` is the module name for the ``setup_package.py`` modules. """ for packagename in packages: package_parts = packagename.split('.') package_path = os.path.join(srcdir, *package_parts) setup_package = os.path.relpath( os.path.join(package_path, 'setup_package.py')) if os.path.isfile(setup_package): module = import_file(setup_package, name=packagename + '.setup_package') yield module def iter_pyx_files(package_dir, package_name): """ A generator that yields Cython source files (ending in '.pyx') in the source packages. Returns ------- pyxgen : generator A generator that yields (extmod, fullfn) where `extmod` is the full name of the module that the .pyx file would live in based on the source directory structure, and `fullfn` is the path to the .pyx file. """ for dirpath, dirnames, filenames in walk_skip_hidden(package_dir): for fn in filenames: if fn.endswith('.pyx'): fullfn = os.path.relpath(os.path.join(dirpath, fn)) # Package must match file name extmod = '.'.join([package_name, fn[:-4]]) yield (extmod, fullfn) break # Don't recurse into subdirectories def get_cython_extensions(srcdir, packages, prevextensions=tuple(), extincludedirs=None): """ Looks for Cython files and generates Extensions if needed. Parameters ---------- srcdir : str Path to the root of the source directory to search. prevextensions : list of `~distutils.core.Extension` objects The extensions that are already defined. Any .pyx files already here will be ignored. extincludedirs : list of str or None Directories to include as the `include_dirs` argument to the generated `~distutils.core.Extension` objects. Returns ------- exts : list of `~distutils.core.Extension` objects The new extensions that are needed to compile all .pyx files (does not include any already in `prevextensions`). """ # Vanilla setuptools and old versions of distribute include Cython files # as .c files in the sources, not .pyx, so we cannot simply look for # existing .pyx sources in the previous sources, but we should also check # for .c files with the same remaining filename. So we look for .pyx and # .c files, and we strip the extension. prevsourcepaths = [] ext_modules = [] for ext in prevextensions: for s in ext.sources: if s.endswith(('.pyx', '.c', '.cpp')): sourcepath = os.path.realpath(os.path.splitext(s)[0]) prevsourcepaths.append(sourcepath) for package_name in packages: package_parts = package_name.split('.') package_path = os.path.join(srcdir, *package_parts) for extmod, pyxfn in iter_pyx_files(package_path, package_name): sourcepath = os.path.realpath(os.path.splitext(pyxfn)[0]) if sourcepath not in prevsourcepaths: ext_modules.append(Extension(extmod, [pyxfn], include_dirs=extincludedirs)) return ext_modules class DistutilsExtensionArgs(collections.defaultdict): """ A special dictionary whose default values are the empty list. This is useful for building up a set of arguments for `distutils.Extension` without worrying whether the entry is already present. """ def __init__(self, *args, **kwargs): def default_factory(): return [] super(DistutilsExtensionArgs, self).__init__( default_factory, *args, **kwargs) def update(self, other): for key, val in other.items(): self[key].extend(val) def pkg_config(packages, default_libraries, executable='pkg-config'): """ Uses pkg-config to update a set of distutils Extension arguments to include the flags necessary to link against the given packages. If the pkg-config lookup fails, default_libraries is applied to libraries. Parameters ---------- packages : list of str A list of pkg-config packages to look up. default_libraries : list of str A list of library names to use if the pkg-config lookup fails. Returns ------- config : dict A dictionary containing keyword arguments to `distutils.Extension`. These entries include: - ``include_dirs``: A list of include directories - ``library_dirs``: A list of library directories - ``libraries``: A list of libraries - ``define_macros``: A list of macro defines - ``undef_macros``: A list of macros to undefine - ``extra_compile_args``: A list of extra arguments to pass to the compiler """ flag_map = {'-I': 'include_dirs', '-L': 'library_dirs', '-l': 'libraries', '-D': 'define_macros', '-U': 'undef_macros'} command = "{0} --libs --cflags {1}".format(executable, ' '.join(packages)), result = DistutilsExtensionArgs() try: pipe = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE) output = pipe.communicate()[0].strip() except subprocess.CalledProcessError as e: lines = [ ("{0} failed. This may cause the build to fail below." .format(executable)), " command: {0}".format(e.cmd), " returncode: {0}".format(e.returncode), " output: {0}".format(e.output) ] log.warn('\n'.join(lines)) result['libraries'].extend(default_libraries) else: if pipe.returncode != 0: lines = [ "pkg-config could not lookup up package(s) {0}.".format( ", ".join(packages)), "This may cause the build to fail below." ] log.warn('\n'.join(lines)) result['libraries'].extend(default_libraries) else: for token in output.split(): # It's not clear what encoding the output of # pkg-config will come to us in. It will probably be # some combination of pure ASCII (for the compiler # flags) and the filesystem encoding (for any argument # that includes directories or filenames), but this is # just conjecture, as the pkg-config documentation # doesn't seem to address it. arg = token[:2].decode('ascii') value = token[2:].decode(sys.getfilesystemencoding()) if arg in flag_map: if arg == '-D': value = tuple(value.split('=', 1)) result[flag_map[arg]].append(value) else: result['extra_compile_args'].append(value) return result def add_external_library(library): """ Add a build option for selecting the internal or system copy of a library. Parameters ---------- library : str The name of the library. If the library is `foo`, the build option will be called `--use-system-foo`. """ for command in ['build', 'build_ext', 'install']: add_command_option(command, str('use-system-' + library), 'Use the system {0} library'.format(library), is_bool=True) def use_system_library(library): """ Returns `True` if the build configuration indicates that the given library should use the system copy of the library rather than the internal one. For the given library `foo`, this will be `True` if `--use-system-foo` or `--use-system-libraries` was provided at the commandline or in `setup.cfg`. Parameters ---------- library : str The name of the library Returns ------- use_system : bool `True` if the build should use the system copy of the library. """ return ( get_distutils_build_or_install_option('use_system_{0}'.format(library)) or get_distutils_build_or_install_option('use_system_libraries')) @extends_doc(_find_packages) def find_packages(where='.', exclude=(), invalidate_cache=False): """ This version of ``find_packages`` caches previous results to speed up subsequent calls. Use ``invalide_cache=True`` to ignore cached results from previous ``find_packages`` calls, and repeat the package search. """ if exclude: warnings.warn( "Use of the exclude parameter is no longer supported since it does " "not work as expected. Use add_exclude_packages instead. Note that " "it must be called prior to any other calls from setup helpers.", AstropyDeprecationWarning) # Calling add_exclude_packages after this point will have no effect _module_state['excludes_too_late'] = True if not invalidate_cache and _module_state['package_cache'] is not None: return _module_state['package_cache'] packages = _find_packages( where=where, exclude=list(_module_state['exclude_packages'])) _module_state['package_cache'] = packages return packages class FakeBuildSphinx(Command): """ A dummy build_sphinx command that is called if Sphinx is not installed and displays a relevant error message """ # user options inherited from sphinx.setup_command.BuildDoc user_options = [ ('fresh-env', 'E', ''), ('all-files', 'a', ''), ('source-dir=', 's', ''), ('build-dir=', None, ''), ('config-dir=', 'c', ''), ('builder=', 'b', ''), ('project=', None, ''), ('version=', None, ''), ('release=', None, ''), ('today=', None, ''), ('link-index', 'i', '')] # user options appended in astropy.setup_helpers.AstropyBuildSphinx user_options.append(('warnings-returncode', 'w', '')) user_options.append(('clean-docs', 'l', '')) user_options.append(('no-intersphinx', 'n', '')) user_options.append(('open-docs-in-browser', 'o', '')) def initialize_options(self): try: raise RuntimeError("Sphinx and its dependencies must be installed " "for build_docs.") except: log.error('error: Sphinx and its dependencies must be installed ' 'for build_docs.') sys.exit(1) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5516844 astropy-healpix-0.5/astropy_helpers/astropy_helpers/sphinx/0000755000077000000240000000000000000000000024405 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574694604.619747 astropy-healpix-0.5/astropy_helpers/astropy_helpers/sphinx/__init__.py0000644000077000000240000000000000000000000026504 0ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6200445 astropy-healpix-0.5/astropy_helpers/astropy_helpers/sphinx/conf.py0000644000077000000240000000023400000000000025703 0ustar00tomstaff00000000000000import warnings from sphinx_astropy.conf import * warnings.warn("Note that astropy_helpers.sphinx.conf is deprecated - use sphinx_astropy.conf instead") ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6246252 astropy-healpix-0.5/astropy_helpers/astropy_helpers/utils.py0000644000077000000240000002056700000000000024620 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst import contextlib import imp import os import sys import glob from importlib import machinery as import_machinery # Note: The following Warning subclasses are simply copies of the Warnings in # Astropy of the same names. class AstropyWarning(Warning): """ The base warning class from which all Astropy warnings should inherit. Any warning inheriting from this class is handled by the Astropy logger. """ class AstropyDeprecationWarning(AstropyWarning): """ A warning class to indicate a deprecated feature. """ class AstropyPendingDeprecationWarning(PendingDeprecationWarning, AstropyWarning): """ A warning class to indicate a soon-to-be deprecated feature. """ def _get_platlib_dir(cmd): """ Given a build command, return the name of the appropriate platform-specific build subdirectory directory (e.g. build/lib.linux-x86_64-2.7) """ plat_specifier = '.{0}-{1}'.format(cmd.plat_name, sys.version[0:3]) return os.path.join(cmd.build_base, 'lib' + plat_specifier) def get_numpy_include_path(): """ Gets the path to the numpy headers. """ # We need to go through this nonsense in case setuptools # downloaded and installed Numpy for us as part of the build or # install, since Numpy may still think it's in "setup mode", when # in fact we're ready to use it to build astropy now. import builtins if hasattr(builtins, '__NUMPY_SETUP__'): del builtins.__NUMPY_SETUP__ import imp import numpy imp.reload(numpy) try: numpy_include = numpy.get_include() except AttributeError: numpy_include = numpy.get_numpy_include() return numpy_include class _DummyFile(object): """A noop writeable object.""" errors = '' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr if sys.platform == 'win32': import ctypes def _has_hidden_attribute(filepath): """ Returns True if the given filepath has the hidden attribute on MS-Windows. Based on a post here: http://stackoverflow.com/questions/284115/cross-platform-hidden-file-detection """ if isinstance(filepath, bytes): filepath = filepath.decode(sys.getfilesystemencoding()) try: attrs = ctypes.windll.kernel32.GetFileAttributesW(filepath) assert attrs != -1 result = bool(attrs & 2) except (AttributeError, AssertionError): result = False return result else: def _has_hidden_attribute(filepath): return False def is_path_hidden(filepath): """ Determines if a given file or directory is hidden. Parameters ---------- filepath : str The path to a file or directory Returns ------- hidden : bool Returns `True` if the file is hidden """ name = os.path.basename(os.path.abspath(filepath)) if isinstance(name, bytes): is_dotted = name.startswith(b'.') else: is_dotted = name.startswith('.') return is_dotted or _has_hidden_attribute(filepath) def walk_skip_hidden(top, onerror=None, followlinks=False): """ A wrapper for `os.walk` that skips hidden files and directories. This function does not have the parameter `topdown` from `os.walk`: the directories must always be recursed top-down when using this function. See also -------- os.walk : For a description of the parameters """ for root, dirs, files in os.walk( top, topdown=True, onerror=onerror, followlinks=followlinks): # These lists must be updated in-place so os.walk will skip # hidden directories dirs[:] = [d for d in dirs if not is_path_hidden(d)] files[:] = [f for f in files if not is_path_hidden(f)] yield root, dirs, files def write_if_different(filename, data): """Write `data` to `filename`, if the content of the file is different. Parameters ---------- filename : str The file name to be written to. data : bytes The data to be written to `filename`. """ assert isinstance(data, bytes) if os.path.exists(filename): with open(filename, 'rb') as fd: original_data = fd.read() else: original_data = None if original_data != data: with open(filename, 'wb') as fd: fd.write(data) def import_file(filename, name=None): """ Imports a module from a single file as if it doesn't belong to a particular package. The returned module will have the optional ``name`` if given, or else a name generated from the filename. """ # Specifying a traditional dot-separated fully qualified name here # results in a number of "Parent module 'astropy' not found while # handling absolute import" warnings. Using the same name, the # namespaces of the modules get merged together. So, this # generates an underscore-separated name which is more likely to # be unique, and it doesn't really matter because the name isn't # used directly here anyway. mode = 'r' if name is None: basename = os.path.splitext(filename)[0] name = '_'.join(os.path.relpath(basename).split(os.sep)[1:]) if not os.path.exists(filename): raise ImportError('Could not import file {0}'.format(filename)) if import_machinery: loader = import_machinery.SourceFileLoader(name, filename) mod = loader.load_module() else: with open(filename, mode) as fd: mod = imp.load_module(name, fd, filename, ('.py', mode, 1)) return mod def resolve_name(name): """Resolve a name like ``module.object`` to an object and return it. Raise `ImportError` if the module or name is not found. """ parts = name.split('.') cursor = len(parts) - 1 module_name = parts[:cursor] attr_name = parts[-1] while cursor > 0: try: ret = __import__('.'.join(module_name), fromlist=[attr_name]) break except ImportError: if cursor == 0: raise cursor -= 1 module_name = parts[:cursor] attr_name = parts[cursor] ret = '' for part in parts[cursor:]: try: ret = getattr(ret, part) except AttributeError: raise ImportError(name) return ret def extends_doc(extended_func): """ A function decorator for use when wrapping an existing function but adding additional functionality. This copies the docstring from the original function, and appends to it (along with a newline) the docstring of the wrapper function. Examples -------- >>> def foo(): ... '''Hello.''' ... >>> @extends_doc(foo) ... def bar(): ... '''Goodbye.''' ... >>> print(bar.__doc__) Hello. Goodbye. """ def decorator(func): if not (extended_func.__doc__ is None or func.__doc__ is None): func.__doc__ = '\n\n'.join([extended_func.__doc__.rstrip('\n'), func.__doc__.lstrip('\n')]) return func return decorator def find_data_files(package, pattern): """ Include files matching ``pattern`` inside ``package``. Parameters ---------- package : str The package inside which to look for data files pattern : str Pattern (glob-style) to match for the data files (e.g. ``*.dat``). This supports the``**``recursive syntax. For example, ``**/*.fits`` matches all files ending with ``.fits`` recursively. Only one instance of ``**`` can be included in the pattern. """ return glob.glob(os.path.join(package, pattern), recursive=True) ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574696077.659762 astropy-healpix-0.5/astropy_helpers/astropy_helpers/version.py0000644000077000000240000000057000000000000025135 0ustar00tomstaff00000000000000# Autogenerated by Astropy-affiliated package astropy_helpers's setup.py on 2019-11-25 15:34:37 UTC import datetime version = "3.2.2" githash = "ce42e6e238c200a4715785ef8c9d233f612d0c75" major = 3 minor = 2 bugfix = 2 version_info = (major, minor, bugfix) release = True timestamp = datetime.datetime(2019, 11, 25, 15, 34, 37) debug = False astropy_helpers_version = "" ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574694604.6253676 astropy-healpix-0.5/astropy_helpers/astropy_helpers/version_helpers.py0000644000077000000240000003062600000000000026664 0ustar00tomstaff00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Utilities for generating the version string for Astropy (or an affiliated package) and the version.py module, which contains version info for the package. Within the generated astropy.version module, the `major`, `minor`, and `bugfix` variables hold the respective parts of the version number (bugfix is '0' if absent). The `release` variable is True if this is a release, and False if this is a development version of astropy. For the actual version string, use:: from astropy.version import version or:: from astropy import __version__ """ import datetime import os import pkgutil import sys import time import warnings from distutils import log from configparser import ConfigParser import pkg_resources from . import git_helpers from .distutils_helpers import is_distutils_display_option from .git_helpers import get_git_devstr from .utils import AstropyDeprecationWarning, import_file __all__ = ['generate_version_py'] def _version_split(version): """ Split a version string into major, minor, and bugfix numbers. If any of those numbers are missing the default is zero. Any pre/post release modifiers are ignored. Examples ======== >>> _version_split('1.2.3') (1, 2, 3) >>> _version_split('1.2') (1, 2, 0) >>> _version_split('1.2rc1') (1, 2, 0) >>> _version_split('1') (1, 0, 0) >>> _version_split('') (0, 0, 0) """ parsed_version = pkg_resources.parse_version(version) if hasattr(parsed_version, 'base_version'): # New version parsing for setuptools >= 8.0 if parsed_version.base_version: parts = [int(part) for part in parsed_version.base_version.split('.')] else: parts = [] else: parts = [] for part in parsed_version: if part.startswith('*'): # Ignore any .dev, a, b, rc, etc. break parts.append(int(part)) if len(parts) < 3: parts += [0] * (3 - len(parts)) # In principle a version could have more parts (like 1.2.3.4) but we only # support .. return tuple(parts[:3]) # This is used by setup.py to create a new version.py - see that file for # details. Note that the imports have to be absolute, since this is also used # by affiliated packages. _FROZEN_VERSION_PY_TEMPLATE = """ # Autogenerated by {packagetitle}'s setup.py on {timestamp!s} UTC import datetime {header} major = {major} minor = {minor} bugfix = {bugfix} version_info = (major, minor, bugfix) release = {rel} timestamp = {timestamp!r} debug = {debug} astropy_helpers_version = "{ahver}" """[1:] _FROZEN_VERSION_PY_WITH_GIT_HEADER = """ {git_helpers} _packagename = "{packagename}" _last_generated_version = "{verstr}" _last_githash = "{githash}" # Determine where the source code for this module # lives. If __file__ is not a filesystem path then # it is assumed not to live in a git repo at all. if _get_repo_path(__file__, levels=len(_packagename.split('.'))): version = update_git_devstr(_last_generated_version, path=__file__) githash = get_git_devstr(sha=True, show_warning=False, path=__file__) or _last_githash else: # The file does not appear to live in a git repo so don't bother # invoking git version = _last_generated_version githash = _last_githash """[1:] _FROZEN_VERSION_PY_STATIC_HEADER = """ version = "{verstr}" githash = "{githash}" """[1:] def _get_version_py_str(packagename, version, githash, release, debug, uses_git=True): try: from astropy_helpers import __version__ as ahver except ImportError: ahver = "unknown" epoch = int(os.environ.get('SOURCE_DATE_EPOCH', time.time())) timestamp = datetime.datetime.utcfromtimestamp(epoch) major, minor, bugfix = _version_split(version) if packagename.lower() == 'astropy': packagetitle = 'Astropy' else: packagetitle = 'Astropy-affiliated package ' + packagename header = '' if uses_git: header = _generate_git_header(packagename, version, githash) elif not githash: # _generate_git_header will already generate a new git has for us, but # for creating a new version.py for a release (even if uses_git=False) # we still need to get the githash to include in the version.py # See https://github.com/astropy/astropy-helpers/issues/141 githash = git_helpers.get_git_devstr(sha=True, show_warning=True) if not header: # If _generate_git_header fails it returns an empty string header = _FROZEN_VERSION_PY_STATIC_HEADER.format(verstr=version, githash=githash) return _FROZEN_VERSION_PY_TEMPLATE.format(packagetitle=packagetitle, timestamp=timestamp, header=header, major=major, minor=minor, bugfix=bugfix, ahver=ahver, rel=release, debug=debug) def _generate_git_header(packagename, version, githash): """ Generates a header to the version.py module that includes utilities for probing the git repository for updates (to the current git hash, etc.) These utilities should only be available in development versions, and not in release builds. If this fails for any reason an empty string is returned. """ loader = pkgutil.get_loader(git_helpers) source = loader.get_source(git_helpers.__name__) or '' source_lines = source.splitlines() if not source_lines: log.warn('Cannot get source code for astropy_helpers.git_helpers; ' 'git support disabled.') return '' idx = 0 for idx, line in enumerate(source_lines): if line.startswith('# BEGIN'): break git_helpers_py = '\n'.join(source_lines[idx + 1:]) verstr = version new_githash = git_helpers.get_git_devstr(sha=True, show_warning=False) if new_githash: githash = new_githash return _FROZEN_VERSION_PY_WITH_GIT_HEADER.format( git_helpers=git_helpers_py, packagename=packagename, verstr=verstr, githash=githash) def generate_version_py(packagename=None, version=None, release=None, debug=None, uses_git=None, srcdir='.'): """ Generate a version.py file in the package with version information, and update developer version strings. This function should normally be called without any arguments. In this case the package name and version is read in from the ``setup.cfg`` file (from the ``name`` or ``package_name`` entry and the ``version`` entry in the ``[metadata]`` section). If the version is a developer version (of the form ``3.2.dev``), the version string will automatically be expanded to include a sequential number as a suffix (e.g. ``3.2.dev13312``), and the updated version string will be returned by this function. Based on this updated version string, a ``version.py`` file will be generated inside the package, containing the version string as well as more detailed information (for example the major, minor, and bugfix version numbers, a ``release`` flag indicating whether the current version is a stable or developer version, and so on. """ if packagename is not None: warnings.warn('The packagename argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the package name in setup.cfg instead', AstropyDeprecationWarning) if version is not None: warnings.warn('The version argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the version number in setup.cfg instead', AstropyDeprecationWarning) if release is not None: warnings.warn('The release argument to generate_version_py has ' 'been deprecated and will be removed in future. We now ' 'use the presence of the "dev" string in the version to ' 'determine whether this is a release', AstropyDeprecationWarning) # We use ConfigParser instead of read_configuration here because the latter # only reads in keys recognized by setuptools, but we need to access # package_name below. conf = ConfigParser() conf.read('setup.cfg') if conf.has_option('metadata', 'name'): packagename = conf.get('metadata', 'name') elif conf.has_option('metadata', 'package_name'): # The package-template used package_name instead of name for a while warnings.warn('Specifying the package name using the "package_name" ' 'option in setup.cfg is deprecated - use the "name" ' 'option instead.', AstropyDeprecationWarning) packagename = conf.get('metadata', 'package_name') elif packagename is not None: # deprecated pass else: sys.stderr.write('ERROR: Could not read package name from setup.cfg\n') sys.exit(1) if conf.has_option('metadata', 'version'): version = conf.get('metadata', 'version') add_git_devstr = True elif version is not None: # deprecated add_git_devstr = False else: sys.stderr.write('ERROR: Could not read package version from setup.cfg\n') sys.exit(1) if release is None: release = 'dev' not in version if not release and add_git_devstr: version += get_git_devstr(False) if uses_git is None: uses_git = not release # In some cases, packages have a - but this is a _ in the module. Since we # are only interested in the module here, we replace - by _ packagename = packagename.replace('-', '_') try: version_module = get_pkg_version_module(packagename) try: last_generated_version = version_module._last_generated_version except AttributeError: last_generated_version = version_module.version try: last_githash = version_module._last_githash except AttributeError: last_githash = version_module.githash current_release = version_module.release current_debug = version_module.debug except ImportError: version_module = None last_generated_version = None last_githash = None current_release = None current_debug = None if release is None: # Keep whatever the current value is, if it exists release = bool(current_release) if debug is None: # Likewise, keep whatever the current value is, if it exists debug = bool(current_debug) package_srcdir = os.path.join(srcdir, *packagename.split('.')) version_py = os.path.join(package_srcdir, 'version.py') if (last_generated_version != version or current_release != release or current_debug != debug): if '-q' not in sys.argv and '--quiet' not in sys.argv: log.set_threshold(log.INFO) if is_distutils_display_option(): # Always silence unnecessary log messages when display options are # being used log.set_threshold(log.WARN) log.info('Freezing version number to {0}'.format(version_py)) with open(version_py, 'w') as f: # This overwrites the actual version.py f.write(_get_version_py_str(packagename, version, last_githash, release, debug, uses_git=uses_git)) return version def get_pkg_version_module(packagename, fromlist=None): """Returns the package's .version module generated by `astropy_helpers.version_helpers.generate_version_py`. Raises an ImportError if the version module is not found. If ``fromlist`` is an iterable, return a tuple of the members of the version module corresponding to the member names given in ``fromlist``. Raises an `AttributeError` if any of these module members are not found. """ version = import_file(os.path.join(packagename, 'version.py'), name='version') if fromlist: return tuple(getattr(version, member) for member in fromlist) else: return version ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5448494 astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/0000755000077000000240000000000000000000000024566 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696077.7393768 astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/PKG-INFO0000644000077000000240000000503100000000000025662 0ustar00tomstaff00000000000000Metadata-Version: 2.1 Name: astropy-helpers Version: 3.2.2 Summary: Utilities for building and installing packages in the Astropy ecosystem Home-page: https://github.com/astropy/astropy-helpers Author: The Astropy Developers Author-email: astropy.team@gmail.com License: BSD 3-Clause License Description: astropy-helpers =============== .. image:: https://travis-ci.org/astropy/astropy-helpers.svg :target: https://travis-ci.org/astropy/astropy-helpers .. image:: https://ci.appveyor.com/api/projects/status/rt9161t9mhx02xp7/branch/master?svg=true :target: https://ci.appveyor.com/project/Astropy/astropy-helpers .. image:: https://codecov.io/gh/astropy/astropy-helpers/branch/master/graph/badge.svg :target: https://codecov.io/gh/astropy/astropy-helpers The **astropy-helpers** package includes many build, installation, and documentation-related tools used by the Astropy project, but packaged separately for use by other projects that wish to leverage this work. The motivation behind this package and details of its implementation are in the accepted `Astropy Proposal for Enhancement (APE) 4 `_. Astropy-helpers is not a traditional package in the sense that it is not intended to be installed directly by users or developers. Instead, it is meant to be accessed when the ``setup.py`` command is run - see the "Using astropy-helpers in a package" section in the documentation for how to do this. For a real-life example of how to implement astropy-helpers in a project, see the ``setup.py`` and ``setup.cfg`` files of the `Affiliated package template `_. For more information, see the documentation at http://astropy-helpers.readthedocs.io Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: Framework :: Setuptools Plugin Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Software Development :: Build Tools Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: System :: Archiving :: Packaging Provides: astropy_helpers Requires-Python: >=3.5 Provides-Extra: docs ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696077.8182936 astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/SOURCES.txt0000644000077000000240000000162300000000000026454 0ustar00tomstaff00000000000000CHANGES.rst LICENSE.rst MANIFEST.in README.rst ah_bootstrap.py setup.cfg setup.py astropy_helpers/__init__.py astropy_helpers/conftest.py astropy_helpers/distutils_helpers.py astropy_helpers/git_helpers.py astropy_helpers/openmp_helpers.py astropy_helpers/setup_helpers.py astropy_helpers/utils.py astropy_helpers/version.py astropy_helpers/version_helpers.py astropy_helpers.egg-info/PKG-INFO astropy_helpers.egg-info/SOURCES.txt astropy_helpers.egg-info/dependency_links.txt astropy_helpers.egg-info/not-zip-safe astropy_helpers.egg-info/requires.txt astropy_helpers.egg-info/top_level.txt astropy_helpers/commands/__init__.py astropy_helpers/commands/_dummy.py astropy_helpers/commands/build_ext.py astropy_helpers/commands/build_sphinx.py astropy_helpers/commands/test.py astropy_helpers/commands/src/compiler.c astropy_helpers/sphinx/__init__.py astropy_helpers/sphinx/conf.py licenses/LICENSE_ASTROSCRAPPY.rst././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696077.7403238 astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/dependency_links.txt0000644000077000000240000000000100000000000030634 0ustar00tomstaff00000000000000 ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696077.7397695 astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/not-zip-safe0000644000077000000240000000000100000000000027014 0ustar00tomstaff00000000000000 ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696077.7409554 astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/requires.txt0000644000077000000240000000002700000000000027165 0ustar00tomstaff00000000000000 [docs] sphinx-astropy ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574696077.741411 astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/top_level.txt0000644000077000000240000000002000000000000027310 0ustar00tomstaff00000000000000astropy_helpers ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5523944 astropy-healpix-0.5/astropy_helpers/licenses/0000755000077000000240000000000000000000000021456 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1526666202.894688 astropy-healpix-0.5/astropy_helpers/licenses/LICENSE_ASTROSCRAPPY.rst0000644000077000000240000000315400000000000025307 0ustar00tomstaff00000000000000# The OpenMP helpers include code heavily adapted from astroscrappy, released # under the following license: # # Copyright (c) 2015, Curtis McCully # All rights reserved. # # Redistribution and use in source and binary forms, with or without modification, # are permitted provided that the following conditions are met: # # * Redistributions of source code must retain the above copyright notice, this # list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, this # list of conditions and the following disclaimer in the documentation and/or # other materials provided with the distribution. # * Neither the name of the Astropy Team nor the names of its contributors may be # used to endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND # ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR # ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES # (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; # LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON # ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS # SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.5540156 astropy-healpix-0.5/cextern/0000755000077000000240000000000000000000000016076 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8794773 astropy-healpix-0.5/cextern/.gitignore0000644000077000000240000000006300000000000020065 0ustar00tomstaff00000000000000astrometry.net/test_healpix astrometry.net/example ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8804672 astropy-healpix-0.5/cextern/README.md0000644000077000000240000000110500000000000017352 0ustar00tomstaff00000000000000# astropy-healpix/cextern/ The `astropy-healpix` Python package is a wrapper around a C library. See http://astropy-healpix.readthedocs.io/en/latest/about.html This README gives some technical details on the C code here. - The main file is `healpix.h` and `healpix.c`, start reading there first. - For the Python `astropy-healpix` packge, the C code is built via `setup.py` - However, to help work on the C code and test it directly, a `Makefile` is included here. - For testing, a copy of `CuTest.h` and `CuTest.c` from here is bundled: https://github.com/asimjalis/cutest ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.6209083 astropy-healpix-0.5/cextern/astrometry.net/0000755000077000000240000000000000000000000021074 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8810105 astropy-healpix-0.5/cextern/astrometry.net/CuTest.c0000644000077000000240000001773600000000000022465 0ustar00tomstaff00000000000000#include #include #include #include #include #include #include "CuTest.h" /*-------------------------------------------------------------------------* * CuStr *-------------------------------------------------------------------------*/ char* CuStrAlloc(int size) { char* newStr = (char*) malloc( sizeof(char) * (size) ); return newStr; } char* CuStrCopy(const char* old) { int len = strlen(old); char* newStr = CuStrAlloc(len + 1); strcpy(newStr, old); return newStr; } /*-------------------------------------------------------------------------* * CuString *-------------------------------------------------------------------------*/ void CuStringInit(CuString* str) { str->length = 0; str->size = STRING_MAX; str->buffer = (char*) malloc(sizeof(char) * str->size); str->buffer[0] = '\0'; } CuString* CuStringNew(void) { CuString* str = (CuString*) malloc(sizeof(CuString)); str->length = 0; str->size = STRING_MAX; str->buffer = (char*) malloc(sizeof(char) * str->size); str->buffer[0] = '\0'; return str; } void CuStringDelete(CuString *str) { if (!str) return; free(str->buffer); free(str); } void CuStringResize(CuString* str, int newSize) { str->buffer = (char*) realloc(str->buffer, sizeof(char) * newSize); str->size = newSize; } void CuStringAppend(CuString* str, const char* text) { int length; if (text == NULL) { text = "NULL"; } length = strlen(text); if (str->length + length + 1 >= str->size) CuStringResize(str, str->length + length + 1 + STRING_INC); str->length += length; strcat(str->buffer, text); } void CuStringAppendChar(CuString* str, char ch) { char text[2]; text[0] = ch; text[1] = '\0'; CuStringAppend(str, text); } void CuStringAppendFormat(CuString* str, const char* format, ...) { va_list argp; char buf[HUGE_STRING_LEN]; va_start(argp, format); vsprintf(buf, format, argp); va_end(argp); CuStringAppend(str, buf); } void CuStringInsert(CuString* str, const char* text, int pos) { int length = strlen(text); if (pos > str->length) pos = str->length; if (str->length + length + 1 >= str->size) CuStringResize(str, str->length + length + 1 + STRING_INC); memmove(str->buffer + pos + length, str->buffer + pos, (str->length - pos) + 1); str->length += length; memcpy(str->buffer + pos, text, length); } /*-------------------------------------------------------------------------* * CuTest *-------------------------------------------------------------------------*/ void CuTestInit(CuTest* t, const char* name, TestFunction function) { t->name = CuStrCopy(name); t->failed = 0; t->ran = 0; t->message = NULL; t->function = function; t->jumpBuf = NULL; } CuTest* CuTestNew(const char* name, TestFunction function) { CuTest* tc = CU_ALLOC(CuTest); CuTestInit(tc, name, function); return tc; } void CuTestDelete(CuTest *t) { if (!t) return; free(t->name); free(t); } void CuTestRun(CuTest* tc) { jmp_buf buf; tc->jumpBuf = &buf; if (setjmp(buf) == 0) { tc->ran = 1; (tc->function)(tc); } tc->jumpBuf = 0; } static void CuFailInternal(CuTest* tc, const char* file, int line, CuString* string) { char buf[HUGE_STRING_LEN]; sprintf(buf, "%s:%d: ", file, line); CuStringInsert(string, buf, 0); tc->failed = 1; tc->message = string->buffer; if (tc->jumpBuf != 0) longjmp(*(tc->jumpBuf), 0); } void CuFail_Line(CuTest* tc, const char* file, int line, const char* message2, const char* message) { CuString string; CuStringInit(&string); if (message2 != NULL) { CuStringAppend(&string, message2); CuStringAppend(&string, ": "); } CuStringAppend(&string, message); CuFailInternal(tc, file, line, &string); } void CuAssert_Line(CuTest* tc, const char* file, int line, const char* message, int condition) { if (condition) return; CuFail_Line(tc, file, line, NULL, message); } void CuAssertStrEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, const char* expected, const char* actual) { CuString string; if ((expected == NULL && actual == NULL) || (expected != NULL && actual != NULL && strcmp(expected, actual) == 0)) { return; } CuStringInit(&string); if (message != NULL) { CuStringAppend(&string, message); CuStringAppend(&string, ": "); } CuStringAppend(&string, "expected <"); CuStringAppend(&string, expected); CuStringAppend(&string, "> but was <"); CuStringAppend(&string, actual); CuStringAppend(&string, ">"); CuFailInternal(tc, file, line, &string); } void CuAssertIntEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, int expected, int actual) { char buf[STRING_MAX]; if (expected == actual) return; sprintf(buf, "expected <%d> but was <%d>", expected, actual); CuFail_Line(tc, file, line, message, buf); } void CuAssertDblEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, double expected, double actual, double delta) { char buf[STRING_MAX]; if (fabs(expected - actual) <= delta) return; /* sprintf(buf, "expected <%lf> but was <%lf>", expected, actual); */ sprintf(buf, "expected <%f> but was <%f>", expected, actual); CuFail_Line(tc, file, line, message, buf); } void CuAssertPtrEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, void* expected, void* actual) { char buf[STRING_MAX]; if (expected == actual) return; sprintf(buf, "expected pointer <0x%p> but was <0x%p>", expected, actual); CuFail_Line(tc, file, line, message, buf); } /*-------------------------------------------------------------------------* * CuSuite *-------------------------------------------------------------------------*/ void CuSuiteInit(CuSuite* testSuite) { testSuite->count = 0; testSuite->failCount = 0; memset(testSuite->list, 0, sizeof(testSuite->list)); } CuSuite* CuSuiteNew(void) { CuSuite* testSuite = CU_ALLOC(CuSuite); CuSuiteInit(testSuite); return testSuite; } void CuSuiteDelete(CuSuite *testSuite) { unsigned int n; for (n=0; n < MAX_TEST_CASES; n++) { if (testSuite->list[n]) { CuTestDelete(testSuite->list[n]); } } free(testSuite); } void CuSuiteAdd(CuSuite* testSuite, CuTest *testCase) { assert(testSuite->count < MAX_TEST_CASES); testSuite->list[testSuite->count] = testCase; testSuite->count++; } void CuSuiteAddSuite(CuSuite* testSuite, CuSuite* testSuite2) { int i; for (i = 0 ; i < testSuite2->count ; ++i) { CuTest* testCase = testSuite2->list[i]; CuSuiteAdd(testSuite, testCase); } } void CuSuiteRun(CuSuite* testSuite) { int i; for (i = 0 ; i < testSuite->count ; ++i) { CuTest* testCase = testSuite->list[i]; CuTestRun(testCase); if (testCase->failed) { testSuite->failCount += 1; } } } void CuSuiteSummary(CuSuite* testSuite, CuString* summary) { int i; for (i = 0 ; i < testSuite->count ; ++i) { CuTest* testCase = testSuite->list[i]; CuStringAppend(summary, testCase->failed ? "F" : "."); } CuStringAppend(summary, "\n\n"); } void CuSuiteDetails(CuSuite* testSuite, CuString* details) { int i; int failCount = 0; if (testSuite->failCount == 0) { int passCount = testSuite->count - testSuite->failCount; const char* testWord = passCount == 1 ? "test" : "tests"; CuStringAppendFormat(details, "OK (%d %s)\n", passCount, testWord); } else { if (testSuite->failCount == 1) CuStringAppend(details, "There was 1 failure:\n"); else CuStringAppendFormat(details, "There were %d failures:\n", testSuite->failCount); for (i = 0 ; i < testSuite->count ; ++i) { CuTest* testCase = testSuite->list[i]; if (testCase->failed) { failCount++; CuStringAppendFormat(details, "%d) %s: %s\n", failCount, testCase->name, testCase->message); } } CuStringAppend(details, "\n!!!FAILURES!!!\n"); CuStringAppendFormat(details, "Runs: %d ", testSuite->count); CuStringAppendFormat(details, "Passes: %d ", testSuite->count - testSuite->failCount); CuStringAppendFormat(details, "Fails: %d\n", testSuite->failCount); } } ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1537003272.881539 astropy-healpix-0.5/cextern/astrometry.net/CuTest.h0000644000077000000240000000773300000000000022466 0ustar00tomstaff00000000000000#ifndef CU_TEST_H #define CU_TEST_H #include #include /* CuString */ char* CuStrAlloc(int size); char* CuStrCopy(const char* old); #define CU_ALLOC(TYPE) ((TYPE*) malloc(sizeof(TYPE))) #define HUGE_STRING_LEN 8192 #define STRING_MAX 256 #define STRING_INC 256 typedef struct { int length; int size; char* buffer; } CuString; void CuStringInit(CuString* str); CuString* CuStringNew(void); void CuStringRead(CuString* str, const char* path); void CuStringAppend(CuString* str, const char* text); void CuStringAppendChar(CuString* str, char ch); void CuStringAppendFormat(CuString* str, const char* format, ...); void CuStringInsert(CuString* str, const char* text, int pos); void CuStringResize(CuString* str, int newSize); void CuStringDelete(CuString* str); /* CuTest */ typedef struct CuTest CuTest; typedef void (*TestFunction)(CuTest *); struct CuTest { char* name; TestFunction function; int failed; int ran; const char* message; jmp_buf *jumpBuf; }; void CuTestInit(CuTest* t, const char* name, TestFunction function); CuTest* CuTestNew(const char* name, TestFunction function); void CuTestRun(CuTest* tc); void CuTestDelete(CuTest *t); /* Internal versions of assert functions -- use the public versions */ void CuFail_Line(CuTest* tc, const char* file, int line, const char* message2, const char* message); void CuAssert_Line(CuTest* tc, const char* file, int line, const char* message, int condition); void CuAssertStrEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, const char* expected, const char* actual); void CuAssertIntEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, int expected, int actual); void CuAssertDblEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, double expected, double actual, double delta); void CuAssertPtrEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message, void* expected, void* actual); /* public assert functions */ #define CuFail(tc, ms) CuFail_Line( (tc), __FILE__, __LINE__, NULL, (ms)) #define CuAssert(tc, ms, cond) CuAssert_Line((tc), __FILE__, __LINE__, (ms), (cond)) #define CuAssertTrue(tc, cond) CuAssert_Line((tc), __FILE__, __LINE__, "assert failed", (cond)) #define CuAssertStrEquals(tc,ex,ac) CuAssertStrEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac)) #define CuAssertStrEquals_Msg(tc,ms,ex,ac) CuAssertStrEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac)) #define CuAssertIntEquals(tc,ex,ac) CuAssertIntEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac)) #define CuAssertIntEquals_Msg(tc,ms,ex,ac) CuAssertIntEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac)) #define CuAssertDblEquals(tc,ex,ac,dl) CuAssertDblEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac),(dl)) #define CuAssertDblEquals_Msg(tc,ms,ex,ac,dl) CuAssertDblEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac),(dl)) #define CuAssertPtrEquals(tc,ex,ac) CuAssertPtrEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac)) #define CuAssertPtrEquals_Msg(tc,ms,ex,ac) CuAssertPtrEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac)) #define CuAssertPtrNotNull(tc,p) CuAssert_Line((tc),__FILE__,__LINE__,"null pointer unexpected",(p != NULL)) #define CuAssertPtrNotNullMsg(tc,msg,p) CuAssert_Line((tc),__FILE__,__LINE__,(msg),(p != NULL)) /* CuSuite */ #define MAX_TEST_CASES 1024 #define SUITE_ADD_TEST(SUITE,TEST) CuSuiteAdd(SUITE, CuTestNew(#TEST, TEST)) typedef struct { int count; CuTest* list[MAX_TEST_CASES]; int failCount; } CuSuite; void CuSuiteInit(CuSuite* testSuite); CuSuite* CuSuiteNew(void); void CuSuiteDelete(CuSuite *testSuite); void CuSuiteAdd(CuSuite* testSuite, CuTest *testCase); void CuSuiteAddSuite(CuSuite* testSuite, CuSuite* testSuite2); void CuSuiteRun(CuSuite* testSuite); void CuSuiteSummary(CuSuite* testSuite, CuString* summary); void CuSuiteDetails(CuSuite* testSuite, CuString* details); #endif /* CU_TEST_H */ ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8835669 astropy-healpix-0.5/cextern/astrometry.net/Makefile0000644000077000000240000000050300000000000022532 0ustar00tomstaff00000000000000 all: test_healpix OBJS = healpix-utils.o healpix.o starutil.o permutedsort.o mathutil.o bl.o qsort_reentrant.o HEADERS = healpix-utils.h healpix.h $(OBJS): %.o: %.c $(HEADERS) $(CC) -o $@ -c $< %.o: %.c $(CC) -o $@ -c $< test_healpix: test_healpix-main.c test_healpix.c $(OBJS) CuTest.o example: example.c $(OBJS) ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1526665132.262006 astropy-healpix-0.5/cextern/astrometry.net/an-bool.h0000644000077000000240000000070700000000000022600 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #ifndef AN_BOOL_H #define AN_BOOL_H #ifdef _MSC_VER #if _MSC_VER >= 1600 #include #else #include #endif #else #include #endif #ifndef TRUE #define TRUE 1 #endif #ifndef FALSE #define FALSE 0 #endif // This helps unconfuse SWIG; it doesn't seem to like uint8_t typedef unsigned char anbool; #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8857179 astropy-healpix-0.5/cextern/astrometry.net/bl-nl.c0000644000077000000240000002500700000000000022250 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ /** Defined: --nl --number --NL_PRINT(x) prints number 'x' Note: --You can't declare multiple "number" variables like this: number n1, n2; Instead, do: number n1; number n2; This is because "number" may be a pointer type. */ #include "bl-nl.ph" #define NODE_NUMDATA(node) ((number*)NODE_DATA(node)) number* NLF(to_array)(nl* list) { number* arr; size_t N; if (!list) return NULL; N = NLF(size)(list); arr = malloc(N * sizeof(number)); bl_copy(list, 0, N, arr); return arr; } #define InlineDefine InlineDefineC #include "bl-nl.inc" #undef InlineDefine static int NLF(compare_ascending)(const void* v1, const void* v2) { number i1 = *(number*)v1; number i2 = *(number*)v2; if (i1 > i2) return 1; else if (i1 < i2) return -1; else return 0; } static int NLF(compare_descending)(const void* v1, const void* v2) { number i1 = *(number*)v1; number i2 = *(number*)v2; if (i1 > i2) return -1; else if (i1 < i2) return 1; else return 0; } void NLF(reverse)(nl* list) { bl_reverse(list); } void NLF(append_array)(nl* list, const number* data, size_t ndata) { size_t i; for (i=0; iblocksize); N1 = NLF(size)(list1); N2 = NLF(size)(list2); i1 = i2 = 0; getv1 = getv2 = 1; while (i1 < N1 && i2 < N2) { if (getv1) { v1 = NLF(get)(list1, i1); getv1 = 0; } if (getv2) { getv2 = 0; v2 = NLF(get)(list2, i2); } if (v1 <= v2) { NLF(append)(res, v1); i1++; getv1 = 1; } else { NLF(append)(res, v2); i2++; getv2 = 1; } } for (; i1N-1); bl_remove_index(nlist, nlist->N-1); return ret; } nl* NLF(dupe)(nl* nlist) { nl* ret = NLF(new)(nlist->blocksize); size_t i; for (i=0; iN; i++) NLF(push)(ret, NLF(get)(nlist, i)); return ret; } ptrdiff_t NLF(remove_value)(nl* nlist, const number value) { bl* list = nlist; bl_node *node, *prev; size_t istart = 0; for (node=list->head, prev=NULL; node; prev=node, node=node->next) { int i; number* idat; idat = NODE_DATA(node); for (i=0; iN; i++) if (idat[i] == value) { bl_remove_from_node(list, node, prev, i); list->last_access = prev; list->last_access_n = istart; return istart + i; } istart += node->N; } return BL_NOT_FOUND; } void NLF(remove_all)(nl* list) { bl_remove_all(list); } void NLF(remove_index_range)(nl* list, size_t start, size_t length) { bl_remove_index_range(list, start, length); } void NLF(set)(nl* list, size_t index, const number value) { bl_set(list, index, &value); } /* void dl_set(dl* list, int index, double value) { int i; int nadd = (index+1) - list->N; if (nadd > 0) { // enlarge the list to hold 'nadd' more entries. for (i=0; iN; ptrdiff_t mid; while (lower < (upper-1)) { mid = (upper + lower) / 2; if (n >= iarray[mid]) lower = mid; else upper = mid; } return lower; } // find the first node for which n <= the last element. static bl_node* NLF(findnodecontainingsorted)(const nl* list, const number n, size_t* p_nskipped) { bl_node *node; size_t nskipped; //bl_node *prev; //int prevnskipped; // check if we can use the jump accessor or if we have to start at // the beginning... if (list->last_access && list->last_access->N && // is the value we're looking for >= the first element? (n >= *NODE_NUMDATA(list->last_access))) { node = list->last_access; nskipped = list->last_access_n; } else { node = list->head; nskipped = 0; } /* // find the first node for which n < the first element. The // previous node will contain the value (if it exists). for (prev=node, prevnskipped=nskipped; node && (n < *NODE_NUMDATA(node));) { prev=node; prevnskipped=nskipped; nskipped+=node->N; node=node->next; } if (prev && n <= NODE_NUMDATA(prev)[prev->N-1]) { if (p_nskipped) *p_nskipped = prevnskipped; return prev; } if (node && n <= NODE_NUMDATA(node)[node->N-1]) { if (p_nskipped) *p_nskipped = nskipped; return node; } return NULL; */ /* if (!node && prev && n > NODE_NUMDATA(prev)[prev->N-1]) return NULL; if (p_nskipped) *p_nskipped = prevnskipped; return prev; */ for (; node && (n > NODE_NUMDATA(node)[node->N-1]); node=node->next) nskipped += node->N; if (p_nskipped) *p_nskipped = nskipped; return node; } static ptrdiff_t NLF(insertascending)(nl* list, const number n, int unique) { bl_node *node; size_t ind; size_t nskipped; node = NLF(findnodecontainingsorted)(list, n, &nskipped); if (!node) { NLF(append)(list, n); return list->N-1; } /* for (; node && (n > NODE_NUMDATA(node)[node->N-1]); node=node->next) nskipped += node->N; if (!node) { // either we're adding the first element, or we're appending since // n is bigger than the largest element in the list. NLF(append)(list, n); return list->N-1; } */ // find where in the node it should be inserted... ind = 1 + NLF(binarysearch)(node, n); // check if it's a duplicate... if (unique && ind > 0 && (n == NODE_NUMDATA(node)[ind-1])) return BL_NOT_FOUND; // set the jump accessors... list->last_access = node; list->last_access_n = nskipped; // ... so that this runs in O(1). bl_insert(list, nskipped + ind, &n); return nskipped + ind; } size_t NLF(insert_ascending)(nl* list, const number n) { return NLF(insertascending)(list, n, 0); } ptrdiff_t NLF(insert_unique_ascending)(nl* list, const number n) { return NLF(insertascending)(list, n, 1); } size_t NLF(insert_descending)(nl* list, const number n) { return bl_insert_sorted(list, &n, NLF(compare_descending)); } void NLF(insert)(nl* list, size_t indx, const number data) { bl_insert(list, indx, &data); } void NLF(copy)(nl* list, size_t start, size_t length, number* vdest) { bl_copy(list, start, length, vdest); } void NLF(print)(nl* list) { bl_node* n; for (n=list->head; n; n=n->next) { int i; printf("[ "); for (i=0; iN; i++) { if (i > 0) printf(", "); NL_PRINT(NODE_NUMDATA(n)[i]); } printf("] "); } } ptrdiff_t NLF(index_of)(nl* list, const number data) { bl_node* n; number* idata; size_t npast = 0; for (n=list->head; n; n=n->next) { int i; idata = NODE_NUMDATA(n); for (i=0; iN; i++) if (idata[i] == data) return npast + i; npast += n->N; } return BL_NOT_FOUND; } int NLF(contains)(nl* list, const number data) { return (NLF(index_of)(list, data) != BL_NOT_FOUND); } int NLF(sorted_contains)(nl* list, const number n) { return NLF(sorted_index_of)(list, n) != BL_NOT_FOUND; } ptrdiff_t NLF(sorted_index_of)(nl* list, const number n) { bl_node *node; ptrdiff_t lower; size_t nskipped; node = NLF(findnodecontainingsorted)(list, n, &nskipped); if (!node) return BL_NOT_FOUND; //if (!node && (n > NODE_NUMDATA(prev)[prev->N-1])) //return -1; //node = prev; /* // find the first node for which n <= the last element. That node // will contain the value (if it exists) for (; node && (n > NODE_NUMDATA(node)[node->N-1]); node=node->next) nskipped += node->N; if (!node) return -1; */ // update jump accessors... list->last_access = node; list->last_access_n = nskipped; // find within the node... lower = NLF(binarysearch)(node, n); if (lower == BL_NOT_FOUND) return BL_NOT_FOUND; if (n == NODE_NUMDATA(node)[lower]) return nskipped + lower; return BL_NOT_FOUND; } #undef NLF #undef NODE_NUMDATA ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2625923 astropy-healpix-0.5/cextern/astrometry.net/bl-nl.h0000644000077000000240000000543700000000000022262 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ /** Common header for lists of numerical types. Expects "nl" to be #defined to the list type. Expects "number" to be #defined to the numerical type. */ #include "bl-nl.ph" typedef bl nl; // The "const number"s in here are mostly for pl. Malloc nl* NLF(new)(int blocksize); Pure InlineDeclare size_t NLF(size)(const nl* list); void NLF(new_existing)(nl* list, int blocksize); void NLF(init)(nl* list, int blocksize); void NLF(reverse)(nl* list); void NLF(remove_all)(nl* list); void NLF(remove_all_reuse)(nl* list); void NLF(free)(nl* list); number* NLF(append)(nl* list, const number data); void NLF(append_list)(nl* list, nl* list2); void NLF(append_array)(nl* list, const number* data, size_t ndata); void NLF(merge_lists)(nl* list1, nl* list2); void NLF(push)(nl* list, const number data); number NLF(pop)(nl* list); int NLF(contains)(nl* list, const number data); // Assuming the list is sorted in ascending order, // does it contain the given number? int NLF(sorted_contains)(nl* list, const number data); // Or -1 if not found. ptrdiff_t NLF(sorted_index_of)(nl* list, const number data); #if DEFINE_SORT void NLF(sort)(nl* list, int ascending); #endif Malloc number* NLF(to_array)(nl* list); // Returns the index in the list of the given number, or -1 if it // is not found. ptrdiff_t NLF(index_of)(nl* list, const number data); InlineDeclare number NLF(get)(nl* list, size_t n); InlineDeclare number NLF(get_const)(const nl* list, size_t n); InlineDeclare number* NLF(access)(nl* list, size_t n); /** Copy from the list, starting at index "start" for length "length", into the provided array. */ void NLF(copy)(nl* list, size_t start, size_t length, number* vdest); nl* NLF(dupe)(nl* list); void NLF(print)(nl* list); void NLF(insert)(nl* list, size_t indx, const number data); size_t NLF(insert_ascending)(nl* list, const number n); size_t NLF(insert_descending)(nl* list, const number n); // Returns the index at which the element was added, or -1 if it's a duplicate. ptrdiff_t NLF(insert_unique_ascending)(nl* list, const number p); void NLF(set)(nl* list, size_t ind, const number value); void NLF(remove)(nl* list, size_t ind); void NLF(remove_index_range)(nl* list, size_t start, size_t length); // See also sorted_index_of, which should be faster. // Or -1 if not found ptrdiff_t NLF(find_index_ascending)(nl* list, const number value); nl* NLF(merge_ascending)(nl* list1, nl* list2); // returns the index of the removed value, or -1 if it didn't // exist in the list. ptrdiff_t NLF(remove_value)(nl* list, const number value); int NLF(check_consistency)(nl* list); int NLF(check_sorted_ascending)(nl* list, int isunique); int NLF(check_sorted_descending)(nl* list, int isunique); ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2628565 astropy-healpix-0.5/cextern/astrometry.net/bl-nl.inc0000644000077000000240000000105200000000000022571 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include "bl-nl.ph" InlineDefine number NLF(get)(nl* list, size_t n) { number* ptr = (number*)bl_access(list, n); return *ptr; } InlineDefine number NLF(get_const)(const nl* list, size_t n) { number* ptr = (number*)bl_access_const(list, n); return *ptr; } InlineDefine size_t NLF(size)(const nl* list) { return bl_size(list); } InlineDefine number* NLF(access)(nl* list, size_t j) { return (number*)bl_access(list, j); } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2631168 astropy-healpix-0.5/cextern/astrometry.net/bl-nl.ph0000644000077000000240000000042700000000000022434 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ //#define NODE_NUMDATA(node) ((number*)NODE_DATA(node)) #define NLFGLUE2(n,f) n ## _ ## f #define NLFGLUE(n,f) NLFGLUE2(n,f) #define NLF(func) NLFGLUE(nl, func) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5103812 astropy-healpix-0.5/cextern/astrometry.net/bl.c0000644000077000000240000007762000000000000021651 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #define _GNU_SOURCE /* for GNU extension vasprintf() */ #include #include #include #include #include #include "bl.h" #include "keywords.h" #include "bl.ph" static bl_node* bl_new_node(bl* list); static void bl_remove_from_node(bl* list, bl_node* node, bl_node* prev, int index_in_node); // NOTE: this should be replaced by a proper implementation! #ifdef _MSC_VER int vasprintf(char **strp, const char *fmt, va_list ap) {return -1;} #endif // Defined in bl.ph (private header): // free_node // NODE_DATA // NODE_CHARDATA // NODE_INTDATA // NODE_DOUBLEDATA // Defined in bl.inc (inlined functions): // bl_size // bl_access // il_size // il_get // NOTE!, if you make changes here, also see bl-sort.c ! //#define DEFINE_SORT 1 #define DEFINE_SORT 0 #define nl il #define number int #define NL_PRINT(x) printf("%i", x) #include "bl-nl.c" #undef nl #undef number #undef NL_PRINT #define nl ll #define number int64_t #define NL_PRINT(x) printf("%lli", (long long int)x) #include "bl-nl.c" #undef nl #undef number #undef NL_PRINT #define nl fl #define number float #define NL_PRINT(x) printf("%f", (float)x) #include "bl-nl.c" #undef nl #undef number #undef NL_PRINT #define nl dl #define number double #define NL_PRINT(x) printf("%g", x) #include "bl-nl.c" #undef nl #undef number #undef NL_PRINT #undef DEFINE_SORT #define DEFINE_SORT 0 #define nl pl #define number void* #define NL_PRINT(x) printf("%p", x) #include "bl-nl.c" #undef nl #undef number #undef NL_PRINT #undef DEFINE_SORT Pure int bl_datasize(const bl* list) { if (!list) return 0; return list->datasize; } void bl_split(bl* src, bl* dest, size_t split) { bl_node* node; size_t nskipped; size_t ind; size_t ntaken = src->N - split; node = find_node(src, split, &nskipped); ind = split - nskipped; if (ind == 0) { // this whole node belongs to "dest". if (split) { // we need to get the previous node... bl_node* last = find_node(src, split-1, NULL); last->next = NULL; src->tail = last; } else { // we've removed everything from "src". src->head = NULL; src->tail = NULL; } } else { // create a new node to hold the second half of the items in "node". bl_node* newnode = bl_new_node(dest); newnode->N = (node->N - ind); newnode->next = node->next; memcpy(NODE_CHARDATA(newnode), NODE_CHARDATA(node) + (ind * src->datasize), newnode->N * src->datasize); node->N -= (node->N - ind); node->next = NULL; src->tail = node; // to make the code outside this block work... node = newnode; } // append it to "dest". if (dest->tail) { dest->tail->next = node; dest->N += ntaken; } else { dest->head = node; dest->tail = node; dest->N += ntaken; } // adjust "src". src->N -= ntaken; src->last_access = NULL; } void bl_init(bl* list, int blocksize, int datasize) { list->head = NULL; list->tail = NULL; list->N = 0; list->blocksize = blocksize; list->datasize = datasize; list->last_access = NULL; list->last_access_n = 0; } bl* bl_new(int blocksize, int datasize) { bl* rtn; rtn = malloc(sizeof(bl)); if (!rtn) { printf("Couldn't allocate memory for a bl.\n"); return NULL; } bl_init(rtn, blocksize, datasize); return rtn; } void bl_free(bl* list) { if (!list) return; bl_remove_all(list); free(list); } void bl_remove_all(bl* list) { bl_node *n, *lastnode; lastnode = NULL; for (n=list->head; n; n=n->next) { if (lastnode) bl_free_node(lastnode); lastnode = n; } if (lastnode) bl_free_node(lastnode); list->head = NULL; list->tail = NULL; list->N = 0; list->last_access = NULL; list->last_access_n = 0; } void bl_remove_all_but_first(bl* list) { bl_node *n, *lastnode; lastnode = NULL; if (list->head) { for (n=list->head->next; n; n=n->next) { if (lastnode) bl_free_node(lastnode); lastnode = n; } if (lastnode) bl_free_node(lastnode); list->head->next = NULL; list->head->N = 0; list->tail = list->head; } else { list->head = NULL; list->tail = NULL; } list->N = 0; list->last_access = NULL; list->last_access_n = 0; } static void bl_remove_from_node(bl* list, bl_node* node, bl_node* prev, int index_in_node) { // if we're removing the last element at this node, then // remove this node from the linked list. if (node->N == 1) { // if we're removing the first node... if (prev == NULL) { list->head = node->next; // if it's the first and only node... if (list->head == NULL) { list->tail = NULL; } } else { // if we're removing the last element from // the tail node... if (node == list->tail) { list->tail = prev; } prev->next = node->next; } bl_free_node(node); } else { int ncopy; // just remove this element... ncopy = node->N - index_in_node - 1; if (ncopy > 0) { memmove(NODE_CHARDATA(node) + index_in_node * list->datasize, NODE_CHARDATA(node) + (index_in_node+1) * list->datasize, ncopy * list->datasize); } node->N--; } list->N--; } void bl_remove_index(bl* list, size_t index) { // find the node (and previous node) at which element 'index' // can be found. bl_node *node, *prev; size_t nskipped = 0; for (node=list->head, prev=NULL; node; prev=node, node=node->next) { if (index < (nskipped + node->N)) break; nskipped += node->N; } assert(node); bl_remove_from_node(list, node, prev, index-nskipped); list->last_access = NULL; list->last_access_n = 0; } void bl_remove_index_range(bl* list, size_t start, size_t length) { // find the node (and previous node) at which element 'start' // can be found. bl_node *node, *prev; size_t nskipped = 0; list->last_access = NULL; list->last_access_n = 0; for (node=list->head, prev=NULL; node; prev=node, node=node->next) { if (start < (nskipped + node->N)) break; nskipped += node->N; } // begin by removing any indices that are at the end of a block. if (start > nskipped) { // we're not removing everything at this node. size_t istart; size_t n; istart = start - nskipped; if ((istart + length) < node->N) { // we're removing a chunk of elements from the middle of this // block. move elements from the end into the removed chunk. memmove(NODE_CHARDATA(node) + istart * list->datasize, NODE_CHARDATA(node) + (istart + length) * list->datasize, (node->N - (istart + length)) * list->datasize); // we're done! node->N -= length; list->N -= length; return; } else { // we're removing everything from 'istart' to the end of this // block. just change the "N" values. n = (node->N - istart); node->N -= n; list->N -= n; length -= n; start += n; nskipped = start; prev = node; node = node->next; } } // remove complete blocks. for (;;) { size_t n; bl_node* todelete; if (length == 0 || length < node->N) break; // we're skipping this whole block. n = node->N; length -= n; start += n; list->N -= n; nskipped += n; todelete = node; node = node->next; bl_free_node(todelete); } if (prev) prev->next = node; else list->head = node; if (!node) list->tail = prev; // remove indices from the beginning of the last block. // note that we may have removed everything from the tail of the list, // no "node" may be null. if (node && length>0) { //printf("removing %i from end.\n", length); memmove(NODE_CHARDATA(node), NODE_CHARDATA(node) + length * list->datasize, (node->N - length) * list->datasize); node->N -= length; list->N -= length; } } static void clear_list(bl* list) { list->head = NULL; list->tail = NULL; list->N = 0; list->last_access = NULL; list->last_access_n = 0; } void bl_append_list(bl* list1, bl* list2) { list1->last_access = NULL; list1->last_access_n = 0; if (list1->datasize != list2->datasize) { printf("Error: cannot append bls with different data sizes!\n"); assert(0); exit(0); } if (list1->blocksize != list2->blocksize) { printf("Error: cannot append bls with different block sizes!\n"); assert(0); exit(0); } // if list1 is empty, then just copy over list2's head and tail. if (list1->head == NULL) { list1->head = list2->head; list1->tail = list2->tail; list1->N = list2->N; // remove everything from list2 (to avoid sharing nodes) clear_list(list2); return; } // if list2 is empty, then do nothing. if (list2->head == NULL) return; // otherwise, append list2's head to list1's tail. list1->tail->next = list2->head; list1->tail = list2->tail; list1->N += list2->N; // remove everything from list2 (to avoid sharing nodes) clear_list(list2); } static bl_node* bl_new_node(bl* list) { bl_node* rtn; // merge the mallocs for the node and its data into one malloc. rtn = malloc(sizeof(bl_node) + list->datasize * list->blocksize); if (!rtn) { printf("Couldn't allocate memory for a bl node!\n"); return NULL; } //rtn->data = (char*)rtn + sizeof(bl_node); rtn->N = 0; rtn->next = NULL; return rtn; } static void bl_append_node(bl* list, bl_node* node) { node->next = NULL; if (!list->head) { // first node to be added. list->head = node; list->tail = node; } else { list->tail->next = node; list->tail = node; } list->N += node->N; } /* * Append an item to this bl node. If this node is full, then create a new * node and insert it into the list. * * Returns the location where the new item was copied. */ void* bl_node_append(bl* list, bl_node* node, const void* data) { void* dest; if (node->N == list->blocksize) { // create a new node and insert it after the current node. bl_node* newnode; newnode = bl_new_node(list); newnode->next = node->next; node->next = newnode; if (list->tail == node) list->tail = newnode; node = newnode; } // space remains at this node. add item. dest = NODE_CHARDATA(node) + node->N * list->datasize; if (data) memcpy(dest, data, list->datasize); node->N++; list->N++; return dest; } void* bl_append(bl* list, const void* data) { if (!list->tail) // empty list; create a new node. bl_append_node(list, bl_new_node(list)); // append the item to the tail. if the tail node is full, a new tail node may be created. return bl_node_append(list, list->tail, data); } void* bl_push(bl* list, const void* data) { return bl_append(list, data); } void bl_pop(bl* list, void* into) { assert(list->N > 0); bl_get(list, list->N-1, into); bl_remove_index(list, list->N-1); } void bl_print_structure(bl* list) { bl_node* n; printf("bl: head %p, tail %p, N %zu\n", list->head, list->tail, list->N); for (n=list->head; n; n=n->next) { printf("[N=%i] ", n->N); } printf("\n"); } void bl_get(bl* list, size_t n, void* dest) { char* src; assert(list->N > 0); src = bl_access(list, n); memcpy(dest, src, list->datasize); } static void bl_find_ind_and_element(bl* list, const void* data, int (*compare)(const void* v1, const void* v2), void** presult, ptrdiff_t* pindex) { ptrdiff_t lower, upper; int cmp = -2; void* result; lower = -1; upper = list->N; while (lower < (upper-1)) { ptrdiff_t mid; mid = (upper + lower) / 2; cmp = compare(data, bl_access(list, mid)); if (cmp >= 0) { lower = mid; } else { upper = mid; } } if (lower == -1 || compare(data, (result = bl_access(list, lower)))) { *presult = NULL; if (pindex) *pindex = -1; return; } *presult = result; if (pindex) *pindex = lower; } /** * Finds a node for which the given compare() function * returns zero when passed the given 'data' pointer * and elements from the list. */ void* bl_find(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)) { void* rtn; bl_find_ind_and_element(list, data, compare, &rtn, NULL); return rtn; } ptrdiff_t bl_find_index(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)) { void* val; ptrdiff_t ind; bl_find_ind_and_element(list, data, compare, &val, &ind); return ind; } size_t bl_insert_sorted(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)) { ptrdiff_t lower, upper; lower = -1; upper = list->N; while (lower < (upper-1)) { ptrdiff_t mid; int cmp; mid = (upper + lower) / 2; cmp = compare(data, bl_access(list, mid)); if (cmp >= 0) { lower = mid; } else { upper = mid; } } bl_insert(list, lower+1, data); return lower+1; } ptrdiff_t bl_insert_unique_sorted(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)) { // This is just straightforward binary search - really should // use the block structure... ptrdiff_t lower, upper; lower = -1; upper = list->N; while (lower < (upper-1)) { ptrdiff_t mid; int cmp; mid = (upper + lower) / 2; cmp = compare(data, bl_access(list, mid)); if (cmp >= 0) { lower = mid; } else { upper = mid; } } if (lower >= 0) { if (compare(data, bl_access(list, lower)) == 0) { return BL_NOT_FOUND; } } bl_insert(list, lower+1, data); return lower+1; } void bl_set(bl* list, size_t index, const void* data) { bl_node* node; size_t nskipped; void* dataloc; node = find_node(list, index, &nskipped); dataloc = NODE_CHARDATA(node) + (index - nskipped) * list->datasize; memcpy(dataloc, data, list->datasize); // update the last_access member... list->last_access = node; list->last_access_n = nskipped; } /** * Insert the element "data" into the list, such that its index is "index". * All elements that previously had indices "index" and above are moved * one position to the right. */ void bl_insert(bl* list, size_t index, const void* data) { bl_node* node; size_t nskipped; if (list->N == index) { bl_append(list, data); return; } node = find_node(list, index, &nskipped); list->last_access = node; list->last_access_n = nskipped; // if the node is full: // if we're inserting at the end of this node, then create a new node. // else, shift all but the last element, add in this element, and // add the last element to a new node. if (node->N == list->blocksize) { int localindex, nshift; bl_node* next = node->next; bl_node* destnode; localindex = index - nskipped; // if the next node exists and is not full, then insert the overflowing // element at the front. otherwise, create a new node. if (next && (next->N < list->blocksize)) { // shift the existing elements up by one position... memmove(NODE_CHARDATA(next) + list->datasize, NODE_CHARDATA(next), next->N * list->datasize); destnode = next; } else { // create and insert a new node. bl_node* newnode = bl_new_node(list); newnode->next = next; node->next = newnode; if (!newnode->next) list->tail = newnode; destnode = newnode; } if (localindex == node->N) { // the new element becomes the first element in the destination node. memcpy(NODE_CHARDATA(destnode), data, list->datasize); } else { // the last element in this node is added to the destination node. memcpy(NODE_CHARDATA(destnode), NODE_CHARDATA(node) + (node->N-1)*list->datasize, list->datasize); // shift the end portion of this node up by one... nshift = node->N - localindex - 1; memmove(NODE_CHARDATA(node) + (localindex+1) * list->datasize, NODE_CHARDATA(node) + localindex * list->datasize, nshift * list->datasize); // insert the new element... memcpy(NODE_CHARDATA(node) + localindex * list->datasize, data, list->datasize); } destnode->N++; list->N++; } else { // shift... int localindex, nshift; localindex = index - nskipped; nshift = node->N - localindex; memmove(NODE_CHARDATA(node) + (localindex+1) * list->datasize, NODE_CHARDATA(node) + localindex * list->datasize, nshift * list->datasize); // insert... memcpy(NODE_CHARDATA(node) + localindex * list->datasize, data, list->datasize); node->N++; list->N++; } } void* bl_access_const(const bl* list, size_t n) { bl_node* node; size_t nskipped; node = find_node(list, n, &nskipped); // grab the element. return NODE_CHARDATA(node) + (n - nskipped) * list->datasize; } void bl_copy(bl* list, size_t start, size_t length, void* vdest) { bl_node* node; size_t nskipped; char* dest; if (length <= 0) return; node = find_node(list, start, &nskipped); // we've found the node containing "start". keep copying elements and // moving down the list until we've copied all "length" elements. dest = vdest; while (length > 0) { size_t take, avail; char* src; // number of elements we want to take. take = length; // number of elements available at this node. avail = node->N - (start - nskipped); if (take > avail) take = avail; src = NODE_CHARDATA(node) + (start - nskipped) * list->datasize; memcpy(dest, src, take * list->datasize); dest += take * list->datasize; start += take; length -= take; nskipped += node->N; node = node->next; } // update the last_access member... list->last_access = node; list->last_access_n = nskipped; } int bl_check_consistency(bl* list) { bl_node* node; size_t N; int tailok = 1; int nempty = 0; int nnull = 0; // if one of head or tail is NULL, they had both better be NULL! if (!list->head) nnull++; if (!list->tail) nnull++; if (nnull == 1) { fprintf(stderr, "bl_check_consistency: head is %p, and tail is %p.\n", list->head, list->tail); return 1; } N = 0; for (node=list->head; node; node=node->next) { N += node->N; if (!node->N) { // this block is empty. nempty++; } // are we at the last node? if (!node->next) { tailok = (list->tail == node) ? 1 : 0; } } if (!tailok) { fprintf(stderr, "bl_check_consistency: tail pointer is wrong.\n"); return 1; } if (nempty) { fprintf(stderr, "bl_check_consistency: %i empty blocks.\n", nempty); return 1; } if (N != list->N) { fprintf(stderr, "bl_check_consistency: list->N is %zu, but sum of blocks is %zu.\n", list->N, N); return 1; } return 0; } int bl_check_sorted(bl* list, int (*compare)(const void* v1, const void* v2), int isunique) { size_t i, N; size_t nbad = 0; void* v2 = NULL; N = bl_size(list); if (N) v2 = bl_access(list, 0); for (i=1; i= 0) { nbad++; } } else { if (cmp > 0) { nbad++; } } } if (nbad) { fprintf(stderr, "bl_check_sorted: %zu are out of order.\n", nbad); return 1; } return 0; } static void memswap(void* v1, void* v2, int len) { unsigned char tmp; unsigned char* c1 = v1; unsigned char* c2 = v2; int i; for (i=0; ihead; node; node=node->next) { for (i=0; i<(node->N/2); i++) { memswap(NODE_CHARDATA(node) + i * list->datasize, NODE_CHARDATA(node) + (node->N - 1 - i) * list->datasize, list->datasize); } pl_append(blocks, node); } // reverse the blocks lastnode = NULL; for (i=pl_size(blocks)-1; i>=0; i--) { node = pl_get(blocks, i); if (lastnode) lastnode->next = node; lastnode = node; } if (lastnode) lastnode->next = NULL; pl_free(blocks); // swap head and tail node = list->head; list->head = list->tail; list->tail = node; list->last_access = NULL; list->last_access_n = 0; } void* bl_extend(bl* list) { return bl_append(list, NULL); } // special-case pointer list accessors... int bl_compare_pointers_ascending(const void* v1, const void* v2) { void* p1 = *(void**)v1; void* p2 = *(void**)v2; if (p1 > p2) return 1; else if (p1 < p2) return -1; else return 0; } void pl_free_elements(pl* list) { size_t i; for (i=0; iN; while (lower < (upper-1)) { ptrdiff_t mid; int cmp; mid = (upper + lower) / 2; cmp = compare(data, pl_get(list, mid)); if (cmp >= 0) { lower = mid; } else { upper = mid; } } bl_insert(list, lower+1, &data); return lower+1; } /* void pl_set(pl* list, int index, void* data) { int i; int nadd = (index+1) - list->N; if (nadd > 0) { // enlarge the list to hold 'nadd' more entries. for (i=0; i=0; i--) { char* s = sl_get(lst, i); if (strcmp(s, str) == 0) return i; } return BL_NOT_FOUND; } // Returns 0 if the string is not in the sl, 1 otherwise. // (same as sl_index_of(lst, str) > -1) int sl_contains(sl* lst, const char* str) { return (sl_index_of(lst, str) > -1); } void sl_reverse(sl* list) { bl_reverse(list); } char* sl_append(sl* list, const char* data) { char* copy; if (data) { copy = strdup(data); assert(copy); } else copy = NULL; pl_append(list, copy); return copy; } void sl_append_array(sl* list, const char**strings, size_t n) { size_t i; for (i=0; i= 0); copy = strdup(value); if (index < list->N) { // we're replacing an existing value - free it! free(sl_get(list, index)); bl_set(list, index, ©); } else { // pad size_t i; for (i=list->N; iN); assert(start >= 0); assert(length >= 0); for (i=0; ihead; n; n=n->next) { printf("[\n"); for (i=0; iN; i++) printf(" \"%s\"\n", ((char**)NODE_DATA(n))[i]); printf("]\n"); } } static char* sljoin(sl* list, const char* join, int forward) { size_t start, end, inc; size_t len = 0; size_t i, N; char* rtn; size_t offset; size_t JL; if (sl_size(list) == 0) return strdup(""); // step through the list forward or backward? if (forward) { start = 0; end = sl_size(list); inc = 1; } else { start = sl_size(list) - 1; end = -1; inc = -1; } JL = strlen(join); N = sl_size(list); for (i=0; i #include #include #ifdef _MSC_VER #define strcasecmp _stricmp #endif #ifdef _MSC_VER #if _MSC_VER >= 1600 #include #else #include #endif #else #include #endif #include "keywords.h" struct bl_node { // number of elements filled. int N; struct bl_node* next; // (data block implicitly follows this struct). }; typedef struct bl_node bl_node; // the top-level data structure of a blocklist. typedef struct { bl_node* head; bl_node* tail; // the total number of data elements size_t N; // the number of elements per block int blocksize; // the size in bytes of each data element int datasize; // rapid accessors for "jumping in" at the last block accessed bl_node* last_access; size_t last_access_n; } bl; #define BL_NOT_FOUND (ptrdiff_t)(-1) Malloc bl* bl_new(int blocksize, int datasize); void bl_init(bl* l, int blocksize, int datasize); void bl_free(bl* list); void bl_remove_all(bl* list); Pure InlineDeclare size_t bl_size(const bl* list); Pure int bl_datasize(const bl* list); /** Appends an element, returning the location whereto it was copied. */ void* bl_append(bl* list, const void* data); // Copies the nth element into the destination location. void bl_get(bl* list, size_t n, void* dest); // Returns a pointer to the nth element. InlineDeclare void* bl_access(bl* list, size_t n); void* bl_access_const(const bl* list, size_t n); void* bl_push(bl* list, const void* data); // Pops a data item into the given "into" memory. void bl_pop(bl* list, void* into); // allocates space for a new object and returns a pointer to it void* bl_extend(bl* list); /** Removes elements from \c split to the end of the list from \c src and appends them to \c dest. */ void bl_split(bl* src, bl* dest, size_t split); void bl_reverse(bl* list); /* * Appends "list2" to the end of "list1", and removes all elements * from "list2". */ void bl_append_list(bl* list1, bl* list2); void bl_insert(bl* list, size_t indx, const void* data); void bl_set(bl* list, size_t indx, const void* data); void bl_print_structure(bl* list); void bl_copy(bl* list, size_t start, size_t length, void* vdest); /** * Inserts the given datum into the list in such a way that the list * stays sorted in ascending order according to the given comparison * function (assuming it was sorted to begin with!). * * The inserted element will be placed _after_ existing elements with * the same value. * * The comparison function is the same as qsort's: it should return * 1 if the first arg is greater than the second arg * 0 if they're equal * -1 if the first arg is smaller. * * The index where the element was inserted is returned. */ size_t bl_insert_sorted(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)); /** If the item already existed in the list (ie, the compare function returned zero), then -1 is returned. Otherwise, the index at which the item was inserted is returned. */ ptrdiff_t bl_insert_unique_sorted(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)); /* Removes all the elements, but doesn't free the first block, which makes it slightly faster for the case when you're going to add more elements right away, since you don't have to free() the old block then immediately malloc() a new block. */ void bl_remove_all_but_first(bl* list); void bl_remove_index(bl* list, size_t indx); void bl_remove_index_range(bl* list, size_t start, size_t length); void* bl_find(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)); ptrdiff_t bl_find_index(bl* list, const void* data, int (*compare)(const void* v1, const void* v2)); // returns 0 if okay, 1 if an error is detected. int bl_check_consistency(bl* list); // returns 0 if okay, 1 if an error is detected. int bl_check_sorted(bl* list, int (*compare)(const void* v1, const void* v2), int isunique); /////////////////////////////////////////////// // special-case functions for string lists. // /////////////////////////////////////////////// /* sl makes a copy of the string using strdup(). It will be freed when the string is removed from the list or the list is freed. */ typedef bl sl; sl* sl_new(int blocksize); /* The functions: sl_init() ---> sl_init2() sl_free() ---> sl_free2() sl_add() ---> sl_add2() sl_find() ---> sl_find2() are defined by BSD, where they live in libc. We therefore avoid these names, which breaks the principle of least surprise, but makes life a bit easier. */ void sl_init2(sl* list, int blocksize); // free this list and all the strings it contains. void sl_free2(sl* list); void sl_append_contents(sl* dest, sl* src); // Searches the sl for the given string. Comparisons use strcmp(). // Returns -1 if the string is not found, or the first index where it was found. ptrdiff_t sl_index_of(sl* lst, const char* str); ptrdiff_t sl_last_index_of(sl* lst, const char* str); // Returns 0 if the string is not in the sl, 1 otherwise. // (same as sl_index_of(lst, str) > -1) int sl_contains(sl* lst, const char* str); // just free the list structure, not the strings in it. void sl_free_nonrecursive(sl* list); Pure InlineDeclare size_t sl_size(const sl* list); // copies the string and enqueues it; returns the newly-allocate string. char* sl_append(sl* list, const char* string); // appends the string; doesn't copy it. void sl_append_nocopy(sl* list, const char* string); void sl_append_array(sl* list, const char** strings, size_t n); // copies the string and pushes the copy. Returns the copy. char* sl_push(sl* list, const char* data); // returns the last string: it's your responsibility to free it. char* sl_pop(sl* list); char* sl_get(sl* list, size_t n); char* sl_get_const(const sl* list, size_t n); // sets the string at the given index to the given value. // if there is already a string at that index, frees it. char* sl_set(sl* list, size_t n, const char* val); int sl_check_consistency(sl* list); // inserts a copy of the given string. char* sl_insert(sl* list, size_t indx, const char* str); // inserts the given string. void sl_insert_nocopy(sl* list, size_t indx, const char* str); // frees all the strings and removes them from the list. void sl_remove_all(sl* list); // inserts the string; doesn't copy it. void sl_insert_sorted_nocopy(sl* list, const char* string); // inserts a copy of the string; returns the newly-allocated string. char* sl_insert_sorted(sl* list, const char* string); // Inserts the (newly-allocated) formatted string and returns it. char* #ifdef __GNUC__ ATTRIB_FORMAT(printf,2,3) #endif sl_insert_sortedf(sl* list, const char* format, ...); void sl_remove_index_range(sl* list, size_t start, size_t length); void sl_remove(sl* list, size_t index); // Removes "string" if it is found in the list. // Note that this checks pointer match, not a strcmp() match. // Returns the index where the string was found, or -1 if it wasn't found. ptrdiff_t sl_remove_string(sl* list, const char* string); // Removes "string" if it is found in the list, using strcasecmp(). // Returns the string or NULL if not found. char* sl_remove_string_bycaseval(sl* list, const char* string); // Removes "string" if it is found in the list, using strcmp(). // Returns the index where the string was found, or -1 if it wasn't found. ptrdiff_t sl_remove_string_byval(sl* list, const char* string); // remove all elements starting from "start" to the end of the list. void sl_remove_from(sl* list, size_t start); void sl_merge_lists(sl* list1, sl* list2); void sl_print(sl* list); /* Removes duplicate entries, using strcmp(). */ void sl_remove_duplicates(sl* lst); /* Splits the given string 'str' into substrings separated by 'sepstring'. (Some of the substrings may be empty, for example if the 'sepstring' appears consecutively.) Adds them to 'lst', if non-NULL. Allocates and fills a new sl* if 'lst' is NULL. The original string can be reconstructed by calling "sl_implode(lst, sepstring)" */ sl* sl_split(sl* lst, const char* str, const char* sepstring); // Like the PHP function implode(), joins each element in the list with the given // "join" string. The result is a newly-allocate string containing: // sl_get(list, 0) + join + sl_get(list, 1) + join + ... + join + sl_get(list, N-1) // -AKA sl_join. char* sl_implode(sl* list, const char* join); // like Python's joinstring.join(list) // -AKA sl_implode char* sl_join(sl* list, const char* joinstring); // same as sl_join(reverse(list), str) char* sl_join_reverse(sl* list, const char* join); // Appends the (newly-allocated) formatted string and returns it. char* #ifdef __GNUC__ ATTRIB_FORMAT(printf,2,3) #endif sl_appendf(sl* list, const char* format, ...); // Appends the (newly-allocated) formatted string and returns it. char* sl_appendvf(sl* list, const char* format, va_list va); // Inserts the (newly-allocated) formatted string and returns it. char* #ifdef __GNUC__ ATTRIB_FORMAT(printf,3,4) #endif sl_insertf(sl* list, size_t index, const char* format, ...); #define DEFINE_SORT 1 #define nl il #define number int #include "bl-nl.h" #undef nl #undef number #define nl ll #define number int64_t #include "bl-nl.h" #undef nl #undef number #define nl dl #define number double #include "bl-nl.h" #undef nl #undef number #define nl fl #define number float #include "bl-nl.h" #undef nl #undef number #undef DEFINE_SORT #define DEFINE_SORT 0 #define nl pl #define number void* #include "bl-nl.h" #undef nl #undef number #undef DEFINE_SORT //////// Special functions //////// void pl_free_elements(pl* list); size_t pl_insert_sorted(pl* list, const void* data, int (*compare)(const void* v1, const void* v2)); #ifdef INCLUDE_INLINE_SOURCE #define InlineDefine InlineDefineH #include "bl.inc" #define nl il #define number int #include "bl-nl.inc" #undef nl #undef number #define nl ll #define number int64_t #include "bl-nl.inc" #undef nl #undef number #define nl pl #define number void* #include "bl-nl.inc" #undef nl #undef number #define nl dl #define number double #include "bl-nl.inc" #undef nl #undef number #define nl fl #define number float #include "bl-nl.inc" #undef nl #undef number #undef InlineDefine #endif #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2644448 astropy-healpix-0.5/cextern/astrometry.net/bl.inc0000644000077000000240000000223200000000000022163 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include #include "bl.ph" /* find the node in which element "n" can be found. */ InlineDefine bl_node* find_node(const bl* list, size_t n, size_t* p_nskipped) { bl_node* node; size_t nskipped; if (list->last_access && n >= list->last_access_n) { // take the shortcut! nskipped = list->last_access_n; node = list->last_access; } else { node = list->head; nskipped = 0; } for (; node;) { if (n < (nskipped + node->N)) break; nskipped += node->N; node = node->next; } assert(node); if (p_nskipped) *p_nskipped = nskipped; return node; } InlineDefine void* bl_access(bl* list, size_t n) { void* rtn; bl_node* node; size_t nskipped; node = find_node(list, n, &nskipped); // grab the element. rtn = NODE_CHARDATA(node) + (n - nskipped) * list->datasize; // update the last_access member... list->last_access = node; list->last_access_n = nskipped; return rtn; } InlineDefine size_t bl_size(const bl* list) { return list->N; } InlineDefine size_t sl_size(const sl* list) { return bl_size(list); } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2647088 astropy-healpix-0.5/cextern/astrometry.net/bl.ph0000644000077000000240000000105600000000000022024 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ /// Private header file shared between bl.inc and bl.c InlineDeclare bl_node* find_node(const bl* list, size_t n, size_t* rtn_nskipped); // data follows the bl_node*. #define NODE_DATA(node) ((void*)(((bl_node*)(node)) + 1)) #define NODE_CHARDATA(node) ((char*)(((bl_node*)(node)) + 1)) #define NODE_INTDATA(node) ((int*)(((bl_node*)(node)) + 1)) #define NODE_DOUBLEDATA(node) ((double*)(((bl_node*)(node)) + 1)) #define bl_free_node free ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8888285 astropy-healpix-0.5/cextern/astrometry.net/example.c0000644000077000000240000000034600000000000022676 0ustar00tomstaff00000000000000// Simple hello world example #include #include "healpix.h" int main(int argc, char const *argv[]) { double side_length = healpix_side_length_arcmin(4); printf("side_length: %f\n", side_length); return 0; } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5109246 astropy-healpix-0.5/cextern/astrometry.net/healpix-utils.c0000644000077000000240000001141000000000000024025 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include "bl.h" #include "healpix.h" #include "mathutil.h" #include "starutil.h" #include ll* healpix_region_search(int seed, ll* seeds, int Nside, ll* accepted, ll* rejected, int (*accept)(int hp, void* token), void* token, int depth) { ll* frontier; anbool allocd_rej = FALSE; int d; if (!accepted) accepted = ll_new(256); if (!rejected) { rejected = ll_new(256); allocd_rej = TRUE; } if (seeds) //frontier = seeds; frontier = ll_dupe(seeds); else { frontier = ll_new(256); ll_append(frontier, seed); } for (d=0; !depth || dN, *indices); } else { fprintf(stderr, "malloc failed\n"); n_pixels = -1; } ll_free(hps); return n_pixels; } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5114274 astropy-healpix-0.5/cextern/astrometry.net/healpix-utils.h0000644000077000000240000000312500000000000024036 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #ifndef HEALPIX_UTILS_H #define HEALPIX_UTILS_H #include "bl.h" /** Returns healpixes that are / may be within range of the given point, resp. */ il* healpix_rangesearch_xyz(const double* xyz, double radius, int Nside, il* hps); il* healpix_rangesearch_xyz_approx(const double* xyz, double radius, int Nside, il* hps); il* healpix_rangesearch_radec_approx(double ra, double dec, double radius, int Nside, il* hps); il* healpix_rangesearch_radec(double ra, double dec, double radius, int Nside, il* hps); int64_t healpix_rangesearch_radec_simple(double ra, double dec, double radius, int Nside, int approx, int64_t **indices); /** Starting from a "seed" or list of "seeds" healpixes, grows a region by looking at healpix neighbours. Accepts healpixes for which the "accept" function returns 1. Returns the healpixes that are accepted. The accepted results are placed in "accepted", if non-NULL, or in a newly-allocated list. If "rejected" is non-NULL, the healpixes that are rejected will be put there. If "depth" is non-zero, that number of neighbour steps will be taken. Zero means no limit. NOTE that any existing entries in the "accepted" list will be treated as having already been accepted: when the search reaches them, their neighbours will not be added to the frontier to explore. */ il* healpix_region_search(int seed, il* seeds, int Nside, il* accepted, il* rejected, int (*accept)(int hp, void* token), void* token, int depth); #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5121377 astropy-healpix-0.5/cextern/astrometry.net/healpix.c0000644000077000000240000011614000000000000022675 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include #include #include #include #include "os-features.h" #include "healpix.h" #include "mathutil.h" #include "starutil.h" #include "keywords.h" #include "permutedsort.h" #ifndef M_PI # define M_PI 3.14159265358979323846 #endif // Internal type struct hp_s { int bighp; int x; int y; }; typedef struct hp_s hp_t; static int64_t hptointl(hp_t hp, int Nside) { return healpixl_compose_xy(hp.bighp, hp.x, hp.y, Nside); } static void intltohp(int64_t pix, hp_t* hp, int Nside) { healpixl_decompose_xy(pix, &hp->bighp, &hp->x, &hp->y, Nside); } static void hp_decompose(hp_t* hp, int* php, int* px, int* py) { if (php) *php = hp->bighp; if (px) *px = hp->x; if (py) *py = hp->y; } // I've had troubles with rounding functions being declared properly // in other contexts... Declare it here so the compiler complains if // something is wrong. #ifdef _MSC_VER double round(double x) { return floor(x + 0.5); } #else double round(double x); #endif Const static Inline double mysquare(double d) { return d*d; } Const int64_t healpixl_xy_to_nested(int64_t hp, int Nside) { int bighp,x,y; int64_t index; int i; int64_t ns2 = (int64_t)Nside * (int64_t)Nside; if (hp < 0 || Nside < 0) return -1; healpixl_decompose_xy(hp, &bighp, &x, &y, Nside); if (!is_power_of_two(Nside)) { fprintf(stderr, "healpix_xy_to_nested: Nside must be a power of two.\n"); return -1; } // We construct the index called p_n' in the healpix paper, whose bits // are taken from the bits of x and y: // x = ... b4 b2 b0 // y = ... b5 b3 b1 // We go through the bits of x,y, building up "index": index = 0; for (i=0; i<(8*sizeof(int64_t)/2); i++) { index |= (int64_t)(((y & 1) << 1) | (x & 1)) << (i*2); y >>= 1; x >>= 1; if (!x && !y) break; } return index + (int64_t)bighp * ns2; } Const int64_t healpixl_nested_to_xy(int64_t hp, int Nside) { int bighp, x, y; int64_t index; int64_t ns2 = (int64_t)Nside * (int64_t)Nside; int i; if (hp < 0 || Nside < 0) return -1; if (!is_power_of_two(Nside)) { fprintf(stderr, "healpix_xy_to_nested: Nside must be a power of two.\n"); return -1; } bighp = (int)(hp / ns2); index = hp % ns2; x = y = 0; for (i=0; i<(8*sizeof(int64_t)/2); i++) { x |= (index & 0x1) << i; index >>= 1; y |= (index & 0x1) << i; index >>= 1; if (!index) break; } return healpixl_compose_xy(bighp, x, y, Nside); } Const int64_t healpixl_compose_ring(int ring, int longind, int Nside) { if (ring <= Nside) // north polar return (int64_t)ring * ((int64_t)ring-1) * 2 + (int64_t)longind; if (ring < 3*Nside) // equatorial return (int64_t)Nside*((int64_t)Nside-1)*2 + (int64_t)Nside*4*((int64_t)ring-(int64_t)Nside) + (int64_t)longind; { int64_t ri; ri = 4*(int64_t)Nside - (int64_t)ring; return 12*(int64_t)Nside*(int64_t)Nside-1 - ( ri*(ri-1)*2 + (ri*4 - 1 - (int64_t)longind) ); } } void healpixl_decompose_ring(int64_t hp, int Nside, int* p_ring, int* p_longind) { int64_t longind; int64_t offset = 0; int64_t Nside64; int64_t ns2; int ring; double x; Nside64 = (int64_t)Nside; ns2 = Nside64 * Nside64; if (hp < 2 * ns2) { ring = (int)(0.5 + sqrt(0.25 + 0.5 * hp)); offset = 2 * (int64_t)ring * ((int64_t)ring - 1); // The sqrt above can introduce precision issues that can cause ring to // be off by 1, so we check whether the offset is now larger than the HEALPix // value, and if so we need to adjust ring and offset accordingly if (offset > hp) { ring -= 1; offset = 2 * (int64_t)ring * ((int64_t)ring - 1); } longind = hp - offset; } else { offset = 2 * Nside64 * (Nside64 - 1); if (hp < 10 * ns2) { ring = (int)((hp - offset) / ((int64_t)Nside * 4) + (int64_t)Nside); offset += 4 * (ring - Nside64) * Nside64; longind = hp - offset; } else { offset += 8 * ns2; x = (2 * Nside64 + 1 - sqrt((2 * Nside64 + 1) * (2 * Nside64 + 1) - 2 * (hp - offset)))*0.5; ring = (int)x; offset += 2 * (int64_t)ring * (2 * Nside64 + 1 - (int64_t)ring); // The sqrt above can introduce precision issues that can cause ring to // be off by 1, so we check whether the offset is now larger than the HEALPix // value, and if so we need to adjust ring and offset accordingly if (offset > hp) { ring -= 1; offset -= 4 * Nside64 - 4 * (int64_t)ring; } longind = (int)(hp - offset); ring += 3 * Nside; } } if (p_ring) *p_ring = ring; if (p_longind) *p_longind = (int)longind; } Const int64_t healpixl_ring_to_xy(int64_t ring, int Nside) { int bighp, x, y; int ringind, longind; healpixl_decompose_ring(ring, Nside, &ringind, &longind); if (ring < 0 || Nside < 0) { return -1; } else if (ringind <= Nside) { int64_t ind; int v; int F1; int frow; bighp = longind / ringind; ind = (int64_t)longind - (int64_t)bighp * (int64_t)ringind; y = (Nside - 1) - (int)ind; frow = bighp / 4; F1 = frow + 2; v = F1*Nside - ringind - 1; x = v - y; return healpixl_compose_xy(bighp, x, y, Nside); } else if (ringind < (int64_t)3*Nside) { int panel; int ind; int bottomleft; int topleft; int frow, F1, F2, s, v, h; int bighp = -1; int x, y; int R = 0; panel = longind / Nside; ind = longind % Nside; bottomleft = ind < (ringind - Nside + 1) / 2; topleft = ind < ((int64_t)3*Nside - ringind + 1)/2; if (!bottomleft && topleft) { // top row. bighp = panel; } else if (bottomleft && !topleft) { // bottom row. bighp = 8 + panel; } else if (bottomleft && topleft) { // left side. bighp = 4 + panel; } else if (!bottomleft && !topleft) { // right side. bighp = 4 + (panel + 1) % 4; if (bighp == 4) { longind -= ((int64_t)4*Nside - 1); // Gah! Wacky hack - it seems that since // "longind" is negative in this case, the // rounding behaves differently, so we end up // computing the wrong "h" and have to correct // for it. R = 1; } } frow = bighp / 4; F1 = frow + 2; F2 = 2*(bighp % 4) - (frow % 2) + 1; s = (ringind - Nside) % 2; v = F1*Nside - ringind - 1; h = 2*longind - s - F2*Nside; if (R) h--; x = (v + h) / 2; y = (v - h) / 2; //fprintf(stderr, "bighp=%i, frow=%i, F1=%i, F2=%i, s=%i, v=%i, h=%i, x=%i, y=%i.\n", bighp, frow, F1, F2, s, v, h, x, y); if ((v != (x+y)) || (h != (x-y))) { h++; x = (v + h) / 2; y = (v - h) / 2; //fprintf(stderr, "tweak h=%i, x=%i, y=%i\n", h, x, y); if ((v != (x+y)) || (h != (x-y))) { //fprintf(stderr, "still not right.\n"); } } return healpixl_compose_xy(bighp, x, y, Nside); } else { int ind; int v; int F1; int frow; int ri; ri = 4*Nside - ringind; bighp = 8 + longind / ri; ind = longind - (bighp%4) * ri; y = (ri-1) - ind; frow = bighp / 4; F1 = frow + 2; v = F1*Nside - ringind - 1; x = v - y; return healpixl_compose_xy(bighp, x, y, Nside); } } Const int64_t healpixl_xy_to_ring(int64_t hp, int Nside) { int bighp,x,y; int frow; int F1; int v; int ring; int64_t index; healpixl_decompose_xy(hp, &bighp, &x, &y, Nside); frow = bighp / 4; F1 = frow + 2; v = x + y; // "ring" starts from 1 at the north pole and goes to 4Nside-1 at // the south pole; the pixels in each ring have the same latitude. ring = F1*Nside - v - 1; /* ring: [1, Nside] : n pole (Nside, 2Nside] : n equatorial (2Nside+1, 3Nside) : s equat [3Nside, 4Nside-1] : s pole */ // this probably can't happen... if ((ring < 1) || (ring >= (int64_t)4*Nside)) { //fprintf(stderr, "Invalid ring index: %i %i\n", ring, 4*Nside); return -1; } if (ring <= Nside) { // north polar. // left-to-right coordinate within this healpix index = (Nside - 1 - y); // offset from the other big healpixes index += ((bighp % 4) * ring); // offset from the other rings index += (int64_t)ring*(ring-1)*2; } else if (ring >= (int64_t)3*Nside) { // south polar. // Here I first flip everything so that we label the pixels // at zero starting in the southeast corner, increasing to the // west and north, then subtract that from the total number of // healpixels. int ri = (int64_t)4*Nside - ring; // index within this healpix index = (ri-1) - x; // big healpixes index += ((3-(bighp % 4)) * ri); // other rings index += (int64_t)ri*(ri-1)*2; // flip! index = 12*(int64_t)Nside*Nside - 1 - index; } else { // equatorial. int64_t s, F2, h; s = (ring - Nside) % 2; F2 = 2*((int)bighp % 4) - (frow % 2) + 1; h = x - y; index = (F2 * Nside + h + s) / 2; // offset from the north polar region: index += (int64_t)Nside * (Nside - 1) * 2; // offset within the equatorial region: index += (int64_t)Nside * 4 * (ring - Nside); // handle healpix #4 wrap-around if ((bighp == 4) && (y > x)) index += (4 * Nside - 1); //fprintf(stderr, "frow=%i, F1=%i, v=%i, ringind=%i, s=%i, F2=%i, h=%i, longind=%i.\n", frow, F1, v, ring, s, F2, h, (F2*(int)Nside+h+s)/2); } return index; } Const double healpix_side_length_arcmin(int Nside) { return sqrt((4.0 * M_PI * mysquare(180.0 * 60.0 / M_PI)) / (12.0 * Nside * Nside)); } double healpix_nside_for_side_length_arcmin(double arcmin) { return sqrt(4.0*M_PI / (mysquare(arcmin2rad(arcmin)) * 12.0)); } static Inline void swap(int* i1, int* i2) { int tmp; tmp = *i1; *i1 = *i2; *i2 = tmp; } static Inline void swap_double(double* i1, double* i2) { double tmp; tmp = *i1; *i1 = *i2; *i2 = tmp; } static Inline anbool ispolar(int healpix) { // the north polar healpixes are 0,1,2,3 // the south polar healpixes are 8,9,10,11 return (healpix <= 3) || (healpix >= 8); } static Inline anbool isequatorial(int healpix) { // the north polar healpixes are 0,1,2,3 // the south polar healpixes are 8,9,10,11 return (healpix >= 4) && (healpix <= 7); } static Inline anbool isnorthpolar(int healpix) { return (healpix <= 3); } static Inline anbool issouthpolar(int healpix) { return (healpix >= 8); } int64_t healpixl_compose_xy(int bighp, int x, int y, int Nside) { int64_t ns = Nside; assert(Nside > 0); assert(bighp >= 0); assert(bighp < 12); assert(x >= 0); assert(x < Nside); assert(y >= 0); assert(y < Nside); return ((((int64_t)bighp * ns) + x) * ns) + y; } void healpixl_convert_nside(int64_t hp, int nside, int outnside, int64_t* outhp) { int basehp, x, y; int ox, oy; healpixl_decompose_xy(hp, &basehp, &x, &y, nside); healpixl_convert_xy_nside(x, y, nside, outnside, &ox, &oy); *outhp = healpixl_compose_xy(basehp, ox, oy, outnside); } void healpixl_convert_xy_nside(int x, int y, int nside, int outnside, int* outx, int* outy) { double fx, fy; int ox, oy; assert(x >= 0); assert(x < nside); assert(y >= 0); assert(y < nside); // MAGIC 0.5: assume center of pixel... fx = (x + 0.5) / (double)nside; fy = (y + 0.5) / (double)nside; ox = floor(fx * outnside); oy = floor(fy * outnside); if (outx) *outx = ox; if (outy) *outy = oy; } void healpixl_decompose_xy(int64_t finehp, int* pbighp, int* px, int* py, int Nside) { int64_t hp; int64_t ns2 = (int64_t)Nside * (int64_t)Nside; assert(Nside > 0); assert(finehp < ((int64_t)12 * ns2)); assert(finehp >= 0); if (pbighp) { int bighp = (int)(finehp / ns2); assert(bighp >= 0); assert(bighp < 12); *pbighp = bighp; } hp = finehp % ns2; if (px) { *px = (int)(hp / Nside); assert(*px >= 0); assert(*px < Nside); } if (py) { *py = hp % Nside; assert(*py >= 0); assert(*py < Nside); } } /** Given a large-scale healpix number, computes its neighbour in the direction (dx,dy). Returns -1 if there is no such neighbour. */ static int healpix_get_neighbour(int hp, int dx, int dy) { if (isnorthpolar(hp)) { if ((dx == 1) && (dy == 0)) return (hp + 1) % 4; if ((dx == 0) && (dy == 1)) return (hp + 3) % 4; if ((dx == 1) && (dy == 1)) return (hp + 2) % 4; if ((dx == -1) && (dy == 0)) return (hp + 4); if ((dx == 0) && (dy == -1)) return 4 + ((hp + 1) % 4); if ((dx == -1) && (dy == -1)) return hp + 8; return -1; } else if (issouthpolar(hp)) { if ((dx == 1) && (dy == 0)) return 4 + ((hp + 1) % 4); if ((dx == 0) && (dy == 1)) return hp - 4; if ((dx == -1) && (dy == 0)) return 8 + ((hp + 3) % 4); if ((dx == 0) && (dy == -1)) return 8 + ((hp + 1) % 4); if ((dx == -1) && (dy == -1)) return 8 + ((hp + 2) % 4); if ((dx == 1) && (dy == 1)) return hp - 8; return -1; } else { if ((dx == 1) && (dy == 0)) return hp - 4; if ((dx == 0) && (dy == 1)) return (hp + 3) % 4; if ((dx == -1) && (dy == 0)) return 8 + ((hp + 3) % 4); if ((dx == 0) && (dy == -1)) return hp + 4; if ((dx == 1) && (dy == -1)) return 4 + ((hp + 1) % 4); if ((dx == -1) && (dy == 1)) return 4 + ((hp - 1) % 4); return -1; } return -1; } static void get_neighbours(hp_t hp, hp_t* neighbour, int Nside) { int base; int x, y; int nbase; int nx, ny; base = hp.bighp; x = hp.x; y = hp.y; // ( + , 0 ) nx = (x + 1) % Nside; ny = y; if (x == (Nside - 1)) { nbase = healpix_get_neighbour(base, 1, 0); if (isnorthpolar(base)) { nx = x; swap(&nx, &ny); } } else nbase = base; neighbour[0].bighp = nbase; neighbour[0].x = nx; neighbour[0].y = ny; // ( + , + ) nx = (x + 1) % Nside; ny = (y + 1) % Nside; if ((x == Nside - 1) && (y == Nside - 1)) { if (ispolar(base)) nbase = healpix_get_neighbour(base, 1, 1); else nbase = -1; } else if (x == (Nside - 1)) nbase = healpix_get_neighbour(base, 1, 0); else if (y == (Nside - 1)) nbase = healpix_get_neighbour(base, 0, 1); else nbase = base; if (isnorthpolar(base)) { if (x == (Nside - 1)) nx = Nside - 1; if (y == (Nside - 1)) ny = Nside - 1; if ((x == (Nside - 1)) || (y == (Nside - 1))) swap(&nx, &ny); } //printf("(+ +): nbase=%i, nx=%i, ny=%i, pix=%i\n", nbase, nx, ny, nbase*Ns2+xy_to_pnprime(nx,ny,Nside)); neighbour[1].bighp = nbase; neighbour[1].x = nx; neighbour[1].y = ny; // ( 0 , + ) nx = x; ny = (y + 1) % Nside; if (y == (Nside - 1)) { nbase = healpix_get_neighbour(base, 0, 1); if (isnorthpolar(base)) { ny = y; swap(&nx, &ny); } } else nbase = base; //printf("(0 +): nbase=%i, nx=%i, ny=%i, pix=%i\n", nbase, nx, ny, nbase*Ns2+xy_to_pnprime(nx,ny,Nside)); neighbour[2].bighp = nbase; neighbour[2].x = nx; neighbour[2].y = ny; // ( - , + ) nx = (x + Nside - 1) % Nside; ny = (y + 1) % Nside; if ((x == 0) && (y == (Nside - 1))) { if (isequatorial(base)) nbase = healpix_get_neighbour(base, -1, 1); else nbase = -1; } else if (x == 0) { nbase = healpix_get_neighbour(base, -1, 0); if (issouthpolar(base)) { nx = 0; swap(&nx, &ny); } } else if (y == (Nside - 1)) { nbase = healpix_get_neighbour(base, 0, 1); if (isnorthpolar(base)) { ny = y; swap(&nx, &ny); } } else nbase = base; //printf("(- +): nbase=%i, nx=%i, ny=%i, pix=%i\n", nbase, nx, ny, nbase*Ns2+xy_to_pnprime(nx,ny,Nside)); neighbour[3].bighp = nbase; neighbour[3].x = nx; neighbour[3].y = ny; // ( - , 0 ) nx = (x + Nside - 1) % Nside; ny = y; if (x == 0) { nbase = healpix_get_neighbour(base, -1, 0); if (issouthpolar(base)) { nx = 0; swap(&nx, &ny); } } else nbase = base; //printf("(- 0): nbase=%i, nx=%i, ny=%i, pix=%i\n", nbase, nx, ny, nbase*Ns2+xy_to_pnprime(nx,ny,Nside)); neighbour[4].bighp = nbase; neighbour[4].x = nx; neighbour[4].y = ny; // ( - , - ) nx = (x + Nside - 1) % Nside; ny = (y + Nside - 1) % Nside; if ((x == 0) && (y == 0)) { if (ispolar(base)) nbase = healpix_get_neighbour(base, -1, -1); else nbase = -1; } else if (x == 0) nbase = healpix_get_neighbour(base, -1, 0); else if (y == 0) nbase = healpix_get_neighbour(base, 0, -1); else nbase = base; if (issouthpolar(base)) { if (x == 0) nx = 0; if (y == 0) ny = 0; if ((x == 0) || (y == 0)) swap(&nx, &ny); } //printf("(- -): nbase=%i, nx=%i, ny=%i, pix=%i\n", nbase, nx, ny, nbase*Ns2+xy_to_pnprime(nx,ny,Nside)); neighbour[5].bighp = nbase; neighbour[5].x = nx; neighbour[5].y = ny; // ( 0 , - ) ny = (y + Nside - 1) % Nside; nx = x; if (y == 0) { nbase = healpix_get_neighbour(base, 0, -1); if (issouthpolar(base)) { ny = y; swap(&nx, &ny); } } else nbase = base; //printf("(0 -): nbase=%i, nx=%i, ny=%i, pix=%i\n", nbase, nx, ny, nbase*Ns2+xy_to_pnprime(nx,ny,Nside)); neighbour[6].bighp = nbase; neighbour[6].x = nx; neighbour[6].y = ny; // ( + , - ) nx = (x + 1) % Nside; ny = (y + Nside - 1) % Nside; if ((x == (Nside - 1)) && (y == 0)) { if (isequatorial(base)) { nbase = healpix_get_neighbour(base, 1, -1); } else nbase = -1; } else if (x == (Nside - 1)) { nbase = healpix_get_neighbour(base, 1, 0); if (isnorthpolar(base)) { nx = x; swap(&nx, &ny); } } else if (y == 0) { nbase = healpix_get_neighbour(base, 0, -1); if (issouthpolar(base)) { ny = y; swap(&nx, &ny); } } else nbase = base; //printf("(+ -): nbase=%i, nx=%i, ny=%i, pix=%i\n", nbase, nx, ny, nbase*Ns2+xy_to_pnprime(nx,ny,Nside)); neighbour[7].bighp = nbase; neighbour[7].x = nx; neighbour[7].y = ny; } void healpixl_get_neighbours(int64_t pix, int64_t* neighbour, int Nside) { hp_t neigh[8]; hp_t hp; int i; intltohp(pix, &hp, Nside); get_neighbours(hp, neigh, Nside); for (i=0; i<8; i++) if (neigh[i].bighp < 0) { neighbour[i] = -1; } else { neighbour[i] = hptointl(neigh[i], Nside); } } static hp_t xyztohp(double vx, double vy, double vz, double coz, int Nside, double* p_dx, double* p_dy) { double phi; double twothirds = 2.0 / 3.0; double pi = M_PI; double twopi = 2.0 * M_PI; double halfpi = 0.5 * M_PI; double root3 = sqrt(3.0); double dx, dy; int basehp; int x, y; double sector; int offset; double phi_t; hp_t hp; // only used in asserts() VarUnused double EPS = 1e-8; assert(Nside > 0); /* Convert our point into cylindrical coordinates for middle ring */ phi = atan2(vy, vx); if (phi < 0.0) phi += twopi; phi_t = fmod(phi, halfpi); assert (phi_t >= 0.0); // North or south polar cap. if ((vz >= twothirds) || (vz <= -twothirds)) { anbool north; int column; double xx, yy, kx, ky; // Which pole? if (vz >= twothirds) { north = TRUE; } else { north = FALSE; vz *= -1.0; } // if not passed, compute coz if (coz == 0.0) coz = hypot(vx, vy); // solve eqn 20: k = Ns - xx (in the northern hemi) kx = (coz / sqrt(1.0 + vz)) * root3 * fabs(Nside * (2.0 * phi_t - pi) / pi); // solve eqn 19 for k = Ns - yy ky = (coz / sqrt(1.0 + vz)) * root3 * Nside * 2.0 * phi_t / pi; if (north) { xx = Nside - kx; yy = Nside - ky; } else { xx = ky; yy = kx; } // xx, yy should be in [0, Nside]. x = MIN(Nside-1, floor(xx)); assert(x >= 0); assert(x < Nside); y = MIN(Nside-1, floor(yy)); assert(y >= 0); assert(y < Nside); dx = xx - x; dy = yy - y; sector = (phi - phi_t) / (halfpi); offset = (int)round(sector); assert(fabs(sector - offset) < EPS); offset = ((offset % 4) + 4) % 4; assert(offset >= 0); assert(offset <= 3); column = offset; if (north) basehp = column; else basehp = 8 + column; } else { // could be polar or equatorial. double sector; int offset; double u1, u2; double zunits, phiunits; double xx, yy; // project into the unit square z=[-2/3, 2/3], phi=[0, pi/2] zunits = (vz + twothirds) / (4.0 / 3.0); phiunits = phi_t / halfpi; // convert into diagonal units // (add 1 to u2 so that they both cover the range [0,2]. u1 = zunits + phiunits; u2 = zunits - phiunits + 1.0; assert(u1 >= 0.); assert(u1 <= 2.); assert(u2 >= 0.); assert(u2 <= 2.); // x is the northeast direction, y is the northwest. xx = u1 * Nside; yy = u2 * Nside; // now compute which big healpix it's in. // (note that we subtract off the modded portion used to // compute the position within the healpix, so this should be // very close to one of the boundaries.) sector = (phi - phi_t) / (halfpi); offset = (int)round(sector); assert(fabs(sector - offset) < EPS); offset = ((offset % 4) + 4) % 4; assert(offset >= 0); assert(offset <= 3); // we're looking at a square in z,phi space with an X dividing it. // we want to know which section we're in. // xx ranges from 0 in the bottom-left to 2Nside in the top-right. // yy ranges from 0 in the bottom-right to 2Nside in the top-left. // (of the phi,z unit box) if (xx >= Nside) { xx -= Nside; if (yy >= Nside) { // north polar. yy -= Nside; basehp = offset; } else { // right equatorial. basehp = ((offset + 1) % 4) + 4; } } else { if (yy >= Nside) { // left equatorial. yy -= Nside; basehp = offset + 4; } else { // south polar. basehp = 8 + offset; } } assert(xx >= -EPS); assert(xx < (Nside+EPS)); x = MAX(0, MIN(Nside-1, floor(xx))); assert(x >= 0); assert(x < Nside); dx = xx - x; assert(yy >= -EPS); assert(yy < (Nside+EPS)); y = MAX(0, MIN(Nside-1, floor(yy))); assert(y >= 0); assert(y < Nside); dy = yy - y; } hp.bighp = basehp; hp.x = x; hp.y = y; if (p_dx) *p_dx = dx; if (p_dy) *p_dy = dy; return hp; } int64_t xyztohealpixl(double x, double y, double z, int Nside) { return xyztohealpixlf(x, y, z, Nside, NULL, NULL); } int64_t xyztohealpixlf(double x, double y, double z, int Nside, double* p_dx, double* p_dy) { hp_t hp = xyztohp(x,y,z, 0., Nside, p_dx,p_dy); return hptointl(hp, Nside); } int64_t radec_to_healpixl(double ra, double dec, int Nside) { return xyztohealpixl(radec2x(ra,dec), radec2y(ra,dec), radec2z(ra,dec), Nside); } int64_t radec_to_healpixlf(double ra, double dec, int Nside, double* dx, double* dy) { hp_t hp = xyztohp(radec2x(ra,dec), radec2y(ra,dec), radec2z(ra,dec), cos(dec), Nside, dx, dy); return hptointl(hp, Nside); } Const int64_t radecdegtohealpixl(double ra, double dec, int Nside) { return radec_to_healpixl(deg2rad(ra), deg2rad(dec), Nside); } int64_t radecdegtohealpixlf(double ra, double dec, int Nside, double* dx, double* dy) { return radec_to_healpixlf(deg2rad(ra), deg2rad(dec), Nside, dx, dy); } int64_t xyzarrtohealpixl(const double* xyz, int Nside) { return xyztohealpixl(xyz[0], xyz[1], xyz[2], Nside); } int64_t xyzarrtohealpixlf(const double* xyz, int Nside, double* dx, double* dy) { return xyztohealpixlf(xyz[0], xyz[1], xyz[2], Nside, dx, dy); } static void hp_to_xyz(hp_t* hp, int Nside, double dx, double dy, double* rx, double *ry, double *rz) { int chp; anbool equatorial = TRUE; double zfactor = 1.0; int xp, yp; double x, y, z; double pi = M_PI, phi; double rad; hp_decompose(hp, &chp, &xp, &yp); // this is x,y position in the healpix reference frame x = xp + dx; y = yp + dy; if (isnorthpolar(chp)) { if ((x + y) > Nside) { equatorial = FALSE; zfactor = 1.0; } } if (issouthpolar(chp)) { if ((x + y) < Nside) { equatorial = FALSE; zfactor = -1.0; } } if (equatorial) { double zoff=0; double phioff=0; x /= (double)Nside; y /= (double)Nside; if (chp <= 3) { // north phioff = 1.0; } else if (chp <= 7) { // equator zoff = -1.0; chp -= 4; } else if (chp <= 11) { // south phioff = 1.0; zoff = -2.0; chp -= 8; } else { // should never get here assert(0); } z = 2.0/3.0*(x + y + zoff); phi = pi/4*(x - y + phioff + 2*chp); rad = sqrt(1.0 - z*z); } else { /* Rearrange eqns (19) and (20) to find phi_t in terms of x,y. y = Ns - k in eq(19) x - Ns - k in eq(20) (Ns - y)^2 / (Ns - x)^2 = (2 phi_t)^2 / (2 phi_t - pi)^2 Recall than y<=Ns, x<=Ns and 0<=phi_t 0); if (Nside <= 0) { printf("healpix_get_neighbours_within_range: Nside must be > 0.\n"); return -1; } hp = xyzarrtohealpixlf(xyz, Nside, &fx, &fy); healpixes[nhp] = hp; nhp++; { struct neighbour_dirn dirs[] = { // edges { fx, 0, 0, -1 }, { fx, 1, 0, 1 }, { 0 , fy,-1, 0 }, { 1 , fy, 1, 0 }, // bottom corner { 0, 0, -1, 1 }, { 0, 0, -1, 0 }, { 0, 0, -1, -1 }, { 0, 0, 0, -1 }, { 0, 0, 1, -1 }, // right corner { 1, 0, 1, 1 }, { 1, 0, 1, 0 }, { 1, 0, 1, -1 }, { 1, 0, 0, -1 }, { 1, 0, -1, -1 }, // left corner { 0, 1, 1, 1 }, { 0, 1, 0, 1 }, { 0, 1, -1, 1 }, { 0, 1, -1, 0 }, { 0, 1, -1, -1 }, // top corner { 1, 1, -1, 1 }, { 1, 1, 0, 1 }, { 1, 1, 1, 1 }, { 1, 1, 1, 0 }, { 1, 1, 1, -1 }, }; int ndirs = sizeof(dirs) / sizeof(struct neighbour_dirn); double ptx, pty, ptdx, ptdy; int64_t pthp; for (i=0; ix; pty = dir->y; ptdx = dir->dx; ptdy = dir->dy; // pt = point on the edge nearest to the query point. // FIXME -- check that this is true, esp in the polar regions! healpixl_to_xyzarr(hp, Nside, ptx, pty, pt); d2 = distsq(pt, xyz, 3); // delta vector should be outside the healpix assert((ptx+step*ptdx < 0) || (ptx+step*ptdx > 1) || (pty+step*ptdy < 0) || (pty+step*ptdy > 1)); if (d2 > range*range) continue; // compute dx and dy directions that are toward the interior of // the healpix. stepdirx = (ptx < step) ? 1 : -1; stepdiry = (pty < step) ? 1 : -1; // take steps in those directions. healpixl_to_xyzarr(hp, Nside, ptx + stepdirx * step, pty, ptstepx); healpixl_to_xyzarr(hp, Nside, ptx, pty + stepdiry * step, ptstepy); // convert the steps into dx,dy vectors. for (j=0; j<3; j++) { ptstepx[j] = stepdirx * (ptstepx[j] - pt[j]); ptstepy[j] = stepdiry * (ptstepy[j] - pt[j]); } // take a small step in the specified direction. for (j=0; j<3; j++) across[j] = pt[j] + ptdx * ptstepx[j] + ptdy * ptstepy[j]; // see which healpix is at the end of the step. normalize_3(across); pthp = xyzarrtohealpixl(across, Nside); healpixes[nhp] = pthp; nhp++; } } // Remove duplicates... for (i=0; i= dist2A) && (dist2mid >= dist2B)) break; if (dist2A < dist2B) { dist2B = dist2mid; dxB = dxmid; dyB = dymid; } else { dist2A = dist2mid; dxA = dxmid; dyA = dymid; } } // Check whether endpoint A is actually closer. dist2A = cdists[corder[0]]; if (dist2A < dist2mid) { dxA = cdx[corder[0]]; dyA = cdy[corder[0]]; healpixl_to_xyzarr(hp, Nside, dxA, dyA, midxyz); dist2mid = dist2A; } if (closestxyz) memcpy(closestxyz, midxyz, 3*sizeof(double)); return distsq2deg(dist2mid); } double healpix_distance_to_radec(int64_t hp, int Nside, double ra, double dec, double* closestradec) { double xyz[3]; double closestxyz[3]; double dist; radecdeg2xyzarr(ra, dec, xyz); dist = healpix_distance_to_xyz(hp, Nside, xyz, closestxyz); if (closestradec) xyzarr2radecdegarr(closestxyz, closestradec); return dist; } int healpix_within_range_of_radec(int64_t hp, int Nside, double ra, double dec, double radius) { // This is the dumb trivial implementation... return (healpix_distance_to_radec(hp, Nside, ra, dec, NULL) <= radius); } int healpix_within_range_of_xyz(int64_t hp, int Nside, const double* xyz, double radius) { return (healpix_distance_to_xyz(hp, Nside, xyz, NULL) <= radius); } int healpixl_within_range_of_xyz(int64_t hp, int Nside, const double* xyz, double radius) { return (healpix_distance_to_xyz(hp, Nside, xyz, NULL) <= radius); } void healpix_radec_bounds(int64_t hp, int nside, double* p_ralo, double* p_rahi, double* p_declo, double* p_dechi) { // corners! double ralo,rahi,declo,dechi; double ra,dec; double dx, dy; ralo = declo = HUGE_VAL; rahi = dechi = -HUGE_VAL; for (dy=0; dy<2; dy+=1.0) { for (dx=0; dx<2; dx+=1.0) { healpixl_to_radecdeg(hp, nside, dx, dy, &ra, &dec); // FIXME -- wrap-around. ralo = MIN(ra, ralo); rahi = MAX(ra, rahi); declo = MIN(dec, declo); dechi = MAX(dec, dechi); } } if (p_ralo) *p_ralo = ralo; if (p_rahi) *p_rahi = rahi; if (p_declo) *p_declo = declo; if (p_dechi) *p_dechi = dechi; } ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1537003272.892741 astropy-healpix-0.5/cextern/astrometry.net/healpix.h0000644000077000000240000002764100000000000022711 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #ifndef HEALPIX_H #define HEALPIX_H #include #ifdef _MSC_VER #if _MSC_VER >= 1600 #include #else #include #endif #else #include #endif #include "keywords.h" //#undef Const //#define Const /** The HEALPix paper is here: http://iopscience.iop.org/0004-637X/622/2/759/pdf/0004-637X_622_2_759.pdf See: http://adsabs.harvard.edu/cgi-bin/nph-bib_query?bibcode=2005ApJ...622..759G&db_key=AST&high=41069202cf02947 */ /** In this documentation we talk about "base healpixes": these are the big, top-level healpixes. There are 12 of these, with indices [0, 11]. We say "fine healpixes" or "healpixes" or "pixels" when we mean the fine- scale healpixes; there are Nside^2 of these in each base healpix, for a total of 12*Nside^2, indexed from zero. */ /** Some notes about the different indexing schemes: The healpix paper discusses two different ways to number healpixes, and there is a third way, which we prefer, which is (in my opinion) more sensible and easy. -RING indexing. Healpixes are numbered first in order of decreasing DEC, then in order of increasing RA of the center of the pixel, ie: . 0 1 2 3 . 4 5 6 7 8 9 10 11 . 12 13 14 15 16 17 18 19 . 20 21 22 23 24 25 26 27 . 28 29 30 31 32 33 34 35 . 36 37 38 39 40 41 42 43 . 44 45 46 47 Note that 12, 20 and 28 are part of base healpix 4, as is 27; it "wraps around". The RING index can be decomposed into the "ring number" and the index within the ring (called "longitude index"). Note that different rings contain different numbers of healpixes. Also note that the ring number starts from 1, but the longitude index starts from zero. -NESTED indexing. This only works for Nside parameters that are powers of two. This scheme is hierarchical in the sense that each pair of bits of the index tells you where the pixel center is to finer and finer resolution. It doesn't really show with Nside=2, but here it is anyway: . 3 7 11 15 . 2 1 6 5 10 9 14 13 . 19 0 23 4 27 8 31 12 . 17 22 21 26 25 30 29 18 . 16 35 20 39 24 43 28 47 . 34 33 38 37 42 41 46 45 . 32 36 40 44 Note that all the base healpixes have the same pattern; they're just offset by factors of Nside^2. Here's a zoom-in of the first base healpix, turned 45 degrees to the right, for Nside=4: . 10 11 14 15 . 8 9 12 13 . 2 3 6 7 . 0 1 4 5 Note that the bottom-left block of 4 have the smallest values, and within that the bottom-left corner has the smallest value, followed by the bottom-right, top-left, then top-right. The NESTED index can't be decomposed into 'orthogonal' directions. -XY indexing. This is arguably the most natural, at least for the internal usage of the healpix code. Within each base healpix, the healpixes are numbered starting with 0 for the southmost pixel, then increasing first in the "y" (north-west), then in the "x" (north-east) direction. In other words, within each base healpix there is a grid and we number the pixels "lexicographically" (mod a 135 degree turn). . 3 7 11 15 . 1 2 5 6 9 10 13 14 . 19 0 23 4 27 8 31 12 . 18 21 22 25 26 29 30 17 . 16 35 20 39 24 43 28 47 . 33 34 37 38 41 42 45 46 . 32 36 40 44 Zooming in on the first base healpix, turning 45 degrees to the right, for Nside=4 we get: . 3 7 11 15 . 2 6 10 14 . 1 5 9 13 . 0 4 8 12 Notice that the numbers first increase from bottom to top (y), then left to right (x). The XY indexing can be decomposed into 'x' and 'y' coordinates (in case that wasn't obvious), where the above figure becomes (x,y): . (0,3) (1,3) (2,3) (3,3) . (0,2) (1,2) (2,2) (3,2) . (0,1) (1,1) (2,1) (3,1) . (0,0) (1,0) (2,0) (3,0) Note that "x" increases in the north-east direction, and "y" increases in the north-west direction. The major advantage to this indexing scheme is that it extends to fractional coordinates in a natural way: it is meaningful to talk about the position (x,y) = (0.25, 0.6) and you can compute its position. In this code, all healpix indexing uses the XY scheme. If you want to use the other schemes you will have to use the conversion routines: . healpix_xy_to_ring . healpix_ring_to_xy . healpix_xy_to_nested . healpix_nested_to_xy */ // The maximum healpix Nside that leads to int-sized healpix indices. // 12 * (13377+1)^2 > 2^31 (since we use signed ints) // This corresponds to about 16 arcsec side length. #define HP_MAX_INT_NSIDE 13377 /** Converts a healpix index from the XY scheme to the RING scheme. */ Const int64_t healpixl_xy_to_ring(int64_t hp, int Nside); /** Converts a healpix index from the RING scheme to the XY scheme. */ Const int64_t healpixl_ring_to_xy(int64_t ring_index, int Nside); /** Converts a healpix index from the XY scheme to the NESTED scheme. */ Const int64_t healpixl_xy_to_nested(int64_t hp, int Nside); /** Converts a healpix index from the NESTED scheme to the XY scheme. */ Const int64_t healpixl_nested_to_xy(int64_t nested_index, int Nside); /** Decomposes a RING index into the "ring number" (each ring contain healpixels of equal latitude) and "longitude index". Pixels within a ring have longitude index starting at zero for the first pixel with RA >= 0. Different rings contain different numbers of healpixels. */ void healpixl_decompose_ring(int64_t ring_index, int Nside, int* p_ring_number, int* p_longitude_index); /** Composes a RING index given the "ring number" and "longitude index". Does NOT check that the values are legal! Garbage in, garbage out. */ Const int64_t healpixl_compose_ring(int ring, int longind, int Nside); // Const int64_t healpixl_compose_ringl(int64_t ring, int64_t longind, int64_t Nside); /** Decomposes an XY index into the "base healpix" and "x" and "y" coordinates within that healpix. */ void healpix_decompose_xy(int finehp, int* bighp, int* x, int* y, int Nside); void healpixl_decompose_xy(int64_t finehp, int* bighp, int* x, int* y, int Nside); /** Composes an XY index given the "base healpix" and "x" and "y" coordinates within that healpix. */ Const int64_t healpixl_compose_xy(int bighp, int x, int y, int Nside); /** Given (x,y) coordinates of resolution "nside" within a base-level healpixel, and an output resolution "outnside", returns the output (x,y) coordinates at the output resolution. */ void healpixl_convert_xy_nside(int x, int y, int nside, int outnside, int* outx, int* outy); /** Given a healpix index (in the XY scheme) of resolution "nside", and an output resolution "outnside", returns the healpix index at the output resolution. */ void healpix_convert_nside(int64_t hp, int nside, int outnside, int* outhp); /** Converts (RA, DEC) coordinates (in radians) to healpix index. */ Const int64_t radec_to_healpixl(double ra, double dec, int Nside); int64_t radec_to_healpixlf(double ra, double dec, int Nside, double* dx, double* dy); /** Converts (RA, DEC) coordinates (in degrees) to healpix index. */ Const int radecdegtohealpix(double ra, double dec, int Nside); int radecdegtohealpixf(double ra, double dec, int Nside, double* dx, double* dy); Const int64_t radecdegtohealpixl(double ra, double dec, int Nside); int64_t radecdegtohealpixlf(double ra, double dec, int Nside, double* dx, double* dy); /** Converts (x,y,z) coordinates on the unit sphere into a healpix index. */ Const int64_t xyztohealpixl(double x, double y, double z, int Nside); int64_t xyztohealpixlf(double x, double y, double z, int Nside, double* p_dx, double* p_dy); /** Converts (x,y,z) coordinates (stored in an array) on the unit sphere into a healpix index. */ int xyzarrtohealpix(const double* xyz, int Nside); int64_t xyzarrtohealpixl(const double* xyz, int Nside); int xyzarrtohealpixf(const double* xyz,int Nside, double* p_dx, double* p_dy); /** Converts a healpix index, plus fractional offsets (dx,dy), into (x,y,z) coordinates on the unit sphere. (dx,dy) must be in [0, 1]. (0.5, 0.5) is the center of the healpix. (0,0) is the southernmost corner, (1,1) is the northernmost corner, (1,0) is the easternmost, and (0,1) the westernmost. */ void healpixl_to_xyz(int64_t hp, int Nside, double dx, double dy, double* p_x, double *p_y, double *p_z); /** Same as healpix_to_xyz, but (x,y,z) are stored in an array. */ void healpixl_to_xyzarr(int64_t hp, int Nside, double dx, double dy, double* xyz); /** Same as healpix_to_xyz, but returns (RA,DEC) in radians. */ void healpixl_to_radec(int64_t hp, int Nside, double dx, double dy, double* ra, double* dec); void healpixl_to_radecdeg(int64_t hp, int Nside, double dx, double dy, double* ra, double* dec); /** Same as healpix_to_radec, but (RA,DEC) are stored in an array. */ void healpix_to_radecarr(int64_t hp, int Nside, double dx, double dy, double* radec); void healpix_to_radecdegarr(int64_t hp, int Nside, double dx, double dy, double* radec); /** Computes the approximate side length of a healpix, in arcminutes. */ Const double healpix_side_length_arcmin(int Nside); /** Computes the approximate Nside you need to get healpixes with side length about "arcmin" arcminutes. (inverse of healpix_side_length_arcmin) */ double healpix_nside_for_side_length_arcmin(double arcmin); /** Finds the healpixes neighbouring the given healpix, placing them in the array "neighbour". Returns the number of neighbours. You must ensure that "neighbour" has 8 elements. Healpixes in the interior of a large healpix will have eight neighbours; pixels near the edges can have fewer. */ void healpixl_get_neighbours(int64_t pix, int64_t* neighbour, int Nside); /** Finds the healpixes containing and neighbouring the given xyz position which are within distance 'range' (in units of distance of the unit sphere). Places the results in 'healpixes', which must have at least 9 elements. Returns the number of 'healpixes' set. Returns -1 if "Nside" < 0. */ int healpix_get_neighbours_within_range(double* xyz, double range, int64_t* healpixes, int Nside); /** Same as above, but RA,Dec,radius in degrees. */ int healpix_get_neighbours_within_range_radec(double ra, double dec, double radius, int64_t* healpixes, int Nside); /** Returns the minimum distance (in degrees) between the given healpix and the given RA,Dec (in degrees). */ double healpix_distance_to_radec(int64_t hp, int Nside, double ra, double dec, double* closestradec); /** Returns the minimum distance (in degrees) between the given healpix and the given xyz (point on unit sphere). */ double healpix_distance_to_xyz(int64_t hp, int Nside, const double* xyz, double* closestxyz); /** Returns true if the closest distance between the given healpix and the given RA,Dec (in degrees) is less than then given radius (in degrees). */ int healpix_within_range_of_radec(int64_t hp, int Nside, double ra, double dec, double radius); int healpix_within_range_of_xyz(int64_t hp, int Nside, const double* xyz, double radius); int healpixl_within_range_of_xyz(int64_t hp, int Nside, const double* xyz, double radius); /** Computes the RA,Dec bounding-box of the given healpix. Results are in degrees. RA may be wacky for healpixes spanning RA=0. */ void healpix_radec_bounds(int64_t hp, int nside, double* ralo, double* rahi, double* declo, double* dechi); #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2669158 astropy-healpix-0.5/cextern/astrometry.net/keywords.h0000644000077000000240000000551100000000000023116 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ // borrowed from http://rlove.org/log/2005102601. #ifndef ASTROMETRY_KEYWORDS_H #define ASTROMETRY_KEYWORDS_H #ifdef __GNUC__ #define ATTRIB_FORMAT(style,fmt,start) __attribute__ ((format(style,fmt,start))) #endif // this snippet borrowed from GNU libc features.h: #if defined __GNUC__ # define GNUC_PREREQ(maj, min) \ ((__GNUC__ << 16) + __GNUC_MINOR__ >= ((maj) << 16) + (min)) #else # define GNUC_PREREQ(maj, min) 0 #endif #if GNUC_PREREQ (3, 0) // Clang masquerades as gcc but isn't compatible. Someone should file a // lawsuit. Clang treats inlining differently; see // http://clang.llvm.org/compatibility.html#inline #if defined __clang__ || GNUC_PREREQ (5, 0) // After gcc 5.0, -std=gnu11 is the default (vs -std=gnu89 in previous // versions). This affects inlining semantics, among other things. #define InlineDeclare #define InlineDefineH #define InlineDefineC #else // plain old gcc #define INCLUDE_INLINE_SOURCE 1 #define InlineDeclare extern inline #define InlineDefineH extern inline #define InlineDefineC #endif // See: // http://gcc.gnu.org/onlinedocs/gcc/Function-Attributes.html # define Inline inline # define Pure __attribute__ ((pure)) # define Const __attribute__ ((const)) # define Noreturn __attribute__ ((noreturn)) # define Malloc __attribute__ ((malloc)) # define Used __attribute__ ((used)) # define Unused __attribute__ ((unused)) # define VarUnused __attribute__ ((unused)) # define Packed __attribute__ ((packed)) # define likely(x) __builtin_expect (!!(x), 1) # define unlikely(x) __builtin_expect (!!(x), 0) # define Noinline __attribute__ ((noinline)) // alloc_size // new in gcc-3.1: #if GNUC_PREREQ (3, 1) # define Deprecated __attribute__ ((deprecated)) #else # define Deprecated #endif // new in gcc-3.4: #if GNUC_PREREQ (3, 4) # define Must_check __attribute__ ((warn_unused_result)) # define WarnUnusedResult __attribute__ ((warn_unused_result)) #else # define Must_check # define WarnUnusedResult #endif // new in gcc-4.1: #if GNUC_PREREQ (4, 1) #if defined __clang__ // clang complains very loudly about this being ignored... # define Flatten #else # define Flatten __attribute__ (( flatten)) #endif #else # define Flatten #endif #else // not gnuc >= 3.0 # define InlineDeclare # define InlineDefineH # define InlineDefineC # define Inline # define Pure # define Const # define Noreturn # define Malloc # define Must_check # define Deprecated # define Used # define Unused # define VarUnused # define Packed # define likely(x) (x) # define unlikely(x) (x) # define Noinline # define WarnUnusedResult # define Flatten #endif #endif // ASTROMETRY_KEYWORDS_H ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8937328 astropy-healpix-0.5/cextern/astrometry.net/mathutil.c0000644000077000000240000002746500000000000023105 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include #include #include #include #include #include "mathutil.h" #include "keywords.h" #define InlineDefine InlineDefineC #include "mathutil.inc" #undef InlineDefine #include "bl.h" /** Returns 1 if the given point is inside the given polygon (listed as x0,y0, x1,y1, etc). */ int point_in_polygon(double x, double y, const dl* polygon) { size_t i; size_t N = dl_size(polygon) / 2; int inside = 0; for (i=0; i= 2"); return -1; } if (edgehandling == 0) { // truncate. outw = W / S; outh = H / S; } else if (edgehandling == 1) { // average outw = (W + S-1) / S; outh = (H + S-1) / S; } else { printf("Unknown edge handling code %i", edgehandling); return -1; } if (newW) *newW = outw; if (newH) *newH = outh; return 0; } float* average_image_f(const float* image, int W, int H, int S, int edgehandling, int* newW, int* newH, float* output) { return average_weighted_image_f(image, NULL, W, H, S, edgehandling, newW, newH, output, 0.0); } float* average_weighted_image_f(const float* image, const float* weight, int W, int H, int S, int edgehandling, int* newW, int* newH, float* output, float nilval) { int outw, outh; int i,j; if (get_output_image_size(W, H, S, edgehandling, &outw, &outh)) return NULL; if (output == NULL) { output = malloc(outw * outh * sizeof(float)); if (!output) { printf("Failed to allocate %i x %i floats", outw, outh); return NULL; } } for (j=0; j= H) break; for (I=0; I= W) break; ii = (j*S + J)*W + (i*S + I); if (weight) { wsum += weight[ii]; sum += image[ii] * weight[ii]; } else { wsum += 1.0; sum += image[ii]; } } } if (wsum == 0.0) output[j * outw + i] = nilval; else output[j * outw + i] = sum / wsum; } } if (newW) *newW = outw; if (newH) *newH = outh; return output; } // "borrowed" from from linux-2.4 static unsigned int my_hweight32(unsigned int w) { unsigned int res = (w & 0x55555555) + ((w >> 1) & 0x55555555); res = (res & 0x33333333) + ((res >> 2) & 0x33333333); res = (res & 0x0F0F0F0F) + ((res >> 4) & 0x0F0F0F0F); res = (res & 0x00FF00FF) + ((res >> 8) & 0x00FF00FF); return (res & 0x0000FFFF) + ((res >> 16) & 0x0000FFFF); } int invert_2by2_arr(const double* A, double* Ainv) { double det; double inv_det; det = A[0] * A[3] - A[1] * A[2]; if (det == 0.0) return -1; inv_det = 1.0 / det; Ainv[0] = A[3] * inv_det; Ainv[1] = -A[1] * inv_det; Ainv[2] = -A[2] * inv_det; Ainv[3] = A[0] * inv_det; return 0; } int invert_2by2(const double A[2][2], double Ainv[2][2]) { double det; double inv_det; det = A[0][0] * A[1][1] - A[0][1] * A[1][0]; if (det == 0.0) return -1; inv_det = 1.0 / det; Ainv[0][0] = A[1][1] * inv_det; Ainv[0][1] = -A[0][1] * inv_det; Ainv[1][0] = -A[1][0] * inv_det; Ainv[1][1] = A[0][0] * inv_det; return 0; } void tan_vectors(const double* pt, double* vec1, double* vec2) { double etax, etay, etaz, xix, xiy, xiz, eta_norm; double inv_en; // eta is a vector perpendicular to pt etax = -pt[1]; etay = pt[0]; etaz = 0.0; eta_norm = hypot(etax, etay); //sqrt(etax * etax + etay * etay); inv_en = 1.0 / eta_norm; etax *= inv_en; etay *= inv_en; vec1[0] = etax; vec1[1] = etay; vec1[2] = etaz; // xi = pt cross eta xix = -pt[2] * etay; xiy = pt[2] * etax; xiz = pt[0] * etay - pt[1] * etax; // xi has unit length since pt and eta have unit length. vec2[0] = xix; vec2[1] = xiy; vec2[2] = xiz; } int is_power_of_two(unsigned int x) { return (my_hweight32(x) == 1); } double vector_length_3(double* v) { return sqrt(v[0]*v[0] + v[1]*v[1] + v[2]*v[2]); } double vector_length_squared_3(double* v) { return v[0]*v[0] + v[1]*v[1] + v[2]*v[2]; } double dot_product_3(double* va, double* vb) { return va[0]*vb[0] + va[1]*vb[1] + va[2]*vb[2]; } void matrix_matrix_3(double* ma, double* mb, double* result) { result[0] = ma[0]*mb[0] + ma[1]*mb[3] + ma[2]*mb[6]; result[3] = ma[3]*mb[0] + ma[4]*mb[3] + ma[5]*mb[6]; result[6] = ma[6]*mb[0] + ma[7]*mb[3] + ma[8]*mb[6]; result[1] = ma[0]*mb[1] + ma[1]*mb[4] + ma[2]*mb[7]; result[4] = ma[3]*mb[1] + ma[4]*mb[4] + ma[5]*mb[7]; result[7] = ma[6]*mb[1] + ma[7]*mb[4] + ma[8]*mb[7]; result[2] = ma[0]*mb[2] + ma[1]*mb[5] + ma[2]*mb[8]; result[5] = ma[3]*mb[2] + ma[4]*mb[5] + ma[5]*mb[8]; result[8] = ma[6]*mb[2] + ma[7]*mb[5] + ma[8]*mb[8]; } void matrix_vector_3(double* m, double* v, double* result) { result[0] = m[0]*v[0] + m[1]*v[1] + m[2]*v[2]; result[1] = m[3]*v[0] + m[4]*v[1] + m[5]*v[2]; result[2] = m[6]*v[0] + m[7]*v[1] + m[8]*v[2]; } #define GAUSSIAN_SAMPLE_INVALID -1e300 double gaussian_sample(double mean, double stddev) { // from http://www.taygeta.com/random/gaussian.html // Algorithm by Dr. Everett (Skip) Carter, Jr. static double y2 = GAUSSIAN_SAMPLE_INVALID; double x1, x2, w, y1; // this algorithm generates random samples in pairs; the INVALID // jibba-jabba stores the second value until the next time the // function is called. if (y2 != GAUSSIAN_SAMPLE_INVALID) { y1 = y2; y2 = GAUSSIAN_SAMPLE_INVALID; return mean + y1 * stddev; } do { x1 = uniform_sample(-1, 1); x2 = uniform_sample(-1, 1); w = x1 * x1 + x2 * x2; } while ( w >= 1.0 ); w = sqrt( (-2.0 * log(w)) / w ); y1 = x1 * w; y2 = x2 * w; return mean + y1 * stddev; } double uniform_sample(double low, double high) { if (low == high) return low; return low + (high - low)*((double)rand() / (double)RAND_MAX); } /* computes IN PLACE the inverse of a 3x3 matrix stored as a 9-vector with the first ROW of the matrix in positions 0-2, the second ROW in positions 3-5, and the last ROW in positions 6-8. */ double inverse_3by3(double *matrix) { double det; double a11, a12, a13, a21, a22, a23, a31, a32, a33; double b11, b12, b13, b21, b22, b23, b31, b32, b33; a11 = matrix[0]; a12 = matrix[1]; a13 = matrix[2]; a21 = matrix[3]; a22 = matrix[4]; a23 = matrix[5]; a31 = matrix[6]; a32 = matrix[7]; a33 = matrix[8]; det = a11 * ( a22 * a33 - a23 * a32 ) + a12 * ( a23 * a31 - a21 * a33 ) + a13 * ( a21 * a32 - a22 * a31 ); if (det == 0.0) { return det; } b11 = + ( a22 * a33 - a23 * a32 ) / det; b12 = - ( a12 * a33 - a13 * a32 ) / det; b13 = + ( a12 * a23 - a13 * a22 ) / det; b21 = - ( a21 * a33 - a23 * a31 ) / det; b22 = + ( a11 * a33 - a13 * a31 ) / det; b23 = - ( a11 * a23 - a13 * a21 ) / det; b31 = + ( a21 * a32 - a22 * a31 ) / det; b32 = - ( a11 * a32 - a12 * a31 ) / det; b33 = + ( a11 * a22 - a12 * a21 ) / det; matrix[0] = b11; matrix[1] = b12; matrix[2] = b13; matrix[3] = b21; matrix[4] = b22; matrix[5] = b23; matrix[6] = b31; matrix[7] = b32; matrix[8] = b33; //fprintf(stderr,"matrix determinant = %g\n",det); return (det); } void image_to_xyz(double uu, double vv, double* s, double* transform) { double x, y, z; double length; assert(s); assert(transform); x = uu * transform[0] + vv * transform[1] + transform[2]; y = uu * transform[3] + vv * transform[4] + transform[5]; z = uu * transform[6] + vv * transform[7] + transform[8]; length = sqrt(x*x + y*y + z*z); x /= length; y /= length; z /= length; s[0] = x; s[1] = y; s[2] = z; } /* star = trans * (field; 1) star is 3 x N field is 2 x N and is supplemented by a row of ones. trans is 3 x 3 "star" and "field" are stored in lexicographical order, so the element at (r, c) is stored at array[r + c*H], where H is the "height" of the matrix: 3 for "star", 2 for "field". "trans" is stored in the transposed order (to be compatible with image_to_xyz and the previous version of this function). S = T F S F' = T F F' S F' (F F')^-1 = T */ void fit_transform(double* star, double* field, int N, double* trans) { int r, c, k; double FFt[9]; double det; double* R; double* F; // build F = (field; ones) F = malloc(3 * N * sizeof(double)); for (c=0; c 0) ? 1.0 : -1.0)) /** Average the image in "blocksize" x "blocksize" blocks, placing the output in the "output" image. The output image will have size "*newW" by "*newH". If you pass "output = NULL", memory will be allocated for the output image. It is valid to pass in "output" = "image". The output image is returned. */ float* average_image_f(const float* image, int W, int H, int blocksize, int edgehandling, int* newW, int* newH, float* output); float* average_weighted_image_f(const float* image, const float* weight, int W, int H, int blocksize, int edgehandling, int* newW, int* newH, float* output, float nilval); Const InlineDeclare int imax(int a, int b); Const InlineDeclare int imin(int a, int b); InlineDeclare double distsq_exceeds(double* d1, double* d2, int D, double limit); Const InlineDeclare double square(double d); // note, this function works on angles in degrees; it wraps around // at 360. Const InlineDeclare int inrange(double ra, double ralow, double rahigh); InlineDeclare double distsq(const double* d1, const double* d2, int D); InlineDeclare void cross_product(double* v1, double* v2, double* cross); InlineDeclare void normalize(double* x, double* y, double* z); InlineDeclare void normalize_3(double* xyz); #ifdef INCLUDE_INLINE_SOURCE #define InlineDefine InlineDefineH #include "mathutil.inc" #undef InlineDefine #endif #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2679524 astropy-healpix-0.5/cextern/astrometry.net/mathutil.inc0000644000077000000240000000313600000000000023421 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include InlineDefine void normalize(double* x, double* y, double* z) { double invl = 1.0 / sqrt((*x)*(*x) + (*y)*(*y) + (*z)*(*z)); *x *= invl; *y *= invl; *z *= invl; } InlineDefine void normalize_3(double* xyz) { double invlen = 1.0 / sqrt(xyz[0]*xyz[0] + xyz[1]*xyz[1] + xyz[2]*xyz[2]); xyz[0] *= invlen; xyz[1] *= invlen; xyz[2] *= invlen; } InlineDefine void cross_product(double* a, double* b, double* cross) { cross[0] = a[1] * b[2] - a[2] * b[1]; cross[1] = a[2] * b[0] - a[0] * b[2]; cross[2] = a[0] * b[1] - a[1] * b[0]; } InlineDefine int imax(int a, int b) { return (a > b) ? a : b; } InlineDefine int imin(int a, int b) { return (a < b) ? a : b; } InlineDefine double distsq_exceeds(double* d1, double* d2, int D, double limit) { double dist2; int i; dist2 = 0.0; for (i=0; i limit) return 1; } return 0; } InlineDefine double distsq(const double* d1, const double* d2, int D) { double dist2; int i; dist2 = 0.0; for (i=0; i= ralow && ra <= rahigh) return 1; return 0; } /* handle wraparound properly */ //if (ra <= ralow && ra >= rahigh) if (ra >= ralow || ra <= rahigh) return 1; return 0; } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2682617 astropy-healpix-0.5/cextern/astrometry.net/os-features.h0000644000077000000240000000653500000000000023513 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #ifndef OS_FEATURES_H #define OS_FEATURES_H // This is actually in POSIX1b but may or may not be available. int fdatasync(int fd); // Not POSIX; doesn't exist in Solaris 10 //#include #ifndef MIN #define MIN(a,b) (((a)<(b))?(a):(b)) #endif #ifndef MAX #define MAX(a,b) (((a)>(b))?(a):(b)) #endif // isfinite() on Solaris; from // https://code.google.com/p/redis/issues/detail?id=20 #if defined(__sun) && defined(__GNUC__) #undef isnan #define isnan(x) \ __extension__({ __typeof (x) __x_a = (x); \ __builtin_expect(__x_a != __x_a, 0); }) #undef isfinite #define isfinite(x) \ __extension__ ({ __typeof (x) __x_f = (x); \ __builtin_expect(!isnan(__x_f - __x_f), 1); }) #undef isinf #define isinf(x) \ __extension__ ({ __typeof (x) __x_i = (x); \ __builtin_expect(!isnan(__x_i) && !isfinite(__x_i), 0); }) #undef isnormal #define isnormal(x) \ __extension__ ({ __typeof(x) __x_n = (x); \ if (__x_n < 0.0) __x_n = -__x_n; \ __builtin_expect(isfinite(__x_n) \ && (sizeof(__x_n) == sizeof(float) \ ? __x_n >= __FLT_MIN__ \ : sizeof(__x_n) == sizeof(long double) \ ? __x_n >= __LDBL_MIN__ \ : __x_n >= __DBL_MIN__), 1); }) #undef HUGE_VALF #define HUGE_VALF (1e50f) #endif /** The qsort_r story: -qsort_r appears in BSD (including Mac OSX) void qsort_r(void *, size_t, size_t, void *, int (*)(void *, const void *, const void *)); -qsort_r appears in glibc 2.8, but with a different argument order: void qsort_r(void*, size_t, size_t, int (*)(const void*, const void*, void*), void*); Notice that the "thunk" and "comparison function" arguments to qsort_r are swapped, and the "thunk" appears either at the beginning or end of the comparison function. We check a few things: -is qsort_r declared? -does qsort_r exist? -do we need to swap the arguments? Those using qsort_r in Astrometry.net should instead use the macro QSORT_R() to take advantage of these tests. Its signature is: void QSORT_R(void* base, size_t nmembers, size_t member_size, void* token, comparison_function); You should define the "comparison" function like this: static int QSORT_COMPARISON_FUNCTION(my_comparison, void* token, const void* v1, const void* v2) { ... } Distributions including glibc 2.8 include: -Mandriva 2009 -Ubuntu 8.10 */ ///// In this healpix distribution, we avoid all this mess by importing ///// a stand-alone qsort_reentrant implementation. typedef int cmp_t(void *, const void *, const void *); void qsort_rex(void *a, size_t n, size_t es, void *thunk, cmp_t *cmp); #define QSORT_R qsort_rex #define QSORT_COMPARISON_FUNCTION(func, thunk, v1, v2) func(thunk, v1, v2) // As suggested in http://gcc.gnu.org/onlinedocs/gcc-4.3.0/gcc/Function-Names.html #if __STDC_VERSION__ < 199901L # if __GNUC__ >= 2 # define __func__ __FUNCTION__ # else # define __func__ "" # endif #endif #endif ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1537003272.894544 astropy-healpix-0.5/cextern/astrometry.net/permutedsort.c0000644000077000000240000001245500000000000024004 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include #include #ifdef _MSC_VER #include #else #include #endif #include #include #include #include "permutedsort.h" #include "os-features.h" // for qsort_r #ifdef _MSC_VER #if _MSC_VER >= 1600 #include #else #include #endif #else #include #endif #ifdef _MSC_VER typedef long off_t; #endif #ifdef _MSC_VER #include #endif int* permutation_init(int* perm, int N) { int i; if (!N) return perm; if (!perm) perm = malloc(sizeof(int) * N); for (i=0; idata_array; val1 = darray + i1 * ps->data_array_stride; val2 = darray + i2 * ps->data_array_stride; return ps->compare(val1, val2); } int* permuted_sort(const void* realarray, int array_stride, int (*compare)(const void*, const void*), int* perm, int N) { permsort_t ps; if (!perm) perm = permutation_init(perm, N); ps.compare = compare; ps.data_array = realarray; ps.data_array_stride = array_stride; QSORT_R(perm, N, sizeof(int), &ps, compare_permuted); return perm; } #ifdef _MSC_VER #define COMPARE(d1, d2, op1, op2) \ if (d1 op1 d2) return -1; \ if (d1 op2 d2) return 1; \ /* explicitly test for equality, to catch NaNs*/ \ if (d1 == d2) return 0; \ if (_isnan(d1) && _isnan(d2)) return 0; \ if (_isnan(d1)) return 1; \ if (_isnan(d2)) return -1; \ assert(0); return 0; #else #define COMPARE(d1, d2, op1, op2) \ if (d1 op1 d2) return -1; \ if (d1 op2 d2) return 1; \ /* explicitly test for equality, to catch NaNs*/ \ if (d1 == d2) return 0; \ if (isnan(d1) && isnan(d2)) return 0; \ if (isnan(d1)) return 1; \ if (isnan(d2)) return -1; \ assert(0); return 0; #endif //printf("d1=%g, d2=%g\n", d1, d2); #define INTCOMPARE(i1, i2, op1, op2) \ if (i1 op1 i2) return -1; \ if (i1 op2 i2) return 1; \ return 0; int compare_doubles_asc(const void* v1, const void* v2) { const double d1 = *(double*)v1; const double d2 = *(double*)v2; COMPARE(d1, d2, <, >); } int compare_doubles_desc(const void* v1, const void* v2) { // (note that v1,v2 are flipped) const double d1 = *(double*)v1; const double d2 = *(double*)v2; COMPARE(d1, d2, >, <); } int compare_floats_asc(const void* v1, const void* v2) { float f1 = *(float*)v1; float f2 = *(float*)v2; COMPARE(f1, f2, <, >); } int compare_floats_desc(const void* v1, const void* v2) { float f1 = *(float*)v1; float f2 = *(float*)v2; COMPARE(f1, f2, >, <); } int compare_int64_asc(const void* v1, const void* v2) { int64_t i1 = *(int64_t*)v1; int64_t i2 = *(int64_t*)v2; INTCOMPARE(i1, i2, <, >); } int compare_int64_desc(const void* v1, const void* v2) { int64_t i1 = *(int64_t*)v1; int64_t i2 = *(int64_t*)v2; INTCOMPARE(i1, i2, >, <); } // Versions for use with QSORT_R int QSORT_COMPARISON_FUNCTION(compare_floats_asc_r, void* thunk, const void* v1, const void* v2) { return compare_floats_asc(v1, v2); } #undef COMPARE #undef INTCOMPARE int compare_ints_asc(const void* v1, const void* v2) { const int d1 = *(int*)v1; const int d2 = *(int*)v2; if (d1 < d2) return -1; if (d1 > d2) return 1; return 0; } int compare_ints_desc(const void* v1, const void* v2) { return compare_ints_asc(v2, v1); } int compare_uchars_asc(const void* v1, const void* v2) { const unsigned char d1 = *(unsigned char*)v1; const unsigned char d2 = *(unsigned char*)v2; if (d1 < d2) return -1; if (d1 > d2) return 1; return 0; } int compare_uchars_desc(const void* v1, const void* v2) { return compare_ints_desc(v2, v1); } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2689428 astropy-healpix-0.5/cextern/astrometry.net/permutedsort.h0000644000077000000240000000422200000000000024002 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #ifndef PERMUTED_SORT_H #define PERMUTED_SORT_H // for QSORT_COMPARISON_FUNCTION #include "os-features.h" /* Computes the permutation array that will cause the "realarray" to be sorted according to the "compare" function. Ie, the first element in the sorted array will be at (char*)realarray + perm[0] * array_stride The "stride" parameter gives the number of bytes between successive entries in "realarray". If "perm" is NULL, a new permutation array will be allocated and returned. Otherwise, the permutation array will be placed in "perm". Note that if you pass in a non-NULL "perm" array, its existing values will be used! You probably want to initialize it with "permutation_init()" to set it to the identity permutation. */ int* permuted_sort(const void* realarray, int array_stride, int (*compare)(const void*, const void*), int* perm, int Nperm); int* permutation_init(int* perm, int Nperm); /** Applies a permutation array to a data vector. Copies "inarray" into "outarray" according to the given "perm"utation. This also works when "inarray" == "outarray". */ void permutation_apply(const int* perm, int Nperm, const void* inarray, void* outarray, int elemsize); /* Some sort functions that might come in handy: */ int compare_doubles_asc(const void* v1, const void* v2); int compare_doubles_desc(const void* v1, const void* v2); int compare_floats_asc(const void* v1, const void* v2); int compare_floats_desc(const void* v1, const void* v2); int compare_int64_asc(const void* v1, const void* v2); int compare_int64_desc(const void* v1, const void* v2); int compare_ints_asc(const void* v1, const void* v2); int compare_ints_desc(const void* v1, const void* v2); int compare_uchars_asc(const void* v1, const void* v2); int compare_uchars_desc(const void* v1, const void* v2); /* Versions for use with QSORT_R directly (not with permuted_sort). */ int QSORT_COMPARISON_FUNCTION(compare_floats_asc_r, void* thunk, const void* v1, const void* v2); #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2692962 astropy-healpix-0.5/cextern/astrometry.net/qsort_reentrant.c0000644000077000000240000001570100000000000024476 0ustar00tomstaff00000000000000/* * Copyright (c) 1992, 1993 * The Regents of the University of California. All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * 1. Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * 2. Redistributions in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in the * documentation and/or other materials provided with the distribution. * 4. Neither the name of the University nor the names of its contributors * may be used to endorse or promote products derived from this software * without specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE * ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE * FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL * DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS * OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY * OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF * SUCH DAMAGE. */ #ifdef _MSC_VER #include #else #include #endif // from https://groups.google.com/forum/#!topic/astrometry/quGEbY1CgR8 #ifdef _MSC_VER # include #else #if __sun # include #else # include #endif #endif //__FBSDID("$FreeBSD: src/sys/libkern/qsort.c,v 1.15 2004/07/15 23:58:23 glebius Exp $"); // Astrometry: We want reentrant! #define I_AM_QSORT_R #ifdef I_AM_QSORT_R typedef int cmp_t(void *, const void *, const void *); #else typedef int cmp_t(const void *, const void *); #endif static __inline char *med3(char *, char *, char *, cmp_t *, void *); static __inline void swapfunc(char *, char *, int, int); #define min(a, b) (a) < (b) ? (a) : (b) /* * Qsort routine from Bentley & McIlroy's "Engineering a Sort Function". */ #define swapcode(TYPE, parmi, parmj, n) { \ long i = (n) / sizeof (TYPE); \ register TYPE *pi = (TYPE *) (parmi); \ register TYPE *pj = (TYPE *) (parmj); \ do { \ register TYPE t = *pi; \ *pi++ = *pj; \ *pj++ = t; \ } while (--i > 0); \ } #define SWAPINIT(a, es) swaptype = ((char *)a - (char *)0) % sizeof(long) || \ es % sizeof(long) ? 2 : es == sizeof(long)? 0 : 1; static __inline void swapfunc(char *a, char *b, int n, int swaptype) { if(swaptype <= 1) swapcode(long, a, b, n) else swapcode(char, a, b, n) } #define swap(a, b) \ if (swaptype == 0) { \ long t = *(long *)(a); \ *(long *)(a) = *(long *)(b); \ *(long *)(b) = t; \ } else \ swapfunc(a, b, es, swaptype) #define vecswap(a, b, n) if ((n) > 0) swapfunc(a, b, n, swaptype) #ifdef I_AM_QSORT_R #define CMP(t, x, y) (cmp((t), (x), (y))) #else #define CMP(t, x, y) (cmp((x), (y))) #endif static __inline char * med3(char *a, char *b, char *c, cmp_t *cmp, void *thunk #ifndef I_AM_QSORT_R __unused #endif ) { return CMP(thunk, a, b) < 0 ? (CMP(thunk, b, c) < 0 ? b : (CMP(thunk, a, c) < 0 ? c : a )) :(CMP(thunk, b, c) > 0 ? b : (CMP(thunk, a, c) < 0 ? a : c )); } #ifdef I_AM_QSORT_R void qsort_rex(void *a, size_t n, size_t es, void *thunk, cmp_t *cmp) #else #define thunk NULL void qsort(void *a, size_t n, size_t es, cmp_t *cmp) #endif { char *pa, *pb, *pc, *pd, *pl, *pm, *pn; int d, r, swaptype, swap_cnt; loop: SWAPINIT(a, es); swap_cnt = 0; if (n < 7) { for (pm = (char *)a + es; pm < (char *)a + n * es; pm += es) for (pl = pm; pl > (char *)a && CMP(thunk, pl - es, pl) > 0; pl -= es) swap(pl, pl - es); return; } pm = (char *)a + (n / 2) * es; if (n > 7) { pl = a; pn = (char *)a + (n - 1) * es; if (n > 40) { d = (n / 8) * es; pl = med3(pl, pl + d, pl + 2 * d, cmp, thunk); pm = med3(pm - d, pm, pm + d, cmp, thunk); pn = med3(pn - 2 * d, pn - d, pn, cmp, thunk); } pm = med3(pl, pm, pn, cmp, thunk); } swap(a, pm); pa = pb = (char *)a + es; pc = pd = (char *)a + (n - 1) * es; for (;;) { while (pb <= pc && (r = CMP(thunk, pb, a)) <= 0) { if (r == 0) { swap_cnt = 1; swap(pa, pb); pa += es; } pb += es; } while (pb <= pc && (r = CMP(thunk, pc, a)) >= 0) { if (r == 0) { swap_cnt = 1; swap(pc, pd); pd -= es; } pc -= es; } if (pb > pc) break; swap(pb, pc); swap_cnt = 1; pb += es; pc -= es; } if (swap_cnt == 0) { /* Switch to insertion sort */ for (pm = (char *)a + es; pm < (char *)a + n * es; pm += es) for (pl = pm; pl > (char *)a && CMP(thunk, pl - es, pl) > 0; pl -= es) swap(pl, pl - es); return; } pn = (char *)a + n * es; r = min(pa - (char *)a, pb - pa); vecswap(a, pb - r, r); r = min(pd - pc, pn - pd - es); vecswap(pb, pn - r, r); if ((r = pb - pa) > es) #ifdef I_AM_QSORT_R qsort_rex(a, r / es, es, thunk, cmp); #else qsort(a, r / es, es, cmp); #endif if ((r = pd - pc) > es) { /* Iterate rather than recurse to save stack space */ a = pn - r; n = r / es; goto loop; } } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2696157 astropy-healpix-0.5/cextern/astrometry.net/starutil.c0000644000077000000240000000070500000000000023111 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include #include #include #include #include #include "os-features.h" #include "keywords.h" #include "mathutil.h" #include "starutil.h" #define POGSON 2.51188643150958 #define LOGP 0.92103403719762 #define InlineDefine InlineDefineC #include "starutil.inc" #undef InlineDefine ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2700274 astropy-healpix-0.5/cextern/astrometry.net/starutil.h0000644000077000000240000001175200000000000023122 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #ifndef STARUTIL_H #define STARUTIL_H #include #include "an-bool.h" #include "keywords.h" #define DIM_STARS 3 #define DIM_XY 2 // upper bound of dimquads value #define DQMAX 5 // upper bound of dimcodes value #define DCMAX 6 InlineDeclare int dimquad2dimcode(int dimquad); typedef unsigned char uchar; #define ONE_OVER_SIXTY 0.016666666666666666 // pi / 180. #define RAD_PER_DEG 0.017453292519943295 // pi / (180. * 60.) #define RAD_PER_ARCMIN 0.00029088820866572158 // pi / (180. * 60. * 60.) #define RAD_PER_ARCSEC 4.8481368110953598e-06 // 180. / pi #define DEG_PER_RAD 57.295779513082323 #define DEG_PER_ARCMIN ONE_OVER_SIXTY // 1./3600. #define DEG_PER_ARCSEC 0.00027777777777777778 // 60. * 180. / pi #define ARCMIN_PER_RAD 3437.7467707849396 #define ARCMIN_PER_DEG 60.0 #define ARCMIN_PER_ARCSEC ONE_OVER_SIXTY // 60. * 60. * 180. / pi #define ARCSEC_PER_RAD 206264.80624709636 #define ARCSEC_PER_DEG 3600.0 #define ARCSEC_PER_ARCMIN 60.0 InlineDeclare Const double rad2deg(double x); InlineDeclare Const double rad2arcmin(double x); InlineDeclare Const double rad2arcsec(double x); InlineDeclare Const double deg2rad(double x); InlineDeclare Const double deg2arcmin(double x); InlineDeclare Const double deg2arcsec(double x); InlineDeclare Const double arcmin2rad(double x); InlineDeclare Const double arcmin2deg(double x); InlineDeclare Const double arcmin2arcsec(double x); InlineDeclare Const double arcsec2rad(double x); InlineDeclare Const double arcsec2deg(double x); InlineDeclare Const double arcsec2arcmin(double x); #define MJD_JD_OFFSET 2400000.5 InlineDeclare Const double mjdtojd(double mjd); InlineDeclare Const double jdtomjd(double jd); // RA,Dec in radians: #define radec2x(r,d) (cos(d)*cos(r)) #define radec2y(r,d) (cos(d)*sin(r)) #define radec2z(r,d) (sin(d)) InlineDeclare Const double xy2ra(double x, double y); InlineDeclare Const double z2dec(double z); double atora(const char* str); double atodec(const char* str); double mag2flux(double mag); // RA,Dec in radians: InlineDeclare void radec2xyz(double ra, double dec, double* x, double* y, double* z); InlineDeclare Flatten void xyz2radec(double x, double y, double z, double *ra, double *dec); InlineDeclare Flatten void xyzarr2radec(const double* xyz, double *ra, double *dec); InlineDeclare void xyzarr2radecarr(const double* xyz, double *radec); InlineDeclare void radec2xyzarr(double ra, double dec, double* p_xyz); InlineDeclare void radec2xyzarrmany(double *ra, double *dec, double* xyz, int n); // RA,Dec in degrees: InlineDeclare void radecdeg2xyz(double ra, double dec, double* x, double* y, double* z); InlineDeclare Flatten void xyzarr2radecdeg(const double* xyz, double *ra, double *dec); InlineDeclare Flatten void xyzarr2radecdegarr(double* xyz, double *radec); InlineDeclare void radecdeg2xyzarr(double ra, double dec, double* p_xyz); InlineDeclare void radecdegarr2xyzarr(double* radec, double* xyz); InlineDeclare void radecdeg2xyzarrmany(double *ra, double *dec, double* xyz, int n); // Converts a distance-squared between two points on the // surface of the unit sphere into the angle between the // rays from the center of the sphere to the points, in // radians. InlineDeclare Flatten Const double distsq2arc(double dist2); // Distance^2 on the unit sphere to radians. // (alias of distsq2arc) InlineDeclare Flatten Const double distsq2rad(double dist2); InlineDeclare Flatten Const double distsq2deg(double dist2); // Distance on the unit sphere to radians. InlineDeclare Flatten Const double dist2rad(double dist); // Distance^2 on the unit sphere to arcseconds. InlineDeclare Flatten Const double distsq2arcsec(double dist2); // Distance on the unit sphere to arcseconds InlineDeclare Flatten Const double dist2arcsec(double dist); // Radians to distance^2 on the unit sphere. // (alias of arc2distsq) InlineDeclare Const double rad2distsq(double arcInRadians); // Radians to distance on the unit sphere. InlineDeclare Flatten Const double rad2dist(double arcInRadians); // Converts an angle (in arcseconds) into the distance-squared // between two points on the unit sphere separated by that angle. InlineDeclare Flatten Const double arcsec2distsq(double arcInArcSec); // Arcseconds to distance on the unit sphere. InlineDeclare Flatten Const double arcsec2dist(double arcInArcSec); // Degrees to distance on the unit sphere. InlineDeclare Flatten Const double deg2dist(double arcInDegrees); InlineDeclare Flatten Const double deg2distsq(double d); InlineDeclare Flatten Const double arcmin2dist(double arcmin); InlineDeclare Flatten Const double arcmin2distsq(double arcmin); // Distance on the unit sphere to degrees. InlineDeclare Flatten Const double dist2deg(double dist); #define HELP_ERR -101 #define OPT_ERR -201 InlineDeclare void star_midpoint(double* mid, const double* A, const double* B); #ifdef INCLUDE_INLINE_SOURCE #define InlineDefine InlineDefineH #include "starutil.inc" #undef InlineDefine #endif #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5126603 astropy-healpix-0.5/cextern/astrometry.net/starutil.inc0000644000077000000240000001646600000000000023453 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include #ifndef M_PI # define M_PI 3.14159265358979323846 #endif InlineDefine void star_midpoint(double* mid, const double* A, const double* B) { double len; double invlen; // we don't divide by 2 because we immediately renormalize it... mid[0] = A[0] + B[0]; mid[1] = A[1] + B[1]; mid[2] = A[2] + B[2]; //len = sqrt(square(mid[0]) + square(mid[1]) + square(mid[2])); len = sqrt(mid[0] * mid[0] + mid[1] * mid[1] + mid[2] * mid[2]); invlen = 1.0 / len; mid[0] *= invlen; mid[1] *= invlen; mid[2] *= invlen; } InlineDefine Const double mjdtojd(double mjd) { return mjd + MJD_JD_OFFSET; } InlineDefine Const double jdtomjd(double jd) { return jd - MJD_JD_OFFSET; } InlineDefine Const int dimquad2dimcode(int dimquad) { return 2 * (dimquad - 2); } InlineDefine Const double rad2deg(double x) { return x * DEG_PER_RAD; } InlineDefine Const double deg2rad(double x) { return x * RAD_PER_DEG; } InlineDefine Const double deg2arcmin(double x) { return x * ARCMIN_PER_DEG; } InlineDefine Const double arcmin2deg(double x) { return x * DEG_PER_ARCMIN; } InlineDefine Const double arcmin2arcsec(double x) { return x * ARCSEC_PER_ARCMIN; } InlineDefine Const double arcsec2arcmin(double x) { return x * ARCMIN_PER_ARCSEC; } InlineDefine Const double rad2arcmin(double x) { return x * ARCMIN_PER_RAD; } InlineDefine Const double rad2arcsec(double x) { return x * ARCSEC_PER_RAD; } InlineDefine Const double deg2arcsec(double x) { return x * ARCSEC_PER_DEG; } InlineDefine Const double arcmin2rad(double x) { return x * RAD_PER_ARCMIN; } InlineDefine Const double arcsec2rad(double x) { return x * RAD_PER_ARCSEC; } InlineDefine Const double arcsec2deg(double x) { return x * DEG_PER_ARCSEC; } InlineDefine Const double rad2distsq(double x) { // inverse of distsq2arc; cosine law. return 2.0 * (1.0 - cos(x)); } InlineDefine Flatten Const double rad2dist(double x) { return sqrt(rad2distsq(x)); } InlineDefine Flatten Const double arcsec2distsq(double x) { return rad2distsq(arcsec2rad(x)); } InlineDefine Flatten Const double arcmin2dist(double x) { return rad2dist(arcmin2rad(x)); } InlineDefine Flatten Const double arcmin2distsq(double arcmin) { return rad2distsq(arcmin2rad(arcmin)); } InlineDefine Const double z2dec(double z) { return asin(z); } InlineDefine Const double xy2ra(double x, double y) { double a = atan2(y, x); if (a < 0) a += 2.0 * M_PI; return a; } InlineDefine Flatten void xyz2radec(double x, double y, double z, double *ra, double *dec) { if (ra) *ra = xy2ra(x, y); if (dec) { if (fabs(z) > 0.9) *dec = M_PI / 2.0 - atan2(hypot(x, y), z); else *dec = z2dec(z); } } InlineDefine Flatten void xyzarr2radec(const double* xyz, double *ra, double *dec) { xyz2radec(xyz[0], xyz[1], xyz[2], ra, dec); } InlineDefine Flatten void xyzarr2radecdeg(const double* xyz, double *ra, double *dec) { xyzarr2radec(xyz, ra, dec); if (ra) *ra = rad2deg(*ra); if (dec) *dec = rad2deg(*dec); } InlineDefine Flatten void xyzarr2radecdegarr(double* xyz, double *radec) { xyzarr2radecdeg(xyz, radec, radec+1); } InlineDefine void radec2xyzarr(double ra, double dec, double* xyz) { double cosdec = cos(dec); xyz[0] = cosdec * cos(ra); xyz[1] = cosdec * sin(ra); xyz[2] = sin(dec); } InlineDefine void radec2xyz(double ra, double dec, double* x, double* y, double* z) { double cosdec = cos(dec); *x = cosdec * cos(ra); *y = cosdec * sin(ra); *z = sin(dec); } InlineDefine void radecdeg2xyz(double ra, double dec, double* x, double* y, double* z) { radec2xyz(deg2rad(ra), deg2rad(dec), x, y, z); } InlineDefine void radecdeg2xyzarr(double ra, double dec, double* xyz) { radec2xyzarr(deg2rad(ra),deg2rad(dec), xyz); } InlineDefine void radecdegarr2xyzarr(double* radec, double* xyz) { radecdeg2xyzarr(radec[0], radec[1], xyz); } // xyz stored as xyzxyzxyz. InlineDefine void radec2xyzarrmany(double *ra, double *dec, double* xyz, int n) { int i; for (i=0; i * Based on ISO/IEC SC22/WG14 9899 Committee draft (SC22 N2794) * * THIS SOFTWARE IS NOT COPYRIGHTED * * Contributor: Danny Smith * * This source code is offered for use in the public domain. You may * use, modify or distribute it freely. * * This code is distributed in the hope that it will be useful but * WITHOUT ANY WARRANTY. ALL WARRANTIES, EXPRESS OR IMPLIED ARE HEREBY * DISCLAIMED. This includes but is not limited to warranties of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. * * Date: 2000-12-02 * * mwb: This was modified in the following ways: * * - make it compatible with Visual C++ 6 (which uses * non-standard keywords and suffixes for 64-bit types) * - some environments need stddef.h included (for wchar stuff?) * - handle the fact that Microsoft's limits.h header defines * SIZE_MAX * - make corrections for SIZE_MAX, INTPTR_MIN, INTPTR_MAX, UINTPTR_MAX, * PTRDIFF_MIN, PTRDIFF_MAX, SIG_ATOMIC_MIN, and SIG_ATOMIC_MAX * to be 64-bit aware. */ #ifndef _STDINT_H #define _STDINT_H #define __need_wint_t #define __need_wchar_t #include #include #if _MSC_VER && (_MSC_VER < 1300) /* using MSVC 6 or earlier - no "long long" type, but might have _int64 type */ #define __STDINT_LONGLONG __int64 #define __STDINT_LONGLONG_SUFFIX i64 #else #define __STDINT_LONGLONG long long #define __STDINT_LONGLONG_SUFFIX LL #endif #if !defined( PASTE) #define PASTE2( x, y) x##y #define PASTE( x, y) PASTE2( x, y) #endif /* PASTE */ /* 7.18.1.1 Exact-width integer types */ typedef signed char int8_t; typedef unsigned char uint8_t; typedef short int16_t; typedef unsigned short uint16_t; typedef int int32_t; typedef unsigned uint32_t; typedef __STDINT_LONGLONG int64_t; typedef unsigned __STDINT_LONGLONG uint64_t; /* 7.18.1.2 Minimum-width integer types */ typedef signed char int_least8_t; typedef unsigned char uint_least8_t; typedef short int_least16_t; typedef unsigned short uint_least16_t; typedef int int_least32_t; typedef unsigned uint_least32_t; typedef __STDINT_LONGLONG int_least64_t; typedef unsigned __STDINT_LONGLONG uint_least64_t; /* 7.18.1.3 Fastest minimum-width integer types * Not actually guaranteed to be fastest for all purposes * Here we use the exact-width types for 8 and 16-bit ints. */ typedef char int_fast8_t; typedef unsigned char uint_fast8_t; typedef short int_fast16_t; typedef unsigned short uint_fast16_t; typedef int int_fast32_t; typedef unsigned int uint_fast32_t; typedef __STDINT_LONGLONG int_fast64_t; typedef unsigned __STDINT_LONGLONG uint_fast64_t; /* 7.18.1.4 Integer types capable of holding object pointers */ #ifndef _INTPTR_T_DEFINED #define _INTPTR_T_DEFINED #ifdef _WIN64 typedef __STDINT_LONGLONG intptr_t #else typedef int intptr_t; #endif /* _WIN64 */ #endif /* _INTPTR_T_DEFINED */ #ifndef _UINTPTR_T_DEFINED #define _UINTPTR_T_DEFINED #ifdef _WIN64 typedef unsigned __STDINT_LONGLONG uintptr_t #else typedef unsigned int uintptr_t; #endif /* _WIN64 */ #endif /* _UINTPTR_T_DEFINED */ /* 7.18.1.5 Greatest-width integer types */ typedef __STDINT_LONGLONG intmax_t; typedef unsigned __STDINT_LONGLONG uintmax_t; /* 7.18.2 Limits of specified-width integer types */ #if !defined ( __cplusplus) || defined (__STDC_LIMIT_MACROS) /* 7.18.2.1 Limits of exact-width integer types */ #define INT8_MIN (-128) #define INT16_MIN (-32768) #define INT32_MIN (-2147483647 - 1) #define INT64_MIN (PASTE( -9223372036854775807, __STDINT_LONGLONG_SUFFIX) - 1) #define INT8_MAX 127 #define INT16_MAX 32767 #define INT32_MAX 2147483647 #define INT64_MAX (PASTE( 9223372036854775807, __STDINT_LONGLONG_SUFFIX)) #define UINT8_MAX 0xff /* 255U */ #define UINT16_MAX 0xffff /* 65535U */ #define UINT32_MAX 0xffffffff /* 4294967295U */ #define UINT64_MAX (PASTE( 0xffffffffffffffffU, __STDINT_LONGLONG_SUFFIX)) /* 18446744073709551615ULL */ /* 7.18.2.2 Limits of minimum-width integer types */ #define INT_LEAST8_MIN INT8_MIN #define INT_LEAST16_MIN INT16_MIN #define INT_LEAST32_MIN INT32_MIN #define INT_LEAST64_MIN INT64_MIN #define INT_LEAST8_MAX INT8_MAX #define INT_LEAST16_MAX INT16_MAX #define INT_LEAST32_MAX INT32_MAX #define INT_LEAST64_MAX INT64_MAX #define UINT_LEAST8_MAX UINT8_MAX #define UINT_LEAST16_MAX UINT16_MAX #define UINT_LEAST32_MAX UINT32_MAX #define UINT_LEAST64_MAX UINT64_MAX /* 7.18.2.3 Limits of fastest minimum-width integer types */ #define INT_FAST8_MIN INT8_MIN #define INT_FAST16_MIN INT16_MIN #define INT_FAST32_MIN INT32_MIN #define INT_FAST64_MIN INT64_MIN #define INT_FAST8_MAX INT8_MAX #define INT_FAST16_MAX INT16_MAX #define INT_FAST32_MAX INT32_MAX #define INT_FAST64_MAX INT64_MAX #define UINT_FAST8_MAX UINT8_MAX #define UINT_FAST16_MAX UINT16_MAX #define UINT_FAST32_MAX UINT32_MAX #define UINT_FAST64_MAX UINT64_MAX /* 7.18.2.4 Limits of integer types capable of holding object pointers */ #ifdef _WIN64 #define INTPTR_MIN INT64_MIN #define INTPTR_MAX INT64_MAX #define UINTPTR_MAX UINT64_MAX #else #define INTPTR_MIN INT32_MIN #define INTPTR_MAX INT32_MAX #define UINTPTR_MAX UINT32_MAX #endif /* _WIN64 */ /* 7.18.2.5 Limits of greatest-width integer types */ #define INTMAX_MIN INT64_MIN #define INTMAX_MAX INT64_MAX #define UINTMAX_MAX UINT64_MAX /* 7.18.3 Limits of other integer types */ #define PTRDIFF_MIN INTPTR_MIN #define PTRDIFF_MAX INTPTR_MAX #define SIG_ATOMIC_MIN INTPTR_MIN #define SIG_ATOMIC_MAX INTPTR_MAX /* we need to check for SIZE_MAX already defined because MS defines it in limits.h */ #ifndef SIZE_MAX #define SIZE_MAX UINTPTR_MAX #endif #ifndef WCHAR_MIN /* also in wchar.h */ #define WCHAR_MIN 0 #define WCHAR_MAX ((wchar_t)-1) /* UINT16_MAX */ #endif /* * wint_t is unsigned short for compatibility with MS runtime */ #define WINT_MIN 0 #define WINT_MAX ((wint_t)-1) /* UINT16_MAX */ #endif /* !defined ( __cplusplus) || defined __STDC_LIMIT_MACROS */ /* 7.18.4 Macros for integer constants */ #if !defined ( __cplusplus) || defined (__STDC_CONSTANT_MACROS) /* 7.18.4.1 Macros for minimum-width integer constants Accoding to Douglas Gwyn : "This spec was changed in ISO/IEC 9899:1999 TC1; in ISO/IEC 9899:1999 as initially published, the expansion was required to be an integer constant of precisely matching type, which is impossible to accomplish for the shorter types on most platforms, because C99 provides no standard way to designate an integer constant with width less than that of type int. TC1 changed this to require just an integer constant *expression* with *promoted* type." */ #define INT8_C(val) ((int8_t) + (val)) #define UINT8_C(val) ((uint8_t) + (val##U)) #define INT16_C(val) ((int16_t) + (val)) #define UINT16_C(val) ((uint16_t) + (val##U)) #define INT32_C(val) val##L #define UINT32_C(val) val##UL #define INT64_C(val) (PASTE( val, __STDINT_LONGLONG_SUFFIX)) #define UINT64_C(val)(PASTE( PASTE( val, U), __STDINT_LONGLONG_SUFFIX)) /* 7.18.4.2 Macros for greatest-width integer constants */ #define INTMAX_C(val) INT64_C(val) #define UINTMAX_C(val) UINT64_C(val) #endif /* !defined ( __cplusplus) || defined __STDC_CONSTANT_MACROS */ #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8959322 astropy-healpix-0.5/cextern/astrometry.net/test_healpix-main.c0000644000077000000240000000173200000000000024656 0ustar00tomstaff00000000000000 /* This is auto-generated code. Edit at your own peril. */ #include #include #include #include "CuTest.h" extern void test_side_length(CuTest*); extern void test_make_map(CuTest*); extern void test_healpix_distance_to_radec(CuTest*); extern void test_healpix_neighbours(CuTest*); extern void test_big_nside(CuTest*); extern void test_distortion_at_pole(CuTest*); void RunAllTests(void) { CuString *output = CuStringNew(); CuSuite* suite = CuSuiteNew(); SUITE_ADD_TEST(suite, test_side_length); SUITE_ADD_TEST(suite, test_make_map); SUITE_ADD_TEST(suite, test_healpix_distance_to_radec); SUITE_ADD_TEST(suite, test_healpix_neighbours); SUITE_ADD_TEST(suite, test_big_nside); SUITE_ADD_TEST(suite, test_distortion_at_pole); CuSuiteRun(suite); CuSuiteSummary(suite, output); CuSuiteDetails(suite, output); printf("%s\n", output->buffer); } int main(int argc, char** args) { RunAllTests(); } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8966668 astropy-healpix-0.5/cextern/astrometry.net/test_healpix.c0000644000077000000240000005105500000000000023737 0ustar00tomstaff00000000000000/* # This file is part of the Astrometry.net suite. # Licensed under a 3-clause BSD style license - see LICENSE */ #include #include #include #include "os-features.h" #include "CuTest.h" #include "starutil.h" #include "healpix.h" #include "bl.h" static double square(double x) { return x*x; } void test_side_length(CuTest* ct) { double hp; double len = healpix_side_length_arcmin(1); CuAssertDblEquals(ct, 3517.9, len, 0.1); hp = healpix_nside_for_side_length_arcmin(len); CuAssertDblEquals(ct, 1.0, hp, 0.001); len = healpix_side_length_arcmin(2); CuAssertDblEquals(ct, 1758.969, len, 0.001); hp = healpix_nside_for_side_length_arcmin(len); CuAssertDblEquals(ct, 2.0, hp, 0.001); } static void add_plot_xyz_point(double* xyz) { double ra,dec; xyzarr2radecdeg(xyz, &ra, &dec); fprintf(stderr, "xp.append(%g)\n", ra); fprintf(stderr, "yp.append(%g)\n", dec); } static void add_plot_point(int hp, int nside, double dx, double dy) { double xyz[3]; healpixl_to_xyzarr(hp, nside, dx, dy, xyz); add_plot_xyz_point(xyz); } void plot_point(int hp, int nside, double dx, double dy, char* style) { fprintf(stderr, "xp=[]\n"); fprintf(stderr, "yp=[]\n"); add_plot_point(hp, nside, dx, dy); fprintf(stderr, "plot(xp, yp, '%s')\n", style); } void plot_xyz_point(double* xyz, char* style) { fprintf(stderr, "xp=[]\n"); fprintf(stderr, "yp=[]\n"); add_plot_xyz_point(xyz); fprintf(stderr, "plot(xp, yp, '%s')\n", style); } static void plot_hp_boundary(int hp, int nside, double start, double step, char* style) { double dx, dy; fprintf(stderr, "xp=[]\n"); fprintf(stderr, "yp=[]\n"); dy = 0.0; for (dx=start; dx<=1.0; dx+=step) add_plot_point(hp, nside, dx, dy); dx = 1.0; for (dy=start; dy<=1.0; dy+=step) add_plot_point(hp, nside, dx, dy); dy = 1.0; for (dx=1.0-start; dx>=0.0; dx-=step) add_plot_point(hp, nside, dx, dy); dx = 0.0; for (dy=1.0-start; dy>=0.0; dy-=step) add_plot_point(hp, nside, dx, dy); dy = 0.0; add_plot_point(hp, nside, dx, dy); fprintf(stderr, "xp,yp = wrapxy(xp,yp)\nplot(xp, yp, '%s')\n", style); } static void hpmap(int nside, const char* fn) { #if 0 int nhp; #endif double xyz[3]; double range; int64_t hps[9]; int i; int64_t hp; double dx, dy; // pick a point on the edge. //hp = 8; hp = 9; dx = 0.95; dy = 0.0; range = 0.1; /* hp = 6; dx = 0.05; dy = 0.95; range = 0.1; */ healpixl_to_xyzarr(hp, nside, dx, dy, xyz); for (i=0; i<12*nside*nside; i++) { plot_hp_boundary(i, nside, 0.005, 0.01, "b-"); } #if 0 nhp = healpix_get_neighbours_within_range(xyz, range, hps, nside); assert(nhp >= 1); assert(nhp <= 9); #else (void)healpix_get_neighbours_within_range(xyz, range, hps, nside); #endif /* for (i=0; i 1.:\n" " if xx < 180:\n" " xx += 360\n" " else:\n" " xx -= 360\n" " outx.append(xx)\n" " outy.append(yy)\n" " lastx = xx\n" " lasty = yy\n" " return (array(outx),array(outy))\n" ); hpmap(1, "hp.png"); hpmap(2, "hp2.png"); } int tst_xyztohpf(CuTest* ct, int hp, int nside, double dx, double dy) { double x,y,z; double outdx, outdy; int outhp; double outx,outy,outz; double dist; healpixl_to_xyz(hp, nside, dx, dy, &x, &y, &z); outhp = xyztohealpixlf(x, y, z, nside, &outdx, &outdy); healpixl_to_xyz(outhp, nside, outdx, outdy, &outx, &outy, &outz); dist = sqrt(MAX(0, square(x-outx) + square(y-outy) + square(z-outz))); printf("true/computed:\n" "hp: %d / %d\n" "dx: %.20g / %.20g\n" "dy: %.20g / %.20g\n" "x: %g / %g\n" "y: %g / %g\n" "z: %g / %g\n" "dist: %g\n\n", (int)hp, (int)outhp, dx, outdx, dy, outdy, x, outx, y, outy, z, outz, dist); if (dist > 1e-6) { double a, b; double outa, outb; a = xy2ra(x,y) / (2.0 * M_PI); b = z2dec(z) / (M_PI); outa = xy2ra(outx, outy) / (2.0 * M_PI); outb = z2dec(outz) / (M_PI); fprintf(stderr, "plot([%g, %g],[%g, %g],'r.-')\n", a, outa, b, outb); fprintf(stderr, "text(%g, %g, \"(%g,%g)\")\n", a, b, dx, dy); } CuAssertIntEquals(ct, 1, (dist < 1e-6)?1:0); return (dist > 1e-6); } void tEst_xyztohpf(CuTest* ct) { double dx, dy; int hp; int nside; double step = 0.1; double a, b; nside = 1; fprintf(stderr, "%s", "from pylab import plot,text,savefig,clf\n"); fprintf(stderr, "clf()\n"); /* Plot the grid of healpixes with dx,dy=step steps. */ step = 0.25; //for (hp=0; hp<12*nside*nside; hp++) { for (hp=0; hp<1*nside*nside; hp++) { double x,y,z; for (dx=0.0; dx<=1.05; dx+=step) { fprintf(stderr, "xp=[]\n"); fprintf(stderr, "yp=[]\n"); for (dy=0.0; dy<=1.05; dy+=step) { healpixl_to_xyz(hp, nside, dx, dy, &x, &y, &z); a = xy2ra(x,y) / (2.0 * M_PI); b = z2dec(z) / (M_PI); fprintf(stderr, "xp.append(%g)\n", a); fprintf(stderr, "yp.append(%g)\n", b); } fprintf(stderr, "plot(xp, yp, 'k-')\n"); } for (dy=0.0; dy<=1.05; dy+=step) { fprintf(stderr, "xp=[]\n"); fprintf(stderr, "yp=[]\n"); for (dx=0.0; dx<=1.0; dx+=step) { healpixl_to_xyz(hp, nside, dx, dy, &x, &y, &z); a = xy2ra(x,y) / (2.0 * M_PI); b = z2dec(z) / (M_PI); fprintf(stderr, "xp.append(%g)\n", a); fprintf(stderr, "yp.append(%g)\n", b); } fprintf(stderr, "plot(xp, yp, 'k-')\n"); } } step = 0.5; /* Plot places where the conversion screws up. */ for (hp=0; hp<12*nside*nside; hp++) { for (dx=0.0; dx<=1.01; dx+=step) { for (dy=0.0; dy<=1.01; dy+=step) { tst_xyztohpf(ct, hp, nside, dx, dy); } } } fprintf(stderr, "savefig('plot.png')\n"); } static void tst_neighbours(CuTest* ct, int pix, int* true_neigh, int true_nn, int Nside) { int neigh[8]; int nn; int i; for (i=0; i<8; i++) neigh[i] = -1; healpixl_get_neighbours(pix, neigh, Nside); /* printf("true(%i) : [ ", pix); for (i=0; i 2.0 * M_PI) ra -= 2.0 * M_PI; // find its healpix. hp = radec_to_healpixl(ra, dec, Nside); // find its neighbourhood. healpixl_get_neighbours(hp, neigh, Nside); fprintf(stderr, " N%i [ label=\"%i\", pos=\"%g,%g!\" ];\n", hp, hp, scale * ra/M_PI, scale * z); for (k=0; k<8; k++) { fprintf(stderr, " N%i -- N%i\n", hp, neigh[k]); } } void test_healpix_distance_to_radec(CuTest *ct) { double d; double rd[2]; d = healpix_distance_to_radec(4, 1, 0, 0, NULL); CuAssertDblEquals(ct, 0, d, 0); d = healpix_distance_to_radec(4, 1, 45, 0, NULL); CuAssertDblEquals(ct, 0, d, 0); d = healpix_distance_to_radec(4, 1, 45+1, 0, NULL); CuAssertDblEquals(ct, 1, d, 1e-9); d = healpix_distance_to_radec(4, 1, 45+1, 0+1, NULL); CuAssertDblEquals(ct, 1.414, d, 1e-3); d = healpix_distance_to_radec(4, 1, 45+10, 0, NULL); CuAssertDblEquals(ct, 10, d, 1e-9); // top corner d = healpix_distance_to_radec(4, 1, 0, rad2deg(asin(2.0/3.0)), NULL); CuAssertDblEquals(ct, 0, d, 1e-9); d = healpix_distance_to_radec(4, 1, 0, 1 + rad2deg(asin(2.0/3.0)), NULL); CuAssertDblEquals(ct, 1, d, 1e-9); d = healpix_distance_to_radec(4, 1, -45-10, -10, NULL); CuAssertDblEquals(ct, 14.106044, d, 1e-6); d = healpix_distance_to_radec(10, 1, 225, 5, NULL); CuAssertDblEquals(ct, 5, d, 1e-6); d = healpix_distance_to_radec(44, 2, 300, -50, NULL); CuAssertDblEquals(ct, 3.007643, d, 1e-6); d = healpix_distance_to_radec(45, 2, 310, -50, NULL); CuAssertDblEquals(ct, 1.873942, d, 1e-6); // south-polar hp, north pole. d = healpix_distance_to_radec(36, 2, 180, 90, NULL); // The hp corner is -41.8 deg; add 90. CuAssertDblEquals(ct, 131.810, d, 1e-3); // just south of equator to nearly across the sphere d = healpix_distance_to_radec(35, 2, 225, 20, NULL); // this one actually has the midpoint further than A and B. CuAssertDblEquals(ct, 158.189685, d, 1e-6); /* xyz[0] = -100.0; ra = dec = -1.0; d = healpix_distance_to_xyz(4, 1, 0, 0, xyz); xyzarr2radecdeg(xyz, &ra, &dec); CuAssertDblEquals(ct, 0, ra, 0); CuAssertDblEquals(ct, 0, dec, 0); */ rd[0] = rd[1] = -1.0; d = healpix_distance_to_radec(4, 1, 0, 0, rd); CuAssertDblEquals(ct, 0, rd[0], 0); CuAssertDblEquals(ct, 0, rd[1], 0); /* xyz[0] = -100.0; ra = dec = -1.0; d = healpix_distance_to_xyz(4, 1, 45, 0, xyz); xyzarr2radecdeg(xyz, &ra, &dec); CuAssertDblEquals(ct, 45, ra, 0); CuAssertDblEquals(ct, 0, dec, 0); */ rd[0] = rd[1] = -1.0; d = healpix_distance_to_radec(4, 1, 45+1, 0, rd); //CuAssertDblEquals(ct, 45, rd[0], 0); CuAssertDblEquals(ct, 45, rd[0], 1e-8); CuAssertDblEquals(ct, 0, rd[1], 1e-8); d = healpix_distance_to_radec(4, 1, 45+1, 0+1, rd); //CuAssertDblEquals(ct, 45, rd[0], 0); CuAssertDblEquals(ct, 45, rd[0], 1e-8); CuAssertDblEquals(ct, 0, rd[1], 0); // really?? d = healpix_distance_to_radec(4, 1, 20, 25, rd); CuAssertDblEquals(ct, d, 2.297298, 1e-6); CuAssertDblEquals(ct, 18.200995, rd[0], 1e-6); CuAssertDblEquals(ct, 23.392159, rd[1], 1e-6); } void test_healpix_neighbours(CuTest *ct) { int n0[] = { 1,3,2,71,69,143,90,91 }; int n5[] = { 26,27,7,6,4,94,95 }; int n13[] = { 30,31,15,14,12,6,7,27 }; int n15[] = { 31,47,63,61,14,12,13,30 }; int n30[] = { 31,15,13,7,27,25,28,29 }; int n101[] = { 32,34,103,102,100,174,175,122 }; int n127[] = { 58,37,36,126,124,125,56 }; int n64[] = { 65,67,66,183,181,138,139 }; int n133[] = { 80,82,135,134,132,152,154 }; int n148[] = { 149,151,150,147,145,162,168,170 }; int n160[] = { 161,163,162,145,144,128,176,178 }; int n24[] = { 25,27,26,95,93,87,18,19 }; int n42[] = { 43,23,21,111,109,40,41 }; int n59[] = { 62,45,39,37,58,56,57,60 }; int n191[] = { 74,48,117,116,190,188,189,72 }; int n190[] = { 191,117,116,113,187,185,188,189 }; int n186[] = { 187,113,112,165,164,184,185 }; int n184[] = { 185,187,186,165,164,161,178,179 }; // These were taken (IIRC) from the Healpix paper, so the healpix // numbers are all in the NESTED scheme. tst_nested(ct, 0, n0, sizeof(n0) /sizeof(int), 4); tst_nested(ct, 5, n5, sizeof(n5) /sizeof(int), 4); tst_nested(ct, 13, n13, sizeof(n13) /sizeof(int), 4); tst_nested(ct, 15, n15, sizeof(n15) /sizeof(int), 4); tst_nested(ct, 30, n30, sizeof(n30) /sizeof(int), 4); tst_nested(ct, 101, n101, sizeof(n101)/sizeof(int), 4); tst_nested(ct, 127, n127, sizeof(n127)/sizeof(int), 4); tst_nested(ct, 64, n64, sizeof(n64) /sizeof(int), 4); tst_nested(ct, 133, n133, sizeof(n133)/sizeof(int), 4); tst_nested(ct, 148, n148, sizeof(n148)/sizeof(int), 4); tst_nested(ct, 160, n160, sizeof(n160)/sizeof(int), 4); tst_nested(ct, 24, n24, sizeof(n24) /sizeof(int), 4); tst_nested(ct, 42, n42, sizeof(n42) /sizeof(int), 4); tst_nested(ct, 59, n59, sizeof(n59) /sizeof(int), 4); tst_nested(ct, 191, n191, sizeof(n191)/sizeof(int), 4); tst_nested(ct, 190, n190, sizeof(n190)/sizeof(int), 4); tst_nested(ct, 186, n186, sizeof(n186)/sizeof(int), 4); tst_nested(ct, 184, n184, sizeof(n184)/sizeof(int), 4); } /* void pnprime_to_xy(int, int*, int*, int); int xy_to_pnprime(int, int, int); void tst_healpix_pnprime_to_xy(CuTest *ct) { int px,py; pnprime_to_xy(6, &px, &py, 3); CuAssertIntEquals(ct, px, 2); CuAssertIntEquals(ct, py, 0); pnprime_to_xy(8, &px, &py, 3); CuAssertIntEquals(ct, px, 2); CuAssertIntEquals(ct, py, 2); pnprime_to_xy(0, &px, &py, 3); CuAssertIntEquals(ct, px, 0); CuAssertIntEquals(ct, py, 0); pnprime_to_xy(2, &px, &py, 3); CuAssertIntEquals(ct, px, 0); CuAssertIntEquals(ct, py, 2); pnprime_to_xy(4, &px, &py, 3); CuAssertIntEquals(ct, px, 1); CuAssertIntEquals(ct, py, 1); } void tst_healpix_xy_to_pnprime(CuTest *ct) { CuAssertIntEquals(ct, xy_to_pnprime(0,0,3), 0); CuAssertIntEquals(ct, xy_to_pnprime(1,0,3), 3); CuAssertIntEquals(ct, xy_to_pnprime(2,0,3), 6); CuAssertIntEquals(ct, xy_to_pnprime(0,1,3), 1); CuAssertIntEquals(ct, xy_to_pnprime(1,1,3), 4); CuAssertIntEquals(ct, xy_to_pnprime(2,1,3), 7); CuAssertIntEquals(ct, xy_to_pnprime(0,2,3), 2); CuAssertIntEquals(ct, xy_to_pnprime(1,2,3), 5); CuAssertIntEquals(ct, xy_to_pnprime(2,2,3), 8); } */ void print_test_healpix_output(int Nside) { int i, j; double z; double phi; fprintf(stderr, "graph Nside4 {\n"); // north polar for (i=1; i<=Nside; i++) { for (j=1; j<=(4*i); j++) { // find the center of the pixel in ring i // and longitude j. z = 1.0 - square((double)i / (double)Nside)/3.0; phi = M_PI / (2.0 * i) * ((double)j - 0.5); fprintf(stderr, " // North polar, i=%i, j=%i. z=%g, phi=%g\n", i, j, z, phi); print_node(z, phi, Nside); } } // south polar for (i=1; i<=Nside; i++) { for (j=1; j<=(4*i); j++) { z = 1.0 - square((double)i / (double)Nside)/3.0; z *= -1.0; phi = M_PI / (2.0 * i) * ((double)j - 0.5); fprintf(stderr, " // South polar, i=%i, j=%i. z=%g, phi=%g\n", i, j, z, phi); print_node(z, phi, Nside); } } // north equatorial for (i=Nside+1; i<=2*Nside; i++) { for (j=1; j<=(4*Nside); j++) { int s; z = 4.0/3.0 - 2.0 * i / (3.0 * Nside); s = (i - Nside + 1) % 2; s = (s + 2) % 2; phi = M_PI / (2.0 * Nside) * ((double)j - (double)s / 2.0); fprintf(stderr, " // North equatorial, i=%i, j=%i. z=%g, phi=%g, s=%i\n", i, j, z, phi, s); print_node(z, phi, Nside); } } // south equatorial for (i=Nside+1; i<2*Nside; i++) { for (j=1; j<=(4*Nside); j++) { int s; z = 4.0/3.0 - 2.0 * i / (3.0 * Nside); z *= -1.0; s = (i - Nside + 1) % 2; s = (s + 2) % 2; phi = M_PI / (2.0 * Nside) * ((double)j - s / 2.0); fprintf(stderr, " // South equatorial, i=%i, j=%i. z=%g, phi=%g, s=%i\n", i, j, z, phi, s); print_node(z, phi, Nside); } } fprintf(stderr, " node [ shape=point ]\n"); fprintf(stderr, " C0 [ pos=\"0,-10!\" ];\n"); fprintf(stderr, " C1 [ pos=\"20,-10!\" ];\n"); fprintf(stderr, " C2 [ pos=\"20,10!\" ];\n"); fprintf(stderr, " C3 [ pos=\"0,10!\" ];\n"); fprintf(stderr, " C0 -- C1 -- C2 -- C3 -- C0\n"); fprintf(stderr, "}\n"); } void print_healpix_grid(int Nside) { int i; int j; int N = 500; fprintf(stderr, "x%i=[", Nside); for (i=0; i= 0.0); CuAssert(ct, "dx", dx <= 1.0); CuAssert(ct, "dy", dy >= 0.0); CuAssert(ct, "dy", dy <= 1.0); healpixl_to_radecdeg(hp, Nside, dx, dy, &ra2, &dec2); CuAssertDblEquals(ct, ra1, ra2, arcsec2deg(1e-10)); CuAssertDblEquals(ct, dec1, dec2, arcsec2deg(1e-10)); printf("RA,Dec difference: %g, %g arcsec\n", deg2arcsec(ra2-ra1), deg2arcsec(dec2-dec1)); } void test_distortion_at_pole(CuTest* ct) { // not really a test of the code, more of healpix itself... double ra1, dec1, ra2, dec2, ra3, dec3, ra4, dec4; int Nside = 2097152; int64_t hp; double d1, d2, d3, d4, d5, d6; double testras [] = { 0.0, 45.0, 0.0, 0.0 }; double testdecs[] = { 90.0, 50.0, 40.0, 0.0 }; char* testnames[] = { "north pole", "mid-polar", "mid-equatorial", "equator" }; double ra, dec; int i; for (i=0; ibuffer); /* print_healpix_grid(1); print_healpix_grid(2); print_healpix_grid(3); print_healpix_grid(4); print_healpix_grid(5); */ //print_test_healpix_output(); /* int rastep, decstep; int Nra = 100; int Ndec = 100; double ra, dec; int healpix; printf("radechealpix=zeros(%i,3);\n", Nra*Ndec); for (rastep=0; rastep #endif /* Solaris --------------------------------------------------------*/ /* --------ignoring SunOS ieee_flags approach, someone else can ** deal with that! */ #if defined(sun) || defined(__BSD__) || defined(__OpenBSD__) || \ (defined(__FreeBSD__) && (__FreeBSD_version < 502114)) || \ defined(__NetBSD__) #include static void _npy_set_floatstatus_invalid(void) { fpsetsticky(FP_X_INV); } #elif defined(_AIX) #include #include static void _npy_set_floatstatus_invalid(void) { fp_raise_xcp(FP_INVALID); } #elif defined(_MSC_VER) || (defined(__osf__) && defined(__alpha)) /* * By using a volatile floating point value, * the compiler is forced to actually do the requested * operations because of potential concurrency. * * We shouldn't write multiple values to a single * global here, because that would cause * a race condition. */ static volatile double _npy_floatstatus_x, _npy_floatstatus_inf; static void _npy_set_floatstatus_invalid(void) { _npy_floatstatus_inf = NPY_INFINITY; _npy_floatstatus_x = _npy_floatstatus_inf - NPY_INFINITY; } #else /* General GCC code, should work on most platforms */ # include static void _npy_set_floatstatus_invalid(void) { feraiseexcept(FE_INVALID); } #endif ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.6457813 astropy-healpix-0.5/docs/0000755000077000000240000000000000000000000015356 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1526664953.0 astropy-healpix-0.5/docs/Makefile0000644000077000000240000001074500000000000017025 0ustar00tomstaff00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest #This is needed with git because git doesn't create a dir if it's empty $(shell [ -d "_static" ] || mkdir -p _static) help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" clean: -rm -rf $(BUILDDIR) -rm -rf api -rm -rf generated html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Astropy.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Astropy.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/Astropy" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Astropy" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." make -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: @echo "Run 'python setup.py test' in the root directory to run doctests " \ @echo "in the documentation." ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.4947343 astropy-healpix-0.5/docs/_templates/0000755000077000000240000000000000000000000017513 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574696080.6530244 astropy-healpix-0.5/docs/_templates/autosummary/0000755000077000000240000000000000000000000022101 5ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1526664953.0 astropy-healpix-0.5/docs/_templates/autosummary/base.rst0000644000077000000240000000037200000000000023547 0ustar00tomstaff00000000000000{% extends "autosummary_core/base.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1526664953.0 astropy-healpix-0.5/docs/_templates/autosummary/class.rst0000644000077000000240000000037300000000000023743 0ustar00tomstaff00000000000000{% extends "autosummary_core/class.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1526664953.0 astropy-healpix-0.5/docs/_templates/autosummary/module.rst0000644000077000000240000000037400000000000024124 0ustar00tomstaff00000000000000{% extends "autosummary_core/module.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1526665132.274801 astropy-healpix-0.5/docs/about.rst0000644000077000000240000000260500000000000017225 0ustar00tomstaff00000000000000:orphan: .. include:: references.txt .. _about: ****************** About this package ****************** This is a BSD-licensed Python package for HEALPix, which is based on the C HEALPix code written by Dustin Lang originally in `astrometry.net `_, and was added here with a Cython wrapper and expanded with a Python interface. The `healpy `__ package that is a wrapper around the `HEALPix `__ C++ library has existed for a long time. So why this re-write? The main motivation is that the original HEALPIX/healpy packages are GPL-licensed, which is incompatible with the BSD license used by Astropy and most Astropy-affiliated and scientific Python (Numpy, Scipy, ...) package, and HEALPix/healpy will not be relicensed (see `here `__). In addition, the present package doesn't have a big C++ package as a dependency, just a little C and Cython code, which makes it easy to install everywhere -- we support Linux, MacOS X, and Windows. However, this package is intended to be lightweight and we do not plan to fully implement everything that healpy and the original HEALPix library support (such as spherical harmonics). Note that code contributions to this package can't be derived from the HEALPix package or healpy due to licensing reasons (see above). ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2755382 astropy-healpix-0.5/docs/api.rst0000644000077000000240000000031000000000000016653 0ustar00tomstaff00000000000000Reference/API ============= .. automodapi:: astropy_healpix :no-inheritance-diagram: :inherited-members: :no-main-docstr: .. automodapi:: astropy_healpix.healpy :no-inheritance-diagram: ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1526665132.275888 astropy-healpix-0.5/docs/boundaries.rst0000644000077000000240000000455500000000000020254 0ustar00tomstaff00000000000000Pixel corners and edges ======================= In some cases, you may need to find out the longitude/latitude or celestial coordinates of the corners or edges of HEALPix pixels. The :meth:`~astropy_healpix.HEALPix.boundaries_lonlat` method can be used to sample points long the edge of one or more HEALPix pixels:: >>> from astropy_healpix import HEALPix >>> hp = HEALPix(nside=16, order='nested') >>> hp.boundaries_lonlat([120], step=1) # doctest: +FLOAT_CMP (, ) This method takes a ``step`` argument which specifies how many points to sample along each edge. Setting ``step`` to 1 returns the corner positions, while setting e.g. 2 returns the corners and points along the middle of each edge, and larger values can be used to get the precise curved edges of the pixels. The following example shows the difference between the boundary constructed from just the corners (in red) and a much higher-resolution boundary computed with 100 steps on each side (in black): .. plot:: :include-source: import numpy as np from astropy import units as u import matplotlib.pyplot as plt from matplotlib.patches import Polygon from astropy_healpix.core import boundaries_lonlat ax = plt.subplot(1, 1, 1) for step, color in [(1, 'red'), (100, 'black')]: lon, lat = boundaries_lonlat([7], nside=1, step=step) lon = lon.to(u.deg).value lat = lat.to(u.deg).value vertices = np.vstack([lon.ravel(), lat.ravel()]).transpose() p = Polygon(vertices, closed=True, edgecolor=color, facecolor='none') ax.add_patch(p) plt.xlim(210, 330) plt.ylim(-50, 50) As for other methods, the :class:`~astropy_healpix.HEALPix` class has an equivalent :meth:`~astropy_healpix.HEALPix.boundaries_skycoord` method that can return the celestial coordinates of the boundaries as a :class:`~astropy.coordinates.SkyCoord` object if the ``frame`` is set:: >>> from astropy.coordinates import Galactic >>> hp = HEALPix(nside=16, order='nested', frame=Galactic()) >>> hp.boundaries_skycoord([120], step=1) # doctest: +FLOAT_CMP ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8975074 astropy-healpix-0.5/docs/cone_search.rst0000644000077000000240000000320200000000000020356 0ustar00tomstaff00000000000000Seaching for pixels around a position (cone search) =================================================== A common operation when using HEALPix maps is to try and find all pixels that lie within a certain radius of a given longitude/latitude. One way to do this would be to simply find the longitude/latitude of all pixels in the HEALPix map then find the spherical distance to the requested longitude and latitude, but in practice this would be very inefficient for high resolution HEALPix maps where the number of pixels may become arbitrarily large. Instead, the :meth:`~astropy_healpix.HEALPix.cone_search_lonlat` method can be used to efficiently find all HEALpix pixels within a certain radius from a longitude/latitude:: >>> from astropy import units as u >>> from astropy_healpix import HEALPix >>> hp = HEALPix(nside=16, order='nested') >>> print(hp.cone_search_lonlat(10 * u.deg, 30 * u.deg, radius=10 * u.deg)) [1269 160 162 1271 1270 1268 1246 1247 138 139 161 1245 136 137 140 142 130 131 1239 1244 1238 1241 1243 1265 1267 1276 1273 1277 168 169 163 166 164] Likewise, if a celestial frame was specified using the ``frame`` keyword arguent to :class:`~astropy_healpix.HEALPix`, you can use the :meth:`~astropy_healpix.HEALPix.cone_search_skycoord` method to query around specific celestial coordinates:: >>> from astropy.coordinates import Galactic >>> hp = HEALPix(nside=16, order='nested', frame=Galactic()) >>> from astropy.coordinates import SkyCoord >>> coord = SkyCoord('00h42m44.3503s +41d16m08.634s') >>> print(hp.cone_search_skycoord(coord, radius=5 * u.arcmin)) [2537] ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5141582 astropy-healpix-0.5/docs/conf.py0000644000077000000240000001622600000000000016664 0ustar00tomstaff00000000000000# -*- coding: utf-8 -*- # Licensed under a 3-clause BSD style license - see LICENSE.rst # # Astropy documentation build configuration file. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this file. # # All configuration values have a default. Some values are defined in # the global Astropy configuration which is loaded here before anything else. # See astropy.sphinx.conf for which values are set there. # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # sys.path.insert(0, os.path.abspath('..')) # IMPORTANT: the above commented section was generated by sphinx-quickstart, but # is *NOT* appropriate for astropy or Astropy affiliated packages. It is left # commented out with this explanation to make it clear why this should not be # done. If the sys.path entry above is added, when the astropy.sphinx.conf # import occurs, it will import the *source* version of astropy instead of the # version installed (if invoked as "make html" or directly with sphinx), or the # version in the build directory (if "python setup.py build_sphinx" is used). # Thus, any C-extensions that are needed to build the documentation will *not* # be accessible, and the documentation will not build correctly. import os import sys import datetime from importlib import import_module try: from sphinx_astropy.conf.v1 import * # noqa except ImportError: print('ERROR: the documentation requires the sphinx-astropy package to be installed') sys.exit(1) # Get configuration information from setup.cfg from configparser import ConfigParser conf = ConfigParser() conf.read([os.path.join(os.path.dirname(__file__), '..', 'setup.cfg')]) setup_cfg = dict(conf.items('metadata')) # -- General configuration ---------------------------------------------------- # By default, highlight as Python 3. highlight_language = 'python3' # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.2' # To perform a Sphinx version check that needs to be more specific than # major.minor, call `check_sphinx_version("x.y.z")` here. # check_sphinx_version("1.2.1") # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns.append('_templates') # This is added to the end of RST files - a good place to put substitutions to # be used globally. rst_epilog += """ """ # -- Project information ------------------------------------------------------ # This does not *have* to match the package name, but typically does project = setup_cfg['name'] author = setup_cfg['author'] copyright = '{0}, {1}'.format( datetime.datetime.now().year, setup_cfg['author']) # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. import_module(setup_cfg['name']) package = sys.modules[setup_cfg['name']] # The short X.Y version. version = package.__version__.split('-', 1)[0] # The full version, including alpha/beta/rc tags. release = package.__version__ # -- Options for HTML output -------------------------------------------------- # A NOTE ON HTML THEMES # The global astropy configuration uses a custom theme, 'bootstrap-astropy', # which is installed along with astropy. A different theme can be used or # the options for this theme can be modified by overriding some of the # variables set in the global configuration. The variables set in the # global configuration are listed below, commented out. # Add any paths that contain custom themes here, relative to this directory. # To use a different custom theme, add the directory containing the theme. #html_theme_path = [] # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. To override the custom theme, set this to the # name of a builtin theme or the name of a custom theme in html_theme_path. #html_theme = None html_theme_options = { 'logotext1': 'astropy', # white, semi-bold 'logotext2': '-healpix', # orange, light 'logotext3': ':docs' # white, light } # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = '' # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = '' # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '' # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". html_title = '{0} v{1}'.format(project, release) # Output file base name for HTML help builder. htmlhelp_basename = project + 'doc' # -- Options for LaTeX output ------------------------------------------------- # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [('index', project + '.tex', project + u' Documentation', author, 'manual')] # -- Options for manual page output ------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [('index', project.lower(), project + u' Documentation', [author], 1)] # -- Options for the edit_on_github extension --------------------------------- if eval(setup_cfg.get('edit_on_github')): extensions += ['sphinx_astropy.ext.edit_on_github'] versionmod = import_module(setup_cfg['name'] + '.version') edit_on_github_project = setup_cfg['github_project'] if versionmod.release: edit_on_github_branch = "v" + versionmod.version else: edit_on_github_branch = "master" edit_on_github_source_root = "" edit_on_github_doc_root = "docs" # -- Resolving issue number to links in changelog ----------------------------- github_issues_url = 'https://github.com/{0}/issues/'.format(setup_cfg['github_project']) # -- Turn on nitpicky mode for sphinx (to warn about references not found) ---- # # nitpicky = True # nitpick_ignore = [] # # Some warnings are impossible to suppress, and you can list specific references # that should be ignored in a nitpick-exceptions file which should be inside # the docs/ directory. The format of the file should be: # # # # for example: # # py:class astropy.io.votable.tree.Element # py:class astropy.io.votable.tree.SimpleElement # py:class astropy.io.votable.tree.SimpleElementWithContent # # Uncomment the following lines to enable the exceptions: # # for line in open('nitpick-exceptions'): # if line.strip() == "" or line.startswith("#"): # continue # dtype, target = line.split(None, 1) # target = target.strip() # nitpick_ignore.append((dtype, six.u(target))) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1537003272.8993323 astropy-healpix-0.5/docs/coordinates.rst0000644000077000000240000001064100000000000020424 0ustar00tomstaff00000000000000Coordinate conversions ====================== Converting between pixel indices and spherical coordinates ---------------------------------------------------------- As described in :doc:`getting_started`, coordinates in a HEALPix pixellization can follow either the 'ring' or 'nested' convention. Let's start by setting up an example pixellization:: >>> from astropy_healpix import HEALPix >>> hp = HEALPix(nside=16, order='nested') The :meth:`~astropy_healpix.HEALPix.healpix_to_lonlat` method can be used to convert HEALPix indices to :class:`~astropy.coordinates.Longitude` and :class:`~astropy.coordinates.Latitude` objects:: >>> lon, lat = hp.healpix_to_lonlat([1, 442, 2200]) >>> lon # doctest: +FLOAT_CMP >>> lat # doctest: +FLOAT_CMP The :class:`~astropy.coordinates.Longitude` and :class:`~astropy.coordinates.Latitude` objects are fully-fledged :class:`~astropy.units.Quantity` objects and also include shortcuts to get the values in various units:: >>> lon.hourangle # doctest: +FLOAT_CMP array([ 3.1875, 6.25 , 1.8 ]) >>> lat.degree # doctest: +FLOAT_CMP array([ 4.78019185, 54.3409123 , -44.99388015]) Conversely, given longitudes and latitudes as :class:`~astropy.units.Quantity` objects, it is possible to recover HEALPix pixel indices:: >>> from astropy import units as u >>> print(hp.lonlat_to_healpix([1, 3, 4] * u.deg, [5, 6, 9] * u.deg)) [1217 1217 1222] In these examples, what is being converted is the position of the center of each pixel. In fact, the :meth:`~astropy_healpix.HEALPix.lonlat_to_healpix` method can also take or give the fractional position inside each HEALPix pixel, e.g.:: >>> index, dx, dy = hp.lonlat_to_healpix([1, 3, 4] * u.deg, [5, 6, 9] * u.deg, ... return_offsets=True) >>> print(index) [1217 1217 1222] >>> dx # doctest: +FLOAT_CMP array([ 0.22364669, 0.78767489, 0.58832469]) >>> dy # doctest: +FLOAT_CMP array([ 0.86809114, 0.72100823, 0.16610247]) and the :meth:`~astropy_healpix.HEALPix.healpix_to_lonlat` method can take offset positions - for example we can use this to find the position of the corners of a given pixel:: >>> dx = [0., 1., 1., 0.] >>> dy = [0., 0., 1., 1.] >>> lon, lat = hp.healpix_to_lonlat([133, 133, 133, 133], dx=dx, dy=dy) >>> lon # doctest: +FLOAT_CMP >>> lat # doctest: +FLOAT_CMP .. _celestial: Celestial coordinates --------------------- For cases where the HEALPix pixellization is of the celestial sphere, a ``frame`` argument can be passed to :class:`~astropy_healpix.HEALPix`. This argument should specify the celestial frame (using an `astropy.coordinates `_ frame) in which the HEALPix pixellization is defined:: >>> from astropy_healpix import HEALPix >>> from astropy.coordinates import Galactic >>> hp = HEALPix(nside=16, order='nested', frame=Galactic()) Each method defined in :class:`~astropy_healpix.HEALPix` and ending in ``lonlat`` has an equivalent method ending in ``skycoord`` which can be used if the frame is set. For example, to convert from HEALPix indices to celestial coordinates, you can use the :meth:`~astropy_healpix.HEALPix.healpix_to_skycoord` method:: >>> hp.healpix_to_skycoord([144, 231]) # doctest: +FLOAT_CMP and to convert from celestial coordinates to HEALPix indices you can use the :meth:`~astropy_healpix.HEALPix.skycoord_to_healpix` method, e.g:: >>> from astropy.coordinates import SkyCoord >>> coord = SkyCoord('00h42m44.3503s +41d16m08.634s') >>> hp.skycoord_to_healpix(coord) 2537 Converting between ring and nested conventions ---------------------------------------------- The :class:`~astropy_healpix.HEALPix` class has methods that can be used to convert HEALPix pixel indices between the ring and nested convention. These are :meth:`~astropy_healpix.HEALPix.nested_to_ring`:: >>> print(hp.nested_to_ring([30])) [873] and :meth:`~astropy_healpix.HEALPix.ring_to_nested`:: >>> print(hp.ring_to_nested([1, 2, 3])) [ 511 767 1023] ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2785413 astropy-healpix-0.5/docs/getting_started.rst0000644000077000000240000000466600000000000021313 0ustar00tomstaff00000000000000.. include:: references.txt .. _using: Getting started =============== .. _using-intro: The cleanest way to use the functionality in **healpix** is to make use of the high-level :class:`~astropy_healpix.HEALPix` class. The :class:`~astropy_healpix.HEALPix` class should be initialized with the ``nside`` parameter which controls the resolution of the pixellization - it is the number of pixels on the side of each of the 12 top-level HEALPix pixels:: >>> from astropy_healpix import HEALPix >>> hp = HEALPix(nside=16) As described in the references above, HEALPix pixel indices can follow two different ordering conventions - the *nested* convention and the *ring* convention. By default, the :class:`~astropy_healpix.HEALPix` class assumes the ring ordering convention, but it is possible to explicitly specify the convention to use using the ``order`` argument, for example:: >>> hp = HEALPix(nside=16, order='ring') or:: >>> hp = HEALPix(nside=16, order='nested') Once this class has been set up, you can access various properties and methods related to the HEALPix pixellization. For example, you can calculate the number of pixels as well as the pixel area or resolution:: >>> hp.npix 3072 >>> hp.pixel_area # doctest: +FLOAT_CMP >>> hp.pixel_resolution # doctest: +FLOAT_CMP As you can see, when appropriate the properties and the methods on the :class:`~astropy_healpix.HEALPix` class return Astropy high-level classes such as :class:`~astropy.units.Quantity`, :class:`~astropy.coordinates.Longitude`, and so on. For example, the :meth:`~astropy_healpix.HEALPix.healpix_to_lonlat` method can be used to convert HEALPix indices to :class:`~astropy.coordinates.Longitude` and :class:`~astropy.coordinates.Latitude` objects:: >>> lon, lat = hp.healpix_to_lonlat([1, 442, 2200]) >>> lon # doctest: +FLOAT_CMP >>> lat # doctest: +FLOAT_CMP The :class:`~astropy_healpix.HEALPix` class includes methods that take or return :class:`~astropy.coordinates.SkyCoord` objects (we will take a look at this in the :ref:`celestial` section). In the subsequent sections of the documentation, we will take a closer look at converting between coordinate systems, as well as more advanced features such as interpolation and cone searches. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1526665132.2788744 astropy-healpix-0.5/docs/healpy_compat.rst0000644000077000000240000000271400000000000020741 0ustar00tomstaff00000000000000Healpy-compatible interface =========================== In addition to the above high- and low-level interfaces, we have provided a `healpy `_-compatible interface in :mod:`astropy_healpix.healpy`. Note that this only includes a subset of the healpy functions. This is not the recommended interface, and is only provided as a convenience for packages that want to support both healpy and this package. Example ------- As an example, the :func:`~astropy_healpix.healpy.pix2ang` function can be used to get the longitude/latitude of a given HEALPix pixel (by default using the 'ring' convention):: >>> from astropy_healpix.healpy import pix2ang >>> pix2ang(16, [100, 120]) (array([ 0.35914432, 0.41113786]), array([ 3.70259134, 1.6689711 ])) which agrees exactly with the healpy function:: .. doctest-requires:: healpy >>> from healpy import pix2ang >>> pix2ang(16, [100, 120]) (array([ 0.35914432, 0.41113786]), array([ 3.70259134, 1.6689711 ])) Migrate ------- To migrate a script or package from using ``healpy`` to this ``healpix`` package, to check if the required functionality is available by changing all:: import healpy as hp to:: from astropy_healpix import healpy as hp and see what's missing or breaks. Please file issues or feature requests! As mentioned above, we then recommend that when you actually make the change, you use the main API of this package instead of the ``healpy``-compatible interface. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574695438.1369023 astropy-healpix-0.5/docs/index.rst0000644000077000000240000000330700000000000017222 0ustar00tomstaff00000000000000.. include:: references.txt .. warning:: This **astropy-healpix** package is in an early stage of development. It should not be considered feature complete or API stable. Feedback and contributions welcome! What is HEALPix? ================ `HEALPix `_ (Hierarchical Equal Area isoLatitude Pixelisation) is an algorithm for pixellizing a sphere that is sometimes used in Astronomy to store data from all-sky surveys, but the general algorithm can apply to any field that has to deal with representing data on a sphere. More information about the HEALPix algorithm can be found here: * http://healpix.jpl.nasa.gov/ * http://adsabs.harvard.edu/abs/2005ApJ...622..759G * http://adsabs.harvard.edu/abs/2007MNRAS.381..865C About this package ================== **astropy-healpix** is a new BSD-licensed implementation that is separate from the original GPL-licensed `HEALPix library `_ and associated `healpy `__ Python wrapper. See :ref:`about` for further information about the difference between this new implementation and the original libraries. The code can be found on `GitHub `__, along with the list of `Contributors `__. User documentation ================== .. toctree:: :maxdepth: 1 installation getting_started coordinates boundaries cone_search interpolation performance healpy_compat api Version history =============== For a list of changes in each version, see the `CHANGES.rst `_ file. ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1574433289.514666 astropy-healpix-0.5/docs/installation.rst0000644000077000000240000000676600000000000020630 0ustar00tomstaff00000000000000.. include:: references.txt .. doctest-skip-all .. _install: ************ Installation ************ Dependencies ============ Required dependencies --------------------- The **astropy-healpix** package works with Python 3.6 and later (on Linux, MacOS and Windows), and requires the following dependencies: * `Numpy `__ 1.11 or later * `Astropy `__ 2.0 or later If you use :ref:`pip` or :ref:`conda`, these will be installed automatically. Optional dependencies --------------------- The following packages are optional dependencies, which can be installed if needed: * `pytest `__ for testing * `healpy `__ for testing (but this is not required and the tests that require healpy will be skipped if healpy is not installed) * `hypothesis `__ for the healpy-related tests. Stable version ============== Installing the latest stable version is possible either using pip or conda. .. _pip: Using pip --------- To install **astropy-healpix** with `pip `__ from `PyPI `__ simply run:: pip install --no-deps astropy-healpix .. note:: The ``--no-deps`` flag is optional, but highly recommended if you already have Numpy installed, since otherwise pip will sometimes try to "help" you by upgrading your Numpy installation, which may not always be desired. .. _conda: Using conda ----------- To install healpix with `Anaconda `_ from the `conda-forge channel on anaconda.org `__ simply run:: conda install -c conda-forge astropy-healpix Testing installation -------------------- To check that you have this package installed and which version you're using, start Python and execute the following code: .. code-block:: bash $ python Python 3.6.2 |Continuum Analytics, Inc.| (default, Jul 20 2017, 13:14:59) [GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import astropy_healpix >>> astropy_healpix.__version__ 0.1 To make sure that all functionality is working OK on your system, you can run the automated tests of this package by executing the ``test`` function: .. code-block:: bash python -c 'import astropy_healpix; astropy_healpix.test()' Development version =================== Install the latest development version from https://github.com/astropy/astropy-healpix : .. code-block:: bash git clone https://github.com/astropy/astropy-healpix cd astropy-healpix pip install . Contributing ============ This section contains some tips how to hack on **astropy-healpix**. One quick way to get a Python environment with everything needed to work on ``astropy-healpix`` (code, run tests, build docs) is like this: .. code-block:: bash git clone https://github.com/astropy/astropy-healpix cd astropy-healpix conda env create -f environment-dev.yml conda activate astropy-healpix Run this command to do an in-place build and put this local version on your Python ``sys.path``:: python setup.py develop To run the tests, use ``pytest`` directly: python -m pytest -v astropy_healpix To build the docs, use this command:: python setup.py build_docs open docs/_build/html/index.html If you have any questions, just open an issue on Github and we'll help. ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1530538362.390297 astropy-healpix-0.5/docs/interpolation.rst0000644000077000000240000001570200000000000021004 0ustar00tomstaff00000000000000Interpolating values from a HEALPix map ======================================= Main methods ------------ While all the functionality we have seen so far in the remainder of the documentation is concerned with the geometry of the HEALPix pixellization, the main purpose of HEALPix is to actually tabulate values in each pixel to represent a physical quantity over a sphere (e.g. flux over the celestial sphere). We will refer to this as a HEALPix map. These maps are stored using a 1-dimensional vector with as many elements as pixels in the HEALPix pixellization, and either in the 'ring' or 'nested' order. If you are interested in finding the value in a HEALPix map at a given longitude/latitude on the sphere, there are two main options: * Convert the longitude/latitude to the HEALPix pixel that the position falls inside (e.g. ``index``) using :meth:`~astropy_healpix.HEALPix.lonlat_to_healpix` or :meth:`~astropy_healpix.HEALPix.skycoord_to_healpix`, and extract the value of the array of map values at that index (e.g. ``values[index]``). This is essentially equivalent to a nearest-neighbour interpolation. * Convert the longitude/latitude to the HEALPix pixel that the position falls inside then find the other neighboring pixels and carry out a bilinear interpolation. This is trickier to do by hand, and we therefore provide the methods :meth:`~astropy_healpix.HEALPix.interpolate_bilinear_lonlat` and :meth:`~astropy_healpix.HEALPix.interpolate_bilinear_skycoord` methods to faciliate this. If you are not already familiar with how to access HEALPix data from FITS files, we have provided a `Full example`_ in the following section. Full example ------------ To illustrate this, we use an example map from the `WMAP mission `__, specifically the map **K Band Map for the Full Five Years**. We start off by downloading and opening this map with Astropy:: >>> from astropy.io import fits >>> hdulist = fits.open('https://lambda.gsfc.nasa.gov/data/map/dr3/skymaps/5yr//wmap_band_imap_r9_5yr_K_v3.fits') # doctest: +REMOTE_DATA Downloading https://lambda.gsfc.nasa.gov/data/map/dr3/skymaps/5yr//wmap_band_imap_r9_5yr_K_v3.fits [Done] >>> hdulist.info() # doctest: +REMOTE_DATA Filename: ... No. Name Ver Type Cards Dimensions Format 0 PRIMARY 1 PrimaryHDU 19 () 1 Archive Map Table 1 BinTableHDU 20 3145728R x 2C [E, E] Since HEALPix maps are stored in tabular form, the data is contained in HDU 1 (primary HDUs cannot contain tabular data). Let's now take a look at the header:: >>> hdulist[1].header # doctest: +REMOTE_DATA XTENSION= 'BINTABLE' /binary table extension BITPIX = 8 /8-bit bytes NAXIS = 2 /2-dimensional binary table NAXIS1 = 8 /width of table in bytes NAXIS2 = 3145728 /number of rows in table PCOUNT = 0 /size of special data area GCOUNT = 1 /one data group (required keyword) TFIELDS = 2 /number of fields in each row TTYPE1 = 'TEMPERATURE ' /label for field 1 TFORM1 = 'E ' /data format of field: 4-byte REAL TUNIT1 = 'mK ' /physical unit of field 1 TTYPE2 = 'N_OBS ' /label for field 2 TFORM2 = 'E ' /data format of field: 4-byte REAL TUNIT2 = 'counts ' /physical unit of field 2 EXTNAME = 'Archive Map Table' /name of this binary table extension PIXTYPE = 'HEALPIX ' /Pixel algorigthm ORDERING= 'NESTED ' /Ordering scheme NSIDE = 512 /Resolution parameter FIRSTPIX= 0 /First pixel (0 based) LASTPIX = 3145727 /Last pixel (0 based) Of particular interest to us are the ``NSIDE`` and ``ORDERING`` keywords:: >>> hdulist[1].header['NSIDE'] # doctest: +REMOTE_DATA 512 >>> hdulist[1].header['ORDERING'] # doctest: +REMOTE_DATA 'NESTED' The data itself can be accessed using:: >>> hdulist[1].data['TEMPERATURE'] # doctest: +REMOTE_DATA array([ 16.28499985, 16.8025322 , 15.32036781, ..., 15.0780201 , 15.36229229, 15.23281574], dtype=float32) The last piece of information we need is that the map is in Galactic coordinates, which is unfortunately not encoded in the header but can be found `here `__. We can now instantiate a :class:`~astropy_healpix.HEALPix` object:: >>> from astropy_healpix import HEALPix >>> from astropy.coordinates import Galactic >>> nside = hdulist[1].header['NSIDE'] # doctest: +REMOTE_DATA >>> order = hdulist[1].header['ORDERING'] # doctest: +REMOTE_DATA >>> hp = HEALPix(nside=nside, order=order, frame=Galactic()) # doctest: +REMOTE_DATA and we can now use :meth:`~astropy_healpix.HEALPix.interpolate_bilinear_skycoord` to interpolate the temperature at a given position on the sky:: >>> from astropy.coordinates import SkyCoord >>> coord = SkyCoord('00h42m44.3503s +41d16m08.634s', frame='icrs') >>> temperature = hdulist[1].data['temperature'] # doctest: +REMOTE_DATA >>> hp.interpolate_bilinear_skycoord(coord, temperature) # doctest: +FLOAT_CMP +REMOTE_DATA array([ 0.41296058]) Here is a full example that uses this to make a map of a section of the sky: .. plot:: :include-source: # Get the data from astropy.io import fits hdulist = fits.open('https://lambda.gsfc.nasa.gov/data/map/dr3/skymaps/5yr//wmap_band_imap_r9_5yr_K_v3.fits') # Set up the HEALPix projection from astropy_healpix import HEALPix from astropy.coordinates import Galactic nside = hdulist[1].header['NSIDE'] order = hdulist[1].header['ORDERING'] hp = HEALPix(nside=nside, order=order, frame=Galactic()) # Sample a 300x200 grid in RA/Dec from astropy import units as u ra = np.linspace(-15., 15., 300) * u.deg dec = np.linspace(-10., 10., 200) * u.deg ra_grid, dec_grid = np.meshgrid(ra, dec) # Set up Astropy coordinate objects from astropy.coordinates import SkyCoord coords = SkyCoord(ra_grid.ravel(), dec_grid.ravel(), frame='icrs') # Interpolate values temperature = hdulist[1].data['temperature'] tmap = hp.interpolate_bilinear_skycoord(coords, temperature) tmap = tmap.reshape((200, 300)) # Make a plot of the interpolated temperatures plt.figure(figsize=(9, 5)) im = plt.imshow(tmap, extent=[-1, 1, -10, 10], cmap=plt.cm.RdYlBu, aspect='auto') plt.colorbar(im) plt.xlabel('Right ascension (ICRS)') plt.ylabel('Declination (ICRS)') plt.show() In practice, for the common case of reprojecting a HEALPix map to a regular gridded image, you can use the `reproject `_ package which provides high-level reprojection functions that use **healpix** behind the scenes. ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1526664953.0 astropy-healpix-0.5/docs/make.bat0000644000077000000240000001064100000000000016765 0ustar00tomstaff00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Astropy.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Astropy.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1556095356.9345636 astropy-healpix-0.5/docs/performance.rst0000644000077000000240000000344100000000000020413 0ustar00tomstaff00000000000000Performance =========== At this time, we have focused mostly on implementing functionality into the **astropy-healpix** package, performance is not as good in most cases as the `healpy `__ library. Once the API is stable, we will focus on improving performance. Benchmark --------- To get an idea of how the performance of the two packages compare, we have included some simple benchmarks that compare the healpy-compatible interface of **astropy-healpix** with healpy itself. These benchmarks are run with: .. code-block:: bash $ python -m astropy_healpix.bench Running benchmarks... fct nest nside size time_healpy time_self ratio ------- ----- ----- ------- ----------- ---------- ------- pix2ang True 1 10 0.0000081 0.0003575 43.91 pix2ang True 128 10 0.0000082 0.0003471 42.52 pix2ang True 1 1000 0.0000399 0.0004751 11.92 pix2ang True 128 1000 0.0000345 0.0004575 13.28 pix2ang True 1 1000000 0.0434032 0.1589150 3.66 pix2ang True 128 1000000 0.0364285 0.1383810 3.80 pix2ang False 1 10 0.0000080 0.0004040 50.30 pix2ang False 128 10 0.0000082 0.0003322 40.63 pix2ang False 1 1000 0.0000400 0.0005005 12.50 pix2ang False 128 1000 0.0000548 0.0005045 9.21 pix2ang False 1 1000000 0.0342841 0.1429310 4.17 pix2ang False 128 1000000 0.0478645 0.1405270 2.94 For small arrays, ``pix2ang`` in **astropy-healpix** performs worse, but in both caes the times are less than a millisecond, and such differences may therefore not matter. For larger arrays, the difference is a factor of a few at most. We will add more benchmarks over time to provide a more complete picture. ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1526664953.0 astropy-healpix-0.5/docs/references.txt0000644000077000000240000000000000000000000020226 0ustar00tomstaff00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574695479.7676115 astropy-healpix-0.5/setup.cfg0000644000077000000240000000264100000000000016252 0ustar00tomstaff00000000000000[metadata] name = astropy_healpix description = BSD-licensed HEALPix for Astropy long_description = file: README.rst author = Christoph Deil, Thomas Robitaille, and Dustin Lang author_email = astropy.team@gmail.com license = BSD 3-Clause url = https://github.com/astropy/astropy-healpix edit_on_github = False github_project = astropy/astropy-healpix version = 0.5 [options] zip_safe = False packages = find: python_requires = >=3.6 setup_requires = numpy install_requires = numpy astropy [options.extras_require] test = pytest-astropy hypothesis<4.42 docs = sphinx-astropy matplotlib [build_sphinx] source-dir = docs build-dir = docs/_build all_files = 1 [build_docs] source-dir = docs build-dir = docs/_build all_files = 1 [upload_docs] upload-dir = docs/_build/html show-response = 1 [tool:pytest] minversion = 3.0 norecursedirs = build docs/_build doctest_plus = enabled addopts = -p no:warnings [ah_bootstrap] auto_use = True [pycodestyle] # E101 - mix of tabs and spaces # W191 - use of tabs # W291 - trailing whitespace # W292 - no newline at end of file # W293 - trailing whitespace # W391 - blank line at end of file # E111 - 4 spaces per indentation level # E112 - 4 spaces per indentation level # E113 - 4 spaces per indentation level # E901 - SyntaxError or IndentationError # E902 - IOError select = E101,W191,W291,W292,W293,W391,E111,E112,E113,E901,E902 exclude = extern,sphinx,*parsetab.py ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1574433289.5178237 astropy-healpix-0.5/setup.py0000644000077000000240000000230100000000000016134 0ustar00tomstaff00000000000000#!/usr/bin/env python # Licensed under a 3-clause BSD style license - see LICENSE.rst import builtins # Ensure that astropy-helpers is available import ah_bootstrap # noqa from setuptools import setup from setuptools.config import read_configuration from astropy_helpers.setup_helpers import register_commands, get_package_info from astropy_helpers.version_helpers import generate_version_py # Store the package name in a built-in variable so it's easy # to get from other parts of the setup infrastructure builtins._ASTROPY_PACKAGE_NAME_ = read_configuration('setup.cfg')['metadata']['name'] # Create a dictionary with setup command overrides. Note that this gets # information about the package (name and version) from the setup.cfg file. cmdclass = register_commands() # Freeze build information in version.py. Note that this gets information # about the package (name and version) from the setup.cfg file. version = generate_version_py() # Get configuration information from all of the various subpackages. # See the docstring for setup_helpers.update_package_files for more # details. package_info = get_package_info() setup(name='astropy-healpix', version=version, cmdclass=cmdclass, **package_info)