././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.6531785
astropy-healpix-0.5/ 0000755 0000770 0000024 00000000000 00000000000 014426 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574695473.5461984
astropy-healpix-0.5/CHANGES.rst 0000644 0000770 0000024 00000003371 00000000000 016234 0 ustar 00tom staff 0000000 0000000 .. _changes:
*******
Changes
*******
0.5 (2019-11-25)
================
- Update package infrastructure to use ``setup.cfg``. [#134]
- Make sure that Numpy is declared as a build-time dependency. [#134]
- Update astropy-helpers to v3.2.2. [#134]
- Update minimum required Python version to 3.6. [#125]
- Add ``HEALPix.from_header``. [#127]
- Clean up C code to avoid compilation warnings. [#118, #119, #120, #121, #122, #123]
- Fix unit tests on 32-bit architectures. [#117]
- Fix compatibility with Numpy 1.16 and later. [#116]
0.4 (2018-12-18)
================
- Healpix rangesearch cleanup [#113]
- Update astropy-helpers to v2.0.8 [#112]
- Rewrite core module in C to make ``healpix_to_lonlat`` and
``lonlat_to_healpix`` broadcastable over both pixel index and nside. [#110]
0.3.1 (2018-10-24)
==================
- Ensure .c files are included in tar file.
0.3 (2018-10-24)
================
- Remove OpenMP from astropy-healpix [#108]
- Fix bilinear interpolation of invalid values [#106]
- Add uniq to (level, ipix) and inverse function [#105]
- compute z more stably; improve on z2dec [#101]
- use more stable cos(Dec) term [#94]
- Fix get_interp_weights for phi=None case [#89]
- Add pix2vec, vec2pix, ang2vec [#73]
- Add ``pixel_resolution_to_nside`` function. [#31]
0.2 (2017-10-15)
================
- Expand benchmarks to include ang2pix, nest2ring and ring2nest. [#62]
- Use OpenMP to parallelize the Cython wrappers. [#59]
- Renamed the ``healpix_neighbours`` function to ``neighbours`` and added
a wrapper to the high-level class. [#61]
- Fix bilinear interpolation which was being done incorrectly, and added
a new ``bilinear_interpolation_weights`` function to get the interpolation
weights. [#63]
0.1 (2017-10-01)
================
- Initial release
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1537003272.8710742
astropy-healpix-0.5/LICENSE.md 0000644 0000770 0000024 00000002727 00000000000 016042 0 ustar 00tom staff 0000000 0000000 Copyright (c) 2016-2018, Astropy Developers
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
* Neither the name of the Astropy Team nor the names of its contributors may be
used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.6539373
astropy-healpix-0.5/PKG-INFO 0000644 0000770 0000024 00000003241 00000000000 015523 0 ustar 00tom staff 0000000 0000000 Metadata-Version: 2.1
Name: astropy-healpix
Version: 0.5
Summary: BSD-licensed HEALPix for Astropy
Home-page: https://github.com/astropy/astropy-healpix
Author: Christoph Deil, Thomas Robitaille, and Dustin Lang
Author-email: astropy.team@gmail.com
License: BSD 3-Clause
Description: |RTD| |Travis Status| |AppVeyor status| |CircleCI| |Coverage Status|
About
-----
This is a BSD-licensed HEALPix package developed by the Astropy project
and based on C code written by Dustin Lang in `astrometry.net `__. See the
`Documentation `__ for
information about installing and using this package.
.. |Travis Status| image:: https://travis-ci.org/astropy/astropy-healpix.svg
:target: https://travis-ci.org/astropy/astropy-healpix?branch=master
.. |AppVeyor status| image:: https://ci.appveyor.com/api/projects/status/5kxwb47o2umy370m/branch/master?svg=true
:target: https://ci.appveyor.com/project/Astropy/astropy-healpix/branch/master
.. |CircleCI| image:: https://circleci.com/gh/astropy/astropy-healpix.svg?style=svg
:target: https://circleci.com/gh/astropy/astropy-healpix
.. |Coverage Status| image:: https://coveralls.io/repos/astropy/astropy-healpix/badge.svg
:target: https://coveralls.io/r/astropy/astropy-healpix
.. |RTD| image:: https://readthedocs.org/projects/astropy-healpix/badge/?version=latest
:target: http://astropy-healpix.readthedocs.io/en/latest/?badge=latest
Platform: UNKNOWN
Requires-Python: >=3.6
Provides-Extra: docs
Provides-Extra: test
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1526665132.250059
astropy-healpix-0.5/README.rst 0000644 0000770 0000024 00000002221 00000000000 016112 0 ustar 00tom staff 0000000 0000000 |RTD| |Travis Status| |AppVeyor status| |CircleCI| |Coverage Status|
About
-----
This is a BSD-licensed HEALPix package developed by the Astropy project
and based on C code written by Dustin Lang in `astrometry.net `__. See the
`Documentation `__ for
information about installing and using this package.
.. |Travis Status| image:: https://travis-ci.org/astropy/astropy-healpix.svg
:target: https://travis-ci.org/astropy/astropy-healpix?branch=master
.. |AppVeyor status| image:: https://ci.appveyor.com/api/projects/status/5kxwb47o2umy370m/branch/master?svg=true
:target: https://ci.appveyor.com/project/Astropy/astropy-healpix/branch/master
.. |CircleCI| image:: https://circleci.com/gh/astropy/astropy-healpix.svg?style=svg
:target: https://circleci.com/gh/astropy/astropy-healpix
.. |Coverage Status| image:: https://coveralls.io/repos/astropy/astropy-healpix/badge.svg
:target: https://coveralls.io/r/astropy/astropy-healpix
.. |RTD| image:: https://readthedocs.org/projects/astropy-healpix/badge/?version=latest
:target: http://astropy-healpix.readthedocs.io/en/latest/?badge=latest
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.4971526
astropy-healpix-0.5/ah_bootstrap.py 0000644 0000770 0000024 00000110633 00000000000 017471 0 ustar 00tom staff 0000000 0000000 """
This bootstrap module contains code for ensuring that the astropy_helpers
package will be importable by the time the setup.py script runs. It also
includes some workarounds to ensure that a recent-enough version of setuptools
is being used for the installation.
This module should be the first thing imported in the setup.py of distributions
that make use of the utilities in astropy_helpers. If the distribution ships
with its own copy of astropy_helpers, this module will first attempt to import
from the shipped copy. However, it will also check PyPI to see if there are
any bug-fix releases on top of the current version that may be useful to get
past platform-specific bugs that have been fixed. When running setup.py, use
the ``--offline`` command-line option to disable the auto-upgrade checks.
When this module is imported or otherwise executed it automatically calls a
main function that attempts to read the project's setup.cfg file, which it
checks for a configuration section called ``[ah_bootstrap]`` the presences of
that section, and options therein, determine the next step taken: If it
contains an option called ``auto_use`` with a value of ``True``, it will
automatically call the main function of this module called
`use_astropy_helpers` (see that function's docstring for full details).
Otherwise no further action is taken and by default the system-installed version
of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers``
may be called manually from within the setup.py script).
This behavior can also be controlled using the ``--auto-use`` and
``--no-auto-use`` command-line flags. For clarity, an alias for
``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using
the latter if needed.
Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same
names as the arguments to `use_astropy_helpers`, and can be used to configure
the bootstrap script when ``auto_use = True``.
See https://github.com/astropy/astropy-helpers for more details, and for the
latest version of this module.
"""
import contextlib
import errno
import io
import locale
import os
import re
import subprocess as sp
import sys
from distutils import log
from distutils.debug import DEBUG
from configparser import ConfigParser, RawConfigParser
import pkg_resources
from setuptools import Distribution
from setuptools.package_index import PackageIndex
# This is the minimum Python version required for astropy-helpers
__minimum_python_version__ = (3, 5)
# TODO: Maybe enable checking for a specific version of astropy_helpers?
DIST_NAME = 'astropy-helpers'
PACKAGE_NAME = 'astropy_helpers'
UPPER_VERSION_EXCLUSIVE = None
# Defaults for other options
DOWNLOAD_IF_NEEDED = True
INDEX_URL = 'https://pypi.python.org/simple'
USE_GIT = True
OFFLINE = False
AUTO_UPGRADE = True
# A list of all the configuration options and their required types
CFG_OPTIONS = [
('auto_use', bool), ('path', str), ('download_if_needed', bool),
('index_url', str), ('use_git', bool), ('offline', bool),
('auto_upgrade', bool)
]
# Start off by parsing the setup.cfg file
_err_help_msg = """
If the problem persists consider installing astropy_helpers manually using pip
(`pip install astropy_helpers`) or by manually downloading the source archive,
extracting it, and installing by running `python setup.py install` from the
root of the extracted source code.
"""
SETUP_CFG = ConfigParser()
if os.path.exists('setup.cfg'):
try:
SETUP_CFG.read('setup.cfg')
except Exception as e:
if DEBUG:
raise
log.error(
"Error reading setup.cfg: {0!r}\n{1} will not be "
"automatically bootstrapped and package installation may fail."
"\n{2}".format(e, PACKAGE_NAME, _err_help_msg))
# We used package_name in the package template for a while instead of name
if SETUP_CFG.has_option('metadata', 'name'):
parent_package = SETUP_CFG.get('metadata', 'name')
elif SETUP_CFG.has_option('metadata', 'package_name'):
parent_package = SETUP_CFG.get('metadata', 'package_name')
else:
parent_package = None
if SETUP_CFG.has_option('options', 'python_requires'):
python_requires = SETUP_CFG.get('options', 'python_requires')
# The python_requires key has a syntax that can be parsed by SpecifierSet
# in the packaging package. However, we don't want to have to depend on that
# package, so instead we can use setuptools (which bundles packaging). We
# have to add 'python' to parse it with Requirement.
from pkg_resources import Requirement
req = Requirement.parse('python' + python_requires)
# We want the Python version as a string, which we can get from the platform module
import platform
# strip off trailing '+' incase this is a dev install of python
python_version = platform.python_version().strip('+')
# allow pre-releases to count as 'new enough'
if not req.specifier.contains(python_version, True):
if parent_package is None:
message = "ERROR: Python {} is required by this package\n".format(req.specifier)
else:
message = "ERROR: Python {} is required by {}\n".format(req.specifier, parent_package)
sys.stderr.write(message)
sys.exit(1)
if sys.version_info < __minimum_python_version__:
if parent_package is None:
message = "ERROR: Python {} or later is required by astropy-helpers\n".format(
__minimum_python_version__)
else:
message = "ERROR: Python {} or later is required by astropy-helpers for {}\n".format(
__minimum_python_version__, parent_package)
sys.stderr.write(message)
sys.exit(1)
_str_types = (str, bytes)
# What follows are several import statements meant to deal with install-time
# issues with either missing or misbehaving pacakges (including making sure
# setuptools itself is installed):
# Check that setuptools 30.3 or later is present
from distutils.version import LooseVersion
try:
import setuptools
assert LooseVersion(setuptools.__version__) >= LooseVersion('30.3')
except (ImportError, AssertionError):
sys.stderr.write("ERROR: setuptools 30.3 or later is required by astropy-helpers\n")
sys.exit(1)
# typing as a dependency for 1.6.1+ Sphinx causes issues when imported after
# initializing submodule with ah_boostrap.py
# See discussion and references in
# https://github.com/astropy/astropy-helpers/issues/302
try:
import typing # noqa
except ImportError:
pass
# Note: The following import is required as a workaround to
# https://github.com/astropy/astropy-helpers/issues/89; if we don't import this
# module now, it will get cleaned up after `run_setup` is called, but that will
# later cause the TemporaryDirectory class defined in it to stop working when
# used later on by setuptools
try:
import setuptools.py31compat # noqa
except ImportError:
pass
# matplotlib can cause problems if it is imported from within a call of
# run_setup(), because in some circumstances it will try to write to the user's
# home directory, resulting in a SandboxViolation. See
# https://github.com/matplotlib/matplotlib/pull/4165
# Making sure matplotlib, if it is available, is imported early in the setup
# process can mitigate this (note importing matplotlib.pyplot has the same
# issue)
try:
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot
except:
# Ignore if this fails for *any* reason*
pass
# End compatibility imports...
class _Bootstrapper(object):
"""
Bootstrapper implementation. See ``use_astropy_helpers`` for parameter
documentation.
"""
def __init__(self, path=None, index_url=None, use_git=None, offline=None,
download_if_needed=None, auto_upgrade=None):
if path is None:
path = PACKAGE_NAME
if not (isinstance(path, _str_types) or path is False):
raise TypeError('path must be a string or False')
if not isinstance(path, str):
fs_encoding = sys.getfilesystemencoding()
path = path.decode(fs_encoding) # path to unicode
self.path = path
# Set other option attributes, using defaults where necessary
self.index_url = index_url if index_url is not None else INDEX_URL
self.offline = offline if offline is not None else OFFLINE
# If offline=True, override download and auto-upgrade
if self.offline:
download_if_needed = False
auto_upgrade = False
self.download = (download_if_needed
if download_if_needed is not None
else DOWNLOAD_IF_NEEDED)
self.auto_upgrade = (auto_upgrade
if auto_upgrade is not None else AUTO_UPGRADE)
# If this is a release then the .git directory will not exist so we
# should not use git.
git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git'))
if use_git is None and not git_dir_exists:
use_git = False
self.use_git = use_git if use_git is not None else USE_GIT
# Declared as False by default--later we check if astropy-helpers can be
# upgraded from PyPI, but only if not using a source distribution (as in
# the case of import from a git submodule)
self.is_submodule = False
@classmethod
def main(cls, argv=None):
if argv is None:
argv = sys.argv
config = cls.parse_config()
config.update(cls.parse_command_line(argv))
auto_use = config.pop('auto_use', False)
bootstrapper = cls(**config)
if auto_use:
# Run the bootstrapper, otherwise the setup.py is using the old
# use_astropy_helpers() interface, in which case it will run the
# bootstrapper manually after reconfiguring it.
bootstrapper.run()
return bootstrapper
@classmethod
def parse_config(cls):
if not SETUP_CFG.has_section('ah_bootstrap'):
return {}
config = {}
for option, type_ in CFG_OPTIONS:
if not SETUP_CFG.has_option('ah_bootstrap', option):
continue
if type_ is bool:
value = SETUP_CFG.getboolean('ah_bootstrap', option)
else:
value = SETUP_CFG.get('ah_bootstrap', option)
config[option] = value
return config
@classmethod
def parse_command_line(cls, argv=None):
if argv is None:
argv = sys.argv
config = {}
# For now we just pop recognized ah_bootstrap options out of the
# arg list. This is imperfect; in the unlikely case that a setup.py
# custom command or even custom Distribution class defines an argument
# of the same name then we will break that. However there's a catch22
# here that we can't just do full argument parsing right here, because
# we don't yet know *how* to parse all possible command-line arguments.
if '--no-git' in argv:
config['use_git'] = False
argv.remove('--no-git')
if '--offline' in argv:
config['offline'] = True
argv.remove('--offline')
if '--auto-use' in argv:
config['auto_use'] = True
argv.remove('--auto-use')
if '--no-auto-use' in argv:
config['auto_use'] = False
argv.remove('--no-auto-use')
if '--use-system-astropy-helpers' in argv:
config['auto_use'] = False
argv.remove('--use-system-astropy-helpers')
return config
def run(self):
strategies = ['local_directory', 'local_file', 'index']
dist = None
# First, remove any previously imported versions of astropy_helpers;
# this is necessary for nested installs where one package's installer
# is installing another package via setuptools.sandbox.run_setup, as in
# the case of setup_requires
for key in list(sys.modules):
try:
if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'):
del sys.modules[key]
except AttributeError:
# Sometimes mysterious non-string things can turn up in
# sys.modules
continue
# Check to see if the path is a submodule
self.is_submodule = self._check_submodule()
for strategy in strategies:
method = getattr(self, 'get_{0}_dist'.format(strategy))
dist = method()
if dist is not None:
break
else:
raise _AHBootstrapSystemExit(
"No source found for the {0!r} package; {0} must be "
"available and importable as a prerequisite to building "
"or installing this package.".format(PACKAGE_NAME))
# This is a bit hacky, but if astropy_helpers was loaded from a
# directory/submodule its Distribution object gets a "precedence" of
# "DEVELOP_DIST". However, in other cases it gets a precedence of
# "EGG_DIST". However, when activing the distribution it will only be
# placed early on sys.path if it is treated as an EGG_DIST, so always
# do that
dist = dist.clone(precedence=pkg_resources.EGG_DIST)
# Otherwise we found a version of astropy-helpers, so we're done
# Just active the found distribution on sys.path--if we did a
# download this usually happens automatically but it doesn't hurt to
# do it again
# Note: Adding the dist to the global working set also activates it
# (makes it importable on sys.path) by default.
try:
pkg_resources.working_set.add(dist, replace=True)
except TypeError:
# Some (much) older versions of setuptools do not have the
# replace=True option here. These versions are old enough that all
# bets may be off anyways, but it's easy enough to work around just
# in case...
if dist.key in pkg_resources.working_set.by_key:
del pkg_resources.working_set.by_key[dist.key]
pkg_resources.working_set.add(dist)
@property
def config(self):
"""
A `dict` containing the options this `_Bootstrapper` was configured
with.
"""
return dict((optname, getattr(self, optname))
for optname, _ in CFG_OPTIONS if hasattr(self, optname))
def get_local_directory_dist(self):
"""
Handle importing a vendored package from a subdirectory of the source
distribution.
"""
if not os.path.isdir(self.path):
return
log.info('Attempting to import astropy_helpers from {0} {1!r}'.format(
'submodule' if self.is_submodule else 'directory',
self.path))
dist = self._directory_import()
if dist is None:
log.warn(
'The requested path {0!r} for importing {1} does not '
'exist, or does not contain a copy of the {1} '
'package.'.format(self.path, PACKAGE_NAME))
elif self.auto_upgrade and not self.is_submodule:
# A version of astropy-helpers was found on the available path, but
# check to see if a bugfix release is available on PyPI
upgrade = self._do_upgrade(dist)
if upgrade is not None:
dist = upgrade
return dist
def get_local_file_dist(self):
"""
Handle importing from a source archive; this also uses setup_requires
but points easy_install directly to the source archive.
"""
if not os.path.isfile(self.path):
return
log.info('Attempting to unpack and import astropy_helpers from '
'{0!r}'.format(self.path))
try:
dist = self._do_download(find_links=[self.path])
except Exception as e:
if DEBUG:
raise
log.warn(
'Failed to import {0} from the specified archive {1!r}: '
'{2}'.format(PACKAGE_NAME, self.path, str(e)))
dist = None
if dist is not None and self.auto_upgrade:
# A version of astropy-helpers was found on the available path, but
# check to see if a bugfix release is available on PyPI
upgrade = self._do_upgrade(dist)
if upgrade is not None:
dist = upgrade
return dist
def get_index_dist(self):
if not self.download:
log.warn('Downloading {0!r} disabled.'.format(DIST_NAME))
return None
log.warn(
"Downloading {0!r}; run setup.py with the --offline option to "
"force offline installation.".format(DIST_NAME))
try:
dist = self._do_download()
except Exception as e:
if DEBUG:
raise
log.warn(
'Failed to download and/or install {0!r} from {1!r}:\n'
'{2}'.format(DIST_NAME, self.index_url, str(e)))
dist = None
# No need to run auto-upgrade here since we've already presumably
# gotten the most up-to-date version from the package index
return dist
def _directory_import(self):
"""
Import astropy_helpers from the given path, which will be added to
sys.path.
Must return True if the import succeeded, and False otherwise.
"""
# Return True on success, False on failure but download is allowed, and
# otherwise raise SystemExit
path = os.path.abspath(self.path)
# Use an empty WorkingSet rather than the man
# pkg_resources.working_set, since on older versions of setuptools this
# will invoke a VersionConflict when trying to install an upgrade
ws = pkg_resources.WorkingSet([])
ws.add_entry(path)
dist = ws.by_key.get(DIST_NAME)
if dist is None:
# We didn't find an egg-info/dist-info in the given path, but if a
# setup.py exists we can generate it
setup_py = os.path.join(path, 'setup.py')
if os.path.isfile(setup_py):
# We use subprocess instead of run_setup from setuptools to
# avoid segmentation faults - see the following for more details:
# https://github.com/cython/cython/issues/2104
sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path)
for dist in pkg_resources.find_distributions(path, True):
# There should be only one...
return dist
return dist
def _do_download(self, version='', find_links=None):
if find_links:
allow_hosts = ''
index_url = None
else:
allow_hosts = None
index_url = self.index_url
# Annoyingly, setuptools will not handle other arguments to
# Distribution (such as options) before handling setup_requires, so it
# is not straightforward to programmatically augment the arguments which
# are passed to easy_install
class _Distribution(Distribution):
def get_option_dict(self, command_name):
opts = Distribution.get_option_dict(self, command_name)
if command_name == 'easy_install':
if find_links is not None:
opts['find_links'] = ('setup script', find_links)
if index_url is not None:
opts['index_url'] = ('setup script', index_url)
if allow_hosts is not None:
opts['allow_hosts'] = ('setup script', allow_hosts)
return opts
if version:
req = '{0}=={1}'.format(DIST_NAME, version)
else:
if UPPER_VERSION_EXCLUSIVE is None:
req = DIST_NAME
else:
req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE)
attrs = {'setup_requires': [req]}
# NOTE: we need to parse the config file (e.g. setup.cfg) to make sure
# it honours the options set in the [easy_install] section, and we need
# to explicitly fetch the requirement eggs as setup_requires does not
# get honored in recent versions of setuptools:
# https://github.com/pypa/setuptools/issues/1273
try:
context = _verbose if DEBUG else _silence
with context():
dist = _Distribution(attrs=attrs)
try:
dist.parse_config_files(ignore_option_errors=True)
dist.fetch_build_eggs(req)
except TypeError:
# On older versions of setuptools, ignore_option_errors
# doesn't exist, and the above two lines are not needed
# so we can just continue
pass
# If the setup_requires succeeded it will have added the new dist to
# the main working_set
return pkg_resources.working_set.by_key.get(DIST_NAME)
except Exception as e:
if DEBUG:
raise
msg = 'Error retrieving {0} from {1}:\n{2}'
if find_links:
source = find_links[0]
elif index_url != INDEX_URL:
source = index_url
else:
source = 'PyPI'
raise Exception(msg.format(DIST_NAME, source, repr(e)))
def _do_upgrade(self, dist):
# Build up a requirement for a higher bugfix release but a lower minor
# release (so API compatibility is guaranteed)
next_version = _next_version(dist.parsed_version)
req = pkg_resources.Requirement.parse(
'{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version))
package_index = PackageIndex(index_url=self.index_url)
upgrade = package_index.obtain(req)
if upgrade is not None:
return self._do_download(version=upgrade.version)
def _check_submodule(self):
"""
Check if the given path is a git submodule.
See the docstrings for ``_check_submodule_using_git`` and
``_check_submodule_no_git`` for further details.
"""
if (self.path is None or
(os.path.exists(self.path) and not os.path.isdir(self.path))):
return False
if self.use_git:
return self._check_submodule_using_git()
else:
return self._check_submodule_no_git()
def _check_submodule_using_git(self):
"""
Check if the given path is a git submodule. If so, attempt to initialize
and/or update the submodule if needed.
This function makes calls to the ``git`` command in subprocesses. The
``_check_submodule_no_git`` option uses pure Python to check if the given
path looks like a git submodule, but it cannot perform updates.
"""
cmd = ['git', 'submodule', 'status', '--', self.path]
try:
log.info('Running `{0}`; use the --no-git option to disable git '
'commands'.format(' '.join(cmd)))
returncode, stdout, stderr = run_cmd(cmd)
except _CommandNotFound:
# The git command simply wasn't found; this is most likely the
# case on user systems that don't have git and are simply
# trying to install the package from PyPI or a source
# distribution. Silently ignore this case and simply don't try
# to use submodules
return False
stderr = stderr.strip()
if returncode != 0 and stderr:
# Unfortunately the return code alone cannot be relied on, as
# earlier versions of git returned 0 even if the requested submodule
# does not exist
# This is a warning that occurs in perl (from running git submodule)
# which only occurs with a malformatted locale setting which can
# happen sometimes on OSX. See again
# https://github.com/astropy/astropy/issues/2749
perl_warning = ('perl: warning: Falling back to the standard locale '
'("C").')
if not stderr.strip().endswith(perl_warning):
# Some other unknown error condition occurred
log.warn('git submodule command failed '
'unexpectedly:\n{0}'.format(stderr))
return False
# Output of `git submodule status` is as follows:
#
# 1: Status indicator: '-' for submodule is uninitialized, '+' if
# submodule is initialized but is not at the commit currently indicated
# in .gitmodules (and thus needs to be updated), or 'U' if the
# submodule is in an unstable state (i.e. has merge conflicts)
#
# 2. SHA-1 hash of the current commit of the submodule (we don't really
# need this information but it's useful for checking that the output is
# correct)
#
# 3. The output of `git describe` for the submodule's current commit
# hash (this includes for example what branches the commit is on) but
# only if the submodule is initialized. We ignore this information for
# now
_git_submodule_status_re = re.compile(
r'^(?P[+-U ])(?P[0-9a-f]{40}) '
r'(?P\S+)( .*)?$')
# The stdout should only contain one line--the status of the
# requested submodule
m = _git_submodule_status_re.match(stdout)
if m:
# Yes, the path *is* a git submodule
self._update_submodule(m.group('submodule'), m.group('status'))
return True
else:
log.warn(
'Unexpected output from `git submodule status`:\n{0}\n'
'Will attempt import from {1!r} regardless.'.format(
stdout, self.path))
return False
def _check_submodule_no_git(self):
"""
Like ``_check_submodule_using_git``, but simply parses the .gitmodules file
to determine if the supplied path is a git submodule, and does not exec any
subprocesses.
This can only determine if a path is a submodule--it does not perform
updates, etc. This function may need to be updated if the format of the
.gitmodules file is changed between git versions.
"""
gitmodules_path = os.path.abspath('.gitmodules')
if not os.path.isfile(gitmodules_path):
return False
# This is a minimal reader for gitconfig-style files. It handles a few of
# the quirks that make gitconfig files incompatible with ConfigParser-style
# files, but does not support the full gitconfig syntax (just enough
# needed to read a .gitmodules file).
gitmodules_fileobj = io.StringIO()
# Must use io.open for cross-Python-compatible behavior wrt unicode
with io.open(gitmodules_path) as f:
for line in f:
# gitconfig files are more flexible with leading whitespace; just
# go ahead and remove it
line = line.lstrip()
# comments can start with either # or ;
if line and line[0] in (':', ';'):
continue
gitmodules_fileobj.write(line)
gitmodules_fileobj.seek(0)
cfg = RawConfigParser()
try:
cfg.readfp(gitmodules_fileobj)
except Exception as exc:
log.warn('Malformatted .gitmodules file: {0}\n'
'{1} cannot be assumed to be a git submodule.'.format(
exc, self.path))
return False
for section in cfg.sections():
if not cfg.has_option(section, 'path'):
continue
submodule_path = cfg.get(section, 'path').rstrip(os.sep)
if submodule_path == self.path.rstrip(os.sep):
return True
return False
def _update_submodule(self, submodule, status):
if status == ' ':
# The submodule is up to date; no action necessary
return
elif status == '-':
if self.offline:
raise _AHBootstrapSystemExit(
"Cannot initialize the {0} submodule in --offline mode; "
"this requires being able to clone the submodule from an "
"online repository.".format(submodule))
cmd = ['update', '--init']
action = 'Initializing'
elif status == '+':
cmd = ['update']
action = 'Updating'
if self.offline:
cmd.append('--no-fetch')
elif status == 'U':
raise _AHBootstrapSystemExit(
'Error: Submodule {0} contains unresolved merge conflicts. '
'Please complete or abandon any changes in the submodule so that '
'it is in a usable state, then try again.'.format(submodule))
else:
log.warn('Unknown status {0!r} for git submodule {1!r}. Will '
'attempt to use the submodule as-is, but try to ensure '
'that the submodule is in a clean state and contains no '
'conflicts or errors.\n{2}'.format(status, submodule,
_err_help_msg))
return
err_msg = None
cmd = ['git', 'submodule'] + cmd + ['--', submodule]
log.warn('{0} {1} submodule with: `{2}`'.format(
action, submodule, ' '.join(cmd)))
try:
log.info('Running `{0}`; use the --no-git option to disable git '
'commands'.format(' '.join(cmd)))
returncode, stdout, stderr = run_cmd(cmd)
except OSError as e:
err_msg = str(e)
else:
if returncode != 0:
err_msg = stderr
if err_msg is not None:
log.warn('An unexpected error occurred updating the git submodule '
'{0!r}:\n{1}\n{2}'.format(submodule, err_msg,
_err_help_msg))
class _CommandNotFound(OSError):
"""
An exception raised when a command run with run_cmd is not found on the
system.
"""
def run_cmd(cmd):
"""
Run a command in a subprocess, given as a list of command-line
arguments.
Returns a ``(returncode, stdout, stderr)`` tuple.
"""
try:
p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE)
# XXX: May block if either stdout or stderr fill their buffers;
# however for the commands this is currently used for that is
# unlikely (they should have very brief output)
stdout, stderr = p.communicate()
except OSError as e:
if DEBUG:
raise
if e.errno == errno.ENOENT:
msg = 'Command not found: `{0}`'.format(' '.join(cmd))
raise _CommandNotFound(msg, cmd)
else:
raise _AHBootstrapSystemExit(
'An unexpected error occurred when running the '
'`{0}` command:\n{1}'.format(' '.join(cmd), str(e)))
# Can fail of the default locale is not configured properly. See
# https://github.com/astropy/astropy/issues/2749. For the purposes under
# consideration 'latin1' is an acceptable fallback.
try:
stdio_encoding = locale.getdefaultlocale()[1] or 'latin1'
except ValueError:
# Due to an OSX oddity locale.getdefaultlocale() can also crash
# depending on the user's locale/language settings. See:
# http://bugs.python.org/issue18378
stdio_encoding = 'latin1'
# Unlikely to fail at this point but even then let's be flexible
if not isinstance(stdout, str):
stdout = stdout.decode(stdio_encoding, 'replace')
if not isinstance(stderr, str):
stderr = stderr.decode(stdio_encoding, 'replace')
return (p.returncode, stdout, stderr)
def _next_version(version):
"""
Given a parsed version from pkg_resources.parse_version, returns a new
version string with the next minor version.
Examples
========
>>> _next_version(pkg_resources.parse_version('1.2.3'))
'1.3.0'
"""
if hasattr(version, 'base_version'):
# New version parsing from setuptools >= 8.0
if version.base_version:
parts = version.base_version.split('.')
else:
parts = []
else:
parts = []
for part in version:
if part.startswith('*'):
break
parts.append(part)
parts = [int(p) for p in parts]
if len(parts) < 3:
parts += [0] * (3 - len(parts))
major, minor, micro = parts[:3]
return '{0}.{1}.{2}'.format(major, minor + 1, 0)
class _DummyFile(object):
"""A noop writeable object."""
errors = '' # Required for Python 3.x
encoding = 'utf-8'
def write(self, s):
pass
def flush(self):
pass
@contextlib.contextmanager
def _verbose():
yield
@contextlib.contextmanager
def _silence():
"""A context manager that silences sys.stdout and sys.stderr."""
old_stdout = sys.stdout
old_stderr = sys.stderr
sys.stdout = _DummyFile()
sys.stderr = _DummyFile()
exception_occurred = False
try:
yield
except:
exception_occurred = True
# Go ahead and clean up so that exception handling can work normally
sys.stdout = old_stdout
sys.stderr = old_stderr
raise
if not exception_occurred:
sys.stdout = old_stdout
sys.stderr = old_stderr
class _AHBootstrapSystemExit(SystemExit):
def __init__(self, *args):
if not args:
msg = 'An unknown problem occurred bootstrapping astropy_helpers.'
else:
msg = args[0]
msg += '\n' + _err_help_msg
super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:])
BOOTSTRAPPER = _Bootstrapper.main()
def use_astropy_helpers(**kwargs):
"""
Ensure that the `astropy_helpers` module is available and is importable.
This supports automatic submodule initialization if astropy_helpers is
included in a project as a git submodule, or will download it from PyPI if
necessary.
Parameters
----------
path : str or None, optional
A filesystem path relative to the root of the project's source code
that should be added to `sys.path` so that `astropy_helpers` can be
imported from that path.
If the path is a git submodule it will automatically be initialized
and/or updated.
The path may also be to a ``.tar.gz`` archive of the astropy_helpers
source distribution. In this case the archive is automatically
unpacked and made temporarily available on `sys.path` as a ``.egg``
archive.
If `None` skip straight to downloading.
download_if_needed : bool, optional
If the provided filesystem path is not found an attempt will be made to
download astropy_helpers from PyPI. It will then be made temporarily
available on `sys.path` as a ``.egg`` archive (using the
``setup_requires`` feature of setuptools. If the ``--offline`` option
is given at the command line the value of this argument is overridden
to `False`.
index_url : str, optional
If provided, use a different URL for the Python package index than the
main PyPI server.
use_git : bool, optional
If `False` no git commands will be used--this effectively disables
support for git submodules. If the ``--no-git`` option is given at the
command line the value of this argument is overridden to `False`.
auto_upgrade : bool, optional
By default, when installing a package from a non-development source
distribution ah_boostrap will try to automatically check for patch
releases to astropy-helpers on PyPI and use the patched version over
any bundled versions. Setting this to `False` will disable that
functionality. If the ``--offline`` option is given at the command line
the value of this argument is overridden to `False`.
offline : bool, optional
If `False` disable all actions that require an internet connection,
including downloading packages from the package index and fetching
updates to any git submodule. Defaults to `True`.
"""
global BOOTSTRAPPER
config = BOOTSTRAPPER.config
config.update(**kwargs)
# Create a new bootstrapper with the updated configuration and run it
BOOTSTRAPPER = _Bootstrapper(**config)
BOOTSTRAPPER.run()
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5238602
astropy-healpix-0.5/astropy_healpix/ 0000755 0000770 0000024 00000000000 00000000000 017641 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1526665132.2514584
astropy-healpix-0.5/astropy_healpix/__init__.py 0000644 0000770 0000024 00000000640 00000000000 021752 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
"""
BSD-licensed HEALPix for Astropy.
"""
# Affiliated packages may add whatever they like to this file, but
# should keep this content at the top.
from ._astropy_init import * # noqa
# For egg_info test builds to pass, put package imports here.
if not _ASTROPY_SETUP_: # noqa
from .high_level import * # noqa
from .core import * # noqa
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.4981136
astropy-healpix-0.5/astropy_healpix/_astropy_init.py 0000644 0000770 0000024 00000003564 00000000000 023106 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
__all__ = ['__version__', '__githash__']
# this indicates whether or not we are in the package's setup.py
try:
_ASTROPY_SETUP_
except NameError:
import builtins
builtins._ASTROPY_SETUP_ = False
try:
from .version import version as __version__
except ImportError:
__version__ = ''
try:
from .version import githash as __githash__
except ImportError:
__githash__ = ''
if not _ASTROPY_SETUP_: # noqa
import os
from warnings import warn
from astropy.config.configuration import (
update_default_config,
ConfigurationDefaultMissingError,
ConfigurationDefaultMissingWarning)
# Create the test function for self test
from astropy.tests.runner import TestRunner
test = TestRunner.make_test_runner_in(os.path.dirname(__file__))
test.__test__ = False
__all__ += ['test']
# add these here so we only need to cleanup the namespace at the end
config_dir = None
if not os.environ.get('ASTROPY_SKIP_CONFIG_UPDATE', False):
config_dir = os.path.dirname(__file__)
config_template = os.path.join(config_dir, __package__ + ".cfg")
if os.path.isfile(config_template):
try:
update_default_config(
__package__, config_dir, version=__version__)
except TypeError as orig_error:
try:
update_default_config(__package__, config_dir)
except ConfigurationDefaultMissingError as e:
wmsg = (e.args[0] +
" Cannot install default profile. If you are "
"importing from source, this is expected.")
warn(ConfigurationDefaultMissingWarning(wmsg))
del e
except Exception:
raise orig_error
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.3273942
astropy-healpix-0.5/astropy_healpix/_compiler.c 0000644 0000770 0000024 00000005240 00000000000 021757 0 ustar 00tom staff 0000000 0000000 #include
/***************************************************************************
* Macros for determining the compiler version.
*
* These are borrowed from boost, and majorly abridged to include only
* the compilers we care about.
***************************************************************************/
#define STRINGIZE(X) DO_STRINGIZE(X)
#define DO_STRINGIZE(X) #X
#if defined __clang__
/* Clang C++ emulates GCC, so it has to appear early. */
# define COMPILER "Clang version " __clang_version__
#elif defined(__INTEL_COMPILER) || defined(__ICL) || defined(__ICC) || defined(__ECC)
/* Intel */
# if defined(__INTEL_COMPILER)
# define INTEL_VERSION __INTEL_COMPILER
# elif defined(__ICL)
# define INTEL_VERSION __ICL
# elif defined(__ICC)
# define INTEL_VERSION __ICC
# elif defined(__ECC)
# define INTEL_VERSION __ECC
# endif
# define COMPILER "Intel C compiler version " STRINGIZE(INTEL_VERSION)
#elif defined(__GNUC__)
/* gcc */
# define COMPILER "GCC version " __VERSION__
#elif defined(__SUNPRO_CC)
/* Sun Workshop Compiler */
# define COMPILER "Sun compiler version " STRINGIZE(__SUNPRO_CC)
#elif defined(_MSC_VER)
/* Microsoft Visual C/C++
Must be last since other compilers define _MSC_VER for compatibility as well */
# if _MSC_VER < 1200
# define COMPILER_VERSION 5.0
# elif _MSC_VER < 1300
# define COMPILER_VERSION 6.0
# elif _MSC_VER == 1300
# define COMPILER_VERSION 7.0
# elif _MSC_VER == 1310
# define COMPILER_VERSION 7.1
# elif _MSC_VER == 1400
# define COMPILER_VERSION 8.0
# elif _MSC_VER == 1500
# define COMPILER_VERSION 9.0
# elif _MSC_VER == 1600
# define COMPILER_VERSION 10.0
# else
# define COMPILER_VERSION _MSC_VER
# endif
# define COMPILER "Microsoft Visual C++ version " STRINGIZE(COMPILER_VERSION)
#else
/* Fallback */
# define COMPILER "Unknown compiler"
#endif
/***************************************************************************
* Module-level
***************************************************************************/
struct module_state {
/* The Sun compiler can't handle empty structs */
#if defined(__SUNPRO_C) || defined(_MSC_VER)
int _dummy;
#endif
};
static struct PyModuleDef moduledef = {
PyModuleDef_HEAD_INIT,
"compiler_version",
NULL,
sizeof(struct module_state),
NULL,
NULL,
NULL,
NULL,
NULL
};
#define INITERROR return NULL
PyMODINIT_FUNC
PyInit_compiler_version(void)
{
PyObject* m;
m = PyModule_Create(&moduledef);
if (m == NULL)
INITERROR;
PyModule_AddStringConstant(m, "compiler", COMPILER);
return m;
}
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574433289.498598
astropy-healpix-0.5/astropy_healpix/_core.c 0000644 0000770 0000024 00000026656 00000000000 021113 0 ustar 00tom staff 0000000 0000000 #define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION
#include
#include
#include
#include "healpix.h"
#include "healpix-utils.h"
#include "interpolation.h"
/* FIXME: We need npy_set_floatstatus_invalid(), but unlike most of the Numpy
* C API it is only available on some platforms if you explicitly link against
* Numpy, which is not typically done for building C extensions. This bundled
* header file contains a static definition of _npy_set_floatstatus_invalid().
*/
#include "ieee754.h"
#define INVALID_INDEX (-1)
/* Data structure for storing function pointers for routines that are specific
* to the HEALPix ordering scheme. When we create the ufuncs using
* PyUFunc_FromFuncAndData, we will set them up to pass a pointer to this
* data structure through as the void *data parameter for the loop functions
* defined below. */
typedef struct {
int64_t (*order_to_xy)(int64_t, int);
int64_t (*xy_to_order)(int64_t, int);
} order_funcs;
static order_funcs
nested_order_funcs = {healpixl_nested_to_xy, healpixl_xy_to_nested},
ring_order_funcs = {healpixl_ring_to_xy, healpixl_xy_to_ring};
static void *no_ufunc_data[] = {NULL},
*nested_ufunc_data[] = {&nested_order_funcs},
*ring_ufunc_data[] = {&ring_order_funcs};
static int64_t nside2npix(int nside)
{
return 12 * ((int64_t) nside) * ((int64_t) nside);
}
static int pixel_nside_valid(int64_t pixel, int nside)
{
return pixel >= 0 && pixel < nside2npix(nside);
}
static void healpix_to_lonlat_loop(
char **args, npy_intp *dimensions, npy_intp *steps, void *data)
{
order_funcs *funcs = data;
npy_intp i, n = dimensions[0];
for (i = 0; i < n; i ++)
{
int64_t pixel = *(int64_t *) &args[0][i * steps[0]];
int nside = *(int *) &args[1][i * steps[1]];
double dx = *(double *) &args[2][i * steps[2]];
double dy = *(double *) &args[3][i * steps[3]];
double *lon = (double *) &args[4][i * steps[4]];
double *lat = (double *) &args[5][i * steps[5]];
int64_t xy = INVALID_INDEX;
if (pixel_nside_valid(pixel, nside))
xy = funcs->order_to_xy(pixel, nside);
if (xy >= 0)
healpixl_to_radec(xy, nside, dx, dy, lon, lat);
else
{
*lon = *lat = NPY_NAN;
_npy_set_floatstatus_invalid();
}
}
}
static void lonlat_to_healpix_loop(
char **args, npy_intp *dimensions, npy_intp *steps, void *data)
{
order_funcs *funcs = data;
npy_intp i, n = dimensions[0];
for (i = 0; i < n; i ++)
{
double lon = *(double *) &args[0][i * steps[0]];
double lat = *(double *) &args[1][i * steps[1]];
int nside = *(int *) &args[2][i * steps[2]];
int64_t *pixel = (int64_t *) &args[3][i * steps[3]];
double *dx = (double *) &args[4][i * steps[4]];
double *dy = (double *) &args[5][i * steps[5]];
int64_t xy = INVALID_INDEX;
xy = radec_to_healpixlf(lon, lat, nside, dx, dy);
if (xy >= 0)
*pixel = funcs->xy_to_order(xy, nside);
else {
*pixel = INVALID_INDEX;
*dx = *dy = NPY_NAN;
_npy_set_floatstatus_invalid();
}
}
}
static void nested_to_ring_loop(
char **args, npy_intp *dimensions, npy_intp *steps, void *data)
{
npy_intp i, n = dimensions[0];
for (i = 0; i < n; i ++)
{
int64_t nested = *(int64_t *) &args[0][i * steps[0]];
int nside = *(int *) &args[1][i * steps[1]];
int64_t *ring = (int64_t *) &args[2][i * steps[2]];
int64_t xy = INVALID_INDEX;
if (pixel_nside_valid(nested, nside))
xy = healpixl_nested_to_xy(nested, nside);
if (xy >= 0)
*ring = healpixl_xy_to_ring(xy, nside);
else {
*ring = INVALID_INDEX;
_npy_set_floatstatus_invalid();
}
}
}
static void ring_to_nested_loop(
char **args, npy_intp *dimensions, npy_intp *steps, void *data)
{
npy_intp i, n = dimensions[0];
for (i = 0; i < n; i ++)
{
int64_t ring = *(int64_t *) &args[0][i * steps[0]];
int nside = *(int *) &args[1][i * steps[1]];
int64_t *nested = (int64_t *) &args[2][i * steps[2]];
int64_t xy = INVALID_INDEX;
if (pixel_nside_valid(ring, nside))
xy = healpixl_ring_to_xy(ring, nside);
if (xy >= 0)
*nested = healpixl_xy_to_nested(xy, nside);
else {
*nested = INVALID_INDEX;
_npy_set_floatstatus_invalid();
}
}
}
static void bilinear_interpolation_weights_loop(
char **args, npy_intp *dimensions, npy_intp *steps, void *data)
{
npy_intp i, n = dimensions[0];
for (i = 0; i < n; i ++)
{
double lon = *(double *) &args[0][i * steps[0]];
double lat = *(double *) &args[1][i * steps[1]];
int nside = *(int *) &args[2][i * steps[2]];
int64_t indices[4];
double weights[4];
int j;
interpolate_weights(lon, lat, indices, weights, nside);
for (j = 0; j < 4; j ++)
{
*(int64_t *) &args[3 + j][i * steps[3 + j]] = indices[j];
*(double *) &args[7 + j][i * steps[7 + j]] = weights[j];
}
}
}
static void neighbours_loop(
char **args, npy_intp *dimensions, npy_intp *steps, void *data)
{
order_funcs *funcs = data;
npy_intp i, n = dimensions[0];
for (i = 0; i < n; i ++)
{
int64_t pixel = *(int64_t *) &args[0][i * steps[0]];
int nside = *(int *) &args[1][i * steps[1]];
int64_t neighbours[] = {
INVALID_INDEX, INVALID_INDEX, INVALID_INDEX, INVALID_INDEX,
INVALID_INDEX, INVALID_INDEX, INVALID_INDEX, INVALID_INDEX};
int j;
int64_t xy = INVALID_INDEX;
if (pixel_nside_valid(pixel, nside))
xy = funcs->order_to_xy(pixel, nside);
if (xy >= 0)
healpixl_get_neighbours(xy, neighbours, nside);
for (j = 0; j < 8; j ++)
{
int k = 4 - j;
if (k < 0)
k += 8;
xy = neighbours[k];
if (xy >= 0)
pixel = funcs->xy_to_order(xy, nside);
else {
pixel = INVALID_INDEX;
_npy_set_floatstatus_invalid();
}
*(int64_t *) &args[2 + j][i * steps[2 + j]] = pixel;
}
}
}
static PyObject *healpix_cone_search(
PyObject *self, PyObject *args, PyObject *kwargs)
{
PyObject *result;
static char *kws[] = {"lon", "lat", "radius", "nside", "order", NULL};
double lon, lat, radius;
int nside;
char *order;
int64_t *indices, n_indices;
int64_t *result_data;
npy_intp dims[1];
if (!PyArg_ParseTupleAndKeywords(
args, kwargs, "dddis", kws, &lon, &lat, &radius, &nside, &order))
return NULL;
n_indices = healpix_rangesearch_radec_simple(
lon, lat, radius, nside, 0, &indices);
if (!indices)
{
PyErr_SetString(
PyExc_RuntimeError, "healpix_rangesearch_radec_simple failed");
return NULL;
}
dims[0] = n_indices;
result = PyArray_SimpleNew(1, dims, NPY_INT64);
if (result)
{
result_data = PyArray_DATA((PyArrayObject *) result);
if (strcmp(order, "nested") == 0)
{
int i;
for (i = 0; i < n_indices; i ++)
result_data[i] = healpixl_xy_to_nested(indices[i], nside);
} else {
int i;
for (i = 0; i < n_indices; i ++)
result_data[i] = healpixl_xy_to_ring(indices[i], nside);
}
}
free(indices);
return result;
}
static PyMethodDef methods[] = {
{"healpix_cone_search", (PyCFunction) healpix_cone_search, METH_VARARGS | METH_KEYWORDS, NULL},
{NULL, NULL, 0, NULL}
};
static PyModuleDef moduledef = {
PyModuleDef_HEAD_INIT,
"_core", NULL, -1, methods
};
static PyUFuncGenericFunction
healpix_to_lonlat_loops [] = {healpix_to_lonlat_loop},
lonlat_to_healpix_loops [] = {lonlat_to_healpix_loop},
nested_to_ring_loops [] = {nested_to_ring_loop},
ring_to_nested_loops [] = {ring_to_nested_loop},
bilinear_interpolation_weights_loops[] = {bilinear_interpolation_weights_loop},
neighbours_loops [] = {neighbours_loop};
static char
healpix_to_lonlat_types[] = {
NPY_INT64, NPY_INT, NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE},
lonlat_to_healpix_types[] = {
NPY_DOUBLE, NPY_DOUBLE, NPY_INT, NPY_INT64, NPY_DOUBLE, NPY_DOUBLE},
healpix_to_healpix_types[] = {
NPY_INT64, NPY_INT, NPY_INT64},
bilinear_interpolation_weights_types[] = {
NPY_DOUBLE, NPY_DOUBLE, NPY_INT,
NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64,
NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE, NPY_DOUBLE},
neighbours_types[] = {
NPY_INT64, NPY_INT,
NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64,
NPY_INT64, NPY_INT64, NPY_INT64, NPY_INT64};
PyMODINIT_FUNC PyInit__core(void)
{
PyObject *module;
import_array();
import_umath();
module = PyModule_Create(&moduledef);
PyModule_AddObject(
module, "healpix_nested_to_lonlat", PyUFunc_FromFuncAndData(
healpix_to_lonlat_loops, nested_ufunc_data,
healpix_to_lonlat_types, 1, 4, 2, PyUFunc_None,
"healpix_nested_to_lonlat", NULL, 0));
PyModule_AddObject(
module, "healpix_ring_to_lonlat", PyUFunc_FromFuncAndData(
healpix_to_lonlat_loops, ring_ufunc_data,
healpix_to_lonlat_types, 1, 4, 2, PyUFunc_None,
"healpix_ring_to_lonlat", NULL, 0));
PyModule_AddObject(
module, "lonlat_to_healpix_nested", PyUFunc_FromFuncAndData(
lonlat_to_healpix_loops, nested_ufunc_data,
lonlat_to_healpix_types, 1, 3, 3, PyUFunc_None,
"lonlat_to_healpix_nested", NULL, 0));
PyModule_AddObject(
module, "lonlat_to_healpix_ring", PyUFunc_FromFuncAndData(
lonlat_to_healpix_loops, ring_ufunc_data,
lonlat_to_healpix_types, 1, 3, 3, PyUFunc_None,
"lonlat_to_healpix_ring", NULL, 0));
PyModule_AddObject(
module, "nested_to_ring", PyUFunc_FromFuncAndData(
nested_to_ring_loops, no_ufunc_data,
healpix_to_healpix_types, 1, 2, 1, PyUFunc_None,
"nested_to_ring", NULL, 0));
PyModule_AddObject(
module, "ring_to_nested", PyUFunc_FromFuncAndData(
ring_to_nested_loops, no_ufunc_data,
healpix_to_healpix_types, 1, 2, 1, PyUFunc_None,
"ring_to_nested", NULL, 0));
PyModule_AddObject(
module, "bilinear_interpolation_weights", PyUFunc_FromFuncAndData(
bilinear_interpolation_weights_loops, no_ufunc_data,
bilinear_interpolation_weights_types, 1, 3, 8, PyUFunc_None,
"bilinear_interpolation_weights", NULL, 0));
PyModule_AddObject(
module, "neighbours_nested", PyUFunc_FromFuncAndData(
neighbours_loops, nested_ufunc_data,
neighbours_types, 1, 2, 8, PyUFunc_None,
"neighbours_nested", NULL, 0));
PyModule_AddObject(
module, "neighbours_ring", PyUFunc_FromFuncAndData(
neighbours_loops, ring_ufunc_data,
neighbours_types, 1, 2, 8, PyUFunc_None,
"neighbours_ring", NULL, 0));
return module;
}
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.4992325
astropy-healpix-0.5/astropy_healpix/bench.py 0000644 0000770 0000024 00000015062 00000000000 021276 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
"""Benchmarks for this package.
To run all benchmarks and print a report to the console::
python -m healpix.bench
You can also run the benchmarks first, save the results dict
to disk as a JSON file (or share it with others) and then
print the results later, or compare them with other results.
We should now that this is not very comprehensive / flexible.
If your application depends on performance of HEALPix computations,
you should write benchmarks with cases relevant for that application
and check if HEALPix computations are really the bottleneck and if
this package is fast enough for you or not.
"""
import timeit
from astropy.table import Table
# NOTE: If healpy is installed, we use it in the benchmarks, but healpy is not
# a formal dependency of astropy-healpix.
try:
import healpy as hp # noqa
except ImportError:
HEALPY_INSTALLED = False
else:
HEALPY_INSTALLED = True
# Copied from https://github.com/kwgoodman/bottleneck/blob/master/bottleneck/benchmark/autotimeit.py
def autotimeit(stmt, setup='pass', repeat=3, mintime=0.2):
timer = timeit.Timer(stmt, setup)
number, time1 = autoscaler(timer, mintime)
time2 = timer.repeat(repeat=repeat - 1, number=number)
return min(time2 + [time1]) / number
# Copied from https://github.com/kwgoodman/bottleneck/blob/master/bottleneck/benchmark/autotimeit.py
def autoscaler(timer, mintime):
number = 1
for i in range(12):
time = timer.timeit(number)
if time > mintime:
return number, time
number *= 10
raise RuntimeError('function is too fast to test')
def get_import(package, function):
if package == 'astropy_healpix':
return f'from astropy_healpix.healpy import {function}'
else:
return f'from healpy import {function}'
def bench_pix2ang(size=None, nside=None, nest=None, package=None, fast=False):
setup = '\n'.join([
get_import(package, 'pix2ang'),
'import numpy as np',
f'nside={nside}',
f'ipix=(np.random.random({size}) * 12 * nside ** 2).astype(np.int64)',
f'nest={nest}'])
stmt = 'pix2ang(nside, ipix, nest)'
return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1)
def bench_ang2pix(size=None, nside=None, nest=None, package=None, fast=False):
setup = '\n'.join([
get_import(package, 'ang2pix'),
'import numpy as np',
f'nside={nside}',
f'lon=360 * np.random.random({size})',
f'lat=180 * np.random.random({size}) - 90',
f'nest={nest}'])
stmt = 'ang2pix(nside, lon, lat, nest, lonlat=True)'
return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1)
def bench_nest2ring(size=None, nside=None, package=None, fast=False):
setup = '\n'.join([
get_import(package, 'nest2ring'),
'import numpy as np',
f'nside={nside}',
f'ipix=(np.random.random({size}) * 12 * nside ** 2).astype(np.int64)'])
stmt = 'nest2ring(nside, ipix)'
return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1)
def bench_ring2nest(size=None, nside=None, package=None, fast=False):
setup = '\n'.join([
get_import(package, 'ring2nest'),
'import numpy as np',
f'nside={nside}',
f'ipix=(np.random.random({size}) * 12 * nside ** 2).astype(np.int64)'])
stmt = 'ring2nest(nside, ipix)'
return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1)
def bench_get_interp_weights(size=None, nside=None, nest=None, package=None, fast=False):
setup = '\n'.join([
get_import(package, 'get_interp_weights'),
'import numpy as np',
f'nside={nside}',
f'lon=360 * np.random.random({size})',
f'lat=180 * np.random.random({size}) - 90',
f'nest={nest}'])
stmt = 'get_interp_weights(nside, lon, lat, nest=nest, lonlat=True)'
return autotimeit(stmt=stmt, setup=setup, repeat=1, mintime=0 if fast else 0.1)
def run_single(name, benchmark, fast=False, **kwargs):
time_self = benchmark(package='astropy_healpix', fast=fast, **kwargs)
results_single = dict(function=name, time_self=time_self, **kwargs)
if HEALPY_INSTALLED:
time_healpy = bench_ang2pix(package='healpy', fast=fast, **kwargs)
results_single['time_healpy'] = time_healpy
return results_single
def bench_run(fast=False):
"""Run all benchmarks. Return results as a dict."""
results = []
if fast:
SIZES = [10, 1_000, 100_000]
else:
SIZES = [10, 1_000, 1_000_000]
for nest in [True, False]:
for size in SIZES:
for nside in [1, 128]:
results.append(run_single('pix2ang', bench_pix2ang, fast=fast,
size=size, nside=nside, nest=nest))
for nest in [True, False]:
for size in SIZES:
for nside in [1, 128]:
results.append(run_single('ang2pix', bench_ang2pix, fast=fast,
size=size, nside=nside, nest=nest))
for size in SIZES:
for nside in [1, 128]:
results.append(run_single('nest2ring', bench_nest2ring, fast=fast,
size=size, nside=nside))
for size in SIZES:
for nside in [1, 128]:
results.append(run_single('ring2nest', bench_ring2nest, fast=fast,
size=size, nside=nside))
for nest in [True, False]:
for size in SIZES:
for nside in [1, 128]:
results.append(run_single('get_interp_weights', bench_get_interp_weights,
fast=fast, size=size,
nside=nside, nest=nest))
return results
def bench_report(results):
"""Print a report for given benchmark results to the console."""
table = Table(names=['function', 'nest', 'nside', 'size',
'time_healpy', 'time_self', 'ratio'],
dtype=['S20', bool, int, int, float, float, float], masked=True)
for row in results:
table.add_row(row)
table['time_self'].format = '10.7f'
if HEALPY_INSTALLED:
table['ratio'] = table['time_self'] / table['time_healpy']
table['time_healpy'].format = '10.7f'
table['ratio'].format = '7.2f'
table.pprint(max_lines=-1)
def main(fast=False):
"""Run all benchmarks and print report to the console."""
print('Running benchmarks...\n')
results = bench_run(fast=fast)
bench_report(results)
if __name__ == '__main__':
main()
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5029109
astropy-healpix-0.5/astropy_healpix/conftest.py 0000644 0000770 0000024 00000002314 00000000000 022040 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
import os
import numpy as np
from astropy.version import version as astropy_version
if astropy_version < '3.0':
from astropy.tests.pytest_plugins import *
del pytest_report_header
else:
from pytest_astropy_header.display import PYTEST_HEADER_MODULES, TESTED_VERSIONS
from astropy.tests.helper import enable_deprecations_as_exceptions
enable_deprecations_as_exceptions()
def pytest_configure(config):
config.option.astropy_header = True
PYTEST_HEADER_MODULES.pop('h5py', None)
PYTEST_HEADER_MODULES.pop('Pandas', None)
PYTEST_HEADER_MODULES['Astropy'] = 'astropy'
PYTEST_HEADER_MODULES['healpy'] = 'healpy'
from .version import version, astropy_helpers_version
packagename = os.path.basename(os.path.dirname(__file__))
TESTED_VERSIONS[packagename] = version
TESTED_VERSIONS['astropy_helpers'] = astropy_helpers_version
# Set the Numpy print style to a fixed version to make doctest outputs
# reproducible.
try:
np.set_printoptions(legacy='1.13')
except TypeError:
# On older versions of Numpy, the unrecognized 'legacy' option will
# raise a TypeError.
pass
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5036943
astropy-healpix-0.5/astropy_healpix/core.py 0000644 0000770 0000024 00000046105 00000000000 021151 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
import math
import numpy as np
from astropy import units as u
from astropy.coordinates import Longitude, Latitude
from . import _core
__all__ = [
'nside_to_pixel_area',
'nside_to_pixel_resolution',
'pixel_resolution_to_nside',
'nside_to_npix',
'npix_to_nside',
'level_to_nside',
'nside_to_level',
'level_ipix_to_uniq',
'uniq_to_level_ipix',
'lonlat_to_healpix',
'healpix_to_lonlat',
'bilinear_interpolation_weights',
'interpolate_bilinear_lonlat',
'neighbours',
]
def _restore_shape(*args, **kwargs):
shape = kwargs['shape']
if shape:
if len(args) > 1:
return [arg.reshape(shape) for arg in args]
else:
return args[0].reshape(shape)
else:
if len(args) > 1:
return [arg.item() for arg in args]
else:
return args[0].item()
def _validate_order(order):
# We also support upper-case, to support directly the values
# ORDERING = {'RING', 'NESTED'} in FITS headers
# This is currently undocumented in the docstrings.
if order == 'nested' or order == 'NESTED':
return 'nested'
elif order == 'ring' or order == 'RING':
return 'ring'
else:
raise ValueError("order must be 'nested' or 'ring'")
def _validate_healpix_index(label, healpix_index, nside):
npix = nside_to_npix(nside)
if np.any((healpix_index < 0) | (healpix_index > npix - 1)):
raise ValueError(f'{label} must be in the range [0:{npix}]')
def _validate_offset(label, offset):
offset = np.asarray(offset)
if np.any((offset < 0) | (offset > 1)):
raise ValueError(f'd{label} must be in the range [0:1]')
def _validate_level(level):
if np.any(level < 0):
raise ValueError('level must be positive')
def _validate_nside(nside):
log_2_nside = np.round(np.log2(nside))
if not np.all(2 ** log_2_nside == nside):
raise ValueError('nside must be a power of two')
def _validate_npix(level, ipix):
if not np.all(ipix < (3 << 2*(level + 1))):
raise ValueError('ipix for a specific level must be inferior to npix')
def level_to_nside(level):
"""
Find the pixel dimensions of the top-level HEALPix tiles.
This is given by ``nside = 2**level``.
Parameters
----------
level : int
The resolution level
Returns
-------
nside : int
The number of pixels on the side of one of the 12 'top-level' HEALPix tiles.
"""
level = np.asarray(level, dtype=np.int64)
_validate_level(level)
return 2 ** level
def nside_to_level(nside):
"""
Find the HEALPix level for a given nside.
This is given by ``level = log2(nside)``.
This function is the inverse of `level_to_nside`.
Parameters
----------
nside : int
The number of pixels on the side of one of the 12 'top-level' HEALPix tiles.
Must be a power of two.
Returns
-------
level : int
The level of the HEALPix cells
"""
nside = np.asarray(nside, dtype=np.int64)
_validate_nside(nside)
return np.log2(nside).astype(np.int64)
def uniq_to_level_ipix(uniq):
"""
Convert a HEALPix cell uniq number to its (level, ipix) equivalent.
A uniq number is a 64 bits integer equaling to : ipix + 4*(4**level). Please read
this `paper `_
for more details about uniq numbers.
Parameters
----------
uniq : int
The uniq number of a HEALPix cell.
Returns
-------
level, ipix: int, int
The level and index of the HEALPix cell computed from ``uniq``.
"""
uniq = np.asarray(uniq, dtype=np.int64)
level = (np.log2(uniq//4)) // 2
level = level.astype(np.int64)
_validate_level(level)
ipix = uniq - (1 << 2*(level + 1))
_validate_npix(level, ipix)
return level, ipix
def level_ipix_to_uniq(level, ipix):
"""
Convert a level and HEALPix index into a uniq number representing the cell.
This function is the inverse of `uniq_to_level_ipix`.
Parameters
----------
level : int
The level of the HEALPix cell
ipix : int
The index of the HEALPix cell
Returns
-------
uniq : int
The uniq number representing the HEALPix cell.
"""
level = np.asarray(level, dtype=np.int64)
ipix = np.asarray(ipix, dtype=np.int64)
_validate_level(level)
_validate_npix(level, ipix)
return ipix + (1 << 2*(level + 1))
def nside_to_pixel_area(nside):
"""
Find the area of HEALPix pixels given the pixel dimensions of one of
the 12 'top-level' HEALPix tiles.
Parameters
----------
nside : int
The number of pixels on the side of one of the 12 'top-level' HEALPix tiles.
Returns
-------
pixel_area : :class:`~astropy.units.Quantity`
The area of the HEALPix pixels
"""
nside = np.asanyarray(nside, dtype=np.int64)
_validate_nside(nside)
npix = 12 * nside * nside
pixel_area = 4 * math.pi / npix * u.sr
return pixel_area
def nside_to_pixel_resolution(nside):
"""
Find the resolution of HEALPix pixels given the pixel dimensions of one of
the 12 'top-level' HEALPix tiles.
Parameters
----------
nside : int
The number of pixels on the side of one of the 12 'top-level' HEALPix tiles.
Returns
-------
resolution : :class:`~astropy.units.Quantity`
The resolution of the HEALPix pixels
See also
--------
pixel_resolution_to_nside
"""
nside = np.asanyarray(nside, dtype=np.int64)
_validate_nside(nside)
return (nside_to_pixel_area(nside) ** 0.5).to(u.arcmin)
def pixel_resolution_to_nside(resolution, round='nearest'):
"""Find closest HEALPix nside for a given angular resolution.
This function is the inverse of `nside_to_pixel_resolution`,
for the default rounding scheme of ``round='nearest'``.
If you choose ``round='up'``, you'll get HEALPix pixels that
have at least the requested resolution (usually a bit better
due to rounding).
Pixel resolution is defined as square root of pixel area.
Parameters
----------
resolution : `~astropy.units.Quantity`
Angular resolution
round : {'up', 'nearest', 'down'}
Which way to round
Returns
-------
nside : int
The number of pixels on the side of one of the 12 'top-level' HEALPix tiles.
Always a power of 2.
Examples
--------
>>> from astropy import units as u
>>> from astropy_healpix import pixel_resolution_to_nside
>>> pixel_resolution_to_nside(13 * u.arcmin)
256
>>> pixel_resolution_to_nside(13 * u.arcmin, round='up')
512
"""
resolution = resolution.to(u.rad).value
pixel_area = resolution * resolution
npix = 4 * math.pi / pixel_area
nside = np.sqrt(npix / 12)
# Now we have to round to the closest ``nside``
# Since ``nside`` must be a power of two,
# we first compute the corresponding ``level = log2(nside)`
# round the level and then go back to nside
level = np.log2(nside)
if round == 'up':
level = np.ceil(level)
elif round == 'nearest':
level = np.round(level)
elif round == 'down':
level = np.floor(level)
else:
raise ValueError(f'Invalid value for round: {round!r}')
# For very low requested resolution (i.e. large angle values), we
# return ``level=0``, i.e. ``nside=1``, i.e. the lowest resolution
# that exists with HEALPix
level = np.clip(level.astype(int), 0, None)
return level_to_nside(level)
def nside_to_npix(nside):
"""
Find the number of pixels corresponding to a HEALPix resolution.
Parameters
----------
nside : int
The number of pixels on the side of one of the 12 'top-level' HEALPix tiles.
Returns
-------
npix : int
The number of pixels in the HEALPix map.
"""
nside = np.asanyarray(nside, dtype=np.int64)
_validate_nside(nside)
return 12 * nside ** 2
def npix_to_nside(npix):
"""
Find the number of pixels on the side of one of the 12 'top-level' HEALPix
tiles given a total number of pixels.
Parameters
----------
npix : int
The number of pixels in the HEALPix map.
Returns
-------
nside : int
The number of pixels on the side of one of the 12 'top-level' HEALPix tiles.
"""
npix = np.asanyarray(npix, dtype=np.int64)
if not np.all(npix % 12 == 0):
raise ValueError('Number of pixels must be divisible by 12')
square_root = np.sqrt(npix / 12)
if not np.all(square_root ** 2 == npix / 12):
raise ValueError('Number of pixels is not of the form 12 * nside ** 2')
return np.round(square_root).astype(int)
def healpix_to_lonlat(healpix_index, nside, dx=None, dy=None, order='ring'):
"""
Convert HEALPix indices (optionally with offsets) to longitudes/latitudes.
If no offsets (``dx`` and ``dy``) are provided, the coordinates will default
to those at the center of the HEALPix pixels.
Parameters
----------
healpix_index : int or `~numpy.ndarray`
HEALPix indices (as a scalar or array)
nside : int or `~numpy.ndarray`
Number of pixels along the side of each of the 12 top-level HEALPix tiles
dx, dy : float or `~numpy.ndarray`, optional
Offsets inside the HEALPix pixel, which must be in the range [0:1],
where 0.5 is the center of the HEALPix pixels (as scalars or arrays)
order : { 'nested' | 'ring' }, optional
Order of HEALPix pixels
Returns
-------
lon : :class:`~astropy.coordinates.Longitude`
The longitude values
lat : :class:`~astropy.coordinates.Latitude`
The latitude values
"""
_validate_nside(nside)
if _validate_order(order) == 'ring':
func = _core.healpix_ring_to_lonlat
else: # _validate_order(order) == 'nested'
func = _core.healpix_nested_to_lonlat
if dx is None:
dx = 0.5
else:
_validate_offset('x', dx)
if dy is None:
dy = 0.5
else:
_validate_offset('y', dy)
nside = np.asarray(nside, dtype=np.intc)
lon, lat = func(healpix_index, nside, dx, dy)
lon = Longitude(lon, unit=u.rad, copy=False)
lat = Latitude(lat, unit=u.rad, copy=False)
return lon, lat
def lonlat_to_healpix(lon, lat, nside, return_offsets=False, order='ring'):
"""
Convert longitudes/latitudes to HEALPix indices
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude values as :class:`~astropy.units.Quantity`
instances with angle units.
nside : int or `~numpy.ndarray`
Number of pixels along the side of each of the 12 top-level HEALPix tiles
order : { 'nested' | 'ring' }
Order of HEALPix pixels
return_offsets : bool, optional
If `True`, the returned values are the HEALPix pixel indices as well as
``dx`` and ``dy``, the fractional positions inside the pixels. If
`False` (the default), only the HEALPix pixel indices is returned.
Returns
-------
healpix_index : int or `~numpy.ndarray`
The HEALPix indices
dx, dy : `~numpy.ndarray`
Offsets inside the HEALPix pixel in the range [0:1], where 0.5 is the
center of the HEALPix pixels
"""
if _validate_order(order) == 'ring':
func = _core.lonlat_to_healpix_ring
else: # _validate_order(order) == 'nested'
func = _core.lonlat_to_healpix_nested
nside = np.asarray(nside, dtype=np.intc)
lon = lon.to_value(u.rad)
lat = lat.to_value(u.rad)
healpix_index, dx, dy = func(lon, lat, nside)
if return_offsets:
return healpix_index, dx, dy
else:
return healpix_index
def nested_to_ring(nested_index, nside):
"""
Convert a HEALPix 'nested' index to a HEALPix 'ring' index
Parameters
----------
nested_index : int or `~numpy.ndarray`
Healpix index using the 'nested' ordering
nside : int or `~numpy.ndarray`
Number of pixels along the side of each of the 12 top-level HEALPix tiles
Returns
-------
ring_index : int or `~numpy.ndarray`
Healpix index using the 'ring' ordering
"""
nside = np.asarray(nside, dtype=np.intc)
return _core.nested_to_ring(nested_index, nside)
def ring_to_nested(ring_index, nside):
"""
Convert a HEALPix 'ring' index to a HEALPix 'nested' index
Parameters
----------
ring_index : int or `~numpy.ndarray`
Healpix index using the 'ring' ordering
nside : int or `~numpy.ndarray`
Number of pixels along the side of each of the 12 top-level HEALPix tiles
Returns
-------
nested_index : int or `~numpy.ndarray`
Healpix index using the 'nested' ordering
"""
nside = np.asarray(nside, dtype=np.intc)
return _core.ring_to_nested(ring_index, nside)
def bilinear_interpolation_weights(lon, lat, nside, order='ring'):
"""
Get the four neighbours for each (lon, lat) position and the weight
associated with each one for bilinear interpolation.
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude values as :class:`~astropy.units.Quantity`
instances with angle units.
nside : int
Number of pixels along the side of each of the 12 top-level HEALPix tiles
order : { 'nested' | 'ring' }
Order of HEALPix pixels
Returns
-------
indices : `~numpy.ndarray`
2-D array with shape (4, N) giving the four indices to use for the
interpolation
weights : `~numpy.ndarray`
2-D array with shape (4, N) giving the four weights to use for the
interpolation
"""
lon = lon.to_value(u.rad)
lat = lat.to_value(u.rad)
_validate_nside(nside)
nside = np.asarray(nside, dtype=np.intc)
result = _core.bilinear_interpolation_weights(lon, lat, nside)
indices = np.stack(result[:4])
weights = np.stack(result[4:])
if _validate_order(order) == 'nested':
indices = ring_to_nested(indices, nside)
return indices, weights
def interpolate_bilinear_lonlat(lon, lat, values, order='ring'):
"""
Interpolate values at specific longitudes/latitudes using bilinear interpolation
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude values as :class:`~astropy.units.Quantity` instances
with angle units.
values : `~numpy.ndarray`
Array with the values in each HEALPix pixel. The first dimension should
have length 12 * nside ** 2 (and nside is determined automatically from
this).
order : { 'nested' | 'ring' }
Order of HEALPix pixels
Returns
-------
result : float `~numpy.ndarray`
The interpolated values
"""
nside = npix_to_nside(values.shape[0])
indices, weights = bilinear_interpolation_weights(lon, lat, nside, order=order)
values = values[indices]
# At this point values has shape (N, M) where both N and M might be several
# dimensions, and weights has shape (N,), so we need to transpose in order
# to benefit from broadcasting, then transpose back so that the dimension
# with length 4 is at the start again, ready for summing.
result = (values.T * weights.T).T
return result.sum(axis=0)
def neighbours(healpix_index, nside, order='ring'):
"""
Find all the HEALPix pixels that are the neighbours of a HEALPix pixel
Parameters
----------
healpix_index : `~numpy.ndarray`
Array of HEALPix pixels
nside : int
Number of pixels along the side of each of the 12 top-level HEALPix tiles
order : { 'nested' | 'ring' }
Order of HEALPix pixels
Returns
-------
neigh : `~numpy.ndarray`
Array giving the neighbours starting SW and rotating clockwise. This has
one extra dimension compared to ``healpix_index`` - the first dimension -
which is set to 8. For example if healpix_index has shape (2, 3),
``neigh`` has shape (8, 2, 3).
"""
_validate_nside(nside)
nside = np.asarray(nside, dtype=np.intc)
if _validate_order(order) == 'ring':
func = _core.neighbours_ring
else: # _validate_order(order) == 'nested'
func = _core.neighbours_nested
return np.stack(func(healpix_index, nside))
def healpix_cone_search(lon, lat, radius, nside, order='ring'):
"""
Find all the HEALPix pixels within a given radius of a longitude/latitude.
Note that this returns all pixels that overlap, including partially, with
the search cone. This function can only be used for a single lon/lat pair at
a time, since different calls to the function may result in a different
number of matches.
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude to search around
radius : :class:`~astropy.units.Quantity`
The search radius
nside : int
Number of pixels along the side of each of the 12 top-level HEALPix tiles
order : { 'nested' | 'ring' }
Order of HEALPix pixels
Returns
-------
healpix_index : `~numpy.ndarray`
1-D array with all the matching HEALPix pixel indices.
"""
lon = lon.to_value(u.deg)
lat = lat.to_value(u.deg)
radius = radius.to_value(u.deg)
_validate_nside(nside)
order = _validate_order(order)
return _core.healpix_cone_search(lon, lat, radius, nside, order)
def boundaries_lonlat(healpix_index, step, nside, order='ring'):
"""
Return the longitude and latitude of the edges of HEALPix pixels
This returns the longitude and latitude of points along the edge of each
HEALPIX pixel. The number of points returned for each pixel is ``4 * step``,
so setting ``step`` to 1 returns just the corners.
Parameters
----------
healpix_index : `~numpy.ndarray`
1-D array of HEALPix pixels
step : int
The number of steps to take along each edge.
nside : int
Number of pixels along the side of each of the 12 top-level HEALPix tiles
order : { 'nested' | 'ring' }
Order of HEALPix pixels
Returns
-------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude, as 2-D arrays where the first dimension is
the same as the ``healpix_index`` input, and the second dimension has
size ``4 * step``.
"""
healpix_index = np.asarray(healpix_index, dtype=np.int64)
step = int(step)
if step < 1:
raise ValueError('step must be at least 1')
# PERF: this could be optimized by writing a Cython routine to do this to
# avoid allocating temporary arrays
frac = np.linspace(0., 1., step + 1)[:-1]
dx = np.hstack([1 - frac, np.repeat(0, step), frac, np.repeat(1, step)])
dy = np.hstack([np.repeat(1, step), 1 - frac, np.repeat(0, step), frac])
healpix_index, dx, dy = np.broadcast_arrays(healpix_index.reshape(-1, 1), dx, dy)
lon, lat = healpix_to_lonlat(healpix_index.ravel(), nside, dx.ravel(), dy.ravel(), order=order)
lon = lon.reshape(-1, 4 * step)
lat = lat.reshape(-1, 4 * step)
return lon, lat
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5046349
astropy-healpix-0.5/astropy_healpix/healpy.py 0000644 0000770 0000024 00000015166 00000000000 021506 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
"""
This submodule provides a healpy-compatible interface.
"""
import numpy as np
from astropy import units as u
from astropy.coordinates.representation import CartesianRepresentation, UnitSphericalRepresentation
from .core import (nside_to_pixel_resolution, nside_to_pixel_area,
nside_to_npix, npix_to_nside, nested_to_ring, ring_to_nested,
level_to_nside, lonlat_to_healpix, healpix_to_lonlat,
boundaries_lonlat, bilinear_interpolation_weights,
interpolate_bilinear_lonlat)
RAD2DEG = 180 / np.pi
PI_2 = np.pi / 2
__all__ = ['nside2resol',
'nside2pixarea',
'nside2npix',
'npix2nside',
'pix2ang',
'ang2pix',
'pix2vec',
'vec2pix',
'order2nside',
'nest2ring',
'ring2nest',
'boundaries',
'vec2ang',
'ang2vec',
'get_interp_weights',
'get_interp_val']
def _lonlat_to_healpy(lon, lat, lonlat=False):
# We use in-place operations below to avoid making temporary arrays - this
# is safe because the lon/lat arrays returned from healpix_to_lonlat are
# new and not used elsewhere.
if lonlat:
return lon.to(u.deg).value, lat.to(u.deg).value
else:
lat, lon = lat.to(u.rad).value, lon.to(u.rad).value
if np.isscalar(lon):
return PI_2 - lat, lon
else:
lat = np.subtract(PI_2, lat, out=lat)
return lat, lon
def _healpy_to_lonlat(theta, phi, lonlat=False):
# Unlike in _lonlat_to_healpy, we don't use in-place operations since we
# don't want to modify theta and phi since the user may be using them
# elsewhere.
if lonlat:
lon = np.asarray(theta) / RAD2DEG
lat = np.asarray(phi) / RAD2DEG
else:
lat = PI_2 - np.asarray(theta)
lon = np.asarray(phi)
return u.Quantity(lon, u.rad, copy=False), u.Quantity(lat, u.rad, copy=False)
def nside2resol(nside, arcmin=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.nside2resol`."""
resolution = nside_to_pixel_resolution(nside)
if arcmin:
return resolution.to(u.arcmin).value
else:
return resolution.to(u.rad).value
def nside2pixarea(nside, degrees=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.nside2pixarea`."""
area = nside_to_pixel_area(nside)
if degrees:
return area.to(u.deg ** 2).value
else:
return area.to(u.sr).value
def nside2npix(nside):
"""Drop-in replacement for healpy `~healpy.pixelfunc.nside2npix`."""
return nside_to_npix(nside)
def npix2nside(npix):
"""Drop-in replacement for healpy `~healpy.pixelfunc.npix2nside`."""
return npix_to_nside(npix)
def order2nside(order):
"""Drop-in replacement for healpy `~healpy.pixelfunc.order2nside`."""
return level_to_nside(order)
def pix2ang(nside, ipix, nest=False, lonlat=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.pix2ang`."""
lon, lat = healpix_to_lonlat(ipix, nside, order='nested' if nest else 'ring')
return _lonlat_to_healpy(lon, lat, lonlat=lonlat)
def ang2pix(nside, theta, phi, nest=False, lonlat=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.ang2pix`."""
lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat)
return lonlat_to_healpix(lon, lat, nside, order='nested' if nest else 'ring')
def pix2vec(nside, ipix, nest=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.pix2vec`."""
lon, lat = healpix_to_lonlat(ipix, nside, order='nested' if nest else 'ring')
return ang2vec(*_lonlat_to_healpy(lon, lat))
def vec2pix(nside, x, y, z, nest=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.vec2pix`."""
theta, phi = vec2ang(np.transpose([x, y, z]))
# hp.vec2ang() returns raveled arrays, which are 1D.
if np.isscalar(x):
theta = theta.item()
phi = phi.item()
else:
shape = np.shape(x)
theta = theta.reshape(shape)
phi = phi.reshape(shape)
lon, lat = _healpy_to_lonlat(theta, phi)
return lonlat_to_healpix(lon, lat, nside, order='nested' if nest else 'ring')
def nest2ring(nside, ipix):
"""Drop-in replacement for healpy `~healpy.pixelfunc.nest2ring`."""
ipix = np.atleast_1d(ipix).astype(np.int64, copy=False)
return nested_to_ring(ipix, nside)
def ring2nest(nside, ipix):
"""Drop-in replacement for healpy `~healpy.pixelfunc.ring2nest`."""
ipix = np.atleast_1d(ipix).astype(np.int64, copy=False)
return ring_to_nested(ipix, nside)
def boundaries(nside, pix, step=1, nest=False):
"""Drop-in replacement for healpy `~healpy.boundaries`."""
pix = np.asarray(pix)
if pix.ndim > 1:
# For consistency with healpy we only support scalars or 1D arrays
raise ValueError("Array has to be one dimensional")
lon, lat = boundaries_lonlat(pix, step, nside, order='nested' if nest else 'ring')
rep_sph = UnitSphericalRepresentation(lon, lat)
rep_car = rep_sph.to_cartesian().xyz.value.swapaxes(0, 1)
if rep_car.shape[0] == 1:
return rep_car[0]
else:
return rep_car
def vec2ang(vectors, lonlat=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.vec2ang`."""
x, y, z = vectors.transpose()
rep_car = CartesianRepresentation(x, y, z)
rep_sph = rep_car.represent_as(UnitSphericalRepresentation)
return _lonlat_to_healpy(rep_sph.lon.ravel(), rep_sph.lat.ravel(), lonlat=lonlat)
def ang2vec(theta, phi, lonlat=False):
"""Drop-in replacement for healpy `~healpy.pixelfunc.ang2vec`."""
lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat)
rep_sph = UnitSphericalRepresentation(lon, lat)
rep_car = rep_sph.represent_as(CartesianRepresentation)
return rep_car.xyz.value
def get_interp_weights(nside, theta, phi=None, nest=False, lonlat=False):
"""
Drop-in replacement for healpy `~healpy.pixelfunc.get_interp_weights`.
Although note that the order of the weights and pixels may differ.
"""
# if phi is not given, theta is interpreted as pixel number
if phi is None:
theta, phi = pix2ang(nside, ipix=theta, nest=nest)
lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat)
return bilinear_interpolation_weights(lon, lat, nside, order='nested' if nest else 'ring')
def get_interp_val(m, theta, phi, nest=False, lonlat=False):
"""
Drop-in replacement for healpy `~healpy.pixelfunc.get_interp_val`.
"""
lon, lat = _healpy_to_lonlat(theta, phi, lonlat=lonlat)
return interpolate_bilinear_lonlat(lon, lat, m, order='nested' if nest else 'ring')
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574433289.505507
astropy-healpix-0.5/astropy_healpix/high_level.py 0000644 0000770 0000024 00000043276 00000000000 022335 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
import os
from astropy.coordinates import SkyCoord
from astropy.coordinates.representation import UnitSphericalRepresentation
from .core import (nside_to_pixel_area, nside_to_pixel_resolution,
nside_to_npix, npix_to_nside, healpix_to_lonlat,
lonlat_to_healpix, bilinear_interpolation_weights,
interpolate_bilinear_lonlat, ring_to_nested, nested_to_ring,
healpix_cone_search, boundaries_lonlat, neighbours)
from .utils import parse_input_healpix_data
__all__ = ['HEALPix']
NO_FRAME_MESSAGE = """
No frame was specified when initializing HEALPix, so SkyCoord objects cannot be
returned. Either specify a frame when initializing HEALPix or use the {0}
method.
""".replace(os.linesep, ' ').strip()
class NoFrameError(Exception):
def __init__(self, alternative_method):
super().__init__(NO_FRAME_MESSAGE.format(alternative_method))
class HEALPix:
"""
A HEALPix pixellization.
Parameters
----------
nside : int
Number of pixels along the side of each of the 12 top-level HEALPix tiles
order : { 'nested' | 'ring' }
Order of HEALPix pixels
frame : :class:`~astropy.coordinates.BaseCoordinateFrame`, optional
The celestial coordinate frame of the pixellization. This can be
ommitted, in which case the pixellization will not be attached to any
particular celestial frame, and the methods ending in _skycoord will
not work (but the _lonlat methods will still work and continue to
return generic longitudes/latitudes).
"""
def __init__(self, nside=None, order='ring', frame=None):
if nside is None:
raise ValueError('nside has not been set')
self.nside = nside
self.order = order
self.frame = frame
@classmethod
def from_header(cls, input_data, field=0, hdu_in=None, nested=None):
"""
Parameters
----------
input_data : str or `~astropy.io.fits.TableHDU` or `~astropy.io.fits.BinTableHDU` or tuple
The input data to reproject. This can be:
* The name of a HEALPIX FITS file
* A `~astropy.io.fits.TableHDU` or `~astropy.io.fits.BinTableHDU`
instance
* A tuple where the first element is a `~numpy.ndarray` and the
second element is a `~astropy.coordinates.BaseCoordinateFrame`
instance or a string alias for a coordinate frame.
hdu_in : int or str, optional
If ``input_data`` is a FITS file, specifies the HDU to use.
(the default HDU for HEALPIX data is 1, unlike with image files where
it is generally 0)
nested : bool, optional
The order of the healpix_data, either nested (True) or ring (False).
If a FITS file is passed in, this is determined from the header.
Returns
-------
healpix : `~astropy_healpix.HEALPix`
A HEALPix pixellization corresponding to the input data.
"""
array_in, frame, nested = parse_input_healpix_data(
input_data, field=field, hdu_in=hdu_in, nested=nested)
nside = npix_to_nside(len(array_in))
order = 'nested' if nested else 'ring'
return cls(nside=nside, order=order, frame=frame)
@property
def pixel_area(self):
"""
The area of a single HEALPix pixel.
"""
return nside_to_pixel_area(self.nside)
@property
def pixel_resolution(self):
"""
The resolution of a single HEALPix pixel.
"""
return nside_to_pixel_resolution(self.nside)
@property
def npix(self):
"""
The number of pixels in the pixellization of the sphere.
"""
return nside_to_npix(self.nside)
def healpix_to_lonlat(self, healpix_index, dx=None, dy=None):
"""
Convert HEALPix indices (optionally with offsets) to longitudes/latitudes
Parameters
----------
healpix_index : `~numpy.ndarray`
1-D array of HEALPix indices
dx, dy : `~numpy.ndarray`, optional
1-D arrays of offsets inside the HEALPix pixel, which must be in
the range [0:1] (0.5 is the center of the HEALPix pixels). If not
specified, the position at the center of the pixel is used.
Returns
-------
lon : :class:`~astropy.coordinates.Longitude`
The longitude values
lat : :class:`~astropy.coordinates.Latitude`
The latitude values
"""
return healpix_to_lonlat(healpix_index, self.nside, dx=dx, dy=dy, order=self.order)
def lonlat_to_healpix(self, lon, lat, return_offsets=False):
"""
Convert longitudes/latitudes to HEALPix indices (optionally with offsets)
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude values as :class:`~astropy.units.Quantity` instances
with angle units.
return_offsets : bool
If `True`, the returned values are the HEALPix pixel as well as
``dx`` and ``dy``, the fractional positions inside the pixel. If
`False` (the default), only the HEALPix pixel is returned.
Returns
-------
healpix_index : `~numpy.ndarray`
1-D array of HEALPix indices
dx, dy : `~numpy.ndarray`
1-D arrays of offsets inside the HEALPix pixel in the range [0:1] (0.5
is the center of the HEALPix pixels). This is returned if
``return_offsets`` is `True`.
"""
return lonlat_to_healpix(lon, lat, self.nside,
return_offsets=return_offsets, order=self.order)
def nested_to_ring(self, nested_index):
"""
Convert a healpix 'nested' index to a healpix 'ring' index
Parameters
----------
nested_index : `~numpy.ndarray`
Healpix index using the 'nested' ordering
Returns
-------
ring_index : `~numpy.ndarray`
Healpix index using the 'ring' ordering
"""
return nested_to_ring(nested_index, self.nside)
def ring_to_nested(self, ring_index):
"""
Convert a healpix 'ring' index to a healpix 'nested' index
Parameters
----------
ring_index : `~numpy.ndarray`
Healpix index using the 'ring' ordering
Returns
-------
nested_index : `~numpy.ndarray`
Healpix index using the 'nested' ordering
"""
return ring_to_nested(ring_index, self.nside)
def bilinear_interpolation_weights(self, lon, lat):
"""
Get the four neighbours for each (lon, lat) position and the weight
associated with each one for bilinear interpolation.
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude values as
:class:`~astropy.units.Quantity` instances with angle units.
Returns
-------
indices : `~numpy.ndarray`
2-D array with shape (4, N) giving the four indices to use for the
interpolation
weights : `~numpy.ndarray`
2-D array with shape (4, N) giving the four weights to use for the
interpolation
"""
return bilinear_interpolation_weights(lon, lat, self.nside, order=self.order)
def interpolate_bilinear_lonlat(self, lon, lat, values):
"""
Interpolate values at specific longitudes/latitudes using bilinear interpolation
If a position does not have four neighbours, this currently returns NaN.
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude values as :class:`~astropy.units.Quantity` instances
with angle units.
values : `~numpy.ndarray`
1-D array with the values in each HEALPix pixel. This must have a
length of the form 12 * nside ** 2 (and nside is determined
automatically from this).
Returns
-------
result : `~numpy.ndarray`
1-D array of interpolated values
"""
if len(values) != self.npix:
raise ValueError('values must be an array of length {} (got {})'.format(self.npix, len(values)))
return interpolate_bilinear_lonlat(lon, lat, values, order=self.order)
def cone_search_lonlat(self, lon, lat, radius):
"""
Find all the HEALPix pixels within a given radius of a longitude/latitude.
Note that this returns all pixels that overlap, including partially, with
the search cone. This function can only be used for a single lon/lat pair at
a time, since different calls to the function may result in a different
number of matches.
Parameters
----------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude to search around
radius : :class:`~astropy.units.Quantity`
The search radius
Returns
-------
healpix_index : `~numpy.ndarray`
1-D array with all the matching HEALPix pixel indices.
"""
if not lon.isscalar or not lat.isscalar or not radius.isscalar:
raise ValueError('The longitude, latitude and radius must be '
'scalar Quantity objects')
return healpix_cone_search(lon, lat, radius, self.nside, order=self.order)
def boundaries_lonlat(self, healpix_index, step):
"""
Return the longitude and latitude of the edges of HEALPix pixels
This returns the longitude and latitude of points along the edge of each
HEALPIX pixel. The number of points returned for each pixel is ``4 * step``,
so setting ``step`` to 1 returns just the corners.
Parameters
----------
healpix_index : `~numpy.ndarray`
1-D array of HEALPix pixels
step : int
The number of steps to take along each edge.
Returns
-------
lon, lat : :class:`~astropy.units.Quantity`
The longitude and latitude, as 2-D arrays where the first dimension is
the same as the ``healpix_index`` input, and the second dimension has
size ``4 * step``.
"""
return boundaries_lonlat(healpix_index, step, self.nside, order=self.order)
def neighbours(self, healpix_index):
"""
Find all the HEALPix pixels that are the neighbours of a HEALPix pixel
Parameters
----------
healpix_index : `~numpy.ndarray`
Array of HEALPix pixels
Returns
-------
neigh : `~numpy.ndarray`
Array giving the neighbours starting SW and rotating clockwise. This has
one extra dimension compared to ``healpix_index`` - the first dimension -
which is set to 8. For example if healpix_index has shape (2, 3),
``neigh`` has shape (8, 2, 3).
"""
return neighbours(healpix_index, self.nside, order=self.order)
def healpix_to_skycoord(self, healpix_index, dx=None, dy=None):
"""
Convert HEALPix indices (optionally with offsets) to celestial coordinates.
Note that this method requires that a celestial frame was specified when
initializing HEALPix. If you don't know or need the celestial frame, you
can instead use :meth:`~astropy_healpix.HEALPix.healpix_to_lonlat`.
Parameters
----------
healpix_index : `~numpy.ndarray`
1-D array of HEALPix indices
dx, dy : `~numpy.ndarray`, optional
1-D arrays of offsets inside the HEALPix pixel, which must be in
the range [0:1] (0.5 is the center of the HEALPix pixels). If not
specified, the position at the center of the pixel is used.
Returns
-------
coord : :class:`~astropy.coordinates.SkyCoord`
The resulting celestial coordinates
"""
if self.frame is None:
raise NoFrameError("healpix_to_skycoord")
lon, lat = self.healpix_to_lonlat(healpix_index, dx=dx, dy=dy)
representation = UnitSphericalRepresentation(lon, lat, copy=False)
return SkyCoord(self.frame.realize_frame(representation))
def skycoord_to_healpix(self, skycoord, return_offsets=False):
"""
Convert celestial coordinates to HEALPix indices (optionally with offsets).
Note that this method requires that a celestial frame was specified when
initializing HEALPix. If you don't know or need the celestial frame, you
can instead use :meth:`~astropy_healpix.HEALPix.lonlat_to_healpix`.
Parameters
----------
skycoord : :class:`~astropy.coordinates.SkyCoord`
The celestial coordinates to convert
return_offsets : bool
If `True`, the returned values are the HEALPix pixel as well as
``dx`` and ``dy``, the fractional positions inside the pixel. If
`False` (the default), only the HEALPix pixel is returned.
Returns
-------
healpix_index : `~numpy.ndarray`
1-D array of HEALPix indices
dx, dy : `~numpy.ndarray`
1-D arrays of offsets inside the HEALPix pixel in the range [0:1] (0.5
is the center of the HEALPix pixels). This is returned if
``return_offsets`` is `True`.
"""
if self.frame is None:
raise NoFrameError("skycoord_to_healpix")
skycoord = skycoord.transform_to(self.frame)
representation = skycoord.represent_as(UnitSphericalRepresentation)
lon, lat = representation.lon, representation.lat
return self.lonlat_to_healpix(lon, lat, return_offsets=return_offsets)
def interpolate_bilinear_skycoord(self, skycoord, values):
"""
Interpolate values at specific celestial coordinates using bilinear interpolation.
If a position does not have four neighbours, this currently returns NaN.
Note that this method requires that a celestial frame was specified when
initializing HEALPix. If you don't know or need the celestial frame, you
can instead use :meth:`~astropy_healpix.HEALPix.interpolate_bilinear_lonlat`.
Parameters
----------
skycoord : :class:`~astropy.coordinates.SkyCoord`
The celestial coordinates at which to interpolate
values : `~numpy.ndarray`
1-D array with the values in each HEALPix pixel. This must have a
length of the form 12 * nside ** 2 (and nside is determined
automatically from this).
Returns
-------
result : `~numpy.ndarray`
1-D array of interpolated values
"""
if self.frame is None:
raise NoFrameError("interpolate_bilinear_skycoord")
skycoord = skycoord.transform_to(self.frame)
representation = skycoord.represent_as(UnitSphericalRepresentation)
lon, lat = representation.lon, representation.lat
return self.interpolate_bilinear_lonlat(lon, lat, values)
def cone_search_skycoord(self, skycoord, radius):
"""
Find all the HEALPix pixels within a given radius of a celestial position.
Note that this returns all pixels that overlap, including partially,
with the search cone. This function can only be used for a single
celestial position at a time, since different calls to the function may
result in a different number of matches.
This method requires that a celestial frame was specified when
initializing HEALPix. If you don't know or need the celestial frame,
you can instead use :meth:`~astropy_healpix.HEALPix.cone_search_lonlat`.
Parameters
----------
skycoord : :class:`~astropy.coordinates.SkyCoord`
The celestial coordinates to use for the cone search
radius : :class:`~astropy.units.Quantity`
The search radius
Returns
-------
healpix_index : `~numpy.ndarray`
1-D array with all the matching HEALPix pixel indices.
"""
if self.frame is None:
raise NoFrameError("cone_search_skycoord")
skycoord = skycoord.transform_to(self.frame)
representation = skycoord.represent_as(UnitSphericalRepresentation)
lon, lat = representation.lon, representation.lat
return self.cone_search_lonlat(lon, lat, radius)
def boundaries_skycoord(self, healpix_index, step):
"""
Return the celestial coordinates of the edges of HEALPix pixels
This returns the celestial coordinates of points along the edge of each
HEALPIX pixel. The number of points returned for each pixel is ``4 * step``,
so setting ``step`` to 1 returns just the corners.
This method requires that a celestial frame was specified when
initializing HEALPix. If you don't know or need the celestial frame,
you can instead use :meth:`~astropy_healpix.HEALPix.boundaries_lonlat`.
Parameters
----------
healpix_index : `~numpy.ndarray`
1-D array of HEALPix pixels
step : int
The number of steps to take along each edge.
Returns
-------
skycoord : :class:`~astropy.coordinates.SkyCoord`
The celestial coordinates of the HEALPix pixel boundaries
"""
if self.frame is None:
raise NoFrameError("boundaries_skycoord")
lon, lat = self.boundaries_lonlat(healpix_index, step)
representation = UnitSphericalRepresentation(lon, lat, copy=False)
return SkyCoord(self.frame.realize_frame(representation))
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1556095356.926286
astropy-healpix-0.5/astropy_healpix/interpolation.c 0000644 0000770 0000024 00000015540 00000000000 022701 0 ustar 00tom staff 0000000 0000000 #include
#include "interpolation.h"
#include "healpix.h"
// Old versions of MSVC do not support C99 and therefore
// do not define NAN in math.h.
#ifndef NAN
static const union {
unsigned long integer;
float value;
} type_punned_nan = {0xFFFFFFFFFFFFFFFFul};
#define NAN (type_punned_nan.value)
#endif
#ifndef M_PI
#define M_PI 3.14159265358979323846
#endif
void interpolate_weights(double lon, double lat, int64_t *ring_indices,
double *weights, int Nside) {
// Given a longitude and a latitude, Nside, and pre-allocated arrays of 4
// elements ring_indices and weights, find the ring index of the four nearest
// neighbours and the weights to use for each neighbour to interpolate.
int64_t xy_index, npix;
int64_t ring1, ring2, ring3, ring4;
double lon1, lat1, lon2, lat2;
double lon3, lat3, lon4, lat4;
double xfrac1, xfrac2, yfrac, lon_frac;
int ring_number, longitude_index, n_in_ring;
// Find the xy index of the pixel in which the coordinates fall
xy_index = radec_to_healpixl(lon, lat, Nside);
// Find the lon/lat of the center of that pixel
healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon1, &lat1);
// Take into account possible wrapping so that the pixel longitude/latitude
// are close to the requested longitude/latitude
if (lon - lon1 > M_PI)
lon1 += 2 * M_PI;
if (lon1 - lon > M_PI)
lon1 -= 2 * M_PI;
// Convert to a ring index and decompose into ring number and longitude index
ring1 = healpixl_xy_to_ring(xy_index, Nside);
if (ring1 < 0)
{
int i;
for (i = 0; i < 4; i ++)
{
ring_indices[i] = -1;
weights[i] = NAN;
}
return;
}
healpixl_decompose_ring(ring1, Nside, &ring_number, &longitude_index);
// Figure out how many pixels are in the ring
if (ring_number < Nside) {
n_in_ring = 4 * ring_number;
} else if (ring_number < 3 * Nside) {
n_in_ring = 4 * Nside;
} else {
n_in_ring = (int)(4 * (4 * (int64_t)Nside - (int64_t)ring_number));
}
// We now want to find the next index in the ring so that the point to
// interpolate is between the two. First we check what direction to look in by
// finding the longitude/latitude of the center of the HEALPix pixel.
if (lon < lon1) { // Go to the left
if (longitude_index == 0) {
ring2 = ring1 + n_in_ring - 1;
} else {
ring2 = ring1 - 1;
}
} else { // Go to the right
if (longitude_index == n_in_ring - 1) {
ring2 = ring1 - n_in_ring + 1;
} else {
ring2 = ring1 + 1;
}
}
// Find the lon/lat of the new pixel
xy_index = healpixl_ring_to_xy(ring2, Nside);
healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon2, &lat2);
// Take into account possible wrapping so that the pixel longitude/latitude
// are close to the requested longitude/latitude
if (lon - lon2 > M_PI)
lon2 += 2 * M_PI;
if (lon2 - lon > M_PI)
lon2 -= 2 * M_PI;
// Now check whether we are moving up or down in terms of ring index
if (lat > lat1) { // Move up (0 index is at the top)
ring_number -= 1;
} else { // Move down
ring_number += 1;
}
if (ring_number > 0 && ring_number < 4 * Nside) {
// Now figure out again how many pixels are in the ring
if (ring_number < Nside) {
n_in_ring = 4 * ring_number;
} else if (ring_number < 3 * Nside) {
n_in_ring = 4 * Nside;
} else {
n_in_ring = (int)(4 * (4 * (int64_t)Nside - (int64_t)ring_number));
}
// Now determine the longitude index in which the requested longitude falls.
// In all regions, the longitude elements are spaced by 360 / n_in_ring. For
// convenience we convert the longitude index so that the spacing is 1.
lon_frac = lon * n_in_ring / (2 * M_PI);
// In the equatorial region, the first ring starts at 0.5 and the second at
// 0 (in lon_frac space). The ring number is 1-based and the first ring in
// the equatorial region is even. In this ring we can simply take
// int(lon_frac) to get the longitude index but in the odd rings we need to
// adjust lon_frac
if (n_in_ring == 4 * Nside && ring_number % 2 == 1) { // Equatorial region
lon_frac += 0.5;
}
// Find the longitude index of the closest pixel
longitude_index = (int)lon_frac;
if (longitude_index == n_in_ring) {
longitude_index -= 1;
}
// Find the longitude/latitude and ring index of this pixel
ring3 = healpixl_compose_ring(ring_number, longitude_index, Nside);
xy_index = healpixl_ring_to_xy(ring3, Nside);
healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon3, &lat3);
// Take into account possible wrapping so that the pixel longitude/latitude
// are close to the requested longitude/latitude
if (lon - lon3 > M_PI)
lon3 += 2 * M_PI;
if (lon3 - lon > M_PI)
lon3 -= 2 * M_PI;
// Finally we can find the fourth pixel as before
if (lon < lon3) { // Go to the left
if (longitude_index == 0) {
ring4 = ring3 + n_in_ring - 1;
} else {
ring4 = ring3 - 1;
}
} else { // Go to the right
if (longitude_index == n_in_ring - 1) {
ring4 = ring3 - n_in_ring + 1;
} else {
ring4 = ring3 + 1;
}
}
xy_index = healpixl_ring_to_xy(ring4, Nside);
healpixl_to_radec(xy_index, Nside, 0.5, 0.5, &lon4, &lat4);
// Take into account possible wrapping so that the pixel longitude/latitude
// are close to the requested longitude/latitude
if (lon - lon4 > M_PI)
lon4 += 2 * M_PI;
if (lon4 - lon > M_PI)
lon4 -= 2 * M_PI;
// Determine the interpolation weights
xfrac1 = (lon - lon1) / (lon2 - lon1);
xfrac2 = (lon - lon3) / (lon4 - lon3);
yfrac = (lat - lat1) / (lat3 - lat1);
weights[0] = (1 - xfrac1) * (1 - yfrac);
weights[1] = xfrac1 * (1 - yfrac);
weights[2] = (1 - xfrac2) * yfrac;
weights[3] = xfrac2 * yfrac;
} else {
// In the case where we are inside the four top/bottom-most
// values, we effectively place a value at the pole that
// is the average of the four values, and the interpolation
// is the weighted average of this polar value and the
// value interpolated along the ring.
xfrac1 = (lon - lon1) / (lon2 - lon1);
yfrac = (lat - lat1) / (0.5 * M_PI - lat1);
if (ring_number == 0) {
ring3 = (ring1 + 2) % 4;
ring4 = (ring2 + 2) % 4;
yfrac = (lat - lat1) / (0.5 * M_PI - lat1);
} else {
npix = 12 * (int64_t)Nside * (int64_t)Nside;
ring3 = ((ring1 - (npix - 4)) + 2) % 4 + npix - 4;
ring4 = ((ring2 - (npix - 4)) + 2) % 4 + npix - 4;
yfrac = (lat - lat1) / (-0.5 * M_PI - lat1);
}
weights[0] = (1 - xfrac1) * (1 - yfrac) + 0.25 * yfrac;
weights[1] = xfrac1 * (1 - yfrac) + 0.25 * yfrac;
weights[2] = 0.25 * yfrac;
weights[3] = 0.25 * yfrac;
}
ring_indices[0] = ring1;
ring_indices[1] = ring2;
ring_indices[2] = ring3;
ring_indices[3] = ring4;
}
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1537003272.8769257
astropy-healpix-0.5/astropy_healpix/interpolation.h 0000644 0000770 0000024 00000000535 00000000000 022704 0 ustar 00tom staff 0000000 0000000 #ifndef ASTROPY_HEALPIX_INTERPOLATION_INCL
#define ASTROPY_HEALPIX_INTERPOLATION_INCL
#ifdef _MSC_VER
#if _MSC_VER >= 1600
#include
#else
#include
#endif
#else
#include
#endif
void interpolate_weights(double lon, double lat, int64_t *ring_indices,
double *weights, int Nside);
#endif
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5061731
astropy-healpix-0.5/astropy_healpix/setup_package.py 0000644 0000770 0000024 00000001703 00000000000 023027 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
import os
from distutils.core import Extension
HEALPIX_ROOT = os.path.relpath(os.path.dirname(__file__))
C_FILES = ['bl.c',
'healpix-utils.c',
'healpix.c',
'mathutil.c',
'permutedsort.c',
'qsort_reentrant.c',
'starutil.c']
C_DIR = os.path.join('cextern', 'astrometry.net')
C_DIRS = ['numpy', C_DIR, HEALPIX_ROOT,
os.path.join('cextern', 'numpy')]
def get_extensions():
libraries = []
sources = [os.path.join(C_DIR, filename) for filename in C_FILES]
sources.append(os.path.join(HEALPIX_ROOT, 'interpolation.c'))
sources.append(os.path.join(HEALPIX_ROOT, '_core.c'))
extension = Extension(
name="astropy_healpix._core",
sources=sources,
include_dirs=C_DIRS,
libraries=libraries,
language="c",
extra_compile_args=['-O2'])
return [extension]
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5300088
astropy-healpix-0.5/astropy_healpix/tests/ 0000755 0000770 0000024 00000000000 00000000000 021003 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1526665132.2572062
astropy-healpix-0.5/astropy_healpix/tests/__init__.py 0000644 0000770 0000024 00000000100 00000000000 023103 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1526665132.258326
astropy-healpix-0.5/astropy_healpix/tests/coveragerc 0000644 0000770 0000024 00000001400 00000000000 023041 0 ustar 00tom staff 0000000 0000000 [run]
source = {packagename}
omit =
{packagename}/_astropy_init*
{packagename}/conftest*
{packagename}/cython_version*
{packagename}/setup_package*
{packagename}/*/setup_package*
{packagename}/*/*/setup_package*
{packagename}/tests/*
{packagename}/*/tests/*
{packagename}/*/*/tests/*
{packagename}/version*
[report]
exclude_lines =
# Have to re-enable the standard pragma
pragma: no cover
# Don't complain about packages we have installed
except ImportError
# Don't complain if tests don't hit assertions
raise AssertionError
raise NotImplementedError
# Don't complain about script hooks
def main\(.*\):
# Ignore branches that don't pertain to this version of Python
pragma: py{ignore_python_version} ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1526665132.2588434
astropy-healpix-0.5/astropy_healpix/tests/setup_package.py 0000644 0000770 0000024 00000000242 00000000000 024166 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
def get_package_data():
return {
_ASTROPY_PACKAGE_NAME_ + '.tests': ['coveragerc']}
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5070071
astropy-healpix-0.5/astropy_healpix/tests/test_bench.py 0000644 0000770 0000024 00000000101 00000000000 023463 0 ustar 00tom staff 0000000 0000000 from ..bench import main
def test_bench():
main(fast=True)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5075772
astropy-healpix-0.5/astropy_healpix/tests/test_core.py 0000644 0000770 0000024 00000027742 00000000000 023360 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
from itertools import product
import pytest
import numpy as np
from numpy.testing import assert_allclose, assert_equal
from astropy import units as u
from astropy.coordinates import Longitude, Latitude
from ..core import (nside_to_pixel_area, nside_to_pixel_resolution, pixel_resolution_to_nside,
nside_to_npix, npix_to_nside, healpix_to_lonlat,
lonlat_to_healpix, interpolate_bilinear_lonlat,
neighbours, healpix_cone_search, boundaries_lonlat,
level_to_nside, nside_to_level,
nested_to_ring, ring_to_nested,
level_ipix_to_uniq, uniq_to_level_ipix,
bilinear_interpolation_weights)
def test_level_to_nside():
assert level_to_nside(5) == 2 ** 5
with pytest.raises(ValueError) as exc:
level_to_nside(-1)
assert exc.value.args[0] == 'level must be positive'
def test_nside_to_level():
assert nside_to_level(1024) == 10
with pytest.raises(ValueError) as exc:
nside_to_level(511)
assert exc.value.args[0] == 'nside must be a power of two'
def test_level_ipix_to_uniq():
assert 11 + 4*4**0 == level_ipix_to_uniq(0, 11)
assert 62540 + 4*4**15 == level_ipix_to_uniq(15, 62540)
with pytest.raises(ValueError) as exc:
level_ipix_to_uniq(1, 49)
assert exc.value.args[0] == 'ipix for a specific level must be inferior to npix'
@pytest.mark.parametrize("level", [
0, 5, 10, 15, 20, 22, 25, 26, 27, 28, 29
])
def test_uniq_to_level_ipix(level):
npix = 3 << 2*(level + 1)
# Take 10 pixel indices between 0 and npix - 1
size = 10
ipix = np.arange(size, dtype=np.int64) * (npix // size)
level = np.ones(size) * level
level_res, ipix_res = uniq_to_level_ipix(level_ipix_to_uniq(level, ipix))
assert np.all(level_res == level) & np.all(ipix_res == ipix)
def test_nside_to_pixel_area():
resolution = nside_to_pixel_area(256)
assert_allclose(resolution.value, 1.5978966540475428e-05)
assert resolution.unit == u.sr
def test_nside_to_pixel_resolution():
resolution = nside_to_pixel_resolution(256)
assert_allclose(resolution.value, 13.741945647269624)
assert resolution.unit == u.arcmin
def test_pixel_resolution_to_nside():
# Check the different rounding options
nside = pixel_resolution_to_nside(13 * u.arcmin, round='nearest')
assert nside == 256
nside = pixel_resolution_to_nside(13 * u.arcmin, round='up')
assert nside == 512
nside = pixel_resolution_to_nside(13 * u.arcmin, round='down')
assert nside == 256
# Check that it works with arrays
nside = pixel_resolution_to_nside([1e3, 10, 1e-3] * u.deg, round='nearest')
assert_equal(nside, [1, 8, 65536])
with pytest.raises(ValueError) as exc:
pixel_resolution_to_nside(13 * u.arcmin, round='peaches')
assert exc.value.args[0] == "Invalid value for round: 'peaches'"
with pytest.raises(AttributeError) as exc:
pixel_resolution_to_nside(13)
assert exc.value.args[0] == "'int' object has no attribute 'to'"
def test_nside_to_npix():
npix = nside_to_npix(4)
assert npix == 192
npix = nside_to_npix([4, 4])
assert_equal(npix, 192)
with pytest.raises(ValueError) as exc:
nside_to_npix(15)
assert exc.value.args[0] == 'nside must be a power of two'
def test_npix_to_nside():
nside = npix_to_nside(192)
assert nside == 4
nside = npix_to_nside([192, 192])
assert_equal(nside, 4)
with pytest.raises(ValueError) as exc:
npix_to_nside(7)
assert exc.value.args[0] == 'Number of pixels must be divisible by 12'
with pytest.raises(ValueError) as exc:
npix_to_nside(12 * 8 * 9)
assert exc.value.args[0] == 'Number of pixels is not of the form 12 * nside ** 2'
# For the following tests, the numerical accuracy of this function is already
# tested in test_cython_api.py, so we focus here on functionality specific to
# the Python functions.
@pytest.mark.parametrize('order', ['nested', 'ring'])
def test_healpix_to_lonlat(order):
lon, lat = healpix_to_lonlat([1, 2, 3], 4, order=order)
assert isinstance(lon, Longitude)
assert isinstance(lat, Latitude)
index = lonlat_to_healpix(lon, lat, 4, order=order)
assert_equal(index, [1, 2, 3])
lon, lat = healpix_to_lonlat([1, 2, 3], 4,
dx=[0.1, 0.2, 0.3],
dy=[0.5, 0.4, 0.7], order=order)
assert isinstance(lon, Longitude)
assert isinstance(lat, Latitude)
index, dx, dy = lonlat_to_healpix(lon, lat, 4, order=order, return_offsets=True)
assert_equal(index, [1, 2, 3])
assert_allclose(dx, [0.1, 0.2, 0.3])
assert_allclose(dy, [0.5, 0.4, 0.7])
def test_healpix_to_lonlat_invalid():
dx = [0.1, 0.4, 0.9]
dy = [0.4, 0.3, 0.2]
with pytest.warns(RuntimeWarning, match='invalid value'):
lon, lat = healpix_to_lonlat([-1, 2, 3], 4)
with pytest.warns(RuntimeWarning, match='invalid value'):
lon, lat = healpix_to_lonlat([192, 2, 3], 4)
with pytest.raises(ValueError) as exc:
lon, lat = healpix_to_lonlat([1, 2, 3], 5)
assert exc.value.args[0] == 'nside must be a power of two'
with pytest.raises(ValueError) as exc:
lon, lat = healpix_to_lonlat([1, 2, 3], 4, order='banana')
assert exc.value.args[0] == "order must be 'nested' or 'ring'"
with pytest.raises(ValueError) as exc:
lon, lat = healpix_to_lonlat([1, 2, 3], 4, dx=[-0.1, 0.4, 0.5], dy=dy)
assert exc.value.args[0] == 'dx must be in the range [0:1]'
with pytest.raises(ValueError) as exc:
lon, lat = healpix_to_lonlat([1, 2, 3], 4, dx=dx, dy=[-0.1, 0.4, 0.5])
assert exc.value.args[0] == 'dy must be in the range [0:1]'
def test_healpix_to_lonlat_shape():
lon, lat = healpix_to_lonlat(2, 8)
assert lon.isscalar and lat.isscalar
lon, lat = healpix_to_lonlat([[1, 2, 3], [3, 4, 4]], 8)
assert lon.shape == (2, 3) and lat.shape == (2, 3)
lon, lat = healpix_to_lonlat([[1], [2], [3]], nside=8, dx=0.2, dy=[[0.1, 0.3]])
assert lon.shape == (3, 2) and lat.shape == (3, 2)
def test_lonlat_to_healpix_shape():
healpix_index = lonlat_to_healpix(2 * u.deg, 3 * u.deg, 8)
assert np.can_cast(healpix_index, np.int64)
lon, lat = np.ones((2, 4)) * u.deg, np.zeros((2, 4)) * u.deg
healpix_index = lonlat_to_healpix(lon, lat, 8)
assert healpix_index.shape == (2, 4)
healpix_index, dx, dy = lonlat_to_healpix(2 * u.deg, 3 * u.deg, 8, return_offsets=True)
assert np.can_cast(healpix_index, np.int64)
assert isinstance(dx, float)
assert isinstance(dy, float)
lon, lat = np.ones((2, 4)) * u.deg, np.zeros((2, 4)) * u.deg
healpix_index, dx, dy = lonlat_to_healpix(lon, lat, 8, return_offsets=True)
assert healpix_index.shape == (2, 4)
assert dx.shape == (2, 4)
assert dy.shape == (2, 4)
@pytest.mark.parametrize('function', [nested_to_ring, ring_to_nested])
def test_nested_ring_shape(function):
index = function(1, 8)
assert np.can_cast(index, np.int64)
index = function([[1, 2, 3], [2, 3, 4]], 8)
assert index.shape == (2, 3)
@pytest.mark.parametrize('order', ['nested', 'ring'])
def test_bilinear_interpolation_weights(order):
indices, weights = bilinear_interpolation_weights(100 * u.deg, 10 * u.deg,
nside=4, order=order)
if order == 'nested':
indices = nested_to_ring(indices, nside=4)
assert_equal(indices, [76, 77, 60, 59])
assert_allclose(weights, [0.532723, 0.426179, 0.038815, 0.002283], atol=1e-6)
def test_bilinear_interpolation_weights_invalid():
with pytest.raises(ValueError) as exc:
bilinear_interpolation_weights(1 * u.deg, 2 * u.deg, nside=5)
assert exc.value.args[0] == 'nside must be a power of two'
with pytest.raises(ValueError) as exc:
bilinear_interpolation_weights(3 * u.deg, 4 * u.deg,
nside=4, order='banana')
assert exc.value.args[0] == "order must be 'nested' or 'ring'"
def test_bilinear_interpolation_weights_shape():
indices, weights = bilinear_interpolation_weights(3 * u.deg, 4 * u.deg, nside=8)
assert indices.shape == (4,)
assert weights.shape == (4,)
indices, weights = bilinear_interpolation_weights([[1, 2, 3], [2, 3, 4]] * u.deg,
[[1, 2, 3], [2, 3, 4]] * u.deg, nside=8)
assert indices.shape == (4, 2, 3)
assert weights.shape == (4, 2, 3)
@pytest.mark.parametrize('order', ['nested', 'ring'])
def test_interpolate_bilinear_lonlat(order):
values = np.ones(192) * 3
result = interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg,
values, order=order)
assert_allclose(result, [3, 3, 3])
def test_interpolate_bilinear_invalid():
values = np.ones(133)
with pytest.raises(ValueError) as exc:
interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg, values)
assert exc.value.args[0] == 'Number of pixels must be divisible by 12'
values = np.ones(192)
with pytest.raises(ValueError) as exc:
interpolate_bilinear_lonlat([1, 3, 4] * u.deg, [3, 2, 6] * u.deg,
values, order='banana')
assert exc.value.args[0] == "order must be 'nested' or 'ring'"
result = interpolate_bilinear_lonlat([0, np.nan] * u.deg,
[0, np.nan] * u.deg, values,
order='nested')
assert result.shape == (2,)
assert result[0] == 1
assert np.isnan(result[1])
def test_interpolate_bilinear_lonlat_shape():
values = np.ones(192) * 3
result = interpolate_bilinear_lonlat(3 * u.deg, 4 * u.deg, values)
assert isinstance(result, float)
result = interpolate_bilinear_lonlat([[1, 2, 3], [2, 3, 4]] * u.deg,
[[1, 2, 3], [2, 3, 4]] * u.deg, values)
assert result.shape == (2, 3)
values = np.ones((192, 50)) * 3
lon = np.ones((3, 6, 5)) * u.deg
lat = np.ones((3, 6, 5)) * u.deg
result = interpolate_bilinear_lonlat(lon, lat, values)
assert result.shape == (3, 6, 5, 50)
@pytest.mark.parametrize('order', ['nested', 'ring'])
def test_neighbours(order):
neigh = neighbours([1, 2, 3], 4, order=order)
if order == 'nested':
expected = [[0, 71, 2],
[2, 77, 8],
[3, 8, 9],
[6, 9, 12],
[4, 3, 6],
[94, 1, 4],
[91, 0, 1],
[90, 69, 0]]
else:
expected = [[6, 8, 10],
[5, 7, 9],
[0, 1, 2],
[3, 0, 1],
[2, 3, 0],
[8, 10, 4],
[7, 9, 11],
[16, 19, 22]]
assert_equal(neigh, expected)
def test_neighbours_invalid():
with pytest.warns(RuntimeWarning, match='invalid value'):
neighbours([-1, 2, 3], 4)
with pytest.warns(RuntimeWarning, match='invalid value'):
neighbours([192, 2, 3], 4)
with pytest.raises(ValueError) as exc:
neighbours([1, 2, 3], 5)
assert exc.value.args[0] == 'nside must be a power of two'
with pytest.raises(ValueError) as exc:
neighbours([1, 2, 3], 4, order='banana')
assert exc.value.args[0] == "order must be 'nested' or 'ring'"
def test_neighbours_shape():
neigh = neighbours([[1, 2, 3], [2, 3, 4]], 4)
assert neigh.shape == (8, 2, 3)
@pytest.mark.parametrize('order', ['nested', 'ring'])
def test_healpix_cone_search(order):
indices = healpix_cone_search(10 * u.deg, 20 * u.deg, 1 * u.deg,
nside=256, order=order)
assert len(indices) == 80
@pytest.mark.parametrize(('step', 'order'), product([1, 4, 10], ['nested', 'ring']))
def test_boundaries_lonlat(step, order):
lon, lat = boundaries_lonlat([10, 20, 30], step, 256, order=order)
assert lon.shape == (3, 4 * step)
assert lat.shape == (3, 4 * step)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5080934
astropy-healpix-0.5/astropy_healpix/tests/test_healpy.py 0000644 0000770 0000024 00000027426 00000000000 023711 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
from itertools import product
import pytest
import numpy as np
from numpy.testing import assert_equal, assert_allclose
from .. import healpy as hp_compat
# NOTE: If healpy is installed, we use it in these tests, but healpy is not a
# formal dependency of astropy-healpix.
hp = pytest.importorskip('healpy')
from hypothesis import given, settings, example
from hypothesis.strategies import integers, floats, booleans
from hypothesis.extra.numpy import arrays
NSIDE_VALUES = [2 ** n for n in range(1, 6)]
@pytest.mark.parametrize(('nside', 'degrees'), product(NSIDE_VALUES, (False, True)))
def test_nside2pixarea(nside, degrees):
actual = hp_compat.nside2pixarea(nside=nside, degrees=degrees)
expected = hp.nside2pixarea(nside=nside, degrees=degrees)
assert_equal(actual, expected)
@pytest.mark.parametrize(('nside', 'arcmin'), product(NSIDE_VALUES, (False, True)))
def test_nside2resol(nside, arcmin):
actual = hp_compat.nside2resol(nside=nside, arcmin=arcmin)
expected = hp.nside2resol(nside=nside, arcmin=arcmin)
assert_equal(actual, expected)
@pytest.mark.parametrize('nside', NSIDE_VALUES)
def test_nside2npix(nside):
actual = hp_compat.nside2npix(nside)
expected = hp.nside2npix(nside)
assert_equal(actual, expected)
@pytest.mark.parametrize('level', [0, 3, 7])
def test_order2nside(level):
actual = hp_compat.order2nside(level)
expected = hp.order2nside(level)
assert_equal(actual, expected)
@pytest.mark.parametrize('npix', [12 * 2 ** (2 * n) for n in range(1, 6)])
def test_npix2nside(npix):
actual = hp_compat.npix2nside(npix)
expected = hp.npix2nside(npix)
assert_equal(actual, expected)
# For the test below, we exclude latitudes that fall exactly on the pole or
# the equator since points that fall at exact boundaries are ambiguous.
@given(nside_pow=integers(0, 29), nest=booleans(), lonlat=booleans(),
lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-10),
lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter(
lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-10))
@settings(max_examples=2000, derandomize=True)
def test_ang2pix(nside_pow, lon, lat, nest, lonlat):
nside = 2 ** nside_pow
if lonlat:
theta, phi = lon, lat
else:
theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon)
ipix1 = hp_compat.ang2pix(nside, theta, phi, nest=nest, lonlat=lonlat)
ipix2 = hp.ang2pix(nside, theta, phi, nest=nest, lonlat=lonlat)
assert ipix1 == ipix2
def test_ang2pix_shape():
ipix = hp_compat.ang2pix(8, 1., 2.)
assert np.can_cast(ipix, np.int64)
ipix = hp_compat.ang2pix(8, [[1., 2.], [3., 4.]], [[1., 2.], [3., 4.]])
assert ipix.shape == (2, 2)
def test_pix2ang_shape():
lon, lat = hp_compat.pix2ang(8, 1)
assert isinstance(lon, float)
assert isinstance(lat, float)
lon, lat = hp_compat.pix2ang(8, [[1, 2, 3], [4, 5, 6]])
assert lon.shape == (2, 3)
assert lat.shape == (2, 3)
@given(nside_pow=integers(0, 29), nest=booleans(), lonlat=booleans(),
frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1))
@settings(max_examples=2000, derandomize=True)
@example(nside_pow=29, frac=0.1666666694606345, nest=False, lonlat=False)
@example(nside_pow=27, frac=2./3., nest=True, lonlat=False)
def test_pix2ang(nside_pow, frac, nest, lonlat):
nside = 2 ** nside_pow
ipix = int(frac * 12 * nside ** 2)
theta1, phi1 = hp_compat.pix2ang(nside, ipix, nest=nest, lonlat=lonlat)
theta2, phi2 = hp.pix2ang(nside, ipix, nest=nest, lonlat=lonlat)
if lonlat:
assert_allclose(phi1, phi2, atol=1e-8)
if abs(phi1) < 90:
assert_allclose(theta1, theta2, atol=1e-10)
else:
assert_allclose(theta1, theta2, atol=1e-8)
if theta1 > 0:
assert_allclose(phi1, phi2, atol=1e-10)
@given(nside_pow=integers(0, 29), nest=booleans(),
x=floats(-1, 1, allow_nan=False, allow_infinity=False).filter(lambda x: abs(x) > 1e-10),
y=floats(-1, 1, allow_nan=False, allow_infinity=False).filter(lambda y: abs(y) > 1e-10),
z=floats(-1, 1, allow_nan=False, allow_infinity=False).filter(lambda z: abs(z) > 1e-10))
@settings(max_examples=2000, derandomize=True)
def test_vec2pix(nside_pow, x, y, z, nest):
nside = 2 ** nside_pow
ipix1 = hp_compat.vec2pix(nside, x, y, z, nest=nest)
ipix2 = hp.vec2pix(nside, x, y, z, nest=nest)
assert ipix1 == ipix2
@given(nside_pow=integers(0, 29), nest=booleans(),
frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1))
@settings(max_examples=2000, derandomize=True)
@example(nside_pow=29, frac=0.1666666694606345, nest=False)
def test_pix2vec(nside_pow, frac, nest):
nside = 2 ** nside_pow
ipix = int(frac * 12 * nside ** 2)
xyz1 = hp_compat.pix2vec(nside, ipix, nest=nest)
xyz2 = hp.pix2vec(nside, ipix, nest=nest)
assert_allclose(xyz1, xyz2, atol=1e-8)
def test_vec2pix_shape():
ipix = hp_compat.vec2pix(8, 1., 2., 3.)
assert np.can_cast(ipix, np.int64)
ipix = hp_compat.vec2pix(8, [[1., 2.], [3., 4.]], [[5., 6.], [7., 8.]], [[9., 10.], [11., 12.]])
assert ipix.shape == (2, 2)
def test_pix2vec_shape():
x, y, z = hp_compat.pix2vec(8, 1)
assert isinstance(x, float)
assert isinstance(y, float)
assert isinstance(z, float)
x, y, z = hp_compat.pix2vec(8, [[1, 2, 3], [4, 5, 6]])
assert x.shape == (2, 3)
assert y.shape == (2, 3)
assert z.shape == (2, 3)
@given(nside_pow=integers(0, 29),
frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1))
@settings(max_examples=2000, derandomize=True)
def test_nest2ring(nside_pow, frac):
nside = 2 ** nside_pow
nest = int(frac * 12 * nside ** 2)
ring1 = hp_compat.nest2ring(nside, nest)
ring2 = hp.nest2ring(nside, nest)
assert ring1 == ring2
@given(nside_pow=integers(0, 29),
frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1))
@settings(max_examples=2000, derandomize=True)
@example(nside_pow=29, frac=0.16666666697710755)
def test_ring2nest(nside_pow, frac):
nside = 2 ** nside_pow
ring = int(frac * 12 * nside ** 2)
nest1 = hp_compat.ring2nest(nside, ring)
nest2 = hp.ring2nest(nside, ring)
assert nest1 == nest2
@given(nside_pow=integers(0, 29), step=integers(1, 10), nest=booleans(),
frac=floats(0, 1, allow_nan=False, allow_infinity=False).filter(lambda x: x < 1))
@settings(max_examples=500, derandomize=True)
def test_boundaries(nside_pow, frac, step, nest):
nside = 2 ** nside_pow
pix = int(frac * 12 * nside ** 2)
b1 = hp_compat.boundaries(nside, pix, step=step, nest=nest)
b2 = hp.boundaries(nside, pix, step=step, nest=nest)
assert_allclose(b1, b2, atol=1e-8)
def test_boundaries_shape():
pix = 1
b1 = hp_compat.boundaries(8, pix, step=4)
b2 = hp.boundaries(8, pix, step=4)
assert b1.shape == b2.shape
pix = [1, 2, 3, 4, 5]
b1 = hp_compat.boundaries(8, pix, step=4)
b2 = hp.boundaries(8, pix, step=4)
assert b1.shape == b2.shape
def not_at_origin(vec):
return np.linalg.norm(vec) > 0
@given(vectors=arrays(float, (3,), elements=floats(-1, 1)).filter(not_at_origin),
lonlat=booleans(), ndim=integers(0, 4))
@settings(max_examples=500, derandomize=True)
def test_vec2ang(vectors, lonlat, ndim):
vectors = np.broadcast_to(vectors, (2,) * ndim + (3,))
theta1, phi1 = hp_compat.vec2ang(vectors, lonlat=lonlat)
theta2, phi2 = hp.vec2ang(vectors, lonlat=lonlat)
# Healpy sometimes returns NaNs for phi (somewhat incorrectly)
phi2 = np.nan_to_num(phi2)
assert_allclose(theta1, theta1, atol=1e-10)
assert_allclose(phi1, phi2, atol=1e-10)
@given(lonlat=booleans(),
lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-10),
lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter(
lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-10))
@settings(max_examples=2000, derandomize=True)
def test_ang2vec(lon, lat, lonlat):
if lonlat:
theta, phi = lon, lat
else:
theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon)
xyz1 = hp_compat.ang2vec(theta, phi, lonlat=lonlat)
xyz2 = hp.ang2vec(theta, phi, lonlat=lonlat)
assert_allclose(xyz1, xyz2, atol=1e-10)
# The following fails, need to investigate:
# @example(nside_pow=29, lon=1.0000000028043134e-05, lat=1.000000000805912e-05, nest=False, lonlat=False)
#
@given(nside_pow=integers(0, 28), nest=booleans(), lonlat=booleans(),
lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-5),
lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter(
lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-5))
@settings(max_examples=500, derandomize=True)
@example(nside_pow=27, lon=1.0000000028043134e-05, lat=-41.81031451395941, nest=False, lonlat=False)
@example(nside_pow=6, lon=1.6345238095238293, lat=69.42254649458224, nest=False, lonlat=False)
@example(nside_pow=15, lon=1.0000000028043134e-05, lat=1.000000000805912e-05, nest=False, lonlat=False)
@example(nside_pow=0, lon=315.0000117809725, lat=1.000000000805912e-05, nest=False, lonlat=False)
@example(nside_pow=0, lon=1.0000000028043134e-05, lat=-41.81031489577861, nest=False, lonlat=False)
@example(nside_pow=0, lon=35.559942143736414, lat=-41.8103252622604, nest=False, lonlat=False)
@example(nside_pow=28, lon=359.9999922886491, lat=-41.81031470486902, nest=False, lonlat=False)
@example(nside_pow=0, lon=1.0000000028043134e-05, lat=-41.81031489577861, nest=False, lonlat=False)
@example(nside_pow=27, lon=1.0000000028043134e-05, lat=-41.81031451395941, nest=False, lonlat=False)
@example(nside_pow=26, lon=359.9999986588955, lat=41.81031489577861, nest=False, lonlat=False)
@example(nside_pow=27, lon=359.999997317791, lat=-41.81031451395943, nest=False, lonlat=False)
@example(nside_pow=27, lon=1.0000000028043134e-05, lat=89.80224636153702, nest=False, lonlat=False)
def test_interp_weights(nside_pow, lon, lat, nest, lonlat):
nside = 2 ** nside_pow
if lonlat:
theta, phi = lon, lat
else:
theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon)
indices1, weights1 = hp_compat.get_interp_weights(nside, theta, phi, nest=nest, lonlat=lonlat)
indices2, weights2 = hp.get_interp_weights(nside, theta, phi, nest=nest, lonlat=lonlat)
# Ignore neighbours with weights < 1e-6 - we have to exclude these otherwise
# in some corner cases there will be different low-probability neighbours.
keep = weights1 > 1e-6
indices1, weights1 = indices1[keep], weights1[keep]
keep = weights2 > 1e-6
indices2, weights2 = indices2[keep], weights2[keep]
order1 = np.argsort(indices1)
order2 = np.argsort(indices2)
assert_equal(indices1[order1], indices2[order2])
assert_allclose(weights1[order1], weights2[order2], atol=1e-5)
# Make an array that can be useful up to the highest nside tested below
NSIDE_POW_MAX = 8
VALUES = np.random.random(12 * NSIDE_POW_MAX ** 2)
@given(nside_pow=integers(0, NSIDE_POW_MAX), nest=booleans(), lonlat=booleans(),
lon=floats(0, 360, allow_nan=False, allow_infinity=False).filter(lambda lon: abs(lon) > 1e-5),
lat=floats(-90, 90, allow_nan=False, allow_infinity=False).filter(
lambda lat: abs(lat) < 89.99 and abs(lat) > 1e-5))
@settings(max_examples=500, derandomize=True)
def test_interp_val(nside_pow, lon, lat, nest, lonlat):
nside = 2 ** nside_pow
if lonlat:
theta, phi = lon, lat
else:
theta, phi = np.pi / 2. - np.radians(lat), np.radians(lon)
m = VALUES[:12 * nside ** 2]
value1 = hp_compat.get_interp_val(m, theta, phi, nest=nest, lonlat=lonlat)
value2 = hp.get_interp_val(m, theta, phi, nest=nest, lonlat=lonlat)
assert_allclose(value1, value2, rtol=0.1, atol=1.e-10)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5086794
astropy-healpix-0.5/astropy_healpix/tests/test_high_level.py 0000644 0000770 0000024 00000015300 00000000000 024521 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
import pytest
import numpy as np
from numpy.testing import assert_allclose, assert_equal
from astropy import units as u
from astropy.coordinates import Longitude, Latitude, Galactic, SkyCoord
from ..high_level import HEALPix
class TestHEALPix:
def setup_class(self):
self.pix = HEALPix(nside=256, order='nested')
def test_pixel_area(self):
pixel_area = self.pix.pixel_area
assert_allclose(pixel_area.value, 1.5978966540475428e-05)
assert pixel_area.unit == u.sr
def test_pixel_resolution(self):
pixel_resolution = self.pix.pixel_resolution
assert_allclose(pixel_resolution.value, 13.741945647269624)
assert pixel_resolution.unit == u.arcmin
def test_npix(self):
assert self.pix.npix == 12 * 256 ** 2
# For the following tests, the numerical accuracy of this function is
# already tested in test_cython_api.py, so we focus here on functionality
# specific to the high-level functions.
def test_healpix_to_lonlat(self):
lon, lat = self.pix.healpix_to_lonlat([1, 2, 3])
assert isinstance(lon, Longitude)
assert isinstance(lat, Latitude)
index = self.pix.lonlat_to_healpix(lon, lat)
assert_equal(index, [1, 2, 3])
lon, lat = self.pix.healpix_to_lonlat([1, 2, 3],
dx=[0.1, 0.2, 0.3],
dy=[0.5, 0.4, 0.7])
assert isinstance(lon, Longitude)
assert isinstance(lat, Latitude)
index, dx, dy = self.pix.lonlat_to_healpix(lon, lat, return_offsets=True)
assert_equal(index, [1, 2, 3])
assert_allclose(dx, [0.1, 0.2, 0.3])
assert_allclose(dy, [0.5, 0.4, 0.7])
def test_nested_to_ring(self):
nested_index_1 = [1, 3, 22]
ring_index = self.pix.nested_to_ring(nested_index_1)
nested_index_2 = self.pix.ring_to_nested(ring_index)
assert_equal(nested_index_1, nested_index_2)
def test_bilinear_interpolation_weights(self):
indices, weights = self.pix.bilinear_interpolation_weights([1, 3, 4] * u.deg,
[3, 2, 6] * u.deg)
assert indices.shape == (4, 3)
assert weights.shape == (4, 3)
def test_interpolate_bilinear_lonlat(self):
values = np.ones(12 * 256 ** 2) * 3
result = self.pix.interpolate_bilinear_lonlat([1, 3, 4] * u.deg,
[3, 2, 6] * u.deg, values)
assert_allclose(result, [3, 3, 3])
def test_interpolate_bilinear_lonlat_invalid(self):
values = np.ones(222) * 3
with pytest.raises(ValueError) as exc:
self.pix.interpolate_bilinear_lonlat([1, 3, 4] * u.deg,
[3, 2, 6] * u.deg, values)
assert exc.value.args[0] == 'values must be an array of length 786432 (got 222)'
def test_cone_search_lonlat(self):
lon, lat = 1 * u.deg, 4 * u.deg
result = self.pix.cone_search_lonlat(lon, lat, 1 * u.deg)
assert len(result) == 77
def test_cone_search_lonlat_invalid(self):
lon, lat = [1, 2] * u.deg, [3, 4] * u.deg
with pytest.raises(ValueError) as exc:
self.pix.cone_search_lonlat(lon, lat, 1 * u.deg)
assert exc.value.args[0] == 'The longitude, latitude and radius must be scalar Quantity objects'
def test_boundaries_lonlat(self):
lon, lat = self.pix.boundaries_lonlat([10, 20, 30], 4)
assert lon.shape == (3, 16)
assert lat.shape == (3, 16)
def test_neighbours(self):
neigh = self.pix.neighbours([10, 20, 30])
assert neigh.shape == (8, 3)
class TestCelestialHEALPix:
def setup_class(self):
self.pix = HEALPix(nside=256, order='nested', frame=Galactic())
def test_healpix_from_header(self):
"""Test instantiation from a FITS header.
Notes
-----
We don't need to test all possible options, because
:meth:`~astropy_healpix.HEALPix.from_header` is just a wrapper around
:meth:`~astropy_healpix.utils.parse_input_healpix_data`, which is
tested exhaustively in :mod:`~astropy_healpix.tests.test_utils`.
"""
pix = HEALPix.from_header(
(np.empty(self.pix.npix), 'G'),
nested=self.pix.order == 'nested')
assert pix.nside == self.pix.nside
assert type(pix.frame) == type(self.pix.frame)
assert pix.order == self.pix.order
def test_healpix_to_skycoord(self):
coord = self.pix.healpix_to_skycoord([1, 2, 3])
assert isinstance(coord, SkyCoord)
assert isinstance(coord.frame, Galactic)
# Make sure that the skycoord_to_healpix method converts coordinates
# to the frame of the HEALPix
coord = coord.transform_to('fk5')
index = self.pix.skycoord_to_healpix(coord)
assert_equal(index, [1, 2, 3])
coord = self.pix.healpix_to_skycoord([1, 2, 3],
dx=[0.1, 0.2, 0.3],
dy=[0.5, 0.4, 0.7])
assert isinstance(coord, SkyCoord)
assert isinstance(coord.frame, Galactic)
# Make sure that the skycoord_to_healpix method converts coordinates
# to the frame of the HEALPix
coord = coord.transform_to('fk5')
index, dx, dy = self.pix.skycoord_to_healpix(coord, return_offsets=True)
assert_equal(index, [1, 2, 3])
assert_allclose(dx, [0.1, 0.2, 0.3])
assert_allclose(dy, [0.5, 0.4, 0.7])
def test_interpolate_bilinear_skycoord(self):
values = np.ones(12 * 256 ** 2) * 3
coord = SkyCoord([1, 2, 3] * u.deg, [4, 3, 1] * u.deg, frame='fk4')
result = self.pix.interpolate_bilinear_skycoord(coord, values)
assert_allclose(result, [3, 3, 3])
# Make sure that coordinate system is correctly taken into account
values = np.arange(12 * 256 ** 2) * 3
coord = SkyCoord([1, 2, 3] * u.deg, [4, 3, 1] * u.deg, frame='fk4')
result1 = self.pix.interpolate_bilinear_skycoord(coord, values)
result2 = self.pix.interpolate_bilinear_skycoord(coord.icrs, values)
assert_allclose(result1, result2)
def test_cone_search_skycoord(self):
coord = SkyCoord(1 * u.deg, 4 * u.deg, frame='galactic')
result1 = self.pix.cone_search_skycoord(coord, 1 * u.deg)
assert len(result1) == 77
result2 = self.pix.cone_search_skycoord(coord.icrs, 1 * u.deg)
assert_allclose(result1, result2)
def test_boundaries_skycoord(self):
coord = self.pix.boundaries_skycoord([10, 20, 30], 4)
assert coord.shape == (3, 16)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5089996
astropy-healpix-0.5/astropy_healpix/tests/test_utils.py 0000644 0000770 0000024 00000003232 00000000000 023554 0 ustar 00tom staff 0000000 0000000 import numpy as np
import pytest
from astropy.coordinates import FK5, Galactic
from astropy.io import fits
from ..utils import parse_coord_system, parse_input_healpix_data
def test_parse_coord_system():
frame = parse_coord_system(Galactic())
assert isinstance(frame, Galactic)
frame = parse_coord_system('fk5')
assert isinstance(frame, FK5)
with pytest.raises(ValueError) as exc:
frame = parse_coord_system('e')
assert exc.value.args[0] == "Ecliptic coordinate frame not yet supported"
frame = parse_coord_system('g')
assert isinstance(frame, Galactic)
with pytest.raises(ValueError) as exc:
frame = parse_coord_system('spam')
assert exc.value.args[0] == "Could not determine frame for system=spam"
def test_parse_input_healpix_data(tmpdir):
data = np.arange(3072)
col = fits.Column(array=data, name='flux', format="E")
hdu = fits.BinTableHDU.from_columns([col])
hdu.header['NSIDE'] = 512
hdu.header['COORDSYS'] = "G"
# As HDU
array, coordinate_system, nested = parse_input_healpix_data(hdu)
np.testing.assert_allclose(array, data)
# As filename
filename = tmpdir.join('test.fits').strpath
hdu.writeto(filename)
array, coordinate_system, nested = parse_input_healpix_data(filename)
np.testing.assert_allclose(array, data)
# As array
array, coordinate_system, nested = parse_input_healpix_data((data, "galactic"))
np.testing.assert_allclose(array, data)
# Invalid
with pytest.raises(TypeError) as exc:
parse_input_healpix_data(data)
assert exc.value.args[0] == "input_data should either be an HDU object or a tuple of (array, frame)"
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574433289.509307
astropy-healpix-0.5/astropy_healpix/utils.py 0000644 0000770 0000024 00000003453 00000000000 021360 0 ustar 00tom staff 0000000 0000000 import numpy as np
from astropy.io import fits
from astropy.io.fits import TableHDU, BinTableHDU
from astropy.coordinates import BaseCoordinateFrame, frame_transform_graph, Galactic, ICRS
FRAMES = {
'g': Galactic(),
'c': ICRS()
}
def parse_coord_system(system):
if isinstance(system, BaseCoordinateFrame):
return system
elif isinstance(system, str):
system = system.lower()
if system == 'e':
raise ValueError("Ecliptic coordinate frame not yet supported")
elif system in FRAMES:
return FRAMES[system]
else:
system_new = frame_transform_graph.lookup_name(system)
if system_new is None:
raise ValueError(f"Could not determine frame for system={system}")
else:
return system_new()
def parse_input_healpix_data(input_data, field=0, hdu_in=None, nested=None):
"""
Parse input HEALPIX data to return a Numpy array and coordinate frame object.
"""
if isinstance(input_data, (TableHDU, BinTableHDU)):
data = input_data.data
header = input_data.header
coordinate_system_in = parse_coord_system(header['COORDSYS'])
array_in = data[data.columns[field].name].ravel()
if 'ORDERING' in header:
nested = header['ORDERING'].lower() == 'nested'
elif isinstance(input_data, str):
hdu = fits.open(input_data)[hdu_in or 1]
return parse_input_healpix_data(hdu, field=field)
elif isinstance(input_data, tuple) and isinstance(input_data[0], np.ndarray):
array_in = input_data[0]
coordinate_system_in = parse_coord_system(input_data[1])
else:
raise TypeError("input_data should either be an HDU object or a tuple of (array, frame)")
return array_in, coordinate_system_in, nested
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696079.8735893
astropy-healpix-0.5/astropy_healpix/version.py 0000644 0000770 0000024 00000000573 00000000000 021705 0 ustar 00tom staff 0000000 0000000 # Autogenerated by Astropy-affiliated package astropy_healpix's setup.py on 2019-11-25 15:34:39 UTC
import datetime
version = "0.5"
githash = "09192c016185fa9af35d876f8364dc0af19a825a"
major = 0
minor = 5
bugfix = 0
version_info = (major, minor, bugfix)
release = True
timestamp = datetime.datetime(2019, 11, 25, 15, 34, 39)
debug = False
astropy_helpers_version = "3.2.2"
././@PaxHeader 0000000 0000000 0000000 00000000032 00000000000 011450 x ustar 00 0000000 0000000 26 mtime=1574696080.53355
astropy-healpix-0.5/astropy_helpers/ 0000755 0000770 0000024 00000000000 00000000000 017651 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6107712
astropy-healpix-0.5/astropy_helpers/CHANGES.rst 0000644 0000770 0000024 00000053173 00000000000 021464 0 ustar 00tom staff 0000000 0000000 astropy-helpers Changelog
*************************
3.2.2 (2019-10-25)
------------------
- Correctly handle main package directory inside namespace package. [#486]
3.2.1 (2019-06-13)
------------------
- Reverting issuing deprecation warning for the ``build_sphinx`` command. [#482]
- Make sure that all data files get included in tar file releases. [#485]
3.2 (2019-05-29)
----------------
- Make sure that ``[options.package_data]`` in setup.cfg is taken into account
when collecting package data. [#453]
- Simplified the code for the custom build_ext command. [#446]
- Avoid importing the astropy package when trying to get the test command
when testing astropy itself. [#450]
- Avoid importing whole package when trying to get version information. Note
that this has also introduced a small API change - ``cython_version`` and
``compiler`` can no longer be imported from the ``package.version`` module
generated by astropy-helpers. Instead, you can import these from
``package.cython_version`` and ``package.compiler_version`` respectively. [#442]
- Make it possible to call ``generate_version_py`` and ``register_commands``
without any arguments, which causes information to be read in from the
``setup.cfg`` file. [#440]
- Simplified setup.py and moved most of the configuration to setup.cfg. [#445]
- Add a new ``astropy_helpers.setup_helpers.setup`` function that does all
the default boilerplate in typical ``setup.py`` files that use
astropy-helpers. [#443]
- Remove ``deprecated``, ``deprecated_attribute``, and ``minversion`` from
``astropy_helpers.utils``. [#447]
- Updated minimum required version of setuptools to 30.3.0. [#440]
- Remove functionality to adjust compilers if a broken compiler is detected.
This is not useful anymore as only a single compiler was previously patched
(now unlikely to be used) and this was only to fix a compilation issue in the
core astropy package. [#421]
- ``sphinx-astropy`` is now a required dependency to build the docs, the
machinery to install it as eggs have been removed. [#474]
3.1.1 (2019-02-22)
------------------
- Moved documentation from README to Sphinx. [#444]
- Fixed broken OpenMP detection when building with ``-coverage``. [#434]
3.1 (2018-12-04)
----------------
- Added extensive documentation about astropy-helpers to the README.rst file. [#416]
- Fixed the compatibility of the build_docs command with Sphinx 1.8 and above. [#413]
- Removing deprecated test_helpers.py file. [#369]
- Removing ez_setup.py file and requiring setuptools 1.0 or later. [#384]
- Remove all sphinx components from ``astropy-helpers``. These are now replaced
by the ``sphinx-astropy`` package in conjunction with the ``astropy-theme-sphinx``,
``sphinx-automodapi``, and ``numpydoc`` packages. [#368]
- openmp_helpers.py: Make add_openmp_flags_if_available() work for clang.
The necessary include, library, and runtime paths now get added to the C test code
used to determine if openmp works.
Autogenerator utility added ``openmp_enabled.is_openmp_enabled()``
which can be called post build to determine state of OpenMP support.
[#382]
- Add version_info tuple to autogenerated version.py. Allows for simple
version checking, i.e. version_info > (2,0,1). [#385]
3.0.2 (2018-06-01)
------------------
- Nothing changed.
3.0.1 (2018-02-22)
------------------
- Nothing changed.
3.0 (2018-02-09)
----------------
- Removing Python 2 support, including 2to3. Packages wishing to keep Python
2 support should NOT update to this version. [#340]
- Removing deprecated _test_compat making astropy a hard dependency for
packages wishing to use the astropy tests machinery. [#314]
- Removing unused 'register' command since packages should be uploaded
with twine and get registered automatically. [#332]
2.0.11 (2019-10-25)
-------------------
- Fixed deprecation warning in sphinx theme. [#493]
- Fixed an issue that caused pytest to crash if it tried to collect
tests. [#488]
2.0.10 (2019-05-29)
-------------------
- Removed ``tocdepthfix`` sphinx extension that worked around a big in
Sphinx that has been long fixed. [#475]
- Allow Python dev versions to pass the python version check. [#476]
- Updated bundled version of sphinx-automodapi to v0.11. [#478]
2.0.9 (2019-02-22)
------------------
- Updated bundled version of sphinx-automodapi to v0.10. [#439]
- Updated bundled sphinx extensions version to sphinx-astropy v1.1.1. [#454]
- Include package name in error message for Python version in
``ah_bootstrap.py``. [#441]
2.0.8 (2018-12-04)
------------------
- Fixed compatibility with Sphinx 1.8+. [#428]
- Fixed error that occurs when installing a package in an environment where
``numpy`` is not already installed. [#404]
- Updated bundled version of sphinx-automodapi to v0.9. [#422]
- Updated bundled version of numpydoc to v0.8.0. [#423]
2.0.7 (2018-06-01)
------------------
- Removing ez_setup.py file and requiring setuptools 1.0 or later. [#384]
2.0.6 (2018-02-24)
------------------
- Avoid deprecation warning due to ``exclude=`` keyword in ``setup.py``. [#379]
2.0.5 (2018-02-22)
------------------
- Fix segmentation faults that occurred when the astropy-helpers submodule
was first initialized in packages that also contained Cython code. [#375]
2.0.4 (2018-02-09)
------------------
- Support dotted package names as namespace packages in generate_version_py.
[#370]
- Fix compatibility with setuptools 36.x and above. [#372]
- Fix false negative in add_openmp_flags_if_available when measuring code
coverage with gcc. [#374]
2.0.3 (2018-01-20)
------------------
- Make sure that astropy-helpers 3.x.x is not downloaded on Python 2. [#362, #363]
- The bundled version of sphinx-automodapi has been updated to v0.7. [#365]
- Add --auto-use and --no-auto-use command-line flags to match the
``auto_use`` configuration option, and add an alias
``--use-system-astropy-helpers`` for ``--no-auto-use``. [#366]
2.0.2 (2017-10-13)
------------------
- Added new helper function add_openmp_flags_if_available that can add
OpenMP compilation flags to a C/Cython extension if needed. [#346]
- Update numpydoc to v0.7. [#343]
- The function ``get_git_devstr`` now returns ``'0'`` instead of ``None`` when
no git repository is present. This allows generation of development version
strings that are in a format that ``setuptools`` expects (e.g. "1.1.3.dev0"
instead of "1.1.3.dev"). [#330]
- It is now possible to override generated timestamps to make builds
reproducible by setting the ``SOURCE_DATE_EPOCH`` environment variable [#341]
- Mark Sphinx extensions as parallel-safe. [#344]
- Switch to using mathjax instead of imgmath for local builds. [#342]
- Deprecate ``exclude`` parameter of various functions in setup_helpers since
it could not work as intended. Add new function ``add_exclude_packages`` to
provide intended behavior. [#331]
- Allow custom Sphinx doctest extension to recognize and process standard
doctest directives ``testsetup`` and ``doctest``. [#335]
2.0.1 (2017-07-28)
------------------
- Fix compatibility with Sphinx <1.5. [#326]
2.0 (2017-07-06)
----------------
- Add support for package that lies in a subdirectory. [#249]
- Removing ``compat.subprocess``. [#298]
- Python 3.3 is no longer supported. [#300]
- The 'automodapi' Sphinx extension (and associated dependencies) has now
been moved to a standalone package which can be found at
https://github.com/astropy/sphinx-automodapi - this is now bundled in
astropy-helpers under astropy_helpers.extern.automodapi for
convenience. Version shipped with astropy-helpers is v0.6.
[#278, #303, #309, #323]
- The ``numpydoc`` Sphinx extension has now been moved to
``astropy_helpers.extern``. [#278]
- Fix ``build_docs`` error catching, so it doesn't hide Sphinx errors. [#292]
- Fix compatibility with Sphinx 1.6. [#318]
- Updating ez_setup.py to the last version before it's removal. [#321]
1.3.1 (2017-03-18)
------------------
- Fixed the missing button to hide output in documentation code
blocks. [#287]
- Fixed bug when ``build_docs`` when running with the clean (-l) option. [#289]
- Add alternative location for various intersphinx inventories to fall back
to. [#293]
1.3 (2016-12-16)
----------------
- ``build_sphinx`` has been deprecated in favor of the ``build_docs`` command.
[#246]
- Force the use of Cython's old ``build_ext`` command. A new ``build_ext``
command was added in Cython 0.25, but it does not work with astropy-helpers
currently. [#261]
1.2 (2016-06-18)
----------------
- Added sphinx configuration value ``automodsumm_inherited_members``.
If ``True`` this will include members that are inherited from a base
class in the generated API docs. Defaults to ``False`` which matches
the previous behavior. [#215]
- Fixed ``build_sphinx`` to recognize builds that succeeded but have output
*after* the "build succeeded." statement. This only applies when
``--warnings-returncode`` is given (which is primarily relevant for Travis
documentation builds). [#223]
- Fixed ``build_sphinx`` the sphinx extensions to not output a spurious warning
for sphinx versions > 1.4. [#229]
- Add Python version dependent local sphinx inventories that contain
otherwise missing references. [#216]
- ``astropy_helpers`` now require Sphinx 1.3 or later. [#226]
1.1.2 (2016-03-9)
-----------------
- The CSS for the sphinx documentation was altered to prevent some text overflow
problems. [#217]
1.1.1 (2015-12-23)
------------------
- Fixed crash in build with ``AttributeError: cython_create_listing`` with
older versions of setuptools. [#209, #210]
1.1 (2015-12-10)
----------------
- The original ``AstropyTest`` class in ``astropy_helpers``, which implements
the ``setup.py test`` command, is deprecated in favor of moving the
implementation of that command closer to the actual Astropy test runner in
``astropy.tests``. Now a dummy ``test`` command is provided solely for
informing users that they need ``astropy`` installed to run the tests
(however, the previous, now deprecated implementation is still provided and
continues to work with older versions of Astropy). See the related issue for
more details. [#184]
- Added a useful new utility function to ``astropy_helpers.utils`` called
``find_data_files``. This is similar to the ``find_packages`` function in
setuptools in that it can be used to search a package for data files
(matching a pattern) that can be passed to the ``package_data`` argument for
``setup()``. See the docstring to ``astropy_helpers.utils.find_data_files``
for more details. [#42]
- The ``astropy_helpers`` module now sets the global ``_ASTROPY_SETUP_``
flag upon import (from within a ``setup.py``) script, so it's not necessary
to have this in the ``setup.py`` script explicitly. If in doubt though,
there's no harm in setting it twice. Putting it in ``astropy_helpers``
just ensures that any other imports that occur during build will have this
flag set. [#191]
- It is now possible to use Cython as a ``setup_requires`` build requirement,
and still build Cython extensions even if Cython wasn't available at the
beginning of the build processes (that is, is automatically downloaded via
setuptools' processing of ``setup_requires``). [#185]
- Moves the ``adjust_compiler`` check into the ``build_ext`` command itself,
so it's only used when actually building extension modules. This also
deprecates the stand-alone ``adjust_compiler`` function. [#76]
- When running the ``build_sphinx`` / ``build_docs`` command with the ``-w``
option, the output from Sphinx is streamed as it runs instead of silently
buffering until the doc build is complete. [#197]
1.0.7 (unreleased)
------------------
- Fix missing import in ``astropy_helpers/utils.py``. [#196]
1.0.6 (2015-12-04)
------------------
- Fixed bug where running ``./setup.py build_sphinx`` could return successfully
even when the build was not successful (and should have returned a non-zero
error code). [#199]
1.0.5 (2015-10-02)
------------------
- Fixed a regression in the ``./setup.py test`` command that was introduced in
v1.0.4.
1.0.4 (2015-10-02)
------------------
- Fixed issue with the sphinx documentation css where the line numbers for code
blocks were not aligned with the code. [#179, #180]
- Fixed crash that could occur when trying to build Cython extension modules
when Cython isn't installed. Normally this still results in a failed build,
but was supposed to provide a useful error message rather than crash
outright (this was a regression introduced in v1.0.3). [#181]
- Fixed a crash that could occur on Python 3 when a working C compiler isn't
found. [#182]
- Quieted warnings about deprecated Numpy API in Cython extensions, when
building Cython extensions against Numpy >= 1.7. [#183, #186]
- Improved support for py.test >= 2.7--running the ``./setup.py test`` command
now copies all doc pages into the temporary test directory as well, so that
all test files have a "common root directory". [#189, #190]
1.0.3 (2015-07-22)
------------------
- Added workaround for sphinx-doc/sphinx#1843, a but in Sphinx which
prevented descriptor classes with a custom metaclass from being documented
correctly. [#158]
- Added an alias for the ``./setup.py build_sphinx`` command as
``./setup.py build_docs`` which, to a new contributor, should hopefully be
less cryptic. [#161]
- The fonts in graphviz diagrams now match the font of the HTML content. [#169]
- When the documentation is built on readthedocs.org, MathJax will be
used for math rendering. When built elsewhere, the "pngmath"
extension is still used for math rendering. [#170]
- Fix crash when importing astropy_helpers when running with ``python -OO``
[#171]
- The ``build`` and ``build_ext`` stages now correctly recognize the presence
of C++ files in Cython extensions (previously only vanilla C worked). [#173]
1.0.2 (2015-04-02)
------------------
- Various fixes enabling the astropy-helpers Sphinx build command and
Sphinx extensions to work with Sphinx 1.3. [#148]
- More improvement to the ability to handle multiple versions of
astropy-helpers being imported in the same Python interpreter session
in the (somewhat rare) case of nested installs. [#147]
- To better support high resolution displays, use SVG for the astropy
logo and linkout image, falling back to PNGs for browsers that
support it. [#150, #151]
- Improve ``setup_helpers.get_compiler_version`` to work with more compilers,
and to return more info. This will help fix builds of Astropy on less
common compilers, like Sun C. [#153]
1.0.1 (2015-03-04)
------------------
- Released in concert with v0.4.8 to address the same issues.
0.4.8 (2015-03-04)
------------------
- Improved the ``ah_bootstrap`` script's ability to override existing
installations of astropy-helpers with new versions in the context of
installing multiple packages simultaneously within the same Python
interpreter (e.g. when one package has in its ``setup_requires`` another
package that uses a different version of astropy-helpers. [#144]
- Added a workaround to an issue in matplotlib that can, in rare cases, lead
to a crash when installing packages that import matplotlib at build time.
[#144]
1.0 (2015-02-17)
----------------
- Added new pre-/post-command hook points for ``setup.py`` commands. Now any
package can define code to run before and/or after any ``setup.py`` command
without having to manually subclass that command by adding
``pre__hook`` and ``post__hook`` callables to
the package's ``setup_package.py`` module. See the PR for more details.
[#112]
- The following objects in the ``astropy_helpers.setup_helpers`` module have
been relocated:
- ``get_dummy_distribution``, ``get_distutils_*``, ``get_compiler_option``,
``add_command_option``, ``is_distutils_display_option`` ->
``astropy_helpers.distutils_helpers``
- ``should_build_with_cython``, ``generate_build_ext_command`` ->
``astropy_helpers.commands.build_ext``
- ``AstropyBuildPy`` -> ``astropy_helpers.commands.build_py``
- ``AstropyBuildSphinx`` -> ``astropy_helpers.commands.build_sphinx``
- ``AstropyInstall`` -> ``astropy_helpers.commands.install``
- ``AstropyInstallLib`` -> ``astropy_helpers.commands.install_lib``
- ``AstropyRegister`` -> ``astropy_helpers.commands.register``
- ``get_pkg_version_module`` -> ``astropy_helpers.version_helpers``
- ``write_if_different``, ``import_file``, ``get_numpy_include_path`` ->
``astropy_helpers.utils``
All of these are "soft" deprecations in the sense that they are still
importable from ``astropy_helpers.setup_helpers`` for now, and there is
no (easy) way to produce deprecation warnings when importing these objects
from ``setup_helpers`` rather than directly from the modules they are
defined in. But please consider updating any imports to these objects.
[#110]
- Use of the ``astropy.sphinx.ext.astropyautosummary`` extension is deprecated
for use with Sphinx < 1.2. Instead it should suffice to remove this
extension for the ``extensions`` list in your ``conf.py`` and add the stock
``sphinx.ext.autosummary`` instead. [#131]
0.4.7 (2015-02-17)
------------------
- Fixed incorrect/missing git hash being added to the generated ``version.py``
when creating a release. [#141]
0.4.6 (2015-02-16)
------------------
- Fixed problems related to the automatically generated _compiler
module not being created properly. [#139]
0.4.5 (2015-02-11)
------------------
- Fixed an issue where ah_bootstrap.py could blow up when astropy_helper's
version number is 1.0.
- Added a workaround for documentation of properties in the rare case
where the class's metaclass has a property of the same name. [#130]
- Fixed an issue on Python 3 where importing a package using astropy-helper's
generated version.py module would crash when the current working directory
is an empty git repository. [#114, #137]
- Fixed an issue where the "revision count" appended to .dev versions by
the generated version.py did not accurately reflect the revision count for
the package it belongs to, and could be invalid if the current working
directory is an unrelated git repository. [#107, #137]
- Likewise, fixed a confusing warning message that could occur in the same
circumstances as the above issue. [#121, #137]
0.4.4 (2014-12-31)
------------------
- More improvements for building the documentation using Python 3.x. [#100]
- Additional minor fixes to Python 3 support. [#115]
- Updates to support new test features in Astropy [#92, #106]
0.4.3 (2014-10-22)
------------------
- The generated ``version.py`` file now preserves the git hash of installed
copies of the package as well as when building a source distribution. That
is, the git hash of the changeset that was installed/released is preserved.
[#87]
- In smart resolver add resolution for class links when they exist in the
intersphinx inventory, but not the mapping of the current package
(e.g. when an affiliated package uses an astropy core class of which
"actual" and "documented" location differs) [#88]
- Fixed a bug that could occur when running ``setup.py`` for the first time
in a repository that uses astropy-helpers as a submodule:
``AttributeError: 'NoneType' object has no attribute 'mkdtemp'`` [#89]
- Fixed a bug where optional arguments to the ``doctest-skip`` Sphinx
directive were sometimes being left in the generated documentation output.
[#90]
- Improved support for building the documentation using Python 3.x. [#96]
- Avoid error message if .git directory is not present. [#91]
0.4.2 (2014-08-09)
------------------
- Fixed some CSS issues in generated API docs. [#69]
- Fixed the warning message that could be displayed when generating a
version number with some older versions of git. [#77]
- Fixed automodsumm to work with new versions of Sphinx (>= 1.2.2). [#80]
0.4.1 (2014-08-08)
------------------
- Fixed git revision count on systems with git versions older than v1.7.2.
[#70]
- Fixed display of warning text when running a git command fails (previously
the output of stderr was not being decoded properly). [#70]
- The ``--offline`` flag to ``setup.py`` understood by ``ah_bootstrap.py``
now also prevents git from going online to fetch submodule updates. [#67]
- The Sphinx extension for converting issue numbers to links in the changelog
now supports working on arbitrary pages via a new ``conf.py`` setting:
``changelog_links_docpattern``. By default it affects the ``changelog``
and ``whatsnew`` pages in one's Sphinx docs. [#61]
- Fixed crash that could result from users with missing/misconfigured
locale settings. [#58]
- The font used for code examples in the docs is now the
system-defined ``monospace`` font, rather than ``Minaco``, which is
not available on all platforms. [#50]
0.4 (2014-07-15)
----------------
- Initial release of astropy-helpers. See `APE4
`_ for
details of the motivation and design of this package.
- The ``astropy_helpers`` package replaces the following modules in the
``astropy`` package:
- ``astropy.setup_helpers`` -> ``astropy_helpers.setup_helpers``
- ``astropy.version_helpers`` -> ``astropy_helpers.version_helpers``
- ``astropy.sphinx`` - > ``astropy_helpers.sphinx``
These modules should be considered deprecated in ``astropy``, and any new,
non-critical changes to those modules will be made in ``astropy_helpers``
instead. Affiliated packages wishing to make use those modules (as in the
Astropy package-template) should use the versions from ``astropy_helpers``
instead, and include the ``ah_bootstrap.py`` script in their project, for
bootstrapping the ``astropy_helpers`` package in their setup.py script.
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1526666202.828646
astropy-healpix-0.5/astropy_helpers/LICENSE.rst 0000644 0000770 0000024 00000002723 00000000000 021471 0 ustar 00tom staff 0000000 0000000 Copyright (c) 2014, Astropy Developers
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
* Neither the name of the Astropy Team nor the names of its contributors may be
used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
././@PaxHeader 0000000 0000000 0000000 00000000032 00000000000 011450 x ustar 00 0000000 0000000 26 mtime=1574694604.61213
astropy-healpix-0.5/astropy_helpers/README.rst 0000644 0000770 0000024 00000002663 00000000000 021347 0 ustar 00tom staff 0000000 0000000 astropy-helpers
===============
.. image:: https://travis-ci.org/astropy/astropy-helpers.svg
:target: https://travis-ci.org/astropy/astropy-helpers
.. image:: https://ci.appveyor.com/api/projects/status/rt9161t9mhx02xp7/branch/master?svg=true
:target: https://ci.appveyor.com/project/Astropy/astropy-helpers
.. image:: https://codecov.io/gh/astropy/astropy-helpers/branch/master/graph/badge.svg
:target: https://codecov.io/gh/astropy/astropy-helpers
The **astropy-helpers** package includes many build, installation, and
documentation-related tools used by the Astropy project, but packaged separately
for use by other projects that wish to leverage this work. The motivation behind
this package and details of its implementation are in the accepted
`Astropy Proposal for Enhancement (APE) 4 `_.
Astropy-helpers is not a traditional package in the sense that it is not
intended to be installed directly by users or developers. Instead, it is meant
to be accessed when the ``setup.py`` command is run - see the "Using
astropy-helpers in a package" section in the documentation for how
to do this. For a real-life example of how to implement astropy-helpers in a
project, see the ``setup.py`` and ``setup.cfg`` files of the
`Affiliated package template `_.
For more information, see the documentation at http://astropy-helpers.readthedocs.io
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6132336
astropy-healpix-0.5/astropy_helpers/ah_bootstrap.py 0000644 0000770 0000024 00000110633 00000000000 022714 0 ustar 00tom staff 0000000 0000000 """
This bootstrap module contains code for ensuring that the astropy_helpers
package will be importable by the time the setup.py script runs. It also
includes some workarounds to ensure that a recent-enough version of setuptools
is being used for the installation.
This module should be the first thing imported in the setup.py of distributions
that make use of the utilities in astropy_helpers. If the distribution ships
with its own copy of astropy_helpers, this module will first attempt to import
from the shipped copy. However, it will also check PyPI to see if there are
any bug-fix releases on top of the current version that may be useful to get
past platform-specific bugs that have been fixed. When running setup.py, use
the ``--offline`` command-line option to disable the auto-upgrade checks.
When this module is imported or otherwise executed it automatically calls a
main function that attempts to read the project's setup.cfg file, which it
checks for a configuration section called ``[ah_bootstrap]`` the presences of
that section, and options therein, determine the next step taken: If it
contains an option called ``auto_use`` with a value of ``True``, it will
automatically call the main function of this module called
`use_astropy_helpers` (see that function's docstring for full details).
Otherwise no further action is taken and by default the system-installed version
of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers``
may be called manually from within the setup.py script).
This behavior can also be controlled using the ``--auto-use`` and
``--no-auto-use`` command-line flags. For clarity, an alias for
``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using
the latter if needed.
Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same
names as the arguments to `use_astropy_helpers`, and can be used to configure
the bootstrap script when ``auto_use = True``.
See https://github.com/astropy/astropy-helpers for more details, and for the
latest version of this module.
"""
import contextlib
import errno
import io
import locale
import os
import re
import subprocess as sp
import sys
from distutils import log
from distutils.debug import DEBUG
from configparser import ConfigParser, RawConfigParser
import pkg_resources
from setuptools import Distribution
from setuptools.package_index import PackageIndex
# This is the minimum Python version required for astropy-helpers
__minimum_python_version__ = (3, 5)
# TODO: Maybe enable checking for a specific version of astropy_helpers?
DIST_NAME = 'astropy-helpers'
PACKAGE_NAME = 'astropy_helpers'
UPPER_VERSION_EXCLUSIVE = None
# Defaults for other options
DOWNLOAD_IF_NEEDED = True
INDEX_URL = 'https://pypi.python.org/simple'
USE_GIT = True
OFFLINE = False
AUTO_UPGRADE = True
# A list of all the configuration options and their required types
CFG_OPTIONS = [
('auto_use', bool), ('path', str), ('download_if_needed', bool),
('index_url', str), ('use_git', bool), ('offline', bool),
('auto_upgrade', bool)
]
# Start off by parsing the setup.cfg file
_err_help_msg = """
If the problem persists consider installing astropy_helpers manually using pip
(`pip install astropy_helpers`) or by manually downloading the source archive,
extracting it, and installing by running `python setup.py install` from the
root of the extracted source code.
"""
SETUP_CFG = ConfigParser()
if os.path.exists('setup.cfg'):
try:
SETUP_CFG.read('setup.cfg')
except Exception as e:
if DEBUG:
raise
log.error(
"Error reading setup.cfg: {0!r}\n{1} will not be "
"automatically bootstrapped and package installation may fail."
"\n{2}".format(e, PACKAGE_NAME, _err_help_msg))
# We used package_name in the package template for a while instead of name
if SETUP_CFG.has_option('metadata', 'name'):
parent_package = SETUP_CFG.get('metadata', 'name')
elif SETUP_CFG.has_option('metadata', 'package_name'):
parent_package = SETUP_CFG.get('metadata', 'package_name')
else:
parent_package = None
if SETUP_CFG.has_option('options', 'python_requires'):
python_requires = SETUP_CFG.get('options', 'python_requires')
# The python_requires key has a syntax that can be parsed by SpecifierSet
# in the packaging package. However, we don't want to have to depend on that
# package, so instead we can use setuptools (which bundles packaging). We
# have to add 'python' to parse it with Requirement.
from pkg_resources import Requirement
req = Requirement.parse('python' + python_requires)
# We want the Python version as a string, which we can get from the platform module
import platform
# strip off trailing '+' incase this is a dev install of python
python_version = platform.python_version().strip('+')
# allow pre-releases to count as 'new enough'
if not req.specifier.contains(python_version, True):
if parent_package is None:
message = "ERROR: Python {} is required by this package\n".format(req.specifier)
else:
message = "ERROR: Python {} is required by {}\n".format(req.specifier, parent_package)
sys.stderr.write(message)
sys.exit(1)
if sys.version_info < __minimum_python_version__:
if parent_package is None:
message = "ERROR: Python {} or later is required by astropy-helpers\n".format(
__minimum_python_version__)
else:
message = "ERROR: Python {} or later is required by astropy-helpers for {}\n".format(
__minimum_python_version__, parent_package)
sys.stderr.write(message)
sys.exit(1)
_str_types = (str, bytes)
# What follows are several import statements meant to deal with install-time
# issues with either missing or misbehaving pacakges (including making sure
# setuptools itself is installed):
# Check that setuptools 30.3 or later is present
from distutils.version import LooseVersion
try:
import setuptools
assert LooseVersion(setuptools.__version__) >= LooseVersion('30.3')
except (ImportError, AssertionError):
sys.stderr.write("ERROR: setuptools 30.3 or later is required by astropy-helpers\n")
sys.exit(1)
# typing as a dependency for 1.6.1+ Sphinx causes issues when imported after
# initializing submodule with ah_boostrap.py
# See discussion and references in
# https://github.com/astropy/astropy-helpers/issues/302
try:
import typing # noqa
except ImportError:
pass
# Note: The following import is required as a workaround to
# https://github.com/astropy/astropy-helpers/issues/89; if we don't import this
# module now, it will get cleaned up after `run_setup` is called, but that will
# later cause the TemporaryDirectory class defined in it to stop working when
# used later on by setuptools
try:
import setuptools.py31compat # noqa
except ImportError:
pass
# matplotlib can cause problems if it is imported from within a call of
# run_setup(), because in some circumstances it will try to write to the user's
# home directory, resulting in a SandboxViolation. See
# https://github.com/matplotlib/matplotlib/pull/4165
# Making sure matplotlib, if it is available, is imported early in the setup
# process can mitigate this (note importing matplotlib.pyplot has the same
# issue)
try:
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot
except:
# Ignore if this fails for *any* reason*
pass
# End compatibility imports...
class _Bootstrapper(object):
"""
Bootstrapper implementation. See ``use_astropy_helpers`` for parameter
documentation.
"""
def __init__(self, path=None, index_url=None, use_git=None, offline=None,
download_if_needed=None, auto_upgrade=None):
if path is None:
path = PACKAGE_NAME
if not (isinstance(path, _str_types) or path is False):
raise TypeError('path must be a string or False')
if not isinstance(path, str):
fs_encoding = sys.getfilesystemencoding()
path = path.decode(fs_encoding) # path to unicode
self.path = path
# Set other option attributes, using defaults where necessary
self.index_url = index_url if index_url is not None else INDEX_URL
self.offline = offline if offline is not None else OFFLINE
# If offline=True, override download and auto-upgrade
if self.offline:
download_if_needed = False
auto_upgrade = False
self.download = (download_if_needed
if download_if_needed is not None
else DOWNLOAD_IF_NEEDED)
self.auto_upgrade = (auto_upgrade
if auto_upgrade is not None else AUTO_UPGRADE)
# If this is a release then the .git directory will not exist so we
# should not use git.
git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git'))
if use_git is None and not git_dir_exists:
use_git = False
self.use_git = use_git if use_git is not None else USE_GIT
# Declared as False by default--later we check if astropy-helpers can be
# upgraded from PyPI, but only if not using a source distribution (as in
# the case of import from a git submodule)
self.is_submodule = False
@classmethod
def main(cls, argv=None):
if argv is None:
argv = sys.argv
config = cls.parse_config()
config.update(cls.parse_command_line(argv))
auto_use = config.pop('auto_use', False)
bootstrapper = cls(**config)
if auto_use:
# Run the bootstrapper, otherwise the setup.py is using the old
# use_astropy_helpers() interface, in which case it will run the
# bootstrapper manually after reconfiguring it.
bootstrapper.run()
return bootstrapper
@classmethod
def parse_config(cls):
if not SETUP_CFG.has_section('ah_bootstrap'):
return {}
config = {}
for option, type_ in CFG_OPTIONS:
if not SETUP_CFG.has_option('ah_bootstrap', option):
continue
if type_ is bool:
value = SETUP_CFG.getboolean('ah_bootstrap', option)
else:
value = SETUP_CFG.get('ah_bootstrap', option)
config[option] = value
return config
@classmethod
def parse_command_line(cls, argv=None):
if argv is None:
argv = sys.argv
config = {}
# For now we just pop recognized ah_bootstrap options out of the
# arg list. This is imperfect; in the unlikely case that a setup.py
# custom command or even custom Distribution class defines an argument
# of the same name then we will break that. However there's a catch22
# here that we can't just do full argument parsing right here, because
# we don't yet know *how* to parse all possible command-line arguments.
if '--no-git' in argv:
config['use_git'] = False
argv.remove('--no-git')
if '--offline' in argv:
config['offline'] = True
argv.remove('--offline')
if '--auto-use' in argv:
config['auto_use'] = True
argv.remove('--auto-use')
if '--no-auto-use' in argv:
config['auto_use'] = False
argv.remove('--no-auto-use')
if '--use-system-astropy-helpers' in argv:
config['auto_use'] = False
argv.remove('--use-system-astropy-helpers')
return config
def run(self):
strategies = ['local_directory', 'local_file', 'index']
dist = None
# First, remove any previously imported versions of astropy_helpers;
# this is necessary for nested installs where one package's installer
# is installing another package via setuptools.sandbox.run_setup, as in
# the case of setup_requires
for key in list(sys.modules):
try:
if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'):
del sys.modules[key]
except AttributeError:
# Sometimes mysterious non-string things can turn up in
# sys.modules
continue
# Check to see if the path is a submodule
self.is_submodule = self._check_submodule()
for strategy in strategies:
method = getattr(self, 'get_{0}_dist'.format(strategy))
dist = method()
if dist is not None:
break
else:
raise _AHBootstrapSystemExit(
"No source found for the {0!r} package; {0} must be "
"available and importable as a prerequisite to building "
"or installing this package.".format(PACKAGE_NAME))
# This is a bit hacky, but if astropy_helpers was loaded from a
# directory/submodule its Distribution object gets a "precedence" of
# "DEVELOP_DIST". However, in other cases it gets a precedence of
# "EGG_DIST". However, when activing the distribution it will only be
# placed early on sys.path if it is treated as an EGG_DIST, so always
# do that
dist = dist.clone(precedence=pkg_resources.EGG_DIST)
# Otherwise we found a version of astropy-helpers, so we're done
# Just active the found distribution on sys.path--if we did a
# download this usually happens automatically but it doesn't hurt to
# do it again
# Note: Adding the dist to the global working set also activates it
# (makes it importable on sys.path) by default.
try:
pkg_resources.working_set.add(dist, replace=True)
except TypeError:
# Some (much) older versions of setuptools do not have the
# replace=True option here. These versions are old enough that all
# bets may be off anyways, but it's easy enough to work around just
# in case...
if dist.key in pkg_resources.working_set.by_key:
del pkg_resources.working_set.by_key[dist.key]
pkg_resources.working_set.add(dist)
@property
def config(self):
"""
A `dict` containing the options this `_Bootstrapper` was configured
with.
"""
return dict((optname, getattr(self, optname))
for optname, _ in CFG_OPTIONS if hasattr(self, optname))
def get_local_directory_dist(self):
"""
Handle importing a vendored package from a subdirectory of the source
distribution.
"""
if not os.path.isdir(self.path):
return
log.info('Attempting to import astropy_helpers from {0} {1!r}'.format(
'submodule' if self.is_submodule else 'directory',
self.path))
dist = self._directory_import()
if dist is None:
log.warn(
'The requested path {0!r} for importing {1} does not '
'exist, or does not contain a copy of the {1} '
'package.'.format(self.path, PACKAGE_NAME))
elif self.auto_upgrade and not self.is_submodule:
# A version of astropy-helpers was found on the available path, but
# check to see if a bugfix release is available on PyPI
upgrade = self._do_upgrade(dist)
if upgrade is not None:
dist = upgrade
return dist
def get_local_file_dist(self):
"""
Handle importing from a source archive; this also uses setup_requires
but points easy_install directly to the source archive.
"""
if not os.path.isfile(self.path):
return
log.info('Attempting to unpack and import astropy_helpers from '
'{0!r}'.format(self.path))
try:
dist = self._do_download(find_links=[self.path])
except Exception as e:
if DEBUG:
raise
log.warn(
'Failed to import {0} from the specified archive {1!r}: '
'{2}'.format(PACKAGE_NAME, self.path, str(e)))
dist = None
if dist is not None and self.auto_upgrade:
# A version of astropy-helpers was found on the available path, but
# check to see if a bugfix release is available on PyPI
upgrade = self._do_upgrade(dist)
if upgrade is not None:
dist = upgrade
return dist
def get_index_dist(self):
if not self.download:
log.warn('Downloading {0!r} disabled.'.format(DIST_NAME))
return None
log.warn(
"Downloading {0!r}; run setup.py with the --offline option to "
"force offline installation.".format(DIST_NAME))
try:
dist = self._do_download()
except Exception as e:
if DEBUG:
raise
log.warn(
'Failed to download and/or install {0!r} from {1!r}:\n'
'{2}'.format(DIST_NAME, self.index_url, str(e)))
dist = None
# No need to run auto-upgrade here since we've already presumably
# gotten the most up-to-date version from the package index
return dist
def _directory_import(self):
"""
Import astropy_helpers from the given path, which will be added to
sys.path.
Must return True if the import succeeded, and False otherwise.
"""
# Return True on success, False on failure but download is allowed, and
# otherwise raise SystemExit
path = os.path.abspath(self.path)
# Use an empty WorkingSet rather than the man
# pkg_resources.working_set, since on older versions of setuptools this
# will invoke a VersionConflict when trying to install an upgrade
ws = pkg_resources.WorkingSet([])
ws.add_entry(path)
dist = ws.by_key.get(DIST_NAME)
if dist is None:
# We didn't find an egg-info/dist-info in the given path, but if a
# setup.py exists we can generate it
setup_py = os.path.join(path, 'setup.py')
if os.path.isfile(setup_py):
# We use subprocess instead of run_setup from setuptools to
# avoid segmentation faults - see the following for more details:
# https://github.com/cython/cython/issues/2104
sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path)
for dist in pkg_resources.find_distributions(path, True):
# There should be only one...
return dist
return dist
def _do_download(self, version='', find_links=None):
if find_links:
allow_hosts = ''
index_url = None
else:
allow_hosts = None
index_url = self.index_url
# Annoyingly, setuptools will not handle other arguments to
# Distribution (such as options) before handling setup_requires, so it
# is not straightforward to programmatically augment the arguments which
# are passed to easy_install
class _Distribution(Distribution):
def get_option_dict(self, command_name):
opts = Distribution.get_option_dict(self, command_name)
if command_name == 'easy_install':
if find_links is not None:
opts['find_links'] = ('setup script', find_links)
if index_url is not None:
opts['index_url'] = ('setup script', index_url)
if allow_hosts is not None:
opts['allow_hosts'] = ('setup script', allow_hosts)
return opts
if version:
req = '{0}=={1}'.format(DIST_NAME, version)
else:
if UPPER_VERSION_EXCLUSIVE is None:
req = DIST_NAME
else:
req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE)
attrs = {'setup_requires': [req]}
# NOTE: we need to parse the config file (e.g. setup.cfg) to make sure
# it honours the options set in the [easy_install] section, and we need
# to explicitly fetch the requirement eggs as setup_requires does not
# get honored in recent versions of setuptools:
# https://github.com/pypa/setuptools/issues/1273
try:
context = _verbose if DEBUG else _silence
with context():
dist = _Distribution(attrs=attrs)
try:
dist.parse_config_files(ignore_option_errors=True)
dist.fetch_build_eggs(req)
except TypeError:
# On older versions of setuptools, ignore_option_errors
# doesn't exist, and the above two lines are not needed
# so we can just continue
pass
# If the setup_requires succeeded it will have added the new dist to
# the main working_set
return pkg_resources.working_set.by_key.get(DIST_NAME)
except Exception as e:
if DEBUG:
raise
msg = 'Error retrieving {0} from {1}:\n{2}'
if find_links:
source = find_links[0]
elif index_url != INDEX_URL:
source = index_url
else:
source = 'PyPI'
raise Exception(msg.format(DIST_NAME, source, repr(e)))
def _do_upgrade(self, dist):
# Build up a requirement for a higher bugfix release but a lower minor
# release (so API compatibility is guaranteed)
next_version = _next_version(dist.parsed_version)
req = pkg_resources.Requirement.parse(
'{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version))
package_index = PackageIndex(index_url=self.index_url)
upgrade = package_index.obtain(req)
if upgrade is not None:
return self._do_download(version=upgrade.version)
def _check_submodule(self):
"""
Check if the given path is a git submodule.
See the docstrings for ``_check_submodule_using_git`` and
``_check_submodule_no_git`` for further details.
"""
if (self.path is None or
(os.path.exists(self.path) and not os.path.isdir(self.path))):
return False
if self.use_git:
return self._check_submodule_using_git()
else:
return self._check_submodule_no_git()
def _check_submodule_using_git(self):
"""
Check if the given path is a git submodule. If so, attempt to initialize
and/or update the submodule if needed.
This function makes calls to the ``git`` command in subprocesses. The
``_check_submodule_no_git`` option uses pure Python to check if the given
path looks like a git submodule, but it cannot perform updates.
"""
cmd = ['git', 'submodule', 'status', '--', self.path]
try:
log.info('Running `{0}`; use the --no-git option to disable git '
'commands'.format(' '.join(cmd)))
returncode, stdout, stderr = run_cmd(cmd)
except _CommandNotFound:
# The git command simply wasn't found; this is most likely the
# case on user systems that don't have git and are simply
# trying to install the package from PyPI or a source
# distribution. Silently ignore this case and simply don't try
# to use submodules
return False
stderr = stderr.strip()
if returncode != 0 and stderr:
# Unfortunately the return code alone cannot be relied on, as
# earlier versions of git returned 0 even if the requested submodule
# does not exist
# This is a warning that occurs in perl (from running git submodule)
# which only occurs with a malformatted locale setting which can
# happen sometimes on OSX. See again
# https://github.com/astropy/astropy/issues/2749
perl_warning = ('perl: warning: Falling back to the standard locale '
'("C").')
if not stderr.strip().endswith(perl_warning):
# Some other unknown error condition occurred
log.warn('git submodule command failed '
'unexpectedly:\n{0}'.format(stderr))
return False
# Output of `git submodule status` is as follows:
#
# 1: Status indicator: '-' for submodule is uninitialized, '+' if
# submodule is initialized but is not at the commit currently indicated
# in .gitmodules (and thus needs to be updated), or 'U' if the
# submodule is in an unstable state (i.e. has merge conflicts)
#
# 2. SHA-1 hash of the current commit of the submodule (we don't really
# need this information but it's useful for checking that the output is
# correct)
#
# 3. The output of `git describe` for the submodule's current commit
# hash (this includes for example what branches the commit is on) but
# only if the submodule is initialized. We ignore this information for
# now
_git_submodule_status_re = re.compile(
r'^(?P[+-U ])(?P[0-9a-f]{40}) '
r'(?P\S+)( .*)?$')
# The stdout should only contain one line--the status of the
# requested submodule
m = _git_submodule_status_re.match(stdout)
if m:
# Yes, the path *is* a git submodule
self._update_submodule(m.group('submodule'), m.group('status'))
return True
else:
log.warn(
'Unexpected output from `git submodule status`:\n{0}\n'
'Will attempt import from {1!r} regardless.'.format(
stdout, self.path))
return False
def _check_submodule_no_git(self):
"""
Like ``_check_submodule_using_git``, but simply parses the .gitmodules file
to determine if the supplied path is a git submodule, and does not exec any
subprocesses.
This can only determine if a path is a submodule--it does not perform
updates, etc. This function may need to be updated if the format of the
.gitmodules file is changed between git versions.
"""
gitmodules_path = os.path.abspath('.gitmodules')
if not os.path.isfile(gitmodules_path):
return False
# This is a minimal reader for gitconfig-style files. It handles a few of
# the quirks that make gitconfig files incompatible with ConfigParser-style
# files, but does not support the full gitconfig syntax (just enough
# needed to read a .gitmodules file).
gitmodules_fileobj = io.StringIO()
# Must use io.open for cross-Python-compatible behavior wrt unicode
with io.open(gitmodules_path) as f:
for line in f:
# gitconfig files are more flexible with leading whitespace; just
# go ahead and remove it
line = line.lstrip()
# comments can start with either # or ;
if line and line[0] in (':', ';'):
continue
gitmodules_fileobj.write(line)
gitmodules_fileobj.seek(0)
cfg = RawConfigParser()
try:
cfg.readfp(gitmodules_fileobj)
except Exception as exc:
log.warn('Malformatted .gitmodules file: {0}\n'
'{1} cannot be assumed to be a git submodule.'.format(
exc, self.path))
return False
for section in cfg.sections():
if not cfg.has_option(section, 'path'):
continue
submodule_path = cfg.get(section, 'path').rstrip(os.sep)
if submodule_path == self.path.rstrip(os.sep):
return True
return False
def _update_submodule(self, submodule, status):
if status == ' ':
# The submodule is up to date; no action necessary
return
elif status == '-':
if self.offline:
raise _AHBootstrapSystemExit(
"Cannot initialize the {0} submodule in --offline mode; "
"this requires being able to clone the submodule from an "
"online repository.".format(submodule))
cmd = ['update', '--init']
action = 'Initializing'
elif status == '+':
cmd = ['update']
action = 'Updating'
if self.offline:
cmd.append('--no-fetch')
elif status == 'U':
raise _AHBootstrapSystemExit(
'Error: Submodule {0} contains unresolved merge conflicts. '
'Please complete or abandon any changes in the submodule so that '
'it is in a usable state, then try again.'.format(submodule))
else:
log.warn('Unknown status {0!r} for git submodule {1!r}. Will '
'attempt to use the submodule as-is, but try to ensure '
'that the submodule is in a clean state and contains no '
'conflicts or errors.\n{2}'.format(status, submodule,
_err_help_msg))
return
err_msg = None
cmd = ['git', 'submodule'] + cmd + ['--', submodule]
log.warn('{0} {1} submodule with: `{2}`'.format(
action, submodule, ' '.join(cmd)))
try:
log.info('Running `{0}`; use the --no-git option to disable git '
'commands'.format(' '.join(cmd)))
returncode, stdout, stderr = run_cmd(cmd)
except OSError as e:
err_msg = str(e)
else:
if returncode != 0:
err_msg = stderr
if err_msg is not None:
log.warn('An unexpected error occurred updating the git submodule '
'{0!r}:\n{1}\n{2}'.format(submodule, err_msg,
_err_help_msg))
class _CommandNotFound(OSError):
"""
An exception raised when a command run with run_cmd is not found on the
system.
"""
def run_cmd(cmd):
"""
Run a command in a subprocess, given as a list of command-line
arguments.
Returns a ``(returncode, stdout, stderr)`` tuple.
"""
try:
p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE)
# XXX: May block if either stdout or stderr fill their buffers;
# however for the commands this is currently used for that is
# unlikely (they should have very brief output)
stdout, stderr = p.communicate()
except OSError as e:
if DEBUG:
raise
if e.errno == errno.ENOENT:
msg = 'Command not found: `{0}`'.format(' '.join(cmd))
raise _CommandNotFound(msg, cmd)
else:
raise _AHBootstrapSystemExit(
'An unexpected error occurred when running the '
'`{0}` command:\n{1}'.format(' '.join(cmd), str(e)))
# Can fail of the default locale is not configured properly. See
# https://github.com/astropy/astropy/issues/2749. For the purposes under
# consideration 'latin1' is an acceptable fallback.
try:
stdio_encoding = locale.getdefaultlocale()[1] or 'latin1'
except ValueError:
# Due to an OSX oddity locale.getdefaultlocale() can also crash
# depending on the user's locale/language settings. See:
# http://bugs.python.org/issue18378
stdio_encoding = 'latin1'
# Unlikely to fail at this point but even then let's be flexible
if not isinstance(stdout, str):
stdout = stdout.decode(stdio_encoding, 'replace')
if not isinstance(stderr, str):
stderr = stderr.decode(stdio_encoding, 'replace')
return (p.returncode, stdout, stderr)
def _next_version(version):
"""
Given a parsed version from pkg_resources.parse_version, returns a new
version string with the next minor version.
Examples
========
>>> _next_version(pkg_resources.parse_version('1.2.3'))
'1.3.0'
"""
if hasattr(version, 'base_version'):
# New version parsing from setuptools >= 8.0
if version.base_version:
parts = version.base_version.split('.')
else:
parts = []
else:
parts = []
for part in version:
if part.startswith('*'):
break
parts.append(part)
parts = [int(p) for p in parts]
if len(parts) < 3:
parts += [0] * (3 - len(parts))
major, minor, micro = parts[:3]
return '{0}.{1}.{2}'.format(major, minor + 1, 0)
class _DummyFile(object):
"""A noop writeable object."""
errors = '' # Required for Python 3.x
encoding = 'utf-8'
def write(self, s):
pass
def flush(self):
pass
@contextlib.contextmanager
def _verbose():
yield
@contextlib.contextmanager
def _silence():
"""A context manager that silences sys.stdout and sys.stderr."""
old_stdout = sys.stdout
old_stderr = sys.stderr
sys.stdout = _DummyFile()
sys.stderr = _DummyFile()
exception_occurred = False
try:
yield
except:
exception_occurred = True
# Go ahead and clean up so that exception handling can work normally
sys.stdout = old_stdout
sys.stderr = old_stderr
raise
if not exception_occurred:
sys.stdout = old_stdout
sys.stderr = old_stderr
class _AHBootstrapSystemExit(SystemExit):
def __init__(self, *args):
if not args:
msg = 'An unknown problem occurred bootstrapping astropy_helpers.'
else:
msg = args[0]
msg += '\n' + _err_help_msg
super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:])
BOOTSTRAPPER = _Bootstrapper.main()
def use_astropy_helpers(**kwargs):
"""
Ensure that the `astropy_helpers` module is available and is importable.
This supports automatic submodule initialization if astropy_helpers is
included in a project as a git submodule, or will download it from PyPI if
necessary.
Parameters
----------
path : str or None, optional
A filesystem path relative to the root of the project's source code
that should be added to `sys.path` so that `astropy_helpers` can be
imported from that path.
If the path is a git submodule it will automatically be initialized
and/or updated.
The path may also be to a ``.tar.gz`` archive of the astropy_helpers
source distribution. In this case the archive is automatically
unpacked and made temporarily available on `sys.path` as a ``.egg``
archive.
If `None` skip straight to downloading.
download_if_needed : bool, optional
If the provided filesystem path is not found an attempt will be made to
download astropy_helpers from PyPI. It will then be made temporarily
available on `sys.path` as a ``.egg`` archive (using the
``setup_requires`` feature of setuptools. If the ``--offline`` option
is given at the command line the value of this argument is overridden
to `False`.
index_url : str, optional
If provided, use a different URL for the Python package index than the
main PyPI server.
use_git : bool, optional
If `False` no git commands will be used--this effectively disables
support for git submodules. If the ``--no-git`` option is given at the
command line the value of this argument is overridden to `False`.
auto_upgrade : bool, optional
By default, when installing a package from a non-development source
distribution ah_boostrap will try to automatically check for patch
releases to astropy-helpers on PyPI and use the patched version over
any bundled versions. Setting this to `False` will disable that
functionality. If the ``--offline`` option is given at the command line
the value of this argument is overridden to `False`.
offline : bool, optional
If `False` disable all actions that require an internet connection,
including downloading packages from the package index and fetching
updates to any git submodule. Defaults to `True`.
"""
global BOOTSTRAPPER
config = BOOTSTRAPPER.config
config.update(**kwargs)
# Create a new bootstrapper with the updated configuration and run it
BOOTSTRAPPER = _Bootstrapper(**config)
BOOTSTRAPPER.run()
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5401027
astropy-healpix-0.5/astropy_helpers/astropy_helpers/ 0000755 0000770 0000024 00000000000 00000000000 023074 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6139154
astropy-healpix-0.5/astropy_helpers/astropy_helpers/__init__.py 0000644 0000770 0000024 00000003312 00000000000 025204 0 ustar 00tom staff 0000000 0000000 try:
from .version import version as __version__
from .version import githash as __githash__
except ImportError:
__version__ = ''
__githash__ = ''
# If we've made it as far as importing astropy_helpers, we don't need
# ah_bootstrap in sys.modules anymore. Getting rid of it is actually necessary
# if the package we're installing has a setup_requires of another package that
# uses astropy_helpers (and possibly a different version at that)
# See https://github.com/astropy/astropy/issues/3541
import sys
if 'ah_bootstrap' in sys.modules:
del sys.modules['ah_bootstrap']
# Note, this is repeated from ah_bootstrap.py, but is here too in case this
# astropy-helpers was upgraded to from an older version that did not have this
# check in its ah_bootstrap.
# matplotlib can cause problems if it is imported from within a call of
# run_setup(), because in some circumstances it will try to write to the user's
# home directory, resulting in a SandboxViolation. See
# https://github.com/matplotlib/matplotlib/pull/4165
# Making sure matplotlib, if it is available, is imported early in the setup
# process can mitigate this (note importing matplotlib.pyplot has the same
# issue)
try:
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot
except:
# Ignore if this fails for *any* reason*
pass
import os
# Ensure that all module-level code in astropy or other packages know that
# we're in setup mode:
if ('__main__' in sys.modules and
hasattr(sys.modules['__main__'], '__file__')):
filename = os.path.basename(sys.modules['__main__'].__file__)
if filename.rstrip('co') == 'setup.py':
import builtins
builtins._ASTROPY_SETUP_ = True
del filename
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5495896
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/ 0000755 0000770 0000024 00000000000 00000000000 024675 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1526666202.8335035
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/__init__.py 0000644 0000770 0000024 00000000000 00000000000 026774 0 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6141853
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/_dummy.py 0000644 0000770 0000024 00000005312 00000000000 026542 0 ustar 00tom staff 0000000 0000000 """
Provides a base class for a 'dummy' setup.py command that has no functionality
(probably due to a missing requirement). This dummy command can raise an
exception when it is run, explaining to the user what dependencies must be met
to use this command.
The reason this is at all tricky is that we want the command to be able to
provide this message even when the user passes arguments to the command. If we
don't know ahead of time what arguments the command can take, this is
difficult, because distutils does not allow unknown arguments to be passed to a
setup.py command. This hacks around that restriction to provide a useful error
message even when a user passes arguments to the dummy implementation of a
command.
Use this like:
try:
from some_dependency import SetupCommand
except ImportError:
from ._dummy import _DummyCommand
class SetupCommand(_DummyCommand):
description = \
'Implementation of SetupCommand from some_dependency; '
'some_dependency must be installed to run this command'
# This is the message that will be raised when a user tries to
# run this command--define it as a class attribute.
error_msg = \
"The 'setup_command' command requires the some_dependency "
"package to be installed and importable."
"""
import sys
from setuptools import Command
from distutils.errors import DistutilsArgError
from textwrap import dedent
class _DummyCommandMeta(type):
"""
Causes an exception to be raised on accessing attributes of a command class
so that if ``./setup.py command_name`` is run with additional command-line
options we can provide a useful error message instead of the default that
tells users the options are unrecognized.
"""
def __init__(cls, name, bases, members):
if bases == (Command, object):
# This is the _DummyCommand base class, presumably
return
if not hasattr(cls, 'description'):
raise TypeError(
"_DummyCommand subclass must have a 'description' "
"attribute.")
if not hasattr(cls, 'error_msg'):
raise TypeError(
"_DummyCommand subclass must have an 'error_msg' "
"attribute.")
def __getattribute__(cls, attr):
if attr in ('description', 'error_msg') or attr.startswith('_'):
# Allow cls.description to work so that `./setup.py
# --help-commands` still works
return super(_DummyCommandMeta, cls).__getattribute__(attr)
raise DistutilsArgError(cls.error_msg)
class _DummyCommand(Command, object, metaclass=_DummyCommandMeta):
pass
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574694604.615091
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/build_ext.py 0000644 0000770 0000024 00000020742 00000000000 027233 0 ustar 00tom staff 0000000 0000000 import errno
import os
import shutil
from distutils.core import Extension
from distutils.ccompiler import get_default_compiler
from distutils.command.build_ext import build_ext as DistutilsBuildExt
from ..distutils_helpers import get_main_package_directory
from ..utils import get_numpy_include_path, import_file
__all__ = ['AstropyHelpersBuildExt']
def should_build_with_cython(previous_cython_version, is_release):
"""
Returns the previously used Cython version (or 'unknown' if not
previously built) if Cython should be used to build extension modules from
pyx files.
"""
# Only build with Cython if, of course, Cython is installed, we're in a
# development version (i.e. not release) or the Cython-generated source
# files haven't been created yet (cython_version == 'unknown'). The latter
# case can happen even when release is True if checking out a release tag
# from the repository
have_cython = False
try:
from Cython import __version__ as cython_version # noqa
have_cython = True
except ImportError:
pass
if have_cython and (not is_release or previous_cython_version == 'unknown'):
return cython_version
else:
return False
class AstropyHelpersBuildExt(DistutilsBuildExt):
"""
A custom 'build_ext' command that allows for manipulating some of the C
extension options at build time.
"""
_uses_cython = False
_force_rebuild = False
def __new__(cls, value, **kwargs):
# NOTE: we need to wait until AstropyHelpersBuildExt is initialized to
# import setuptools.command.build_ext because when that package is
# imported, setuptools tries to import Cython - and if it's not found
# it will affect the rest of the build process. This is an issue because
# if we import that module at the top of this one, setup_requires won't
# have been honored yet, so Cython may not yet be available - and if we
# import build_ext too soon, it will think Cython is not available even
# if it is then intalled when setup_requires is processed. To get around
# this we dynamically create a new class that inherits from the
# setuptools build_ext, and by this point setup_requires has been
# processed.
from setuptools.command.build_ext import build_ext as SetuptoolsBuildExt
class FinalBuildExt(AstropyHelpersBuildExt, SetuptoolsBuildExt):
pass
new_type = type(cls.__name__, (FinalBuildExt,), dict(cls.__dict__))
obj = SetuptoolsBuildExt.__new__(new_type)
obj.__init__(value)
return obj
def finalize_options(self):
# First let's find the package folder, then we can check if the
# version and cython_version are accessible
self.package_dir = get_main_package_directory(self.distribution)
version = import_file(os.path.join(self.package_dir, 'version.py'),
name='version').version
self.is_release = 'dev' not in version
try:
self.previous_cython_version = import_file(os.path.join(self.package_dir,
'cython_version.py'),
name='cython_version').cython_version
except (FileNotFoundError, ImportError):
self.previous_cython_version = 'unknown'
self._uses_cython = should_build_with_cython(self.previous_cython_version, self.is_release)
# Add a copy of the _compiler.so module as well, but only if there
# are in fact C modules to compile (otherwise there's no reason to
# include a record of the compiler used). Note that self.extensions
# may not be set yet, but self.distribution.ext_modules is where any
# extension modules passed to setup() can be found
extensions = self.distribution.ext_modules
if extensions:
build_py = self.get_finalized_command('build_py')
package_dir = build_py.get_package_dir(self.package_dir)
src_path = os.path.relpath(
os.path.join(os.path.dirname(__file__), 'src'))
shutil.copy(os.path.join(src_path, 'compiler.c'),
os.path.join(package_dir, '_compiler.c'))
ext = Extension(self.package_dir + '.compiler_version',
[os.path.join(package_dir, '_compiler.c')])
extensions.insert(0, ext)
super().finalize_options()
# If we are using Cython, then make sure we re-build if the version
# of Cython that is installed is different from the version last
# used to generate the C files.
if self._uses_cython and self._uses_cython != self.previous_cython_version:
self._force_rebuild = True
# Regardless of the value of the '--force' option, force a rebuild
# if the debug flag changed from the last build
if self._force_rebuild:
self.force = True
def run(self):
# For extensions that require 'numpy' in their include dirs,
# replace 'numpy' with the actual paths
np_include = None
for extension in self.extensions:
if 'numpy' in extension.include_dirs:
if np_include is None:
np_include = get_numpy_include_path()
idx = extension.include_dirs.index('numpy')
extension.include_dirs.insert(idx, np_include)
extension.include_dirs.remove('numpy')
self._check_cython_sources(extension)
# Note that setuptools automatically uses Cython to discover and
# build extensions if available, so we don't have to explicitly call
# e.g. cythonize.
super().run()
# Update cython_version.py if building with Cython
if self._uses_cython and self._uses_cython != self.previous_cython_version:
build_py = self.get_finalized_command('build_py')
package_dir = build_py.get_package_dir(self.package_dir)
cython_py = os.path.join(package_dir, 'cython_version.py')
with open(cython_py, 'w') as f:
f.write('# Generated file; do not modify\n')
f.write('cython_version = {0!r}\n'.format(self._uses_cython))
if os.path.isdir(self.build_lib):
# The build/lib directory may not exist if the build_py
# command was not previously run, which may sometimes be
# the case
self.copy_file(cython_py,
os.path.join(self.build_lib, cython_py),
preserve_mode=False)
def _check_cython_sources(self, extension):
"""
Where relevant, make sure that the .c files associated with .pyx
modules are present (if building without Cython installed).
"""
# Determine the compiler we'll be using
if self.compiler is None:
compiler = get_default_compiler()
else:
compiler = self.compiler
# Replace .pyx with C-equivalents, unless c files are missing
for jdx, src in enumerate(extension.sources):
base, ext = os.path.splitext(src)
pyxfn = base + '.pyx'
cfn = base + '.c'
cppfn = base + '.cpp'
if not os.path.isfile(pyxfn):
continue
if self._uses_cython:
extension.sources[jdx] = pyxfn
else:
if os.path.isfile(cfn):
extension.sources[jdx] = cfn
elif os.path.isfile(cppfn):
extension.sources[jdx] = cppfn
else:
msg = (
'Could not find C/C++ file {0}.(c/cpp) for Cython '
'file {1} when building extension {2}. Cython '
'must be installed to build from a git '
'checkout.'.format(base, pyxfn, extension.name))
raise IOError(errno.ENOENT, msg, cfn)
# Cython (at least as of 0.29.2) uses deprecated Numpy API features
# the use of which produces a few warnings when compiling.
# These additional flags should squelch those warnings.
# TODO: Feel free to remove this if/when a Cython update
# removes use of the deprecated Numpy API
if compiler == 'unix':
extension.extra_compile_args.extend([
'-Wp,-w', '-Wno-unused-function'])
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6157284
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/build_sphinx.py 0000644 0000770 0000024 00000021732 00000000000 027744 0 ustar 00tom staff 0000000 0000000
import os
import pkgutil
import re
import shutil
import subprocess
import sys
from distutils.version import LooseVersion
from distutils import log
from sphinx import __version__ as sphinx_version
from sphinx.setup_command import BuildDoc as SphinxBuildDoc
SPHINX_LT_16 = LooseVersion(sphinx_version) < LooseVersion('1.6')
SPHINX_LT_17 = LooseVersion(sphinx_version) < LooseVersion('1.7')
SUBPROCESS_TEMPLATE = """
import os
import sys
{build_main}
os.chdir({srcdir!r})
{sys_path_inserts}
for builder in {builders!r}:
retcode = build_main(argv={argv!r} + ['-b', builder, '.', os.path.join({output_dir!r}, builder)])
if retcode != 0:
sys.exit(retcode)
"""
def ensure_sphinx_astropy_installed():
"""
Make sure that sphinx-astropy is available.
This returns the available version of sphinx-astropy as well as any
paths that should be added to sys.path for sphinx-astropy to be available.
"""
# We've split out the Sphinx part of astropy-helpers into sphinx-astropy
# but we want it to be auto-installed seamlessly for anyone using
# build_docs. We check if it's already installed, and if not, we install
# it to a local .eggs directory and add the eggs to the path (these
# have to each be added to the path, we can't add them by simply adding
# .eggs to the path)
sys_path_inserts = []
sphinx_astropy_version = None
try:
from sphinx_astropy import __version__ as sphinx_astropy_version # noqa
except ImportError:
raise ImportError("sphinx-astropy needs to be installed to build "
"the documentation.")
return sphinx_astropy_version, sys_path_inserts
class AstropyBuildDocs(SphinxBuildDoc):
"""
A version of the ``build_docs`` command that uses the version of Astropy
that is built by the setup ``build`` command, rather than whatever is
installed on the system. To build docs against the installed version, run
``make html`` in the ``astropy/docs`` directory.
"""
description = 'Build Sphinx documentation for Astropy environment'
user_options = SphinxBuildDoc.user_options[:]
user_options.append(
('warnings-returncode', 'w',
'Parses the sphinx output and sets the return code to 1 if there '
'are any warnings. Note that this will cause the sphinx log to '
'only update when it completes, rather than continuously as is '
'normally the case.'))
user_options.append(
('clean-docs', 'l',
'Completely clean previous builds, including '
'automodapi-generated files before building new ones'))
user_options.append(
('no-intersphinx', 'n',
'Skip intersphinx, even if conf.py says to use it'))
user_options.append(
('open-docs-in-browser', 'o',
'Open the docs in a browser (using the webbrowser module) if the '
'build finishes successfully.'))
boolean_options = SphinxBuildDoc.boolean_options[:]
boolean_options.append('warnings-returncode')
boolean_options.append('clean-docs')
boolean_options.append('no-intersphinx')
boolean_options.append('open-docs-in-browser')
_self_iden_rex = re.compile(r"self\.([^\d\W][\w]+)", re.UNICODE)
def initialize_options(self):
SphinxBuildDoc.initialize_options(self)
self.clean_docs = False
self.no_intersphinx = False
self.open_docs_in_browser = False
self.warnings_returncode = False
self.traceback = False
def finalize_options(self):
# This has to happen before we call the parent class's finalize_options
if self.build_dir is None:
self.build_dir = 'docs/_build'
SphinxBuildDoc.finalize_options(self)
# Clear out previous sphinx builds, if requested
if self.clean_docs:
dirstorm = [os.path.join(self.source_dir, 'api'),
os.path.join(self.source_dir, 'generated')]
dirstorm.append(self.build_dir)
for d in dirstorm:
if os.path.isdir(d):
log.info('Cleaning directory ' + d)
shutil.rmtree(d)
else:
log.info('Not cleaning directory ' + d + ' because '
'not present or not a directory')
def run(self):
# TODO: Break this method up into a few more subroutines and
# document them better
import webbrowser
from urllib.request import pathname2url
# This is used at the very end of `run` to decide if sys.exit should
# be called. If it's None, it won't be.
retcode = None
# Now make sure Astropy is built and determine where it was built
build_cmd = self.reinitialize_command('build')
build_cmd.inplace = 0
self.run_command('build')
build_cmd = self.get_finalized_command('build')
build_cmd_path = os.path.abspath(build_cmd.build_lib)
ah_importer = pkgutil.get_importer('astropy_helpers')
if ah_importer is None:
ah_path = '.'
else:
ah_path = os.path.abspath(ah_importer.path)
if SPHINX_LT_17:
build_main = 'from sphinx import build_main'
else:
build_main = 'from sphinx.cmd.build import build_main'
# We need to make sure sphinx-astropy is installed
sphinx_astropy_version, extra_paths = ensure_sphinx_astropy_installed()
sys_path_inserts = [build_cmd_path, ah_path] + extra_paths
sys_path_inserts = os.linesep.join(['sys.path.insert(0, {0!r})'.format(path) for path in sys_path_inserts])
argv = []
if self.warnings_returncode:
argv.append('-W')
if self.no_intersphinx:
# Note, if sphinx_astropy_version is None, this could indicate an
# old version of setuptools, but sphinx-astropy is likely ok, so
# we can proceed.
if sphinx_astropy_version is None or LooseVersion(sphinx_astropy_version) >= LooseVersion('1.1'):
argv.extend(['-D', 'disable_intersphinx=1'])
else:
log.warn('The -n option to disable intersphinx requires '
'sphinx-astropy>=1.1. Ignoring.')
# We now need to adjust the flags based on the parent class's options
if self.fresh_env:
argv.append('-E')
if self.all_files:
argv.append('-a')
if getattr(self, 'pdb', False):
argv.append('-P')
if getattr(self, 'nitpicky', False):
argv.append('-n')
if self.traceback:
argv.append('-T')
# The default verbosity level is 1, so in that case we just don't add a flag
if self.verbose == 0:
argv.append('-q')
elif self.verbose > 1:
argv.append('-v')
if SPHINX_LT_17:
argv.insert(0, 'sphinx-build')
if isinstance(self.builder, str):
builders = [self.builder]
else:
builders = self.builder
subproccode = SUBPROCESS_TEMPLATE.format(build_main=build_main,
srcdir=self.source_dir,
sys_path_inserts=sys_path_inserts,
builders=builders,
argv=argv,
output_dir=os.path.abspath(self.build_dir))
log.debug('Starting subprocess of {0} with python code:\n{1}\n'
'[CODE END])'.format(sys.executable, subproccode))
proc = subprocess.Popen([sys.executable], stdin=subprocess.PIPE)
proc.communicate(subproccode.encode('utf-8'))
if proc.returncode != 0:
retcode = proc.returncode
if retcode is None:
if self.open_docs_in_browser:
if self.builder == 'html':
absdir = os.path.abspath(self.builder_target_dir)
index_path = os.path.join(absdir, 'index.html')
fileurl = 'file://' + pathname2url(index_path)
webbrowser.open(fileurl)
else:
log.warn('open-docs-in-browser option was given, but '
'the builder is not html! Ignoring.')
# Here we explicitly check proc.returncode since we only want to output
# this for cases where the return code really wasn't 0.
if proc.returncode:
log.warn('Sphinx Documentation subprocess failed with return '
'code ' + str(proc.returncode))
if retcode is not None:
# this is potentially dangerous in that there might be something
# after the call to `setup` in `setup.py`, and exiting here will
# prevent that from running. But there's no other apparent way
# to signal what the return code should be.
sys.exit(retcode)
class AstropyBuildSphinx(AstropyBuildDocs): # pragma: no cover
def run(self):
AstropyBuildDocs.run(self)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5501132
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/src/ 0000755 0000770 0000024 00000000000 00000000000 025464 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6160283
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/src/compiler.c 0000644 0000770 0000024 00000005240 00000000000 027443 0 ustar 00tom staff 0000000 0000000 #include
/***************************************************************************
* Macros for determining the compiler version.
*
* These are borrowed from boost, and majorly abridged to include only
* the compilers we care about.
***************************************************************************/
#define STRINGIZE(X) DO_STRINGIZE(X)
#define DO_STRINGIZE(X) #X
#if defined __clang__
/* Clang C++ emulates GCC, so it has to appear early. */
# define COMPILER "Clang version " __clang_version__
#elif defined(__INTEL_COMPILER) || defined(__ICL) || defined(__ICC) || defined(__ECC)
/* Intel */
# if defined(__INTEL_COMPILER)
# define INTEL_VERSION __INTEL_COMPILER
# elif defined(__ICL)
# define INTEL_VERSION __ICL
# elif defined(__ICC)
# define INTEL_VERSION __ICC
# elif defined(__ECC)
# define INTEL_VERSION __ECC
# endif
# define COMPILER "Intel C compiler version " STRINGIZE(INTEL_VERSION)
#elif defined(__GNUC__)
/* gcc */
# define COMPILER "GCC version " __VERSION__
#elif defined(__SUNPRO_CC)
/* Sun Workshop Compiler */
# define COMPILER "Sun compiler version " STRINGIZE(__SUNPRO_CC)
#elif defined(_MSC_VER)
/* Microsoft Visual C/C++
Must be last since other compilers define _MSC_VER for compatibility as well */
# if _MSC_VER < 1200
# define COMPILER_VERSION 5.0
# elif _MSC_VER < 1300
# define COMPILER_VERSION 6.0
# elif _MSC_VER == 1300
# define COMPILER_VERSION 7.0
# elif _MSC_VER == 1310
# define COMPILER_VERSION 7.1
# elif _MSC_VER == 1400
# define COMPILER_VERSION 8.0
# elif _MSC_VER == 1500
# define COMPILER_VERSION 9.0
# elif _MSC_VER == 1600
# define COMPILER_VERSION 10.0
# else
# define COMPILER_VERSION _MSC_VER
# endif
# define COMPILER "Microsoft Visual C++ version " STRINGIZE(COMPILER_VERSION)
#else
/* Fallback */
# define COMPILER "Unknown compiler"
#endif
/***************************************************************************
* Module-level
***************************************************************************/
struct module_state {
/* The Sun compiler can't handle empty structs */
#if defined(__SUNPRO_C) || defined(_MSC_VER)
int _dummy;
#endif
};
static struct PyModuleDef moduledef = {
PyModuleDef_HEAD_INIT,
"compiler_version",
NULL,
sizeof(struct module_state),
NULL,
NULL,
NULL,
NULL,
NULL
};
#define INITERROR return NULL
PyMODINIT_FUNC
PyInit_compiler_version(void)
{
PyObject* m;
m = PyModule_Create(&moduledef);
if (m == NULL)
INITERROR;
PyModule_AddStringConstant(m, "compiler", COMPILER);
return m;
}
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574694604.616569
astropy-healpix-0.5/astropy_helpers/astropy_helpers/commands/test.py 0000644 0000770 0000024 00000002672 00000000000 026235 0 ustar 00tom staff 0000000 0000000 """
Different implementations of the ``./setup.py test`` command depending on
what's locally available.
If Astropy v1.1 or later is available it should be possible to import
AstropyTest from ``astropy.tests.command``. Otherwise there is a skeleton
implementation that allows users to at least discover the ``./setup.py test``
command and learn that they need Astropy to run it.
"""
import os
from ..utils import import_file
# Previously these except statements caught only ImportErrors, but there are
# some other obscure exceptional conditions that can occur when importing
# astropy.tests (at least on older versions) that can cause these imports to
# fail
try:
# If we are testing astropy itself, we need to use import_file to avoid
# actually importing astropy (just the file we need).
command_file = os.path.join('astropy', 'tests', 'command.py')
if os.path.exists(command_file):
AstropyTest = import_file(command_file, 'astropy_tests_command').AstropyTest
else:
import astropy # noqa
from astropy.tests.command import AstropyTest
except Exception:
# No astropy at all--provide the dummy implementation
from ._dummy import _DummyCommand
class AstropyTest(_DummyCommand):
command_name = 'test'
description = 'Run the tests for this package'
error_msg = (
"The 'test' command requires the astropy package to be "
"installed and importable.")
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6168427
astropy-healpix-0.5/astropy_helpers/astropy_helpers/conftest.py 0000644 0000770 0000024 00000003612 00000000000 025275 0 ustar 00tom staff 0000000 0000000 # This file contains settings for pytest that are specific to astropy-helpers.
# Since we run many of the tests in sub-processes, we need to collect coverage
# data inside each subprocess and then combine it into a single .coverage file.
# To do this we set up a list which run_setup appends coverage objects to.
# This is not intended to be used by packages other than astropy-helpers.
import os
import glob
try:
from coverage import CoverageData
except ImportError:
HAS_COVERAGE = False
else:
HAS_COVERAGE = True
if HAS_COVERAGE:
SUBPROCESS_COVERAGE = []
def pytest_configure(config):
if HAS_COVERAGE:
SUBPROCESS_COVERAGE.clear()
def pytest_unconfigure(config):
if HAS_COVERAGE:
# We create an empty coverage data object
combined_cdata = CoverageData()
# Add all files from astropy_helpers to make sure we compute the total
# coverage, not just the coverage of the files that have non-zero
# coverage.
lines = {}
for filename in glob.glob(os.path.join('astropy_helpers', '**', '*.py'), recursive=True):
lines[os.path.abspath(filename)] = []
for cdata in SUBPROCESS_COVERAGE:
# For each CoverageData object, we go through all the files and
# change the filename from one which might be a temporary path
# to the local filename. We then only keep files that actually
# exist.
for filename in cdata.measured_files():
try:
pos = filename.rindex('astropy_helpers')
except ValueError:
continue
short_filename = filename[pos:]
if os.path.exists(short_filename):
lines[os.path.abspath(short_filename)].extend(cdata.lines(filename))
combined_cdata.add_lines(lines)
combined_cdata.write_file('.coverage.subprocess')
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6171172
astropy-healpix-0.5/astropy_helpers/astropy_helpers/distutils_helpers.py 0000644 0000770 0000024 00000017671 00000000000 027230 0 ustar 00tom staff 0000000 0000000 """
This module contains various utilities for introspecting the distutils
module and the setup process.
Some of these utilities require the
`astropy_helpers.setup_helpers.register_commands` function to be called first,
as it will affect introspection of setuptools command-line arguments. Other
utilities in this module do not have that restriction.
"""
import os
import sys
from distutils import ccompiler, log
from distutils.dist import Distribution
from distutils.errors import DistutilsError
from .utils import silence
# This function, and any functions that call it, require the setup in
# `astropy_helpers.setup_helpers.register_commands` to be run first.
def get_dummy_distribution():
"""
Returns a distutils Distribution object used to instrument the setup
environment before calling the actual setup() function.
"""
from .setup_helpers import _module_state
if _module_state['registered_commands'] is None:
raise RuntimeError(
'astropy_helpers.setup_helpers.register_commands() must be '
'called before using '
'astropy_helpers.setup_helpers.get_dummy_distribution()')
# Pre-parse the Distutils command-line options and config files to if
# the option is set.
dist = Distribution({'script_name': os.path.basename(sys.argv[0]),
'script_args': sys.argv[1:]})
dist.cmdclass.update(_module_state['registered_commands'])
with silence():
try:
dist.parse_config_files()
dist.parse_command_line()
except (DistutilsError, AttributeError, SystemExit):
# Let distutils handle DistutilsErrors itself AttributeErrors can
# get raise for ./setup.py --help SystemExit can be raised if a
# display option was used, for example
pass
return dist
def get_main_package_directory(distribution):
"""
Given a Distribution object, return the main package directory.
"""
return min(distribution.packages, key=len).replace('.', os.sep)
def get_distutils_option(option, commands):
""" Returns the value of the given distutils option.
Parameters
----------
option : str
The name of the option
commands : list of str
The list of commands on which this option is available
Returns
-------
val : str or None
the value of the given distutils option. If the option is not set,
returns None.
"""
dist = get_dummy_distribution()
for cmd in commands:
cmd_opts = dist.command_options.get(cmd)
if cmd_opts is not None and option in cmd_opts:
return cmd_opts[option][1]
else:
return None
def get_distutils_build_option(option):
""" Returns the value of the given distutils build option.
Parameters
----------
option : str
The name of the option
Returns
-------
val : str or None
The value of the given distutils build option. If the option
is not set, returns None.
"""
return get_distutils_option(option, ['build', 'build_ext', 'build_clib'])
def get_distutils_install_option(option):
""" Returns the value of the given distutils install option.
Parameters
----------
option : str
The name of the option
Returns
-------
val : str or None
The value of the given distutils build option. If the option
is not set, returns None.
"""
return get_distutils_option(option, ['install'])
def get_distutils_build_or_install_option(option):
""" Returns the value of the given distutils build or install option.
Parameters
----------
option : str
The name of the option
Returns
-------
val : str or None
The value of the given distutils build or install option. If the
option is not set, returns None.
"""
return get_distutils_option(option, ['build', 'build_ext', 'build_clib',
'install'])
def get_compiler_option():
""" Determines the compiler that will be used to build extension modules.
Returns
-------
compiler : str
The compiler option specified for the build, build_ext, or build_clib
command; or the default compiler for the platform if none was
specified.
"""
compiler = get_distutils_build_option('compiler')
if compiler is None:
return ccompiler.get_default_compiler()
return compiler
def add_command_option(command, name, doc, is_bool=False):
"""
Add a custom option to a setup command.
Issues a warning if the option already exists on that command.
Parameters
----------
command : str
The name of the command as given on the command line
name : str
The name of the build option
doc : str
A short description of the option, for the `--help` message
is_bool : bool, optional
When `True`, the option is a boolean option and doesn't
require an associated value.
"""
dist = get_dummy_distribution()
cmdcls = dist.get_command_class(command)
if (hasattr(cmdcls, '_astropy_helpers_options') and
name in cmdcls._astropy_helpers_options):
return
attr = name.replace('-', '_')
if hasattr(cmdcls, attr):
raise RuntimeError(
'{0!r} already has a {1!r} class attribute, barring {2!r} from '
'being usable as a custom option name.'.format(cmdcls, attr, name))
for idx, cmd in enumerate(cmdcls.user_options):
if cmd[0] == name:
log.warn('Overriding existing {0!r} option '
'{1!r}'.format(command, name))
del cmdcls.user_options[idx]
if name in cmdcls.boolean_options:
cmdcls.boolean_options.remove(name)
break
cmdcls.user_options.append((name, None, doc))
if is_bool:
cmdcls.boolean_options.append(name)
# Distutils' command parsing requires that a command object have an
# attribute with the same name as the option (with '-' replaced with '_')
# in order for that option to be recognized as valid
setattr(cmdcls, attr, None)
# This caches the options added through add_command_option so that if it is
# run multiple times in the same interpreter repeated adds are ignored
# (this way we can still raise a RuntimeError if a custom option overrides
# a built-in option)
if not hasattr(cmdcls, '_astropy_helpers_options'):
cmdcls._astropy_helpers_options = set([name])
else:
cmdcls._astropy_helpers_options.add(name)
def get_distutils_display_options():
""" Returns a set of all the distutils display options in their long and
short forms. These are the setup.py arguments such as --name or --version
which print the project's metadata and then exit.
Returns
-------
opts : set
The long and short form display option arguments, including the - or --
"""
short_display_opts = set('-' + o[1] for o in Distribution.display_options
if o[1])
long_display_opts = set('--' + o[0] for o in Distribution.display_options)
# Include -h and --help which are not explicitly listed in
# Distribution.display_options (as they are handled by optparse)
short_display_opts.add('-h')
long_display_opts.add('--help')
# This isn't the greatest approach to hardcode these commands.
# However, there doesn't seem to be a good way to determine
# whether build *will be* run as part of the command at this
# phase.
display_commands = set([
'clean', 'register', 'setopt', 'saveopts', 'egg_info',
'alias'])
return short_display_opts.union(long_display_opts.union(display_commands))
def is_distutils_display_option():
""" Returns True if sys.argv contains any of the distutils display options
such as --version or --name.
"""
display_options = get_distutils_display_options()
return bool(set(sys.argv[1:]).intersection(display_options))
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6175387
astropy-healpix-0.5/astropy_helpers/astropy_helpers/git_helpers.py 0000644 0000770 0000024 00000014537 00000000000 025765 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
"""
Utilities for retrieving revision information from a project's git repository.
"""
# Do not remove the following comment; it is used by
# astropy_helpers.version_helpers to determine the beginning of the code in
# this module
# BEGIN
import locale
import os
import subprocess
import warnings
__all__ = ['get_git_devstr']
def _decode_stdio(stream):
try:
stdio_encoding = locale.getdefaultlocale()[1] or 'utf-8'
except ValueError:
stdio_encoding = 'utf-8'
try:
text = stream.decode(stdio_encoding)
except UnicodeDecodeError:
# Final fallback
text = stream.decode('latin1')
return text
def update_git_devstr(version, path=None):
"""
Updates the git revision string if and only if the path is being imported
directly from a git working copy. This ensures that the revision number in
the version string is accurate.
"""
try:
# Quick way to determine if we're in git or not - returns '' if not
devstr = get_git_devstr(sha=True, show_warning=False, path=path)
except OSError:
return version
if not devstr:
# Probably not in git so just pass silently
return version
if 'dev' in version: # update to the current git revision
version_base = version.split('.dev', 1)[0]
devstr = get_git_devstr(sha=False, show_warning=False, path=path)
return version_base + '.dev' + devstr
else:
# otherwise it's already the true/release version
return version
def get_git_devstr(sha=False, show_warning=True, path=None):
"""
Determines the number of revisions in this repository.
Parameters
----------
sha : bool
If True, the full SHA1 hash will be returned. Otherwise, the total
count of commits in the repository will be used as a "revision
number".
show_warning : bool
If True, issue a warning if git returns an error code, otherwise errors
pass silently.
path : str or None
If a string, specifies the directory to look in to find the git
repository. If `None`, the current working directory is used, and must
be the root of the git repository.
If given a filename it uses the directory containing that file.
Returns
-------
devversion : str
Either a string with the revision number (if `sha` is False), the
SHA1 hash of the current commit (if `sha` is True), or an empty string
if git version info could not be identified.
"""
if path is None:
path = os.getcwd()
if not os.path.isdir(path):
path = os.path.abspath(os.path.dirname(path))
if sha:
# Faster for getting just the hash of HEAD
cmd = ['rev-parse', 'HEAD']
else:
cmd = ['rev-list', '--count', 'HEAD']
def run_git(cmd):
try:
p = subprocess.Popen(['git'] + cmd, cwd=path,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
stdout, stderr = p.communicate()
except OSError as e:
if show_warning:
warnings.warn('Error running git: ' + str(e))
return (None, b'', b'')
if p.returncode == 128:
if show_warning:
warnings.warn('No git repository present at {0!r}! Using '
'default dev version.'.format(path))
return (p.returncode, b'', b'')
if p.returncode == 129:
if show_warning:
warnings.warn('Your git looks old (does it support {0}?); '
'consider upgrading to v1.7.2 or '
'later.'.format(cmd[0]))
return (p.returncode, stdout, stderr)
elif p.returncode != 0:
if show_warning:
warnings.warn('Git failed while determining revision '
'count: {0}'.format(_decode_stdio(stderr)))
return (p.returncode, stdout, stderr)
return p.returncode, stdout, stderr
returncode, stdout, stderr = run_git(cmd)
if not sha and returncode == 128:
# git returns 128 if the command is not run from within a git
# repository tree. In this case, a warning is produced above but we
# return the default dev version of '0'.
return '0'
elif not sha and returncode == 129:
# git returns 129 if a command option failed to parse; in
# particular this could happen in git versions older than 1.7.2
# where the --count option is not supported
# Also use --abbrev-commit and --abbrev=0 to display the minimum
# number of characters needed per-commit (rather than the full hash)
cmd = ['rev-list', '--abbrev-commit', '--abbrev=0', 'HEAD']
returncode, stdout, stderr = run_git(cmd)
# Fall back on the old method of getting all revisions and counting
# the lines
if returncode == 0:
return str(stdout.count(b'\n'))
else:
return ''
elif sha:
return _decode_stdio(stdout)[:40]
else:
return _decode_stdio(stdout).strip()
# This function is tested but it is only ever executed within a subprocess when
# creating a fake package, so it doesn't get picked up by coverage metrics.
def _get_repo_path(pathname, levels=None): # pragma: no cover
"""
Given a file or directory name, determine the root of the git repository
this path is under. If given, this won't look any higher than ``levels``
(that is, if ``levels=0`` then the given path must be the root of the git
repository and is returned if so.
Returns `None` if the given path could not be determined to belong to a git
repo.
"""
if os.path.isfile(pathname):
current_dir = os.path.abspath(os.path.dirname(pathname))
elif os.path.isdir(pathname):
current_dir = os.path.abspath(pathname)
else:
return None
current_level = 0
while levels is None or current_level <= levels:
if os.path.exists(os.path.join(current_dir, '.git')):
return current_dir
current_level += 1
if current_dir == os.path.dirname(current_dir):
break
current_dir = os.path.dirname(current_dir)
return None
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6178815
astropy-healpix-0.5/astropy_helpers/astropy_helpers/openmp_helpers.py 0000644 0000770 0000024 00000022161 00000000000 026470 0 ustar 00tom staff 0000000 0000000 # This module defines functions that can be used to check whether OpenMP is
# available and if so what flags to use. To use this, import the
# add_openmp_flags_if_available function in a setup_package.py file where you
# are defining your extensions:
#
# from astropy_helpers.openmp_helpers import add_openmp_flags_if_available
#
# then call it with a single extension as the only argument:
#
# add_openmp_flags_if_available(extension)
#
# this will add the OpenMP flags if available.
import os
import sys
import glob
import time
import datetime
import tempfile
import subprocess
from distutils import log
from distutils.ccompiler import new_compiler
from distutils.sysconfig import customize_compiler, get_config_var
from distutils.errors import CompileError, LinkError
from .distutils_helpers import get_compiler_option
__all__ = ['add_openmp_flags_if_available']
try:
# Check if this has already been instantiated, only set the default once.
_ASTROPY_DISABLE_SETUP_WITH_OPENMP_
except NameError:
import builtins
# It hasn't, so do so.
builtins._ASTROPY_DISABLE_SETUP_WITH_OPENMP_ = False
CCODE = """
#include
#include
int main(void) {
#pragma omp parallel
printf("nthreads=%d\\n", omp_get_num_threads());
return 0;
}
"""
def _get_flag_value_from_var(flag, var, delim=' '):
"""
Extract flags from an environment variable.
Parameters
----------
flag : str
The flag to extract, for example '-I' or '-L'
var : str
The environment variable to extract the flag from, e.g. CFLAGS or LDFLAGS.
delim : str, optional
The delimiter separating flags inside the environment variable
Examples
--------
Let's assume the LDFLAGS is set to '-L/usr/local/include -customflag'. This
function will then return the following:
>>> _get_flag_value_from_var('-L', 'LDFLAGS')
'/usr/local/include'
Notes
-----
Environment variables are first checked in ``os.environ[var]``, then in
``distutils.sysconfig.get_config_var(var)``.
This function is not supported on Windows.
"""
if sys.platform.startswith('win'):
return None
# Simple input validation
if not var or not flag:
return None
flag_length = len(flag)
if not flag_length:
return None
# Look for var in os.eviron then in get_config_var
if var in os.environ:
flags = os.environ[var]
else:
try:
flags = get_config_var(var)
except KeyError:
return None
# Extract flag from {var:value}
if flags:
for item in flags.split(delim):
if item.startswith(flag):
return item[flag_length:]
def get_openmp_flags():
"""
Utility for returning compiler and linker flags possibly needed for
OpenMP support.
Returns
-------
result : `{'compiler_flags':, 'linker_flags':}`
Notes
-----
The flags returned are not tested for validity, use
`check_openmp_support(openmp_flags=get_openmp_flags())` to do so.
"""
compile_flags = []
link_flags = []
if get_compiler_option() == 'msvc':
compile_flags.append('-openmp')
else:
include_path = _get_flag_value_from_var('-I', 'CFLAGS')
if include_path:
compile_flags.append('-I' + include_path)
lib_path = _get_flag_value_from_var('-L', 'LDFLAGS')
if lib_path:
link_flags.append('-L' + lib_path)
link_flags.append('-Wl,-rpath,' + lib_path)
compile_flags.append('-fopenmp')
link_flags.append('-fopenmp')
return {'compiler_flags': compile_flags, 'linker_flags': link_flags}
def check_openmp_support(openmp_flags=None):
"""
Check whether OpenMP test code can be compiled and run.
Parameters
----------
openmp_flags : dict, optional
This should be a dictionary with keys ``compiler_flags`` and
``linker_flags`` giving the compiliation and linking flags respectively.
These are passed as `extra_postargs` to `compile()` and
`link_executable()` respectively. If this is not set, the flags will
be automatically determined using environment variables.
Returns
-------
result : bool
`True` if the test passed, `False` otherwise.
"""
ccompiler = new_compiler()
customize_compiler(ccompiler)
if not openmp_flags:
# customize_compiler() extracts info from os.environ. If certain keys
# exist it uses these plus those from sysconfig.get_config_vars().
# If the key is missing in os.environ it is not extracted from
# sysconfig.get_config_var(). E.g. 'LDFLAGS' get left out, preventing
# clang from finding libomp.dylib because -L is not passed to
# linker. Call get_openmp_flags() to get flags missed by
# customize_compiler().
openmp_flags = get_openmp_flags()
compile_flags = openmp_flags.get('compiler_flags')
link_flags = openmp_flags.get('linker_flags')
tmp_dir = tempfile.mkdtemp()
start_dir = os.path.abspath('.')
try:
os.chdir(tmp_dir)
# Write test program
with open('test_openmp.c', 'w') as f:
f.write(CCODE)
os.mkdir('objects')
# Compile, test program
ccompiler.compile(['test_openmp.c'], output_dir='objects',
extra_postargs=compile_flags)
# Link test program
objects = glob.glob(os.path.join('objects', '*' + ccompiler.obj_extension))
ccompiler.link_executable(objects, 'test_openmp',
extra_postargs=link_flags)
# Run test program
output = subprocess.check_output('./test_openmp')
output = output.decode(sys.stdout.encoding or 'utf-8').splitlines()
if 'nthreads=' in output[0]:
nthreads = int(output[0].strip().split('=')[1])
if len(output) == nthreads:
is_openmp_supported = True
else:
log.warn("Unexpected number of lines from output of test OpenMP "
"program (output was {0})".format(output))
is_openmp_supported = False
else:
log.warn("Unexpected output from test OpenMP "
"program (output was {0})".format(output))
is_openmp_supported = False
except (CompileError, LinkError, subprocess.CalledProcessError):
is_openmp_supported = False
finally:
os.chdir(start_dir)
return is_openmp_supported
def is_openmp_supported():
"""
Determine whether the build compiler has OpenMP support.
"""
log_threshold = log.set_threshold(log.FATAL)
ret = check_openmp_support()
log.set_threshold(log_threshold)
return ret
def add_openmp_flags_if_available(extension):
"""
Add OpenMP compilation flags, if supported (if not a warning will be
printed to the console and no flags will be added.)
Returns `True` if the flags were added, `False` otherwise.
"""
if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_:
log.info("OpenMP support has been explicitly disabled.")
return False
openmp_flags = get_openmp_flags()
using_openmp = check_openmp_support(openmp_flags=openmp_flags)
if using_openmp:
compile_flags = openmp_flags.get('compiler_flags')
link_flags = openmp_flags.get('linker_flags')
log.info("Compiling Cython/C/C++ extension with OpenMP support")
extension.extra_compile_args.extend(compile_flags)
extension.extra_link_args.extend(link_flags)
else:
log.warn("Cannot compile Cython/C/C++ extension with OpenMP, reverting "
"to non-parallel code")
return using_openmp
_IS_OPENMP_ENABLED_SRC = """
# Autogenerated by {packagetitle}'s setup.py on {timestamp!s}
def is_openmp_enabled():
\"\"\"
Determine whether this package was built with OpenMP support.
\"\"\"
return {return_bool}
"""[1:]
def generate_openmp_enabled_py(packagename, srcdir='.', disable_openmp=None):
"""
Generate ``package.openmp_enabled.is_openmp_enabled``, which can then be used
to determine, post build, whether the package was built with or without
OpenMP support.
"""
if packagename.lower() == 'astropy':
packagetitle = 'Astropy'
else:
packagetitle = packagename
epoch = int(os.environ.get('SOURCE_DATE_EPOCH', time.time()))
timestamp = datetime.datetime.utcfromtimestamp(epoch)
if disable_openmp is not None:
import builtins
builtins._ASTROPY_DISABLE_SETUP_WITH_OPENMP_ = disable_openmp
if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_:
log.info("OpenMP support has been explicitly disabled.")
openmp_support = False if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_ else is_openmp_supported()
src = _IS_OPENMP_ENABLED_SRC.format(packagetitle=packagetitle,
timestamp=timestamp,
return_bool=openmp_support)
package_srcdir = os.path.join(srcdir, *packagename.split('.'))
is_openmp_enabled_py = os.path.join(package_srcdir, 'openmp_enabled.py')
with open(is_openmp_enabled_py, 'w') as f:
f.write(src)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6193264
astropy-healpix-0.5/astropy_helpers/astropy_helpers/setup_helpers.py 0000644 0000770 0000024 00000070552 00000000000 026341 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
"""
This module contains a number of utilities for use during
setup/build/packaging that are useful to astropy as a whole.
"""
import collections
import os
import re
import subprocess
import sys
import traceback
import warnings
from configparser import ConfigParser
import builtins
from distutils import log
from distutils.errors import DistutilsOptionError, DistutilsModuleError
from distutils.core import Extension
from distutils.core import Command
from distutils.command.sdist import sdist as DistutilsSdist
from setuptools import setup as setuptools_setup
from setuptools.config import read_configuration
from setuptools import find_packages as _find_packages
from .distutils_helpers import (add_command_option, get_compiler_option,
get_dummy_distribution, get_distutils_build_option,
get_distutils_build_or_install_option)
from .version_helpers import get_pkg_version_module, generate_version_py
from .utils import (walk_skip_hidden, import_file, extends_doc,
resolve_name, AstropyDeprecationWarning)
from .commands.build_ext import AstropyHelpersBuildExt
from .commands.test import AstropyTest
# These imports are not used in this module, but are included for backwards
# compat with older versions of this module
from .utils import get_numpy_include_path, write_if_different # noqa
__all__ = ['register_commands', 'get_package_info']
_module_state = {'registered_commands': None,
'have_sphinx': False,
'package_cache': None,
'exclude_packages': set(),
'excludes_too_late': False}
try:
import sphinx # noqa
_module_state['have_sphinx'] = True
except ValueError as e:
# This can occur deep in the bowels of Sphinx's imports by way of docutils
# and an occurrence of this bug: http://bugs.python.org/issue18378
# In this case sphinx is effectively unusable
if 'unknown locale' in e.args[0]:
log.warn(
"Possible misconfiguration of one of the environment variables "
"LC_ALL, LC_CTYPES, LANG, or LANGUAGE. For an example of how to "
"configure your system's language environment on OSX see "
"http://blog.remibergsma.com/2012/07/10/"
"setting-locales-correctly-on-mac-osx-terminal-application/")
except ImportError:
pass
except SyntaxError:
# occurs if markupsafe is recent version, which doesn't support Python 3.2
pass
def setup(**kwargs):
"""
A wrapper around setuptools' setup() function that automatically sets up
custom commands, generates a version file, and customizes the setup process
via the ``setup_package.py`` files.
"""
# DEPRECATED: store the package name in a built-in variable so it's easy
# to get from other parts of the setup infrastructure. We should phase this
# out in packages that use it - the cookiecutter template should now be
# able to put the right package name where needed.
conf = read_configuration('setup.cfg')
builtins._ASTROPY_PACKAGE_NAME_ = conf['metadata']['name']
# Create a dictionary with setup command overrides. Note that this gets
# information about the package (name and version) from the setup.cfg file.
cmdclass = register_commands()
# Freeze build information in version.py. Note that this gets information
# about the package (name and version) from the setup.cfg file.
version = generate_version_py()
# Get configuration information from all of the various subpackages.
# See the docstring for setup_helpers.update_package_files for more
# details.
package_info = get_package_info()
package_info['cmdclass'] = cmdclass
package_info['version'] = version
# Override using any specified keyword arguments
package_info.update(kwargs)
setuptools_setup(**package_info)
def adjust_compiler(package):
warnings.warn(
'The adjust_compiler function in setup.py is '
'deprecated and can be removed from your setup.py.',
AstropyDeprecationWarning)
def get_debug_option(packagename):
""" Determines if the build is in debug mode.
Returns
-------
debug : bool
True if the current build was started with the debug option, False
otherwise.
"""
try:
current_debug = get_pkg_version_module(packagename,
fromlist=['debug'])[0]
except (ImportError, AttributeError):
current_debug = None
# Only modify the debug flag if one of the build commands was explicitly
# run (i.e. not as a sub-command of something else)
dist = get_dummy_distribution()
if any(cmd in dist.commands for cmd in ['build', 'build_ext']):
debug = bool(get_distutils_build_option('debug'))
else:
debug = bool(current_debug)
if current_debug is not None and current_debug != debug:
build_ext_cmd = dist.get_command_class('build_ext')
build_ext_cmd._force_rebuild = True
return debug
def add_exclude_packages(excludes):
if _module_state['excludes_too_late']:
raise RuntimeError(
"add_package_excludes must be called before all other setup helper "
"functions in order to properly handle excluded packages")
_module_state['exclude_packages'].update(set(excludes))
def register_commands(package=None, version=None, release=None, srcdir='.'):
"""
This function generates a dictionary containing customized commands that
can then be passed to the ``cmdclass`` argument in ``setup()``.
"""
if package is not None:
warnings.warn('The package argument to generate_version_py has '
'been deprecated and will be removed in future. Specify '
'the package name in setup.cfg instead', AstropyDeprecationWarning)
if version is not None:
warnings.warn('The version argument to generate_version_py has '
'been deprecated and will be removed in future. Specify '
'the version number in setup.cfg instead', AstropyDeprecationWarning)
if release is not None:
warnings.warn('The release argument to generate_version_py has '
'been deprecated and will be removed in future. We now '
'use the presence of the "dev" string in the version to '
'determine whether this is a release', AstropyDeprecationWarning)
# We use ConfigParser instead of read_configuration here because the latter
# only reads in keys recognized by setuptools, but we need to access
# package_name below.
conf = ConfigParser()
conf.read('setup.cfg')
if conf.has_option('metadata', 'name'):
package = conf.get('metadata', 'name')
elif conf.has_option('metadata', 'package_name'):
# The package-template used package_name instead of name for a while
warnings.warn('Specifying the package name using the "package_name" '
'option in setup.cfg is deprecated - use the "name" '
'option instead.', AstropyDeprecationWarning)
package = conf.get('metadata', 'package_name')
elif package is not None: # deprecated
pass
else:
sys.stderr.write('ERROR: Could not read package name from setup.cfg\n')
sys.exit(1)
if _module_state['registered_commands'] is not None:
return _module_state['registered_commands']
if _module_state['have_sphinx']:
try:
from .commands.build_sphinx import (AstropyBuildSphinx,
AstropyBuildDocs)
except ImportError:
AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx
else:
AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx
_module_state['registered_commands'] = registered_commands = {
'test': generate_test_command(package),
# Use distutils' sdist because it respects package_data.
# setuptools/distributes sdist requires duplication of information in
# MANIFEST.in
'sdist': DistutilsSdist,
'build_ext': AstropyHelpersBuildExt,
'build_sphinx': AstropyBuildSphinx,
'build_docs': AstropyBuildDocs
}
# Need to override the __name__ here so that the commandline options are
# presented as being related to the "build" command, for example; normally
# this wouldn't be necessary since commands also have a command_name
# attribute, but there is a bug in distutils' help display code that it
# uses __name__ instead of command_name. Yay distutils!
for name, cls in registered_commands.items():
cls.__name__ = name
# Add a few custom options; more of these can be added by specific packages
# later
for option in [
('use-system-libraries',
"Use system libraries whenever possible", True)]:
add_command_option('build', *option)
add_command_option('install', *option)
add_command_hooks(registered_commands, srcdir=srcdir)
return registered_commands
def add_command_hooks(commands, srcdir='.'):
"""
Look through setup_package.py modules for functions with names like
``pre__hook`` and ``post__hook`` where
```` is the name of a ``setup.py`` command (e.g. build_ext).
If either hook is present this adds a wrapped version of that command to
the passed in ``commands`` `dict`. ``commands`` may be pre-populated with
other custom distutils command classes that should be wrapped if there are
hooks for them (e.g. `AstropyBuildPy`).
"""
hook_re = re.compile(r'^(pre|post)_(.+)_hook$')
# Distutils commands have a method of the same name, but it is not a
# *classmethod* (which probably didn't exist when distutils was first
# written)
def get_command_name(cmdcls):
if hasattr(cmdcls, 'command_name'):
return cmdcls.command_name
else:
return cmdcls.__name__
packages = find_packages(srcdir)
dist = get_dummy_distribution()
hooks = collections.defaultdict(dict)
for setuppkg in iter_setup_packages(srcdir, packages):
for name, obj in vars(setuppkg).items():
match = hook_re.match(name)
if not match:
continue
hook_type = match.group(1)
cmd_name = match.group(2)
if hook_type not in hooks[cmd_name]:
hooks[cmd_name][hook_type] = []
hooks[cmd_name][hook_type].append((setuppkg.__name__, obj))
for cmd_name, cmd_hooks in hooks.items():
commands[cmd_name] = generate_hooked_command(
cmd_name, dist.get_command_class(cmd_name), cmd_hooks)
def generate_hooked_command(cmd_name, cmd_cls, hooks):
"""
Returns a generated subclass of ``cmd_cls`` that runs the pre- and
post-command hooks for that command before and after the ``cmd_cls.run``
method.
"""
def run(self, orig_run=cmd_cls.run):
self.run_command_hooks('pre_hooks')
orig_run(self)
self.run_command_hooks('post_hooks')
return type(cmd_name, (cmd_cls, object),
{'run': run, 'run_command_hooks': run_command_hooks,
'pre_hooks': hooks.get('pre', []),
'post_hooks': hooks.get('post', [])})
def run_command_hooks(cmd_obj, hook_kind):
"""Run hooks registered for that command and phase.
*cmd_obj* is a finalized command object; *hook_kind* is either
'pre_hook' or 'post_hook'.
"""
hooks = getattr(cmd_obj, hook_kind, None)
if not hooks:
return
for modname, hook in hooks:
if isinstance(hook, str):
try:
hook_obj = resolve_name(hook)
except ImportError as exc:
raise DistutilsModuleError(
'cannot find hook {0}: {1}'.format(hook, exc))
else:
hook_obj = hook
if not callable(hook_obj):
raise DistutilsOptionError('hook {0!r} is not callable' % hook)
log.info('running {0} from {1} for {2} command'.format(
hook_kind.rstrip('s'), modname, cmd_obj.get_command_name()))
try:
hook_obj(cmd_obj)
except Exception:
log.error('{0} command hook {1} raised an exception: %s\n'.format(
hook_obj.__name__, cmd_obj.get_command_name()))
log.error(traceback.format_exc())
sys.exit(1)
def generate_test_command(package_name):
"""
Creates a custom 'test' command for the given package which sets the
command's ``package_name`` class attribute to the name of the package being
tested.
"""
return type(package_name.title() + 'Test', (AstropyTest,),
{'package_name': package_name})
def update_package_files(srcdir, extensions, package_data, packagenames,
package_dirs):
"""
This function is deprecated and maintained for backward compatibility
with affiliated packages. Affiliated packages should update their
setup.py to use `get_package_info` instead.
"""
info = get_package_info(srcdir)
extensions.extend(info['ext_modules'])
package_data.update(info['package_data'])
packagenames = list(set(packagenames + info['packages']))
package_dirs.update(info['package_dir'])
def get_package_info(srcdir='.', exclude=()):
"""
Collates all of the information for building all subpackages
and returns a dictionary of keyword arguments that can
be passed directly to `distutils.setup`.
The purpose of this function is to allow subpackages to update the
arguments to the package's ``setup()`` function in its setup.py
script, rather than having to specify all extensions/package data
directly in the ``setup.py``. See Astropy's own
``setup.py`` for example usage and the Astropy development docs
for more details.
This function obtains that information by iterating through all
packages in ``srcdir`` and locating a ``setup_package.py`` module.
This module can contain the following functions:
``get_extensions()``, ``get_package_data()``,
``get_build_options()``, and ``get_external_libraries()``.
Each of those functions take no arguments.
- ``get_extensions`` returns a list of
`distutils.extension.Extension` objects.
- ``get_package_data()`` returns a dict formatted as required by
the ``package_data`` argument to ``setup()``.
- ``get_build_options()`` returns a list of tuples describing the
extra build options to add.
- ``get_external_libraries()`` returns
a list of libraries that can optionally be built using external
dependencies.
"""
ext_modules = []
packages = []
package_dir = {}
# Read in existing package data, and add to it below
setup_cfg = os.path.join(srcdir, 'setup.cfg')
if os.path.exists(setup_cfg):
conf = read_configuration(setup_cfg)
if 'options' in conf and 'package_data' in conf['options']:
package_data = conf['options']['package_data']
else:
package_data = {}
else:
package_data = {}
if exclude:
warnings.warn(
"Use of the exclude parameter is no longer supported since it does "
"not work as expected. Use add_exclude_packages instead. Note that "
"it must be called prior to any other calls from setup helpers.",
AstropyDeprecationWarning)
# Use the find_packages tool to locate all packages and modules
packages = find_packages(srcdir, exclude=exclude)
# Update package_dir if the package lies in a subdirectory
if srcdir != '.':
package_dir[''] = srcdir
# For each of the setup_package.py modules, extract any
# information that is needed to install them. The build options
# are extracted first, so that their values will be available in
# subsequent calls to `get_extensions`, etc.
for setuppkg in iter_setup_packages(srcdir, packages):
if hasattr(setuppkg, 'get_build_options'):
options = setuppkg.get_build_options()
for option in options:
add_command_option('build', *option)
if hasattr(setuppkg, 'get_external_libraries'):
libraries = setuppkg.get_external_libraries()
for library in libraries:
add_external_library(library)
for setuppkg in iter_setup_packages(srcdir, packages):
# get_extensions must include any Cython extensions by their .pyx
# filename.
if hasattr(setuppkg, 'get_extensions'):
ext_modules.extend(setuppkg.get_extensions())
if hasattr(setuppkg, 'get_package_data'):
package_data.update(setuppkg.get_package_data())
# Locate any .pyx files not already specified, and add their extensions in.
# The default include dirs include numpy to facilitate numerical work.
ext_modules.extend(get_cython_extensions(srcdir, packages, ext_modules,
['numpy']))
# Now remove extensions that have the special name 'skip_cython', as they
# exist Only to indicate that the cython extensions shouldn't be built
for i, ext in reversed(list(enumerate(ext_modules))):
if ext.name == 'skip_cython':
del ext_modules[i]
# On Microsoft compilers, we need to pass the '/MANIFEST'
# commandline argument. This was the default on MSVC 9.0, but is
# now required on MSVC 10.0, but it doesn't seem to hurt to add
# it unconditionally.
if get_compiler_option() == 'msvc':
for ext in ext_modules:
ext.extra_link_args.append('/MANIFEST')
return {
'ext_modules': ext_modules,
'packages': packages,
'package_dir': package_dir,
'package_data': package_data,
}
def iter_setup_packages(srcdir, packages):
""" A generator that finds and imports all of the ``setup_package.py``
modules in the source packages.
Returns
-------
modgen : generator
A generator that yields (modname, mod), where `mod` is the module and
`modname` is the module name for the ``setup_package.py`` modules.
"""
for packagename in packages:
package_parts = packagename.split('.')
package_path = os.path.join(srcdir, *package_parts)
setup_package = os.path.relpath(
os.path.join(package_path, 'setup_package.py'))
if os.path.isfile(setup_package):
module = import_file(setup_package,
name=packagename + '.setup_package')
yield module
def iter_pyx_files(package_dir, package_name):
"""
A generator that yields Cython source files (ending in '.pyx') in the
source packages.
Returns
-------
pyxgen : generator
A generator that yields (extmod, fullfn) where `extmod` is the
full name of the module that the .pyx file would live in based
on the source directory structure, and `fullfn` is the path to
the .pyx file.
"""
for dirpath, dirnames, filenames in walk_skip_hidden(package_dir):
for fn in filenames:
if fn.endswith('.pyx'):
fullfn = os.path.relpath(os.path.join(dirpath, fn))
# Package must match file name
extmod = '.'.join([package_name, fn[:-4]])
yield (extmod, fullfn)
break # Don't recurse into subdirectories
def get_cython_extensions(srcdir, packages, prevextensions=tuple(),
extincludedirs=None):
"""
Looks for Cython files and generates Extensions if needed.
Parameters
----------
srcdir : str
Path to the root of the source directory to search.
prevextensions : list of `~distutils.core.Extension` objects
The extensions that are already defined. Any .pyx files already here
will be ignored.
extincludedirs : list of str or None
Directories to include as the `include_dirs` argument to the generated
`~distutils.core.Extension` objects.
Returns
-------
exts : list of `~distutils.core.Extension` objects
The new extensions that are needed to compile all .pyx files (does not
include any already in `prevextensions`).
"""
# Vanilla setuptools and old versions of distribute include Cython files
# as .c files in the sources, not .pyx, so we cannot simply look for
# existing .pyx sources in the previous sources, but we should also check
# for .c files with the same remaining filename. So we look for .pyx and
# .c files, and we strip the extension.
prevsourcepaths = []
ext_modules = []
for ext in prevextensions:
for s in ext.sources:
if s.endswith(('.pyx', '.c', '.cpp')):
sourcepath = os.path.realpath(os.path.splitext(s)[0])
prevsourcepaths.append(sourcepath)
for package_name in packages:
package_parts = package_name.split('.')
package_path = os.path.join(srcdir, *package_parts)
for extmod, pyxfn in iter_pyx_files(package_path, package_name):
sourcepath = os.path.realpath(os.path.splitext(pyxfn)[0])
if sourcepath not in prevsourcepaths:
ext_modules.append(Extension(extmod, [pyxfn],
include_dirs=extincludedirs))
return ext_modules
class DistutilsExtensionArgs(collections.defaultdict):
"""
A special dictionary whose default values are the empty list.
This is useful for building up a set of arguments for
`distutils.Extension` without worrying whether the entry is
already present.
"""
def __init__(self, *args, **kwargs):
def default_factory():
return []
super(DistutilsExtensionArgs, self).__init__(
default_factory, *args, **kwargs)
def update(self, other):
for key, val in other.items():
self[key].extend(val)
def pkg_config(packages, default_libraries, executable='pkg-config'):
"""
Uses pkg-config to update a set of distutils Extension arguments
to include the flags necessary to link against the given packages.
If the pkg-config lookup fails, default_libraries is applied to
libraries.
Parameters
----------
packages : list of str
A list of pkg-config packages to look up.
default_libraries : list of str
A list of library names to use if the pkg-config lookup fails.
Returns
-------
config : dict
A dictionary containing keyword arguments to
`distutils.Extension`. These entries include:
- ``include_dirs``: A list of include directories
- ``library_dirs``: A list of library directories
- ``libraries``: A list of libraries
- ``define_macros``: A list of macro defines
- ``undef_macros``: A list of macros to undefine
- ``extra_compile_args``: A list of extra arguments to pass to
the compiler
"""
flag_map = {'-I': 'include_dirs', '-L': 'library_dirs', '-l': 'libraries',
'-D': 'define_macros', '-U': 'undef_macros'}
command = "{0} --libs --cflags {1}".format(executable, ' '.join(packages)),
result = DistutilsExtensionArgs()
try:
pipe = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
output = pipe.communicate()[0].strip()
except subprocess.CalledProcessError as e:
lines = [
("{0} failed. This may cause the build to fail below."
.format(executable)),
" command: {0}".format(e.cmd),
" returncode: {0}".format(e.returncode),
" output: {0}".format(e.output)
]
log.warn('\n'.join(lines))
result['libraries'].extend(default_libraries)
else:
if pipe.returncode != 0:
lines = [
"pkg-config could not lookup up package(s) {0}.".format(
", ".join(packages)),
"This may cause the build to fail below."
]
log.warn('\n'.join(lines))
result['libraries'].extend(default_libraries)
else:
for token in output.split():
# It's not clear what encoding the output of
# pkg-config will come to us in. It will probably be
# some combination of pure ASCII (for the compiler
# flags) and the filesystem encoding (for any argument
# that includes directories or filenames), but this is
# just conjecture, as the pkg-config documentation
# doesn't seem to address it.
arg = token[:2].decode('ascii')
value = token[2:].decode(sys.getfilesystemencoding())
if arg in flag_map:
if arg == '-D':
value = tuple(value.split('=', 1))
result[flag_map[arg]].append(value)
else:
result['extra_compile_args'].append(value)
return result
def add_external_library(library):
"""
Add a build option for selecting the internal or system copy of a library.
Parameters
----------
library : str
The name of the library. If the library is `foo`, the build
option will be called `--use-system-foo`.
"""
for command in ['build', 'build_ext', 'install']:
add_command_option(command, str('use-system-' + library),
'Use the system {0} library'.format(library),
is_bool=True)
def use_system_library(library):
"""
Returns `True` if the build configuration indicates that the given
library should use the system copy of the library rather than the
internal one.
For the given library `foo`, this will be `True` if
`--use-system-foo` or `--use-system-libraries` was provided at the
commandline or in `setup.cfg`.
Parameters
----------
library : str
The name of the library
Returns
-------
use_system : bool
`True` if the build should use the system copy of the library.
"""
return (
get_distutils_build_or_install_option('use_system_{0}'.format(library)) or
get_distutils_build_or_install_option('use_system_libraries'))
@extends_doc(_find_packages)
def find_packages(where='.', exclude=(), invalidate_cache=False):
"""
This version of ``find_packages`` caches previous results to speed up
subsequent calls. Use ``invalide_cache=True`` to ignore cached results
from previous ``find_packages`` calls, and repeat the package search.
"""
if exclude:
warnings.warn(
"Use of the exclude parameter is no longer supported since it does "
"not work as expected. Use add_exclude_packages instead. Note that "
"it must be called prior to any other calls from setup helpers.",
AstropyDeprecationWarning)
# Calling add_exclude_packages after this point will have no effect
_module_state['excludes_too_late'] = True
if not invalidate_cache and _module_state['package_cache'] is not None:
return _module_state['package_cache']
packages = _find_packages(
where=where, exclude=list(_module_state['exclude_packages']))
_module_state['package_cache'] = packages
return packages
class FakeBuildSphinx(Command):
"""
A dummy build_sphinx command that is called if Sphinx is not
installed and displays a relevant error message
"""
# user options inherited from sphinx.setup_command.BuildDoc
user_options = [
('fresh-env', 'E', ''),
('all-files', 'a', ''),
('source-dir=', 's', ''),
('build-dir=', None, ''),
('config-dir=', 'c', ''),
('builder=', 'b', ''),
('project=', None, ''),
('version=', None, ''),
('release=', None, ''),
('today=', None, ''),
('link-index', 'i', '')]
# user options appended in astropy.setup_helpers.AstropyBuildSphinx
user_options.append(('warnings-returncode', 'w', ''))
user_options.append(('clean-docs', 'l', ''))
user_options.append(('no-intersphinx', 'n', ''))
user_options.append(('open-docs-in-browser', 'o', ''))
def initialize_options(self):
try:
raise RuntimeError("Sphinx and its dependencies must be installed "
"for build_docs.")
except:
log.error('error: Sphinx and its dependencies must be installed '
'for build_docs.')
sys.exit(1)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5516844
astropy-healpix-0.5/astropy_helpers/astropy_helpers/sphinx/ 0000755 0000770 0000024 00000000000 00000000000 024405 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574694604.619747
astropy-healpix-0.5/astropy_helpers/astropy_helpers/sphinx/__init__.py 0000644 0000770 0000024 00000000000 00000000000 026504 0 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6200445
astropy-healpix-0.5/astropy_helpers/astropy_helpers/sphinx/conf.py 0000644 0000770 0000024 00000000234 00000000000 025703 0 ustar 00tom staff 0000000 0000000 import warnings
from sphinx_astropy.conf import *
warnings.warn("Note that astropy_helpers.sphinx.conf is deprecated - use sphinx_astropy.conf instead")
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6246252
astropy-healpix-0.5/astropy_helpers/astropy_helpers/utils.py 0000644 0000770 0000024 00000020567 00000000000 024620 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
import contextlib
import imp
import os
import sys
import glob
from importlib import machinery as import_machinery
# Note: The following Warning subclasses are simply copies of the Warnings in
# Astropy of the same names.
class AstropyWarning(Warning):
"""
The base warning class from which all Astropy warnings should inherit.
Any warning inheriting from this class is handled by the Astropy logger.
"""
class AstropyDeprecationWarning(AstropyWarning):
"""
A warning class to indicate a deprecated feature.
"""
class AstropyPendingDeprecationWarning(PendingDeprecationWarning,
AstropyWarning):
"""
A warning class to indicate a soon-to-be deprecated feature.
"""
def _get_platlib_dir(cmd):
"""
Given a build command, return the name of the appropriate platform-specific
build subdirectory directory (e.g. build/lib.linux-x86_64-2.7)
"""
plat_specifier = '.{0}-{1}'.format(cmd.plat_name, sys.version[0:3])
return os.path.join(cmd.build_base, 'lib' + plat_specifier)
def get_numpy_include_path():
"""
Gets the path to the numpy headers.
"""
# We need to go through this nonsense in case setuptools
# downloaded and installed Numpy for us as part of the build or
# install, since Numpy may still think it's in "setup mode", when
# in fact we're ready to use it to build astropy now.
import builtins
if hasattr(builtins, '__NUMPY_SETUP__'):
del builtins.__NUMPY_SETUP__
import imp
import numpy
imp.reload(numpy)
try:
numpy_include = numpy.get_include()
except AttributeError:
numpy_include = numpy.get_numpy_include()
return numpy_include
class _DummyFile(object):
"""A noop writeable object."""
errors = ''
def write(self, s):
pass
def flush(self):
pass
@contextlib.contextmanager
def silence():
"""A context manager that silences sys.stdout and sys.stderr."""
old_stdout = sys.stdout
old_stderr = sys.stderr
sys.stdout = _DummyFile()
sys.stderr = _DummyFile()
exception_occurred = False
try:
yield
except:
exception_occurred = True
# Go ahead and clean up so that exception handling can work normally
sys.stdout = old_stdout
sys.stderr = old_stderr
raise
if not exception_occurred:
sys.stdout = old_stdout
sys.stderr = old_stderr
if sys.platform == 'win32':
import ctypes
def _has_hidden_attribute(filepath):
"""
Returns True if the given filepath has the hidden attribute on
MS-Windows. Based on a post here:
http://stackoverflow.com/questions/284115/cross-platform-hidden-file-detection
"""
if isinstance(filepath, bytes):
filepath = filepath.decode(sys.getfilesystemencoding())
try:
attrs = ctypes.windll.kernel32.GetFileAttributesW(filepath)
assert attrs != -1
result = bool(attrs & 2)
except (AttributeError, AssertionError):
result = False
return result
else:
def _has_hidden_attribute(filepath):
return False
def is_path_hidden(filepath):
"""
Determines if a given file or directory is hidden.
Parameters
----------
filepath : str
The path to a file or directory
Returns
-------
hidden : bool
Returns `True` if the file is hidden
"""
name = os.path.basename(os.path.abspath(filepath))
if isinstance(name, bytes):
is_dotted = name.startswith(b'.')
else:
is_dotted = name.startswith('.')
return is_dotted or _has_hidden_attribute(filepath)
def walk_skip_hidden(top, onerror=None, followlinks=False):
"""
A wrapper for `os.walk` that skips hidden files and directories.
This function does not have the parameter `topdown` from
`os.walk`: the directories must always be recursed top-down when
using this function.
See also
--------
os.walk : For a description of the parameters
"""
for root, dirs, files in os.walk(
top, topdown=True, onerror=onerror,
followlinks=followlinks):
# These lists must be updated in-place so os.walk will skip
# hidden directories
dirs[:] = [d for d in dirs if not is_path_hidden(d)]
files[:] = [f for f in files if not is_path_hidden(f)]
yield root, dirs, files
def write_if_different(filename, data):
"""Write `data` to `filename`, if the content of the file is different.
Parameters
----------
filename : str
The file name to be written to.
data : bytes
The data to be written to `filename`.
"""
assert isinstance(data, bytes)
if os.path.exists(filename):
with open(filename, 'rb') as fd:
original_data = fd.read()
else:
original_data = None
if original_data != data:
with open(filename, 'wb') as fd:
fd.write(data)
def import_file(filename, name=None):
"""
Imports a module from a single file as if it doesn't belong to a
particular package.
The returned module will have the optional ``name`` if given, or else
a name generated from the filename.
"""
# Specifying a traditional dot-separated fully qualified name here
# results in a number of "Parent module 'astropy' not found while
# handling absolute import" warnings. Using the same name, the
# namespaces of the modules get merged together. So, this
# generates an underscore-separated name which is more likely to
# be unique, and it doesn't really matter because the name isn't
# used directly here anyway.
mode = 'r'
if name is None:
basename = os.path.splitext(filename)[0]
name = '_'.join(os.path.relpath(basename).split(os.sep)[1:])
if not os.path.exists(filename):
raise ImportError('Could not import file {0}'.format(filename))
if import_machinery:
loader = import_machinery.SourceFileLoader(name, filename)
mod = loader.load_module()
else:
with open(filename, mode) as fd:
mod = imp.load_module(name, fd, filename, ('.py', mode, 1))
return mod
def resolve_name(name):
"""Resolve a name like ``module.object`` to an object and return it.
Raise `ImportError` if the module or name is not found.
"""
parts = name.split('.')
cursor = len(parts) - 1
module_name = parts[:cursor]
attr_name = parts[-1]
while cursor > 0:
try:
ret = __import__('.'.join(module_name), fromlist=[attr_name])
break
except ImportError:
if cursor == 0:
raise
cursor -= 1
module_name = parts[:cursor]
attr_name = parts[cursor]
ret = ''
for part in parts[cursor:]:
try:
ret = getattr(ret, part)
except AttributeError:
raise ImportError(name)
return ret
def extends_doc(extended_func):
"""
A function decorator for use when wrapping an existing function but adding
additional functionality. This copies the docstring from the original
function, and appends to it (along with a newline) the docstring of the
wrapper function.
Examples
--------
>>> def foo():
... '''Hello.'''
...
>>> @extends_doc(foo)
... def bar():
... '''Goodbye.'''
...
>>> print(bar.__doc__)
Hello.
Goodbye.
"""
def decorator(func):
if not (extended_func.__doc__ is None or func.__doc__ is None):
func.__doc__ = '\n\n'.join([extended_func.__doc__.rstrip('\n'),
func.__doc__.lstrip('\n')])
return func
return decorator
def find_data_files(package, pattern):
"""
Include files matching ``pattern`` inside ``package``.
Parameters
----------
package : str
The package inside which to look for data files
pattern : str
Pattern (glob-style) to match for the data files (e.g. ``*.dat``).
This supports the``**``recursive syntax. For example, ``**/*.fits``
matches all files ending with ``.fits`` recursively. Only one
instance of ``**`` can be included in the pattern.
"""
return glob.glob(os.path.join(package, pattern), recursive=True)
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574696077.659762
astropy-healpix-0.5/astropy_helpers/astropy_helpers/version.py 0000644 0000770 0000024 00000000570 00000000000 025135 0 ustar 00tom staff 0000000 0000000 # Autogenerated by Astropy-affiliated package astropy_helpers's setup.py on 2019-11-25 15:34:37 UTC
import datetime
version = "3.2.2"
githash = "ce42e6e238c200a4715785ef8c9d233f612d0c75"
major = 3
minor = 2
bugfix = 2
version_info = (major, minor, bugfix)
release = True
timestamp = datetime.datetime(2019, 11, 25, 15, 34, 37)
debug = False
astropy_helpers_version = ""
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574694604.6253676
astropy-healpix-0.5/astropy_helpers/astropy_helpers/version_helpers.py 0000644 0000770 0000024 00000030626 00000000000 026664 0 ustar 00tom staff 0000000 0000000 # Licensed under a 3-clause BSD style license - see LICENSE.rst
"""
Utilities for generating the version string for Astropy (or an affiliated
package) and the version.py module, which contains version info for the
package.
Within the generated astropy.version module, the `major`, `minor`, and `bugfix`
variables hold the respective parts of the version number (bugfix is '0' if
absent). The `release` variable is True if this is a release, and False if this
is a development version of astropy. For the actual version string, use::
from astropy.version import version
or::
from astropy import __version__
"""
import datetime
import os
import pkgutil
import sys
import time
import warnings
from distutils import log
from configparser import ConfigParser
import pkg_resources
from . import git_helpers
from .distutils_helpers import is_distutils_display_option
from .git_helpers import get_git_devstr
from .utils import AstropyDeprecationWarning, import_file
__all__ = ['generate_version_py']
def _version_split(version):
"""
Split a version string into major, minor, and bugfix numbers. If any of
those numbers are missing the default is zero. Any pre/post release
modifiers are ignored.
Examples
========
>>> _version_split('1.2.3')
(1, 2, 3)
>>> _version_split('1.2')
(1, 2, 0)
>>> _version_split('1.2rc1')
(1, 2, 0)
>>> _version_split('1')
(1, 0, 0)
>>> _version_split('')
(0, 0, 0)
"""
parsed_version = pkg_resources.parse_version(version)
if hasattr(parsed_version, 'base_version'):
# New version parsing for setuptools >= 8.0
if parsed_version.base_version:
parts = [int(part)
for part in parsed_version.base_version.split('.')]
else:
parts = []
else:
parts = []
for part in parsed_version:
if part.startswith('*'):
# Ignore any .dev, a, b, rc, etc.
break
parts.append(int(part))
if len(parts) < 3:
parts += [0] * (3 - len(parts))
# In principle a version could have more parts (like 1.2.3.4) but we only
# support ..
return tuple(parts[:3])
# This is used by setup.py to create a new version.py - see that file for
# details. Note that the imports have to be absolute, since this is also used
# by affiliated packages.
_FROZEN_VERSION_PY_TEMPLATE = """
# Autogenerated by {packagetitle}'s setup.py on {timestamp!s} UTC
import datetime
{header}
major = {major}
minor = {minor}
bugfix = {bugfix}
version_info = (major, minor, bugfix)
release = {rel}
timestamp = {timestamp!r}
debug = {debug}
astropy_helpers_version = "{ahver}"
"""[1:]
_FROZEN_VERSION_PY_WITH_GIT_HEADER = """
{git_helpers}
_packagename = "{packagename}"
_last_generated_version = "{verstr}"
_last_githash = "{githash}"
# Determine where the source code for this module
# lives. If __file__ is not a filesystem path then
# it is assumed not to live in a git repo at all.
if _get_repo_path(__file__, levels=len(_packagename.split('.'))):
version = update_git_devstr(_last_generated_version, path=__file__)
githash = get_git_devstr(sha=True, show_warning=False,
path=__file__) or _last_githash
else:
# The file does not appear to live in a git repo so don't bother
# invoking git
version = _last_generated_version
githash = _last_githash
"""[1:]
_FROZEN_VERSION_PY_STATIC_HEADER = """
version = "{verstr}"
githash = "{githash}"
"""[1:]
def _get_version_py_str(packagename, version, githash, release, debug,
uses_git=True):
try:
from astropy_helpers import __version__ as ahver
except ImportError:
ahver = "unknown"
epoch = int(os.environ.get('SOURCE_DATE_EPOCH', time.time()))
timestamp = datetime.datetime.utcfromtimestamp(epoch)
major, minor, bugfix = _version_split(version)
if packagename.lower() == 'astropy':
packagetitle = 'Astropy'
else:
packagetitle = 'Astropy-affiliated package ' + packagename
header = ''
if uses_git:
header = _generate_git_header(packagename, version, githash)
elif not githash:
# _generate_git_header will already generate a new git has for us, but
# for creating a new version.py for a release (even if uses_git=False)
# we still need to get the githash to include in the version.py
# See https://github.com/astropy/astropy-helpers/issues/141
githash = git_helpers.get_git_devstr(sha=True, show_warning=True)
if not header: # If _generate_git_header fails it returns an empty string
header = _FROZEN_VERSION_PY_STATIC_HEADER.format(verstr=version,
githash=githash)
return _FROZEN_VERSION_PY_TEMPLATE.format(packagetitle=packagetitle,
timestamp=timestamp,
header=header,
major=major,
minor=minor,
bugfix=bugfix,
ahver=ahver,
rel=release, debug=debug)
def _generate_git_header(packagename, version, githash):
"""
Generates a header to the version.py module that includes utilities for
probing the git repository for updates (to the current git hash, etc.)
These utilities should only be available in development versions, and not
in release builds.
If this fails for any reason an empty string is returned.
"""
loader = pkgutil.get_loader(git_helpers)
source = loader.get_source(git_helpers.__name__) or ''
source_lines = source.splitlines()
if not source_lines:
log.warn('Cannot get source code for astropy_helpers.git_helpers; '
'git support disabled.')
return ''
idx = 0
for idx, line in enumerate(source_lines):
if line.startswith('# BEGIN'):
break
git_helpers_py = '\n'.join(source_lines[idx + 1:])
verstr = version
new_githash = git_helpers.get_git_devstr(sha=True, show_warning=False)
if new_githash:
githash = new_githash
return _FROZEN_VERSION_PY_WITH_GIT_HEADER.format(
git_helpers=git_helpers_py, packagename=packagename,
verstr=verstr, githash=githash)
def generate_version_py(packagename=None, version=None, release=None, debug=None,
uses_git=None, srcdir='.'):
"""
Generate a version.py file in the package with version information, and
update developer version strings.
This function should normally be called without any arguments. In this case
the package name and version is read in from the ``setup.cfg`` file (from
the ``name`` or ``package_name`` entry and the ``version`` entry in the
``[metadata]`` section).
If the version is a developer version (of the form ``3.2.dev``), the
version string will automatically be expanded to include a sequential
number as a suffix (e.g. ``3.2.dev13312``), and the updated version string
will be returned by this function.
Based on this updated version string, a ``version.py`` file will be
generated inside the package, containing the version string as well as more
detailed information (for example the major, minor, and bugfix version
numbers, a ``release`` flag indicating whether the current version is a
stable or developer version, and so on.
"""
if packagename is not None:
warnings.warn('The packagename argument to generate_version_py has '
'been deprecated and will be removed in future. Specify '
'the package name in setup.cfg instead', AstropyDeprecationWarning)
if version is not None:
warnings.warn('The version argument to generate_version_py has '
'been deprecated and will be removed in future. Specify '
'the version number in setup.cfg instead', AstropyDeprecationWarning)
if release is not None:
warnings.warn('The release argument to generate_version_py has '
'been deprecated and will be removed in future. We now '
'use the presence of the "dev" string in the version to '
'determine whether this is a release', AstropyDeprecationWarning)
# We use ConfigParser instead of read_configuration here because the latter
# only reads in keys recognized by setuptools, but we need to access
# package_name below.
conf = ConfigParser()
conf.read('setup.cfg')
if conf.has_option('metadata', 'name'):
packagename = conf.get('metadata', 'name')
elif conf.has_option('metadata', 'package_name'):
# The package-template used package_name instead of name for a while
warnings.warn('Specifying the package name using the "package_name" '
'option in setup.cfg is deprecated - use the "name" '
'option instead.', AstropyDeprecationWarning)
packagename = conf.get('metadata', 'package_name')
elif packagename is not None: # deprecated
pass
else:
sys.stderr.write('ERROR: Could not read package name from setup.cfg\n')
sys.exit(1)
if conf.has_option('metadata', 'version'):
version = conf.get('metadata', 'version')
add_git_devstr = True
elif version is not None: # deprecated
add_git_devstr = False
else:
sys.stderr.write('ERROR: Could not read package version from setup.cfg\n')
sys.exit(1)
if release is None:
release = 'dev' not in version
if not release and add_git_devstr:
version += get_git_devstr(False)
if uses_git is None:
uses_git = not release
# In some cases, packages have a - but this is a _ in the module. Since we
# are only interested in the module here, we replace - by _
packagename = packagename.replace('-', '_')
try:
version_module = get_pkg_version_module(packagename)
try:
last_generated_version = version_module._last_generated_version
except AttributeError:
last_generated_version = version_module.version
try:
last_githash = version_module._last_githash
except AttributeError:
last_githash = version_module.githash
current_release = version_module.release
current_debug = version_module.debug
except ImportError:
version_module = None
last_generated_version = None
last_githash = None
current_release = None
current_debug = None
if release is None:
# Keep whatever the current value is, if it exists
release = bool(current_release)
if debug is None:
# Likewise, keep whatever the current value is, if it exists
debug = bool(current_debug)
package_srcdir = os.path.join(srcdir, *packagename.split('.'))
version_py = os.path.join(package_srcdir, 'version.py')
if (last_generated_version != version or current_release != release or
current_debug != debug):
if '-q' not in sys.argv and '--quiet' not in sys.argv:
log.set_threshold(log.INFO)
if is_distutils_display_option():
# Always silence unnecessary log messages when display options are
# being used
log.set_threshold(log.WARN)
log.info('Freezing version number to {0}'.format(version_py))
with open(version_py, 'w') as f:
# This overwrites the actual version.py
f.write(_get_version_py_str(packagename, version, last_githash,
release, debug, uses_git=uses_git))
return version
def get_pkg_version_module(packagename, fromlist=None):
"""Returns the package's .version module generated by
`astropy_helpers.version_helpers.generate_version_py`. Raises an
ImportError if the version module is not found.
If ``fromlist`` is an iterable, return a tuple of the members of the
version module corresponding to the member names given in ``fromlist``.
Raises an `AttributeError` if any of these module members are not found.
"""
version = import_file(os.path.join(packagename, 'version.py'), name='version')
if fromlist:
return tuple(getattr(version, member) for member in fromlist)
else:
return version
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5448494
astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/ 0000755 0000770 0000024 00000000000 00000000000 024566 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696077.7393768
astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/PKG-INFO 0000644 0000770 0000024 00000005031 00000000000 025662 0 ustar 00tom staff 0000000 0000000 Metadata-Version: 2.1
Name: astropy-helpers
Version: 3.2.2
Summary: Utilities for building and installing packages in the Astropy ecosystem
Home-page: https://github.com/astropy/astropy-helpers
Author: The Astropy Developers
Author-email: astropy.team@gmail.com
License: BSD 3-Clause License
Description: astropy-helpers
===============
.. image:: https://travis-ci.org/astropy/astropy-helpers.svg
:target: https://travis-ci.org/astropy/astropy-helpers
.. image:: https://ci.appveyor.com/api/projects/status/rt9161t9mhx02xp7/branch/master?svg=true
:target: https://ci.appveyor.com/project/Astropy/astropy-helpers
.. image:: https://codecov.io/gh/astropy/astropy-helpers/branch/master/graph/badge.svg
:target: https://codecov.io/gh/astropy/astropy-helpers
The **astropy-helpers** package includes many build, installation, and
documentation-related tools used by the Astropy project, but packaged separately
for use by other projects that wish to leverage this work. The motivation behind
this package and details of its implementation are in the accepted
`Astropy Proposal for Enhancement (APE) 4 `_.
Astropy-helpers is not a traditional package in the sense that it is not
intended to be installed directly by users or developers. Instead, it is meant
to be accessed when the ``setup.py`` command is run - see the "Using
astropy-helpers in a package" section in the documentation for how
to do this. For a real-life example of how to implement astropy-helpers in a
project, see the ``setup.py`` and ``setup.cfg`` files of the
`Affiliated package template `_.
For more information, see the documentation at http://astropy-helpers.readthedocs.io
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Framework :: Setuptools Plugin
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Topic :: Software Development :: Build Tools
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: System :: Archiving :: Packaging
Provides: astropy_helpers
Requires-Python: >=3.5
Provides-Extra: docs
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696077.8182936
astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/SOURCES.txt 0000644 0000770 0000024 00000001623 00000000000 026454 0 ustar 00tom staff 0000000 0000000 CHANGES.rst
LICENSE.rst
MANIFEST.in
README.rst
ah_bootstrap.py
setup.cfg
setup.py
astropy_helpers/__init__.py
astropy_helpers/conftest.py
astropy_helpers/distutils_helpers.py
astropy_helpers/git_helpers.py
astropy_helpers/openmp_helpers.py
astropy_helpers/setup_helpers.py
astropy_helpers/utils.py
astropy_helpers/version.py
astropy_helpers/version_helpers.py
astropy_helpers.egg-info/PKG-INFO
astropy_helpers.egg-info/SOURCES.txt
astropy_helpers.egg-info/dependency_links.txt
astropy_helpers.egg-info/not-zip-safe
astropy_helpers.egg-info/requires.txt
astropy_helpers.egg-info/top_level.txt
astropy_helpers/commands/__init__.py
astropy_helpers/commands/_dummy.py
astropy_helpers/commands/build_ext.py
astropy_helpers/commands/build_sphinx.py
astropy_helpers/commands/test.py
astropy_helpers/commands/src/compiler.c
astropy_helpers/sphinx/__init__.py
astropy_helpers/sphinx/conf.py
licenses/LICENSE_ASTROSCRAPPY.rst ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696077.7403238
astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/dependency_links.txt 0000644 0000770 0000024 00000000001 00000000000 030634 0 ustar 00tom staff 0000000 0000000
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696077.7397695
astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/not-zip-safe 0000644 0000770 0000024 00000000001 00000000000 027014 0 ustar 00tom staff 0000000 0000000
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696077.7409554
astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/requires.txt 0000644 0000770 0000024 00000000027 00000000000 027165 0 ustar 00tom staff 0000000 0000000
[docs]
sphinx-astropy
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1574696077.741411
astropy-healpix-0.5/astropy_helpers/astropy_helpers.egg-info/top_level.txt 0000644 0000770 0000024 00000000020 00000000000 027310 0 ustar 00tom staff 0000000 0000000 astropy_helpers
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5523944
astropy-healpix-0.5/astropy_helpers/licenses/ 0000755 0000770 0000024 00000000000 00000000000 021456 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1526666202.894688
astropy-healpix-0.5/astropy_helpers/licenses/LICENSE_ASTROSCRAPPY.rst 0000644 0000770 0000024 00000003154 00000000000 025307 0 ustar 00tom staff 0000000 0000000 # The OpenMP helpers include code heavily adapted from astroscrappy, released
# under the following license:
#
# Copyright (c) 2015, Curtis McCully
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without modification,
# are permitted provided that the following conditions are met:
#
# * Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice, this
# list of conditions and the following disclaimer in the documentation and/or
# other materials provided with the distribution.
# * Neither the name of the Astropy Team nor the names of its contributors may be
# used to endorse or promote products derived from this software without
# specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
# ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
# ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.5540156
astropy-healpix-0.5/cextern/ 0000755 0000770 0000024 00000000000 00000000000 016076 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1537003272.8794773
astropy-healpix-0.5/cextern/.gitignore 0000644 0000770 0000024 00000000063 00000000000 020065 0 ustar 00tom staff 0000000 0000000 astrometry.net/test_healpix
astrometry.net/example
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1537003272.8804672
astropy-healpix-0.5/cextern/README.md 0000644 0000770 0000024 00000001105 00000000000 017352 0 ustar 00tom staff 0000000 0000000 # astropy-healpix/cextern/
The `astropy-healpix` Python package is a wrapper around a C library.
See http://astropy-healpix.readthedocs.io/en/latest/about.html
This README gives some technical details on the C code here.
- The main file is `healpix.h` and `healpix.c`, start reading there first.
- For the Python `astropy-healpix` packge, the C code is built via `setup.py`
- However, to help work on the C code and test it directly, a `Makefile`
is included here.
- For testing, a copy of `CuTest.h` and `CuTest.c` from here is bundled:
https://github.com/asimjalis/cutest
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574696080.6209083
astropy-healpix-0.5/cextern/astrometry.net/ 0000755 0000770 0000024 00000000000 00000000000 021074 5 ustar 00tom staff 0000000 0000000 ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1537003272.8810105
astropy-healpix-0.5/cextern/astrometry.net/CuTest.c 0000644 0000770 0000024 00000017736 00000000000 022465 0 ustar 00tom staff 0000000 0000000 #include
#include
#include
#include
#include
#include
#include "CuTest.h"
/*-------------------------------------------------------------------------*
* CuStr
*-------------------------------------------------------------------------*/
char* CuStrAlloc(int size)
{
char* newStr = (char*) malloc( sizeof(char) * (size) );
return newStr;
}
char* CuStrCopy(const char* old)
{
int len = strlen(old);
char* newStr = CuStrAlloc(len + 1);
strcpy(newStr, old);
return newStr;
}
/*-------------------------------------------------------------------------*
* CuString
*-------------------------------------------------------------------------*/
void CuStringInit(CuString* str)
{
str->length = 0;
str->size = STRING_MAX;
str->buffer = (char*) malloc(sizeof(char) * str->size);
str->buffer[0] = '\0';
}
CuString* CuStringNew(void)
{
CuString* str = (CuString*) malloc(sizeof(CuString));
str->length = 0;
str->size = STRING_MAX;
str->buffer = (char*) malloc(sizeof(char) * str->size);
str->buffer[0] = '\0';
return str;
}
void CuStringDelete(CuString *str)
{
if (!str) return;
free(str->buffer);
free(str);
}
void CuStringResize(CuString* str, int newSize)
{
str->buffer = (char*) realloc(str->buffer, sizeof(char) * newSize);
str->size = newSize;
}
void CuStringAppend(CuString* str, const char* text)
{
int length;
if (text == NULL) {
text = "NULL";
}
length = strlen(text);
if (str->length + length + 1 >= str->size)
CuStringResize(str, str->length + length + 1 + STRING_INC);
str->length += length;
strcat(str->buffer, text);
}
void CuStringAppendChar(CuString* str, char ch)
{
char text[2];
text[0] = ch;
text[1] = '\0';
CuStringAppend(str, text);
}
void CuStringAppendFormat(CuString* str, const char* format, ...)
{
va_list argp;
char buf[HUGE_STRING_LEN];
va_start(argp, format);
vsprintf(buf, format, argp);
va_end(argp);
CuStringAppend(str, buf);
}
void CuStringInsert(CuString* str, const char* text, int pos)
{
int length = strlen(text);
if (pos > str->length)
pos = str->length;
if (str->length + length + 1 >= str->size)
CuStringResize(str, str->length + length + 1 + STRING_INC);
memmove(str->buffer + pos + length, str->buffer + pos, (str->length - pos) + 1);
str->length += length;
memcpy(str->buffer + pos, text, length);
}
/*-------------------------------------------------------------------------*
* CuTest
*-------------------------------------------------------------------------*/
void CuTestInit(CuTest* t, const char* name, TestFunction function)
{
t->name = CuStrCopy(name);
t->failed = 0;
t->ran = 0;
t->message = NULL;
t->function = function;
t->jumpBuf = NULL;
}
CuTest* CuTestNew(const char* name, TestFunction function)
{
CuTest* tc = CU_ALLOC(CuTest);
CuTestInit(tc, name, function);
return tc;
}
void CuTestDelete(CuTest *t)
{
if (!t) return;
free(t->name);
free(t);
}
void CuTestRun(CuTest* tc)
{
jmp_buf buf;
tc->jumpBuf = &buf;
if (setjmp(buf) == 0)
{
tc->ran = 1;
(tc->function)(tc);
}
tc->jumpBuf = 0;
}
static void CuFailInternal(CuTest* tc, const char* file, int line, CuString* string)
{
char buf[HUGE_STRING_LEN];
sprintf(buf, "%s:%d: ", file, line);
CuStringInsert(string, buf, 0);
tc->failed = 1;
tc->message = string->buffer;
if (tc->jumpBuf != 0) longjmp(*(tc->jumpBuf), 0);
}
void CuFail_Line(CuTest* tc, const char* file, int line, const char* message2, const char* message)
{
CuString string;
CuStringInit(&string);
if (message2 != NULL)
{
CuStringAppend(&string, message2);
CuStringAppend(&string, ": ");
}
CuStringAppend(&string, message);
CuFailInternal(tc, file, line, &string);
}
void CuAssert_Line(CuTest* tc, const char* file, int line, const char* message, int condition)
{
if (condition) return;
CuFail_Line(tc, file, line, NULL, message);
}
void CuAssertStrEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message,
const char* expected, const char* actual)
{
CuString string;
if ((expected == NULL && actual == NULL) ||
(expected != NULL && actual != NULL &&
strcmp(expected, actual) == 0))
{
return;
}
CuStringInit(&string);
if (message != NULL)
{
CuStringAppend(&string, message);
CuStringAppend(&string, ": ");
}
CuStringAppend(&string, "expected <");
CuStringAppend(&string, expected);
CuStringAppend(&string, "> but was <");
CuStringAppend(&string, actual);
CuStringAppend(&string, ">");
CuFailInternal(tc, file, line, &string);
}
void CuAssertIntEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message,
int expected, int actual)
{
char buf[STRING_MAX];
if (expected == actual) return;
sprintf(buf, "expected <%d> but was <%d>", expected, actual);
CuFail_Line(tc, file, line, message, buf);
}
void CuAssertDblEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message,
double expected, double actual, double delta)
{
char buf[STRING_MAX];
if (fabs(expected - actual) <= delta) return;
/* sprintf(buf, "expected <%lf> but was <%lf>", expected, actual); */
sprintf(buf, "expected <%f> but was <%f>", expected, actual);
CuFail_Line(tc, file, line, message, buf);
}
void CuAssertPtrEquals_LineMsg(CuTest* tc, const char* file, int line, const char* message,
void* expected, void* actual)
{
char buf[STRING_MAX];
if (expected == actual) return;
sprintf(buf, "expected pointer <0x%p> but was <0x%p>", expected, actual);
CuFail_Line(tc, file, line, message, buf);
}
/*-------------------------------------------------------------------------*
* CuSuite
*-------------------------------------------------------------------------*/
void CuSuiteInit(CuSuite* testSuite)
{
testSuite->count = 0;
testSuite->failCount = 0;
memset(testSuite->list, 0, sizeof(testSuite->list));
}
CuSuite* CuSuiteNew(void)
{
CuSuite* testSuite = CU_ALLOC(CuSuite);
CuSuiteInit(testSuite);
return testSuite;
}
void CuSuiteDelete(CuSuite *testSuite)
{
unsigned int n;
for (n=0; n < MAX_TEST_CASES; n++)
{
if (testSuite->list[n])
{
CuTestDelete(testSuite->list[n]);
}
}
free(testSuite);
}
void CuSuiteAdd(CuSuite* testSuite, CuTest *testCase)
{
assert(testSuite->count < MAX_TEST_CASES);
testSuite->list[testSuite->count] = testCase;
testSuite->count++;
}
void CuSuiteAddSuite(CuSuite* testSuite, CuSuite* testSuite2)
{
int i;
for (i = 0 ; i < testSuite2->count ; ++i)
{
CuTest* testCase = testSuite2->list[i];
CuSuiteAdd(testSuite, testCase);
}
}
void CuSuiteRun(CuSuite* testSuite)
{
int i;
for (i = 0 ; i < testSuite->count ; ++i)
{
CuTest* testCase = testSuite->list[i];
CuTestRun(testCase);
if (testCase->failed) { testSuite->failCount += 1; }
}
}
void CuSuiteSummary(CuSuite* testSuite, CuString* summary)
{
int i;
for (i = 0 ; i < testSuite->count ; ++i)
{
CuTest* testCase = testSuite->list[i];
CuStringAppend(summary, testCase->failed ? "F" : ".");
}
CuStringAppend(summary, "\n\n");
}
void CuSuiteDetails(CuSuite* testSuite, CuString* details)
{
int i;
int failCount = 0;
if (testSuite->failCount == 0)
{
int passCount = testSuite->count - testSuite->failCount;
const char* testWord = passCount == 1 ? "test" : "tests";
CuStringAppendFormat(details, "OK (%d %s)\n", passCount, testWord);
}
else
{
if (testSuite->failCount == 1)
CuStringAppend(details, "There was 1 failure:\n");
else
CuStringAppendFormat(details, "There were %d failures:\n", testSuite->failCount);
for (i = 0 ; i < testSuite->count ; ++i)
{
CuTest* testCase = testSuite->list[i];
if (testCase->failed)
{
failCount++;
CuStringAppendFormat(details, "%d) %s: %s\n",
failCount, testCase->name, testCase->message);
}
}
CuStringAppend(details, "\n!!!FAILURES!!!\n");
CuStringAppendFormat(details, "Runs: %d ", testSuite->count);
CuStringAppendFormat(details, "Passes: %d ", testSuite->count - testSuite->failCount);
CuStringAppendFormat(details, "Fails: %d\n", testSuite->failCount);
}
}
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1537003272.881539
astropy-healpix-0.5/cextern/astrometry.net/CuTest.h 0000644 0000770 0000024 00000007733 00000000000 022466 0 ustar 00tom staff 0000000 0000000 #ifndef CU_TEST_H
#define CU_TEST_H
#include
#include
/* CuString */
char* CuStrAlloc(int size);
char* CuStrCopy(const char* old);
#define CU_ALLOC(TYPE) ((TYPE*) malloc(sizeof(TYPE)))
#define HUGE_STRING_LEN 8192
#define STRING_MAX 256
#define STRING_INC 256
typedef struct
{
int length;
int size;
char* buffer;
} CuString;
void CuStringInit(CuString* str);
CuString* CuStringNew(void);
void CuStringRead(CuString* str, const char* path);
void CuStringAppend(CuString* str, const char* text);
void CuStringAppendChar(CuString* str, char ch);
void CuStringAppendFormat(CuString* str, const char* format, ...);
void CuStringInsert(CuString* str, const char* text, int pos);
void CuStringResize(CuString* str, int newSize);
void CuStringDelete(CuString* str);
/* CuTest */
typedef struct CuTest CuTest;
typedef void (*TestFunction)(CuTest *);
struct CuTest
{
char* name;
TestFunction function;
int failed;
int ran;
const char* message;
jmp_buf *jumpBuf;
};
void CuTestInit(CuTest* t, const char* name, TestFunction function);
CuTest* CuTestNew(const char* name, TestFunction function);
void CuTestRun(CuTest* tc);
void CuTestDelete(CuTest *t);
/* Internal versions of assert functions -- use the public versions */
void CuFail_Line(CuTest* tc, const char* file, int line, const char* message2, const char* message);
void CuAssert_Line(CuTest* tc, const char* file, int line, const char* message, int condition);
void CuAssertStrEquals_LineMsg(CuTest* tc,
const char* file, int line, const char* message,
const char* expected, const char* actual);
void CuAssertIntEquals_LineMsg(CuTest* tc,
const char* file, int line, const char* message,
int expected, int actual);
void CuAssertDblEquals_LineMsg(CuTest* tc,
const char* file, int line, const char* message,
double expected, double actual, double delta);
void CuAssertPtrEquals_LineMsg(CuTest* tc,
const char* file, int line, const char* message,
void* expected, void* actual);
/* public assert functions */
#define CuFail(tc, ms) CuFail_Line( (tc), __FILE__, __LINE__, NULL, (ms))
#define CuAssert(tc, ms, cond) CuAssert_Line((tc), __FILE__, __LINE__, (ms), (cond))
#define CuAssertTrue(tc, cond) CuAssert_Line((tc), __FILE__, __LINE__, "assert failed", (cond))
#define CuAssertStrEquals(tc,ex,ac) CuAssertStrEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac))
#define CuAssertStrEquals_Msg(tc,ms,ex,ac) CuAssertStrEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac))
#define CuAssertIntEquals(tc,ex,ac) CuAssertIntEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac))
#define CuAssertIntEquals_Msg(tc,ms,ex,ac) CuAssertIntEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac))
#define CuAssertDblEquals(tc,ex,ac,dl) CuAssertDblEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac),(dl))
#define CuAssertDblEquals_Msg(tc,ms,ex,ac,dl) CuAssertDblEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac),(dl))
#define CuAssertPtrEquals(tc,ex,ac) CuAssertPtrEquals_LineMsg((tc),__FILE__,__LINE__,NULL,(ex),(ac))
#define CuAssertPtrEquals_Msg(tc,ms,ex,ac) CuAssertPtrEquals_LineMsg((tc),__FILE__,__LINE__,(ms),(ex),(ac))
#define CuAssertPtrNotNull(tc,p) CuAssert_Line((tc),__FILE__,__LINE__,"null pointer unexpected",(p != NULL))
#define CuAssertPtrNotNullMsg(tc,msg,p) CuAssert_Line((tc),__FILE__,__LINE__,(msg),(p != NULL))
/* CuSuite */
#define MAX_TEST_CASES 1024
#define SUITE_ADD_TEST(SUITE,TEST) CuSuiteAdd(SUITE, CuTestNew(#TEST, TEST))
typedef struct
{
int count;
CuTest* list[MAX_TEST_CASES];
int failCount;
} CuSuite;
void CuSuiteInit(CuSuite* testSuite);
CuSuite* CuSuiteNew(void);
void CuSuiteDelete(CuSuite *testSuite);
void CuSuiteAdd(CuSuite* testSuite, CuTest *testCase);
void CuSuiteAddSuite(CuSuite* testSuite, CuSuite* testSuite2);
void CuSuiteRun(CuSuite* testSuite);
void CuSuiteSummary(CuSuite* testSuite, CuString* summary);
void CuSuiteDetails(CuSuite* testSuite, CuString* details);
#endif /* CU_TEST_H */
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1537003272.8835669
astropy-healpix-0.5/cextern/astrometry.net/Makefile 0000644 0000770 0000024 00000000503 00000000000 022532 0 ustar 00tom staff 0000000 0000000
all: test_healpix
OBJS = healpix-utils.o healpix.o starutil.o permutedsort.o mathutil.o bl.o qsort_reentrant.o
HEADERS = healpix-utils.h healpix.h
$(OBJS): %.o: %.c $(HEADERS)
$(CC) -o $@ -c $<
%.o: %.c
$(CC) -o $@ -c $<
test_healpix: test_healpix-main.c test_healpix.c $(OBJS) CuTest.o
example: example.c $(OBJS)
././@PaxHeader 0000000 0000000 0000000 00000000033 00000000000 011451 x ustar 00 0000000 0000000 27 mtime=1526665132.262006
astropy-healpix-0.5/cextern/astrometry.net/an-bool.h 0000644 0000770 0000024 00000000707 00000000000 022600 0 ustar 00tom staff 0000000 0000000 /*
# This file is part of the Astrometry.net suite.
# Licensed under a 3-clause BSD style license - see LICENSE
*/
#ifndef AN_BOOL_H
#define AN_BOOL_H
#ifdef _MSC_VER
#if _MSC_VER >= 1600
#include
#else
#include
#endif
#else
#include
#endif
#ifndef TRUE
#define TRUE 1
#endif
#ifndef FALSE
#define FALSE 0
#endif
// This helps unconfuse SWIG; it doesn't seem to like uint8_t
typedef unsigned char anbool;
#endif
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1537003272.8857179
astropy-healpix-0.5/cextern/astrometry.net/bl-nl.c 0000644 0000770 0000024 00000025007 00000000000 022250 0 ustar 00tom staff 0000000 0000000 /*
# This file is part of the Astrometry.net suite.
# Licensed under a 3-clause BSD style license - see LICENSE
*/
/**
Defined:
--nl
--number
--NL_PRINT(x) prints number 'x'
Note:
--You can't declare multiple "number" variables like this:
number n1, n2;
Instead, do:
number n1;
number n2;
This is because "number" may be a pointer type.
*/
#include "bl-nl.ph"
#define NODE_NUMDATA(node) ((number*)NODE_DATA(node))
number* NLF(to_array)(nl* list) {
number* arr;
size_t N;
if (!list)
return NULL;
N = NLF(size)(list);
arr = malloc(N * sizeof(number));
bl_copy(list, 0, N, arr);
return arr;
}
#define InlineDefine InlineDefineC
#include "bl-nl.inc"
#undef InlineDefine
static int NLF(compare_ascending)(const void* v1, const void* v2) {
number i1 = *(number*)v1;
number i2 = *(number*)v2;
if (i1 > i2) return 1;
else if (i1 < i2) return -1;
else return 0;
}
static int NLF(compare_descending)(const void* v1, const void* v2) {
number i1 = *(number*)v1;
number i2 = *(number*)v2;
if (i1 > i2) return -1;
else if (i1 < i2) return 1;
else return 0;
}
void NLF(reverse)(nl* list) {
bl_reverse(list);
}
void NLF(append_array)(nl* list, const number* data, size_t ndata) {
size_t i;
for (i=0; iblocksize);
N1 = NLF(size)(list1);
N2 = NLF(size)(list2);
i1 = i2 = 0;
getv1 = getv2 = 1;
while (i1 < N1 && i2 < N2) {
if (getv1) {
v1 = NLF(get)(list1, i1);
getv1 = 0;
}
if (getv2) {
getv2 = 0;
v2 = NLF(get)(list2, i2);
}
if (v1 <= v2) {
NLF(append)(res, v1);
i1++;
getv1 = 1;
} else {
NLF(append)(res, v2);
i2++;
getv2 = 1;
}
}
for (; i1N-1);
bl_remove_index(nlist, nlist->N-1);
return ret;
}
nl* NLF(dupe)(nl* nlist) {
nl* ret = NLF(new)(nlist->blocksize);
size_t i;
for (i=0; iN; i++)
NLF(push)(ret, NLF(get)(nlist, i));
return ret;
}
ptrdiff_t NLF(remove_value)(nl* nlist, const number value) {
bl* list = nlist;
bl_node *node, *prev;
size_t istart = 0;
for (node=list->head, prev=NULL;
node;
prev=node, node=node->next) {
int i;
number* idat;
idat = NODE_DATA(node);
for (i=0; iN; i++)
if (idat[i] == value) {
bl_remove_from_node(list, node, prev, i);
list->last_access = prev;
list->last_access_n = istart;
return istart + i;
}
istart += node->N;
}
return BL_NOT_FOUND;
}
void NLF(remove_all)(nl* list) {
bl_remove_all(list);
}
void NLF(remove_index_range)(nl* list, size_t start, size_t length) {
bl_remove_index_range(list, start, length);
}
void NLF(set)(nl* list, size_t index, const number value) {
bl_set(list, index, &value);
}
/*
void dl_set(dl* list, int index, double value) {
int i;
int nadd = (index+1) - list->N;
if (nadd > 0) {
// enlarge the list to hold 'nadd' more entries.
for (i=0; iN;
ptrdiff_t mid;
while (lower < (upper-1)) {
mid = (upper + lower) / 2;
if (n >= iarray[mid])
lower = mid;
else
upper = mid;
}
return lower;
}
// find the first node for which n <= the last element.
static bl_node* NLF(findnodecontainingsorted)(const nl* list, const number n,
size_t* p_nskipped) {
bl_node *node;
size_t nskipped;
//bl_node *prev;
//int prevnskipped;
// check if we can use the jump accessor or if we have to start at
// the beginning...
if (list->last_access && list->last_access->N &&
// is the value we're looking for >= the first element?
(n >= *NODE_NUMDATA(list->last_access))) {
node = list->last_access;
nskipped = list->last_access_n;
} else {
node = list->head;
nskipped = 0;
}
/*
// find the first node for which n < the first element. The
// previous node will contain the value (if it exists).
for (prev=node, prevnskipped=nskipped;
node && (n < *NODE_NUMDATA(node));) {
prev=node;
prevnskipped=nskipped;
nskipped+=node->N;
node=node->next;
}
if (prev && n <= NODE_NUMDATA(prev)[prev->N-1]) {
if (p_nskipped)
*p_nskipped = prevnskipped;
return prev;
}
if (node && n <= NODE_NUMDATA(node)[node->N-1]) {
if (p_nskipped)
*p_nskipped = nskipped;
return node;
}
return NULL;
*/
/*
if (!node && prev && n > NODE_NUMDATA(prev)[prev->N-1])
return NULL;
if (p_nskipped)
*p_nskipped = prevnskipped;
return prev;
*/
for (; node && (n > NODE_NUMDATA(node)[node->N-1]); node=node->next)
nskipped += node->N;
if (p_nskipped)
*p_nskipped = nskipped;
return node;
}
static ptrdiff_t NLF(insertascending)(nl* list, const number n, int unique) {
bl_node *node;
size_t ind;
size_t nskipped;
node = NLF(findnodecontainingsorted)(list, n, &nskipped);
if (!node) {
NLF(append)(list, n);
return list->N-1;
}
/*
for (; node && (n > NODE_NUMDATA(node)[node->N-1]); node=node->next)
nskipped += node->N;
if (!node) {
// either we're adding the first element, or we're appending since
// n is bigger than the largest element in the list.
NLF(append)(list, n);
return list->N-1;
}
*/
// find where in the node it should be inserted...
ind = 1 + NLF(binarysearch)(node, n);
// check if it's a duplicate...
if (unique && ind > 0 && (n == NODE_NUMDATA(node)[ind-1]))
return BL_NOT_FOUND;
// set the jump accessors...
list->last_access = node;
list->last_access_n = nskipped;
// ... so that this runs in O(1).
bl_insert(list, nskipped + ind, &n);
return nskipped + ind;
}
size_t NLF(insert_ascending)(nl* list, const number n) {
return NLF(insertascending)(list, n, 0);
}
ptrdiff_t NLF(insert_unique_ascending)(nl* list, const number n) {
return NLF(insertascending)(list, n, 1);
}
size_t NLF(insert_descending)(nl* list, const number n) {
return bl_insert_sorted(list, &n, NLF(compare_descending));
}
void NLF(insert)(nl* list, size_t indx, const number data) {
bl_insert(list, indx, &data);
}
void NLF(copy)(nl* list, size_t start, size_t length, number* vdest) {
bl_copy(list, start, length, vdest);
}
void NLF(print)(nl* list) {
bl_node* n;
for (n=list->head; n; n=n->next) {
int i;
printf("[ ");
for (i=0; iN; i++) {
if (i > 0)
printf(", ");
NL_PRINT(NODE_NUMDATA(n)[i]);
}
printf("] ");
}
}
ptrdiff_t NLF(index_of)(nl* list, const number data) {
bl_node* n;
number* idata;
size_t npast = 0;
for (n=list->head; n; n=n->next) {
int i;
idata = NODE_NUMDATA(n);
for (i=0; iN; i++)
if (idata[i] == data)
return npast + i;
npast += n->N;
}
return BL_NOT_FOUND;
}
int NLF(contains)(nl* list, const number data) {
return (NLF(index_of)(list, data) != BL_NOT_FOUND);
}
int NLF(sorted_contains)(nl* list, const number n) {
return NLF(sorted_index_of)(list, n) != BL_NOT_FOUND;
}
ptrdiff_t NLF(sorted_index_of)(nl* list, const number n) {
bl_node *node;
ptrdiff_t lower;
size_t nskipped;
node = NLF(findnodecontainingsorted)(list, n, &nskipped);
if (!node)
return BL_NOT_FOUND;
//if (!node && (n > NODE_NUMDATA(prev)[prev->N-1]))
//return -1;
//node = prev;
/*
// find the first node for which n <= the last element. That node
// will contain the value (if it exists)
for (; node && (n > NODE_NUMDATA(node)[node->N-1]); node=node->next)
nskipped += node->N;
if (!node)
return -1;
*/
// update jump accessors...
list->last_access = node;
list->last_access_n = nskipped;
// find within the node...
lower = NLF(binarysearch)(node, n);
if (lower == BL_NOT_FOUND)
return BL_NOT_FOUND;
if (n == NODE_NUMDATA(node)[lower])
return nskipped + lower;
return BL_NOT_FOUND;
}
#undef NLF
#undef NODE_NUMDATA
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1526665132.2625923
astropy-healpix-0.5/cextern/astrometry.net/bl-nl.h 0000644 0000770 0000024 00000005437 00000000000 022262 0 ustar 00tom staff 0000000 0000000 /*
# This file is part of the Astrometry.net suite.
# Licensed under a 3-clause BSD style license - see LICENSE
*/
/**
Common header for lists of numerical types.
Expects "nl" to be #defined to the list type.
Expects "number" to be #defined to the numerical type.
*/
#include "bl-nl.ph"
typedef bl nl;
// The "const number"s in here are mostly for pl.
Malloc nl* NLF(new)(int blocksize);
Pure InlineDeclare size_t NLF(size)(const nl* list);
void NLF(new_existing)(nl* list, int blocksize);
void NLF(init)(nl* list, int blocksize);
void NLF(reverse)(nl* list);
void NLF(remove_all)(nl* list);
void NLF(remove_all_reuse)(nl* list);
void NLF(free)(nl* list);
number* NLF(append)(nl* list, const number data);
void NLF(append_list)(nl* list, nl* list2);
void NLF(append_array)(nl* list, const number* data, size_t ndata);
void NLF(merge_lists)(nl* list1, nl* list2);
void NLF(push)(nl* list, const number data);
number NLF(pop)(nl* list);
int NLF(contains)(nl* list, const number data);
// Assuming the list is sorted in ascending order,
// does it contain the given number?
int NLF(sorted_contains)(nl* list, const number data);
// Or -1 if not found.
ptrdiff_t NLF(sorted_index_of)(nl* list, const number data);
#if DEFINE_SORT
void NLF(sort)(nl* list, int ascending);
#endif
Malloc number* NLF(to_array)(nl* list);
// Returns the index in the list of the given number, or -1 if it
// is not found.
ptrdiff_t NLF(index_of)(nl* list, const number data);
InlineDeclare number NLF(get)(nl* list, size_t n);
InlineDeclare number NLF(get_const)(const nl* list, size_t n);
InlineDeclare number* NLF(access)(nl* list, size_t n);
/**
Copy from the list, starting at index "start" for length "length",
into the provided array.
*/
void NLF(copy)(nl* list, size_t start, size_t length, number* vdest);
nl* NLF(dupe)(nl* list);
void NLF(print)(nl* list);
void NLF(insert)(nl* list, size_t indx, const number data);
size_t NLF(insert_ascending)(nl* list, const number n);
size_t NLF(insert_descending)(nl* list, const number n);
// Returns the index at which the element was added, or -1 if it's a duplicate.
ptrdiff_t NLF(insert_unique_ascending)(nl* list, const number p);
void NLF(set)(nl* list, size_t ind, const number value);
void NLF(remove)(nl* list, size_t ind);
void NLF(remove_index_range)(nl* list, size_t start, size_t length);
// See also sorted_index_of, which should be faster.
// Or -1 if not found
ptrdiff_t NLF(find_index_ascending)(nl* list, const number value);
nl* NLF(merge_ascending)(nl* list1, nl* list2);
// returns the index of the removed value, or -1 if it didn't
// exist in the list.
ptrdiff_t NLF(remove_value)(nl* list, const number value);
int NLF(check_consistency)(nl* list);
int NLF(check_sorted_ascending)(nl* list, int isunique);
int NLF(check_sorted_descending)(nl* list, int isunique);
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1526665132.2628565
astropy-healpix-0.5/cextern/astrometry.net/bl-nl.inc 0000644 0000770 0000024 00000001052 00000000000 022571 0 ustar 00tom staff 0000000 0000000 /*
# This file is part of the Astrometry.net suite.
# Licensed under a 3-clause BSD style license - see LICENSE
*/
#include "bl-nl.ph"
InlineDefine
number NLF(get)(nl* list, size_t n) {
number* ptr = (number*)bl_access(list, n);
return *ptr;
}
InlineDefine
number NLF(get_const)(const nl* list, size_t n) {
number* ptr = (number*)bl_access_const(list, n);
return *ptr;
}
InlineDefine
size_t NLF(size)(const nl* list) {
return bl_size(list);
}
InlineDefine
number* NLF(access)(nl* list, size_t j) {
return (number*)bl_access(list, j);
}
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1526665132.2631168
astropy-healpix-0.5/cextern/astrometry.net/bl-nl.ph 0000644 0000770 0000024 00000000427 00000000000 022434 0 ustar 00tom staff 0000000 0000000 /*
# This file is part of the Astrometry.net suite.
# Licensed under a 3-clause BSD style license - see LICENSE
*/
//#define NODE_NUMDATA(node) ((number*)NODE_DATA(node))
#define NLFGLUE2(n,f) n ## _ ## f
#define NLFGLUE(n,f) NLFGLUE2(n,f)
#define NLF(func) NLFGLUE(nl, func)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 011452 x ustar 00 0000000 0000000 28 mtime=1574433289.5103812
astropy-healpix-0.5/cextern/astrometry.net/bl.c 0000644 0000770 0000024 00000077620 00000000000 021651 0 ustar 00tom staff 0000000 0000000 /*
# This file is part of the Astrometry.net suite.
# Licensed under a 3-clause BSD style license - see LICENSE
*/
#define _GNU_SOURCE /* for GNU extension vasprintf() */
#include
#include
#include
#include
#include
#include "bl.h"
#include "keywords.h"
#include "bl.ph"
static bl_node* bl_new_node(bl* list);
static void bl_remove_from_node(bl* list, bl_node* node,
bl_node* prev, int index_in_node);
// NOTE: this should be replaced by a proper implementation!
#ifdef _MSC_VER
int vasprintf(char **strp, const char *fmt, va_list ap) {return -1;}
#endif
// Defined in bl.ph (private header):
// free_node
// NODE_DATA
// NODE_CHARDATA
// NODE_INTDATA
// NODE_DOUBLEDATA
// Defined in bl.inc (inlined functions):
// bl_size
// bl_access
// il_size
// il_get
// NOTE!, if you make changes here, also see bl-sort.c !
//#define DEFINE_SORT 1
#define DEFINE_SORT 0
#define nl il
#define number int
#define NL_PRINT(x) printf("%i", x)
#include "bl-nl.c"
#undef nl
#undef number
#undef NL_PRINT
#define nl ll
#define number int64_t
#define NL_PRINT(x) printf("%lli", (long long int)x)
#include "bl-nl.c"
#undef nl
#undef number
#undef NL_PRINT
#define nl fl
#define number float
#define NL_PRINT(x) printf("%f", (float)x)
#include "bl-nl.c"
#undef nl
#undef number
#undef NL_PRINT
#define nl dl
#define number double
#define NL_PRINT(x) printf("%g", x)
#include "bl-nl.c"
#undef nl
#undef number
#undef NL_PRINT
#undef DEFINE_SORT
#define DEFINE_SORT 0
#define nl pl
#define number void*
#define NL_PRINT(x) printf("%p", x)
#include "bl-nl.c"
#undef nl
#undef number
#undef NL_PRINT
#undef DEFINE_SORT
Pure int bl_datasize(const bl* list) {
if (!list)
return 0;
return list->datasize;
}
void bl_split(bl* src, bl* dest, size_t split) {
bl_node* node;
size_t nskipped;
size_t ind;
size_t ntaken = src->N - split;
node = find_node(src, split, &nskipped);
ind = split - nskipped;
if (ind == 0) {
// this whole node belongs to "dest".
if (split) {
// we need to get the previous node...
bl_node* last = find_node(src, split-1, NULL);
last->next = NULL;
src->tail = last;
} else {
// we've removed everything from "src".
src->head = NULL;
src->tail = NULL;
}
} else {
// create a new node to hold the second half of the items in "node".
bl_node* newnode = bl_new_node(dest);
newnode->N = (node->N - ind);
newnode->next = node->next;
memcpy(NODE_CHARDATA(newnode),
NODE_CHARDATA(node) + (ind * src->datasize),
newnode->N * src->datasize);
node->N -= (node->N - ind);
node->next = NULL;
src->tail = node;
// to make the code outside this block work...
node = newnode;
}
// append it to "dest".
if (dest->tail) {
dest->tail->next = node;
dest->N += ntaken;
} else {
dest->head = node;
dest->tail = node;
dest->N += ntaken;
}
// adjust "src".
src->N -= ntaken;
src->last_access = NULL;
}
void bl_init(bl* list, int blocksize, int datasize) {
list->head = NULL;
list->tail = NULL;
list->N = 0;
list->blocksize = blocksize;
list->datasize = datasize;
list->last_access = NULL;
list->last_access_n = 0;
}
bl* bl_new(int blocksize, int datasize) {
bl* rtn;
rtn = malloc(sizeof(bl));
if (!rtn) {
printf("Couldn't allocate memory for a bl.\n");
return NULL;
}
bl_init(rtn, blocksize, datasize);
return rtn;
}
void bl_free(bl* list) {
if (!list) return;
bl_remove_all(list);
free(list);
}
void bl_remove_all(bl* list) {
bl_node *n, *lastnode;
lastnode = NULL;
for (n=list->head; n; n=n->next) {
if (lastnode)
bl_free_node(lastnode);
lastnode = n;
}
if (lastnode)
bl_free_node(lastnode);
list->head = NULL;
list->tail = NULL;
list->N = 0;
list->last_access = NULL;
list->last_access_n = 0;
}
void bl_remove_all_but_first(bl* list) {
bl_node *n, *lastnode;
lastnode = NULL;
if (list->head) {
for (n=list->head->next; n; n=n->next) {
if (lastnode)
bl_free_node(lastnode);
lastnode = n;
}
if (lastnode)
bl_free_node(lastnode);
list->head->next = NULL;
list->head->N = 0;
list->tail = list->head;
} else {
list->head = NULL;
list->tail = NULL;
}
list->N = 0;
list->last_access = NULL;
list->last_access_n = 0;
}
static void bl_remove_from_node(bl* list, bl_node* node,
bl_node* prev, int index_in_node) {
// if we're removing the last element at this node, then
// remove this node from the linked list.
if (node->N == 1) {
// if we're removing the first node...
if (prev == NULL) {
list->head = node->next;
// if it's the first and only node...
if (list->head == NULL) {
list->tail = NULL;
}
} else {
// if we're removing the last element from
// the tail node...
if (node == list->tail) {
list->tail = prev;
}
prev->next = node->next;
}
bl_free_node(node);
} else {
int ncopy;
// just remove this element...
ncopy = node->N - index_in_node - 1;
if (ncopy > 0) {
memmove(NODE_CHARDATA(node) + index_in_node * list->datasize,
NODE_CHARDATA(node) + (index_in_node+1) * list->datasize,
ncopy * list->datasize);
}
node->N--;
}
list->N--;
}
void bl_remove_index(bl* list, size_t index) {
// find the node (and previous node) at which element 'index'
// can be found.
bl_node *node, *prev;
size_t nskipped = 0;
for (node=list->head, prev=NULL;
node;
prev=node, node=node->next) {
if (index < (nskipped + node->N))
break;
nskipped += node->N;
}
assert(node);
bl_remove_from_node(list, node, prev, index-nskipped);
list->last_access = NULL;
list->last_access_n = 0;
}
void bl_remove_index_range(bl* list, size_t start, size_t length) {
// find the node (and previous node) at which element 'start'
// can be found.
bl_node *node, *prev;
size_t nskipped = 0;
list->last_access = NULL;
list->last_access_n = 0;
for (node=list->head, prev=NULL;
node;
prev=node, node=node->next) {
if (start < (nskipped + node->N))
break;
nskipped += node->N;
}
// begin by removing any indices that are at the end of a block.
if (start > nskipped) {
// we're not removing everything at this node.
size_t istart;
size_t n;
istart = start - nskipped;
if ((istart + length) < node->N) {
// we're removing a chunk of elements from the middle of this
// block. move elements from the end into the removed chunk.
memmove(NODE_CHARDATA(node) + istart * list->datasize,
NODE_CHARDATA(node) + (istart + length) * list->datasize,
(node->N - (istart + length)) * list->datasize);
// we're done!
node->N -= length;
list->N -= length;
return;
} else {
// we're removing everything from 'istart' to the end of this
// block. just change the "N" values.
n = (node->N - istart);
node->N -= n;
list->N -= n;
length -= n;
start += n;
nskipped = start;
prev = node;
node = node->next;
}
}
// remove complete blocks.
for (;;) {
size_t n;
bl_node* todelete;
if (length == 0 || length < node->N)
break;
// we're skipping this whole block.
n = node->N;
length -= n;
start += n;
list->N -= n;
nskipped += n;
todelete = node;
node = node->next;
bl_free_node(todelete);
}
if (prev)
prev->next = node;
else
list->head = node;
if (!node)
list->tail = prev;
// remove indices from the beginning of the last block.
// note that we may have removed everything from the tail of the list,
// no "node" may be null.
if (node && length>0) {
//printf("removing %i from end.\n", length);
memmove(NODE_CHARDATA(node),
NODE_CHARDATA(node) + length * list->datasize,
(node->N - length) * list->datasize);
node->N -= length;
list->N -= length;
}
}
static void clear_list(bl* list) {
list->head = NULL;
list->tail = NULL;
list->N = 0;
list->last_access = NULL;
list->last_access_n = 0;
}
void bl_append_list(bl* list1, bl* list2) {
list1->last_access = NULL;
list1->last_access_n = 0;
if (list1->datasize != list2->datasize) {
printf("Error: cannot append bls with different data sizes!\n");
assert(0);
exit(0);
}
if (list1->blocksize != list2->blocksize) {
printf("Error: cannot append bls with different block sizes!\n");
assert(0);
exit(0);
}
// if list1 is empty, then just copy over list2's head and tail.
if (list1->head == NULL) {
list1->head = list2->head;
list1->tail = list2->tail;
list1->N = list2->N;
// remove everything from list2 (to avoid sharing nodes)
clear_list(list2);
return;
}
// if list2 is empty, then do nothing.
if (list2->head == NULL)
return;
// otherwise, append list2's head to list1's tail.
list1->tail->next = list2->head;
list1->tail = list2->tail;
list1->N += list2->N;
// remove everything from list2 (to avoid sharing nodes)
clear_list(list2);
}
static bl_node* bl_new_node(bl* list) {
bl_node* rtn;
// merge the mallocs for the node and its data into one malloc.
rtn = malloc(sizeof(bl_node) + list->datasize * list->blocksize);
if (!rtn) {
printf("Couldn't allocate memory for a bl node!\n");
return NULL;
}
//rtn->data = (char*)rtn + sizeof(bl_node);
rtn->N = 0;
rtn->next = NULL;
return rtn;
}
static void bl_append_node(bl* list, bl_node* node) {
node->next = NULL;
if (!list->head) {
// first node to be added.
list->head = node;
list->tail = node;
} else {
list->tail->next = node;
list->tail = node;
}
list->N += node->N;
}
/*
* Append an item to this bl node. If this node is full, then create a new
* node and insert it into the list.
*
* Returns the location where the new item was copied.
*/
void* bl_node_append(bl* list, bl_node* node, const void* data) {
void* dest;
if (node->N == list->blocksize) {
// create a new node and insert it after the current node.
bl_node* newnode;
newnode = bl_new_node(list);
newnode->next = node->next;
node->next = newnode;
if (list->tail == node)
list->tail = newnode;
node = newnode;
}
// space remains at this node. add item.
dest = NODE_CHARDATA(node) + node->N * list->datasize;
if (data)
memcpy(dest, data, list->datasize);
node->N++;
list->N++;
return dest;
}
void* bl_append(bl* list, const void* data) {
if (!list->tail)
// empty list; create a new node.
bl_append_node(list, bl_new_node(list));
// append the item to the tail. if the tail node is full, a new tail node may be created.
return bl_node_append(list, list->tail, data);
}
void* bl_push(bl* list, const void* data) {
return bl_append(list, data);
}
void bl_pop(bl* list, void* into) {
assert(list->N > 0);
bl_get(list, list->N-1, into);
bl_remove_index(list, list->N-1);
}
void bl_print_structure(bl* list) {
bl_node* n;
printf("bl: head %p, tail %p, N %zu\n", list->head, list->tail, list->N);
for (n=list->head; n; n=n->next) {
printf("[N=%i] ", n->N);
}
printf("\n");
}
void bl_get(bl* list, size_t n, void* dest) {
char* src;
assert(list->N > 0);
src = bl_access(list, n);
memcpy(dest, src, list->datasize);
}
static void bl_find_ind_and_element(bl* list, const void* data,
int (*compare)(const void* v1, const void* v2),
void** presult, ptrdiff_t* pindex) {
ptrdiff_t lower, upper;
int cmp = -2;
void* result;
lower = -1;
upper = list->N;
while (lower < (upper-1)) {
ptrdiff_t mid;
mid = (upper + lower) / 2;
cmp = compare(data, bl_access(list, mid));
if (cmp >= 0) {
lower = mid;
} else {
upper = mid;
}
}
if (lower == -1 || compare(data, (result = bl_access(list, lower)))) {
*presult = NULL;
if (pindex)
*pindex = -1;
return;
}
*presult = result;
if (pindex)
*pindex = lower;
}
/**
* Finds a node for which the given compare() function
* returns zero when passed the given 'data' pointer
* and elements from the list.
*/
void* bl_find(bl* list, const void* data,
int (*compare)(const void* v1, const void* v2)) {
void* rtn;
bl_find_ind_and_element(list, data, compare, &rtn, NULL);
return rtn;
}
ptrdiff_t bl_find_index(bl* list, const void* data,
int (*compare)(const void* v1, const void* v2)) {
void* val;
ptrdiff_t ind;
bl_find_ind_and_element(list, data, compare, &val, &ind);
return ind;
}
size_t bl_insert_sorted(bl* list, const void* data,
int (*compare)(const void* v1, const void* v2)) {
ptrdiff_t lower, upper;
lower = -1;
upper = list->N;
while (lower < (upper-1)) {
ptrdiff_t mid;
int cmp;
mid = (upper + lower) / 2;
cmp = compare(data, bl_access(list, mid));
if (cmp >= 0) {
lower = mid;
} else {
upper = mid;
}
}
bl_insert(list, lower+1, data);
return lower+1;
}
ptrdiff_t bl_insert_unique_sorted(bl* list, const void* data,
int (*compare)(const void* v1, const void* v2)) {
// This is just straightforward binary search - really should
// use the block structure...
ptrdiff_t lower, upper;
lower = -1;
upper = list->N;
while (lower < (upper-1)) {
ptrdiff_t mid;
int cmp;
mid = (upper + lower) / 2;
cmp = compare(data, bl_access(list, mid));
if (cmp >= 0) {
lower = mid;
} else {
upper = mid;
}
}
if (lower >= 0) {
if (compare(data, bl_access(list, lower)) == 0) {
return BL_NOT_FOUND;
}
}
bl_insert(list, lower+1, data);
return lower+1;
}
void bl_set(bl* list, size_t index, const void* data) {
bl_node* node;
size_t nskipped;
void* dataloc;
node = find_node(list, index, &nskipped);
dataloc = NODE_CHARDATA(node) + (index - nskipped) * list->datasize;
memcpy(dataloc, data, list->datasize);
// update the last_access member...
list->last_access = node;
list->last_access_n = nskipped;
}
/**
* Insert the element "data" into the list, such that its index is "index".
* All elements that previously had indices "index" and above are moved
* one position to the right.
*/
void bl_insert(bl* list, size_t index, const void* data) {
bl_node* node;
size_t nskipped;
if (list->N == index) {
bl_append(list, data);
return;
}
node = find_node(list, index, &nskipped);
list->last_access = node;
list->last_access_n = nskipped;
// if the node is full:
// if we're inserting at the end of this node, then create a new node.
// else, shift all but the last element, add in this element, and
// add the last element to a new node.
if (node->N == list->blocksize) {
int localindex, nshift;
bl_node* next = node->next;
bl_node* destnode;
localindex = index - nskipped;
// if the next node exists and is not full, then insert the overflowing
// element at the front. otherwise, create a new node.
if (next && (next->N < list->blocksize)) {
// shift the existing elements up by one position...
memmove(NODE_CHARDATA(next) + list->datasize,
NODE_CHARDATA(next),
next->N * list->datasize);
destnode = next;
} else {
// create and insert a new node.
bl_node* newnode = bl_new_node(list);
newnode->next = next;
node->next = newnode;
if (!newnode->next)
list->tail = newnode;
destnode = newnode;
}
if (localindex == node->N) {
// the new element becomes the first element in the destination node.
memcpy(NODE_CHARDATA(destnode), data, list->datasize);
} else {
// the last element in this node is added to the destination node.
memcpy(NODE_CHARDATA(destnode), NODE_CHARDATA(node) + (node->N-1)*list->datasize, list->datasize);
// shift the end portion of this node up by one...
nshift = node->N - localindex - 1;
memmove(NODE_CHARDATA(node) + (localindex+1) * list->datasize,
NODE_CHARDATA(node) + localindex * list->datasize,
nshift * list->datasize);
// insert the new element...
memcpy(NODE_CHARDATA(node) + localindex * list->datasize, data, list->datasize);
}
destnode->N++;
list->N++;
} else {
// shift...
int localindex, nshift;
localindex = index - nskipped;
nshift = node->N - localindex;
memmove(NODE_CHARDATA(node) + (localindex+1) * list->datasize,
NODE_CHARDATA(node) + localindex * list->datasize,
nshift * list->datasize);
// insert...
memcpy(NODE_CHARDATA(node) + localindex * list->datasize,
data, list->datasize);
node->N++;
list->N++;
}
}
void* bl_access_const(const bl* list, size_t n) {
bl_node* node;
size_t nskipped;
node = find_node(list, n, &nskipped);
// grab the element.
return NODE_CHARDATA(node) + (n - nskipped) * list->datasize;
}
void bl_copy(bl* list, size_t start, size_t length, void* vdest) {
bl_node* node;
size_t nskipped;
char* dest;
if (length <= 0)
return;
node = find_node(list, start, &nskipped);
// we've found the node containing "start". keep copying elements and
// moving down the list until we've copied all "length" elements.
dest = vdest;
while (length > 0) {
size_t take, avail;
char* src;
// number of elements we want to take.
take = length;
// number of elements available at this node.
avail = node->N - (start - nskipped);
if (take > avail)
take = avail;
src = NODE_CHARDATA(node) + (start - nskipped) * list->datasize;
memcpy(dest, src, take * list->datasize);
dest += take * list->datasize;
start += take;
length -= take;
nskipped += node->N;
node = node->next;
}
// update the last_access member...
list->last_access = node;
list->last_access_n = nskipped;
}
int bl_check_consistency(bl* list) {
bl_node* node;
size_t N;
int tailok = 1;
int nempty = 0;
int nnull = 0;
// if one of head or tail is NULL, they had both better be NULL!
if (!list->head)
nnull++;
if (!list->tail)
nnull++;
if (nnull == 1) {
fprintf(stderr, "bl_check_consistency: head is %p, and tail is %p.\n",
list->head, list->tail);
return 1;
}
N = 0;
for (node=list->head; node; node=node->next) {
N += node->N;
if (!node->N) {
// this block is empty.
nempty++;
}
// are we at the last node?
if (!node->next) {
tailok = (list->tail == node) ? 1 : 0;
}
}
if (!tailok) {
fprintf(stderr, "bl_check_consistency: tail pointer is wrong.\n");
return 1;
}
if (nempty) {
fprintf(stderr, "bl_check_consistency: %i empty blocks.\n", nempty);
return 1;
}
if (N != list->N) {
fprintf(stderr, "bl_check_consistency: list->N is %zu, but sum of blocks is %zu.\n",
list->N, N);
return 1;
}
return 0;
}
int bl_check_sorted(bl* list,
int (*compare)(const void* v1, const void* v2),
int isunique) {
size_t i, N;
size_t nbad = 0;
void* v2 = NULL;
N = bl_size(list);
if (N)
v2 = bl_access(list, 0);
for (i=1; i= 0) {
nbad++;
}
} else {
if (cmp > 0) {
nbad++;
}
}
}
if (nbad) {
fprintf(stderr, "bl_check_sorted: %zu are out of order.\n", nbad);
return 1;
}
return 0;
}
static void memswap(void* v1, void* v2, int len) {
unsigned char tmp;
unsigned char* c1 = v1;
unsigned char* c2 = v2;
int i;
for (i=0; ihead; node; node=node->next) {
for (i=0; i<(node->N/2); i++) {
memswap(NODE_CHARDATA(node) + i * list->datasize,
NODE_CHARDATA(node) + (node->N - 1 - i) * list->datasize,
list->datasize);
}
pl_append(blocks, node);
}
// reverse the blocks
lastnode = NULL;
for (i=pl_size(blocks)-1; i>=0; i--) {
node = pl_get(blocks, i);
if (lastnode)
lastnode->next = node;
lastnode = node;
}
if (lastnode)
lastnode->next = NULL;
pl_free(blocks);
// swap head and tail
node = list->head;
list->head = list->tail;
list->tail = node;
list->last_access = NULL;
list->last_access_n = 0;
}
void* bl_extend(bl* list) {
return bl_append(list, NULL);
}
// special-case pointer list accessors...
int bl_compare_pointers_ascending(const void* v1, const void* v2) {
void* p1 = *(void**)v1;
void* p2 = *(void**)v2;
if (p1 > p2) return 1;
else if (p1 < p2) return -1;
else return 0;
}
void pl_free_elements(pl* list) {
size_t i;
for (i=0; iN;
while (lower < (upper-1)) {
ptrdiff_t mid;
int cmp;
mid = (upper + lower) / 2;
cmp = compare(data, pl_get(list, mid));
if (cmp >= 0) {
lower = mid;
} else {
upper = mid;
}
}
bl_insert(list, lower+1, &data);
return lower+1;
}
/*
void pl_set(pl* list, int index, void* data) {
int i;
int nadd = (index+1) - list->N;
if (nadd > 0) {
// enlarge the list to hold 'nadd' more entries.
for (i=0; i=0; i--) {
char* s = sl_get(lst, i);
if (strcmp(s, str) == 0)
return i;
}
return BL_NOT_FOUND;
}
// Returns 0 if the string is not in the sl, 1 otherwise.
// (same as sl_index_of(lst, str) > -1)
int sl_contains(sl* lst, const char* str) {
return (sl_index_of(lst, str) > -1);
}
void sl_reverse(sl* list) {
bl_reverse(list);
}
char* sl_append(sl* list, const char* data) {
char* copy;
if (data) {
copy = strdup(data);
assert(copy);
} else
copy = NULL;
pl_append(list, copy);
return copy;
}
void sl_append_array(sl* list, const char**strings, size_t n) {
size_t i;
for (i=0; i= 0);
copy = strdup(value);
if (index < list->N) {
// we're replacing an existing value - free it!
free(sl_get(list, index));
bl_set(list, index, ©);
} else {
// pad
size_t i;
for (i=list->N; iN);
assert(start >= 0);
assert(length >= 0);
for (i=0; ihead; n; n=n->next) {
printf("[\n");
for (i=0; iN; i++)
printf(" \"%s\"\n", ((char**)NODE_DATA(n))[i]);
printf("]\n");
}
}
static char* sljoin(sl* list, const char* join, int forward) {
size_t start, end, inc;
size_t len = 0;
size_t i, N;
char* rtn;
size_t offset;
size_t JL;
if (sl_size(list) == 0)
return strdup("");
// step through the list forward or backward?
if (forward) {
start = 0;
end = sl_size(list);
inc = 1;
} else {
start = sl_size(list) - 1;
end = -1;
inc = -1;
}
JL = strlen(join);
N = sl_size(list);
for (i=0; i